Tell us your story
Tell us your story
How has offline Wikipedia affected you? The Wikimedia Foundation (the non-profit that supports Wikipedia) is looking for personal, diverse and inspiring stories about how offline Wikipedia affects the world. If you have a personal story that you would like to share, please contact: stories@kiwix.org. Thank you!

Difference between revisions of "Tools/en"

From Kiwix
Jump to: navigation, search
(Translation of the Generation section)
(Tools summary section)
Line 36: Line 36:
 
* the storage space you have for the final result
 
* the storage space you have for the final result
 
* how to make the selection if necessary.
 
* how to make the selection if necessary.
 +
 +
==Usage==
 +
Here is a list of available scripts (many of them are specific to Mediawiki):
 +
===Mediawiki Maintenance===
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/backupMediawikiInstall.pl?view=log backupMediawikiInstall.pl] creates a tgz archive of a complete existing Mediawiki installation (code + resources + database).
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/installMediawiki.pl?view=log installMediawiki.pl] brings up an instance of Mediawiki from source code without human intervention. This actually simulates the manual Mediawiki installation process.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/resetMediawikiDatabase.pl?view=log resetMediawikiDatabase.pl] empties a local instance of Mediawiki of all pages.
 +
 +
===Mirroring Tools===
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/buildHistoryFile.pl?view=log buildHistoryFile.pl] given a list of articles and an online Mediawiki site, obtains complete histories of each page on the list.
 +
** [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/extractContributorsFromHistoryFile.pl?view=log extractContributorsFromHistoryFile.pl] extracts a list of authors from the histories obtained by the buildHistoryFile.pl script.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/buildContributorsHtmlPages.pl?view=log buildContributorsHtmlPages.pl] given a template and a list of authors, builds a custom set of HTML pages containing all of the authors on the list.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/checkMediawikiPageCompleteness.pl?view=log checkMediawikiPageCompleteness.pl] check if the local copies of pages from an online Mediawiki site are complete, i.e. have no dependencies (template files, multimedia resources, etc.) missing.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/checkPageExistence.pl?view=log checkPageExistence.pl] given a list of page titles  and an online Mediawiki site, checks whether such pages exist in it.  This can be handy, for example, to see what pages have been replicated.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/checkRedirects.pl?view=log checkRedirects.pl] checks if there are no pages redirecting to non-existent pages (i.e. broken redirects).  Eventually, it should also check against pages redirecting to each other.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/listAllImages.pl?view=log listAllImages.pl] lists all images of an online Mediawiki site.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/listAllPages.pl?view=log listAllPages.pl] lists all pages in an online Mediawiki site.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/listCategoryEntries.pl?view=log listCategoryEntries.pl] lists the pages belonging to a category, recursively.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/listRedirects.pl?view=log listRedirects.pl] list page redirects in an online Mediawiki site.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/mirrorMediawikiCode.pl?view=log mirrorMediawikiCode.pl] downloads the exact same version used by an online MediaWiki site;  this includes both Mediawiki code and Mediawiki extensions.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/mirrorMediawikiInterwikis.pl?view=log mirrorMediawikiInterwikis.pl] installs to a local Mediawiki site the InterWikis (cross-language links) exactly identical to an online Mediawiki site.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/mirrorMediawikiPages.pl?view=log mirrorMediawikiPages.pl] copies a set of pages and their dependencies (template and multimedia resources) from an online Mediawiki site to a local Mediawiki site.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/mirroring_tools/scripts/modifyMediawikiEntry.pl?view=log modifyMediawikiEntry.pl] removes, deletes, or replaces a list of pages from an online Mediawiki site.
 +
 +
===Dumping Tools===
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/dumping_tools/scripts/checkEmptyFilesInHtmlDirectory.pl?view=log checkEmptyFilesInHtmlDirectory.pl] checks whether a directory and its subdirectories contain empty files.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/dumping_tools/scripts/dumpHtml.pl?view=log dumpHtml.pl] given a local Mediawiki site, makes all-static copies of pages, i.e. creates a directory with all needed HTML.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/dumping_tools/scripts/launchTntreader.pl?view=log launchTntreader.pl] easily launches the tntreader program.
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/dumping_tools/scripts/optimizeContents.pl?view=log optimizeContents.pl] optimizes a directory with HTML pages and resources. This script calls the following extensions: [http://tidy.sourceforge.net/ HTML Tidy] for HTML files;  The [http://sourceforge.net/projects/littleutils/ Little utils] for images.
 +
 +
===ZIM Generation===
 +
* [http://kiwix.svn.sourceforge.net/viewvc/kiwix/dumping_tools/scripts/buildZimFileFromDirectory.pl?view=log buildZimFileFromDirectory.pl] creates a ZIM file from a directory tree containing static HTML and other content files.

Revision as of 00:57, 25 March 2010


The Kiwix tools are a set of scripts (mostly in Perl) aiming to help creating content usable by Kiwix.

Kiwix is primarily designed as a tool to publish copies of Wikipedia, but every effort is made to ensure it would also be useful for:

As the heart of Kiwix is the HTML rendering engine Gecko, the objective of Kiwix tools is to produce:

  • first, a coherent set of static HTML files and their needed resources: Stylesheets, JavaScript code, images, etc.
  • Only then, and from these static files, the tools create a file in the ZIM format (see below)

Storage

We call such a coherent set of multimedia content a dump or a corpus. These dumps can take many forms: previous versions of Kiwix used a simple directory layout; Moulinwiki used a file compressed with bzip2 and indexed in an SQLite database.

Today, Kiwix uses the ZIM format: a single file contains the entire dump,allowing fast access, high compression and configurability.

ZIM is an open, standard format created and maintained by the openZIM project, of which Kiwix is a founding member. ZIM is itself based on an older format (Zeno). Zeno was created by the Berlin publishing house Directmedia and served for the German Wikipedia released on CD-ROM. Later, the Zeno format had been abandoned, but we wanted to continue development. The future will tell whether this initiative will be successful, but the goal is to make a standard and thus simplify the problem for each of the storage dumps. It is, anyway, already the best free solution.

Generating ZIM Files From Wikis

The question of how to generate a dump is not a simple one. For several reasons, Kiwix has so far concentrated on generating dumps offering a selection of a given Wiki site, even if the publication of complete Wikipedia dumps remains a clear objective. The Kiwix tools are designed to assist in the selection of entries, replication of content from the online site in a local mirror, and then from the mirror to a ZIM file.

But this is not the only method to generate a dump: theoretically, this can be done in different ways. Here is a small inexhaustive list of approaches:

  • If you want to produce a complete dump, you can:
    • obtain a ready HTML dump provided by the wiki admin, as provided here by the Wikimedia Foundation for example.
    • mount a local mirror of the wiki, uploading the data (the content from another wiki) into the database and then generating an HTML dump by yourself. One can find such data for the Wikimedia Foundation here. In the case of a selection rather than a complete dump, you can also retrieve the data dynamically from the site (since the wiki is open source).
    • generate an HTML dump directly (by retrieving the HTML pages) using software such as Vacuum on the website (be careful not to abuse the remote Web site by inordinate amounts of traffic, though!).
  • If you want a partial dump, you must make a selection of items; once you have only the items you want, then the same process applies as with a complete dump.

There are certain constraints that should be taken into account. Here are the most important ones:

  • material resources (equipment, power) of the server
  • your own material resources
  • the storage space you have for the final result
  • how to make the selection if necessary.

Usage

Here is a list of available scripts (many of them are specific to Mediawiki):

Mediawiki Maintenance

  • backupMediawikiInstall.pl creates a tgz archive of a complete existing Mediawiki installation (code + resources + database).
  • installMediawiki.pl brings up an instance of Mediawiki from source code without human intervention. This actually simulates the manual Mediawiki installation process.
  • resetMediawikiDatabase.pl empties a local instance of Mediawiki of all pages.

Mirroring Tools

  • buildHistoryFile.pl given a list of articles and an online Mediawiki site, obtains complete histories of each page on the list.
  • buildContributorsHtmlPages.pl given a template and a list of authors, builds a custom set of HTML pages containing all of the authors on the list.
  • checkMediawikiPageCompleteness.pl check if the local copies of pages from an online Mediawiki site are complete, i.e. have no dependencies (template files, multimedia resources, etc.) missing.
  • checkPageExistence.pl given a list of page titles and an online Mediawiki site, checks whether such pages exist in it. This can be handy, for example, to see what pages have been replicated.
  • checkRedirects.pl checks if there are no pages redirecting to non-existent pages (i.e. broken redirects). Eventually, it should also check against pages redirecting to each other.
  • listAllImages.pl lists all images of an online Mediawiki site.
  • listAllPages.pl lists all pages in an online Mediawiki site.
  • listCategoryEntries.pl lists the pages belonging to a category, recursively.
  • listRedirects.pl list page redirects in an online Mediawiki site.
  • mirrorMediawikiCode.pl downloads the exact same version used by an online MediaWiki site; this includes both Mediawiki code and Mediawiki extensions.
  • mirrorMediawikiInterwikis.pl installs to a local Mediawiki site the InterWikis (cross-language links) exactly identical to an online Mediawiki site.
  • mirrorMediawikiPages.pl copies a set of pages and their dependencies (template and multimedia resources) from an online Mediawiki site to a local Mediawiki site.
  • modifyMediawikiEntry.pl removes, deletes, or replaces a list of pages from an online Mediawiki site.

Dumping Tools

ZIM Generation