Benutzer:Dirk Hünniger/wb2pdf

Aus Wikibooks
Zur Navigation springen Zur Suche springen

Summary[Bearbeiten]

mediawiki2latex converts MediaWiki markup to LaTeX and, via LaTeX, to PDF. It can be used to export pages from any project running MediaWiki, such as Wikipedia. It is also possible to generate epub and odt output files.

Web Version[Bearbeiten]

You may test mediawiki2latex under the following url

http://mediawiki2latex.wmflabs.org/

Remember:

  • There is a time limit of one hour 200 pages per request on the server.
  • There is no limit on the locally installed versions described below.
  • There is an other server that can compile larger documents (up to 800 pages) http://mediawiki2latex-large.wmflabs.org/

Installation Instructions[Bearbeiten]

see Installation Instructions

User Manual[Bearbeiten]

see the User Manual



Command Line Version[Bearbeiten]

A command line version is currently available as part of the Stretch debian distribution, as well as the current ubuntu distribution.

LaTeX intermediate Code[Bearbeiten]

On Linux you can use the -c command line option with an absolute pathname.

Media[Bearbeiten]

Talk[Bearbeiten]

File:Wb2pdfTalk.ogv

Slides[Bearbeiten]

Wb2pdfTalk.pdf

Poster[Bearbeiten]

File:Wb2pdfPoster.png

In Action[Bearbeiten]

To see it in action look here: Datei:Wb2latexCompilingWikibook2PDF.ogg

Developers[Bearbeiten]

The follwing Link Benutzer:Dirk Huenniger/wb2pdf/details explains some of the inner workings of the software.

Quality and Statistics[Bearbeiten]

A test run in October 2014 processing 4369 featured articles of the English Wikipedia did produce a PDF file in each case. In particular these were all featured articles we were able to find at the beginning of the test.

In December 2018 we look at the usage of the web server an saw that the 50 requests looked at resulted in the following output:

PDF 29
FAIL 10
EPUB 7
ZIP 2
ODT 2

The failures are believed to be caused by trying to process large books from the wikipedia book namespace exceeding the time limit of one hour.

In December 2018 we also did a test run on 100 featured articles on the English wikipedia. In two cases a PDF was not created. We just ran these two cases once again and got a PDF in each case. The total size of all PDFs was 2.2 GB on disk. 5 GB were downloaded in order to make them. The process took 6 hours and 15 minutes. The computer used was a i5-8250U notebook with 8 GByte of memory running only one instance of mediawiki2latex at a time. The internet downstream speed was 11.6 MBit/s.

The currently largest book (5000 pages) we created with mediawiki2latex is available from here:

https://drive.google.com/file/d/1SA6TEKWrdpXAxDyHZe-umBa2cJ5Ya77X/view?usp=sharing