The Following 3 Users Say Thank You to crei For This Useful Post: | ||
|
2010-03-10
, 22:19
|
Posts: 355 |
Thanked: 566 times |
Joined on Nov 2009
@ Redstone Canyon, Colorado
|
#2
|
The Following User Says Thank You to jebba For This Useful Post: | ||
|
2010-03-12
, 14:35
|
Posts: 278 |
Thanked: 303 times |
Joined on Feb 2010
@ Norwich, UK
|
#3
|
The Following User Says Thank You to nidO For This Useful Post: | ||
|
2010-03-12
, 15:38
|
Posts: 58 |
Thanked: 42 times |
Joined on Jan 2010
|
#4
|
Am I right in thinking this isn't terribly stable at the moment? I left this running on two VM's overnight and each processed a few chunks fine, then both downloaded new copies of the 11.3GB dump and since this time theyre throwing out various mediawiki/database errors every time they get given a new job to process.
|
2010-03-12
, 16:16
|
Posts: 278 |
Thanked: 303 times |
Joined on Feb 2010
@ Norwich, UK
|
#5
|
|
2010-03-13
, 17:17
|
Posts: 355 |
Thanked: 566 times |
Joined on Nov 2009
@ Redstone Canyon, Colorado
|
#6
|
Thanks for the info - It looks like my copies are broken, both clients did update a short while ago but after doing so i'm still getting one of two faults, exactly which one appears varies each time I kick off the client:
inforrect information errors for ./wikidb/user..frm (the user table seems to be empty, even after the client update)
Spurious "MediaWiki internal error. Exception caught inside exception handler" faults after the client is assigned a job to do and the static HTML dump folder is created.
So, i'm now redownloading both the commons and en data from scratch, and will see how it gets on.
The Following User Says Thank You to jebba For This Useful Post: | ||
|
2010-03-15
, 15:56
|
Posts: 278 |
Thanked: 303 times |
Joined on Feb 2010
@ Norwich, UK
|
#7
|
Are you on a 64bit system? You may need to install libc6-i686 (debian) or glibc.i686 (fedora).
|
2010-03-18
, 18:09
|
Posts: 58 |
Thanked: 42 times |
Joined on Jan 2010
|
#8
|
|
2010-08-25
, 14:30
|
Posts: 8 |
Thanked: 0 times |
Joined on Aug 2010
|
#9
|
Version 3 of Evopedia, the offline Wikipedia reader for (not only) maemo has now been available in Extras-testing for quite a while. Unfortunately there is no recent dump for the English edition of Wikipedia. Evopedia uses a compressed database of pre-rendered article pages (called dump) to deliver articles even faster than when reading them online. The drawback of this strategy is the amount of time needed to pre-render every single article when creating such a dump. This is the reason why there is no current English dump yet.
To remedy this situation, dump at home was created: A system for distributed rendering of Wikipedia pages. If you have a Linux computer with some spare CPU cycles and want to have a recent English (or any other edition, but at the moment, English has priority) Wikipedia dump, please consider joining the project. More information is available on the dump at home project site: http://dumpathome.evopedia.info/contribute
Note that the platform is still in some kind of beta state, so please forgive me if there are still some bugs (and please also report them).
Thanks for your time and interest in the project.
Last edited by crei; 2010-03-10 at 19:36.