Notices


Reply
Thread Tools
pelago's Avatar
Posts: 2,121 | Thanked: 1,540 times | Joined on Mar 2008 @ Oxford, UK
#61
Originally Posted by crei View Post
Perhaps the "simple English" version would be the way to go. I will make a dump for that.
It will certainly be smaller, but mainly because it has many fewer articles (59,435 versus 3,239,806 in the normal English Wikipedia). No N900 article in the Simple English version!
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#62
Hi!

There are now dumps for evopedia in Arabic and Russian. Unfortunately, the search function does not work well with them. This is fixed in version 0.3.0, which is currently in extras-devel. So please use evopedia 0.3.0 (NOT 0.3.0 RC 3, which is the current version in extras) for languages not based on the latin alphabet.
 

The Following User Says Thank You to crei For This Useful Post:
Posts: 75 | Thanked: 78 times | Joined on Jan 2010 @ Germany
#63
does the new dump process (dumpathome) include formulas?

if so, I will start a new dump of the german wikipedia, because formulas are rather important for me…
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#64
Originally Posted by hcm View Post
does the new dump process (dumpathome) include formulas?

if so, I will start a new dump of the german wikipedia, because formulas are rather important for me…
The newly created dumps include formulas and the January 2010 dump does not. I will recreate a German dump and perhaps also make a new English dump as I think the quality of the dumps have improved over the last weeks.

At the moment, I'm working on the software itself again. We will have incremental regular expression search in the titles soon.
 

The Following User Says Thank You to crei For This Useful Post:
Posts: 75 | Thanked: 78 times | Joined on Jan 2010 @ Germany
#65
ok thank you! I am currently running the dumpathome script, but my only task is to wait
are you changing the dumpathome script itself? tell me, if I have to redownload it to help dumping…
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#66
Originally Posted by hcm View Post
ok thank you! I am currently running the dumpathome script, but my only task is to wait
are you changing the dumpathome script itself? tell me, if I have to redownload it to help dumping…
I am currently importing and compressing the German database, the redering process itself should start in some minutes (if nothing goes wrong). The script downloads everything on its own. Since the script is sleeping for some minutes if it does not get a job, you can restart it as soon as the state on http://dumpathome.evopedia.info/ changes from importing to rendering.
Thanks for you help!
 

The Following User Says Thank You to crei For This Useful Post:
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#67
The new German dump is finished. I hope it will work as I am not able to test it at the moment. Due to the immense support from the community, the rendering process is now almost as fast as the average download speed the server can provide for each user. Perhaps I can throttle it up a bit in the next days.
Thanks for your help!
 
Posts: 69 | Thanked: 14 times | Joined on Dec 2009
#68
I read somewhere how much was downloaded for all or just the English dump, but I cannot find it now.

I have some hosting space available for a couple of months which I might be able to make available. It has 7TB monthly allowance.

Bittorent is painfully slow. I am trying to help. I am uploading at 1.2Mbps.
 

The Following User Says Thank You to whats_up_skip For This Useful Post:
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#69
Thank you for your help, that is really great!

The contract for my server actually says that the traffic is unlimited but in fact the provider has warned me that I am using up too much bandwidth and I should restrict it to 70 GB daily. Oh well...

The data was: 2 TB in one week for everything, which includes all end user dump downloads but also the traffic caused by the distributed rendering system, which can be quite much, too.

Last edited by crei; 2010-04-11 at 23:02.
 
Posts: 69 | Thanked: 14 times | Joined on Dec 2009
#70
Well it sound like I might be able to help, but as you say it will be interesting to see what happens if I actually start using that sort of bandwidth.

I haven't used bittorrent before, but it seems to be very poor at even using the upload capacity I have made available. I suppose that is because I haven't been able to download much yet. In the last hour there has only been one ten minute block when the upload averaged 600Kbps, the rest of the time it is almost nothing except for the odd spike.
 
Reply


 
Forum Jump


All times are GMT. The time now is 06:38.