The Following User Says Thank You to whats_up_skip For This Useful Post: | ||
|
2010-04-15
, 22:27
|
Posts: 69 |
Thanked: 14 times |
Joined on Dec 2009
|
#72
|
|
2010-04-15
, 22:33
|
Posts: 481 |
Thanked: 190 times |
Joined on Feb 2006
@ Salem, OR
|
#73
|
|
2010-04-18
, 17:05
|
Posts: 21 |
Thanked: 102 times |
Joined on Apr 2010
|
#74
|
|
2010-04-20
, 20:52
|
Posts: 58 |
Thanked: 42 times |
Joined on Jan 2010
|
#75
|
Hi all,
I still have a problem with evopedia and the german dump.
I installed evopedia 0.3.0, downloaded the german dump (http://evopedia.info/dumps/wikipedia...26.zip.torrent, 2.471.212.452Bytes, md5sum of the zip-file: a330c072ab3f47935ee30791830439b2) and copied the unzipped dump into /home/user/MyDocs.
After starting evopedia I pointed it to this folder, but it only shows me the content of the folder.
What is going wrong, or what am I doing wrong here?
Cheers
|
2010-04-20
, 22:11
|
Posts: 29 |
Thanked: 10 times |
Joined on Feb 2010
@ Germany
|
#76
|
|
2010-04-22
, 19:12
|
Posts: 21 |
Thanked: 102 times |
Joined on Apr 2010
|
#77
|
|
2010-04-22
, 21:32
|
Posts: 22 |
Thanked: 7 times |
Joined on Mar 2010
|
#78
|
|
2010-04-24
, 14:27
|
Posts: 58 |
Thanked: 42 times |
Joined on Jan 2010
|
#79
|
Sometimes, when I search a certain term, the result page says, "ERROR - Page not found". It means that the term is already indexed but it does not have an actual article for the term.
For example, "Machine Learning" term in "wikipedia_en_2010-01-16" dump
And the other question is about the dump.
I found that there is an original dump from wikipedia ("http://dumps.wikimedia.org/") and the dump for evopedia is different from the original wikipedia dump (also different from Aard dictionary?). In that case, is there any way to make a dump from other dumps (a kind of own wikipedia?)
|
2010-05-09
, 22:38
|
Posts: 29 |
Thanked: 10 times |
Joined on Feb 2010
@ Germany
|
#80
|
wikipedia_en_2009-02-28.tar.bz2