Difference between revisions of "User talk:Kelson"

Jump to navigation Jump to search
Line 229: Line 229:
::I'm currently working to build a new Wikipedia dumper in nodejs. You could maybe have a look ? https://sourceforge.net/p/kiwix/other/ci/master/tree/mwhtmldumper/ [[User:Kelson|Kelson]] ([[User talk:Kelson|talk]]) 17:39, 1 July 2013 (CEST)
::I'm currently working to build a new Wikipedia dumper in nodejs. You could maybe have a look ? https://sourceforge.net/p/kiwix/other/ci/master/tree/mwhtmldumper/ [[User:Kelson|Kelson]] ([[User talk:Kelson|talk]]) 17:39, 1 July 2013 (CEST)
::I read it. Till this time, code is written to download content from Wikipedia and create HTML files. Right? Though I am new to nodejs, I think I might help if you can guide me what to do and also provide some details and documentation that how are we creating the dump? --[[User:Akapribot|Ashutosh Kumar Singh]] ([[User talk:Akapribot|talk]]) 01:29, 2 July 2013 (CEST)
::I read it. Till this time, code is written to download content from Wikipedia and create HTML files. Right? Though I am new to nodejs, I think I might help if you can guide me what to do and also provide some details and documentation that how are we creating the dump? --[[User:Akapribot|Ashutosh Kumar Singh]] ([[User talk:Akapribot|talk]]) 01:29, 2 July 2013 (CEST)
:::Yes, this script is a prototype to download articles from Wikipedia. The old way we were using doesn't work anymore this is why I work on that new script now. The approach is to create a directory on the hard-disk with all the pictures and html pages and then transform this directory to a ZIM file (I have a script to do that). One of the things to do with this script is to better retrieve js/css, if you have a look in details, you will see all the necessary css/js code is not downloaded from the wikipedia. So it would be great if you could fix that? 10:14, 2 July 2013 (CEST)

Navigation menu