This is an automated archive made by the Lemmit Bot.
The original was posted on /r/datahoarder by /u/Mhanz97 on 2024-03-18 17:00:31.
Hi everyone, like title said, why the hell its so hard to download a complete website for offline view?
I was trying to download a fandom wiki (the entire wiki about a videogame), i tried lot of tools and i always got some problems…
i tried:
- Wget: got problems when downloading images…lot of them was not downloaded…
- httrack: takes forever/ super slow, and not downloading all the images too + even with depth levels restriction keep download useless outside-domain websites
- offline explorer: maybe the worst since everything was messed up after download + no all the images
- Cyotek web copy: same as offline explorer
- Wikiteam software (dumpgenerator.py): ultra messy, super hard to install and didnt worked on my windows
Basically the only thing that at least download all the text + images its the chrome ctrl+s (save page), but i need to manually full load and save page by page…and when i read in offline mode its a bit messed up, but at least i have all the thing saved…
You must log in or register to comment.