Follow

easy download of a whole website for offline viewing?

:boost_anim_sleeping:

@ademalsasa
These approaches sadly fail with a good amount of sites that will then only give you some JavaScript that will then load the content. Seems to be a common (and problematic) design pattern.
@luka

@wmd that makes sense, webbie. However, most websites I downloaded were not too crazy with JavaScript.

@luka

@wmd @ademalsasa @luka I find that to be quite a pain. I understand why they want to do it, this way they only need one API call to access the data, but it ends up sending way more data, and having far more round trips between the web browser and the server. I prefer making it so my pages contain the initial data, and only update via Javascript.

@luka not sure if thats what you mean: I use firefox 'save as single html' for archiving a lot. If i need to scrape multiple pages of a site then wget, but rarely

@luka I wish we had some kind of standard plugin or proxy able to make web experience asynchronous.

Like, you could experience some "web location" even without having access to net only 1-2 hour in the week as long as you pindown what websites you will scavange later-on, and your every answer/contribution would just be sent later on, like in git (even if merging is not super duper fun).

@luka
I don't know about a whole website, but it would be nice for blogs or sites that feature articles to have a feature where you could download the text of an article without having to also save the whole webpage including style data and sidebars that you don't care about.

Sign in to participate in the conversation
SoNoMu

SoNoMu (Sound Noise Music) is a mastodon instance for musicians, sound-artists, producers of any kind of aural noise, songwriters, bedroom producers, sonic manglers and algorave livecoders. -> more...