single-file-cli is a small deno application that saves a web page into a single html file. It works quite well and it's a straight binary. It does have a dependency on chrome or chromium being installed but I was pleasantly surprised that I could download the already built application and just run it.
It is a bit hefty at 77mb and honestly it seems like there should be a simpler tool for this kind of stuff. It also does take a hot second to actually download the page but it works!
Launching a whole headless browser to load the web page is probably the long part and I don't see any way of getting around that. However I think writing the tool in something that gets compiled might end up much smaller.
I need to set up my a cron that will go through my posts and save any links it finds. This would then get processed by an application that will call single-file-cli for each link. This way I could get a some archiving and not have to deal with bitrot.
This is a bit trickier for sites I just like in general, I don't want to write a crawler and I'm not datahoardery enough to self host something like that.
I'm also of two mind in regards to archiving things. Sometimes it's okay to forget. Not everything on the internet has to actually be forever. It may be nice but I think there is a value in losing things. Maybe.