=============================================================================== blog.notryan.com/005.txt Mon, 27 Apr 2020 Ryan Jacobs 02:15:00 -0700 Returning to LA . . . . . . . . =============================================================================== Tonight I flew into LA. It's weird coming back from a small town. Masks are now mandatory here apparently. In the airport, there is an eerie on-edge vibe. You feel guilty for sitting around. It is hinted at you to keep moving, not to stay put, to leave. Which makes sense I guess. On my flight, I read some Erlang material I scraped before boarding. It's "Learn You Some Erlang..." by Fred Hebert. His whole online e-book (HTML) scraped flawlessly, and weighs about 6 MB. I really appreciated that. I used `wget -mpkEnp https://learnyousomeerlang.com/content`. It worked perfectly. /ipfs/Qme7aKWACFpG2pp4CaBUmn77cSgn6MscUsZDKyRJGaFjrE/learnyousomeerlang.com --- If you're mindful of Internet longevity, you will do a couple of things: 1. Make your content easily scrapeable. Keep pages light. Don't overuse CDNs. Keep references relative and local. 2. If you can manage, embed your styles and your JS. (Even better, don't use JS?) Single file .html sites are great for archiving. You can ctrl+s the page. Or throw in a curl or wget, without having to set flags that trawl through the DOM in search of references. --- Also, I've subscribed to Hebert's RSS feed. I have just "discovered" RSS. I think it's awesome. I'm going to attempt to create an RSS generator for my blog. I have a couple of RSS feeds linked in my Thunderbird account, and honestly, it's goddamn near perfect. News comes to me, when I want to hit the refresh button. There is no endless scrolling through feeds of hyper-optimized always-novel content. It's my curated list... and slower? I'm enjoying it. I'm also subscribed to 4-5 different HNRSS.org feeds. One is based on keywords for topics I'm interested in. Another is set to only return the most popular posts (comments >= 250 || points >= 500). It keeps the noise down at the expense of missing out on some gold. But at least I can skip reading HN for a few days, and still have the good ones show up in my reader. If I have time, I will occasionally browse through HN manually. But it's good to know I won't miss the "big ones". Newsboat is another great CLI client. I've been using that to read the Archlinux News feed before I run system updates. Sometimes there are gotchas that the maintainers warn you about. HNRSS is open-source. One time it went down. (Nothing wrong with that, it's free and someone is paying for a server to host it.) But... I'm trying to think of an easy way where additional people could donate their server resources to mirror it to prevent downtime. This works great for static applications. But for something dynamic like this... we might need some consensus/verification algorithms. Might get tricky. But I will keep thinking about it. Signing off.. Ryan ~2:40-ish (I'm tired.)