Engineering For Slow Internet Even When Not Stuck In Antarctica
Engineering For Slow Internet Even When Not Stuck In Antarctica

Engineering For Slow Internet Even When Not Stuck In Antarctica

Engineering For Slow Internet Even When Not Stuck In Antarctica
Engineering For Slow Internet Even When Not Stuck In Antarctica
Ive had this thought for a while.
If humans ever go to other planets, its going to be VERY hard to keep software up to date without some serous thought and relay stations. The speed of light is a hard restriction.
Lots of devices are only designed for "always on" capability. What happens when its near impossible to "phone home"?
Local mirrors and caching proxies.
I've worked in an environment like this. We had a local server for Windows and Mac updates. Direct updates were blocked. It's a solved problem, you just need developers to participate.
This is what IPFS is for. Instead of linking to a location that would be way far away off planet, it links to the content which could very well be cached on planet or on a relay station closer. Sure, one person has to pull it down from the incredibly far away place, but once it's pulled down at least one time, everybody else pulls it from the more local version that that person has. However, though, timeouts will need to be increased. Maybe not to some insanely stupid amount, but they will need to be increased somewhat.
This is one of the reasons why I preach against Electron apps and the “storage is cheap” argument. Additionally, it may also be really expensive for people in 3rd world countries to buy storage.
Instead of sneaker-net it will be rocket-net... and at a certain point you need an on-prem on-planet support team to just figure things out.
This very much bothers me as a web developer. I go hard on Conditinal GET Request support and compression as well as using http/2+. I’m tired of using websites (outside of work) that need to load a fuckton of assets (even after I block 99% of advertising and tracking domains).
macOS and iOS actually allow updates to be cached locally on the network, and if I remember correctly Windows has some sort of peer-to-peer mechanism for updates too (I can’t remember if that works over the LAN though; I don’t use Windows).
The part I struggle with is caching HTTP. It used to be easy pre-HTTPS but now it’s practically impossible. I do think other types of apps do a poor job of caching things though too.
Yes, Windows peer to peer update downloads work over LAN. (In theory, I've never verified it.)
HTTP caching still works fine, if your proxy performs SSL termination and reencryption. In an enterprise environment that's fine, for individuals it's a non-starter. In this case, you'd want to have a local CDN mirror.
I couldn’t get SSL bumping in Squid on Alpine Linux about a year ago but I’m willing to give it another shot.
My home router is also a mini PC on Alpine Linux. I do transparent caching of plain HTTP (it’s minimal but it works) but with others using the router I do feel uneasy about SSL bumping, not to mention some apps (banks) are a lot more strict about it.
This low bandwidth scenario led to highly aggravating scenarios, such as when a web app would time out on [Paul] while downloading a 20 MB JavaScript file, simply because things were going too slow.
Two major applications I've used that don't deal well with slow cell links:
I remember there is some timeout flags you can do on curl that you can use in conjunction with git...but its been nearly a decade since Ive done anything of the sort. Modern day GitHub is fast-ish...but yeah bigger stuff has some big git issues.
Good points! Didn't know about Lemmyverse.net!
Didn't know about Lemmyverse.net!
As a PieFed user, soon you don't need to - piefed instances will automatically subscribe to every community in newcommunities@lemmy.world so the local communities-finder will always have everything you ever need.
Coming in v1.2.
Every third party site hanging around the fringes of Lemmy is a crutch for missing features in Lemmy and an opportunity for PieFed to incorporate it's functionality.
A bit of banging away later --- I haven't touched Linux traffic shaping in some years --- I've got a quick-and-dirty script to set a machine up to temporarily simulate a slow inbound interface for testing.
I'm going to see whether I can still reproduce that git failure for Cataclysm on git 2.47.2, which is what's in Debian trixie. As I recall, it got a fair bit of the way into the download before bailing out. Including the script here, since I think that the article makes a good point that there probably should be more slow-network testing, and maybe someone else wants to test something themselves on a slow network.
Probably be better to have something a little fancier to only slow traffic for one particular application --- maybe create a "slow Podman container" and match on traffic going to that? --- but this is good enough for a quick-and-dirty test.
Nice! Scientific data!
Also looks like its still an issue with GH: https://github.com/orgs/community/discussions/135808 in slower countries. so yeah nvm its still a huge issue even today.
Meshtastic LongFast is a blazing 1.09kbps and even ShortFast is ~10kbps. Wifi 802.11ah halo can do 4mhz and 16mbps max.
Have they tried sending their packets using pigeons?
Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway.
-- Andrew S. Tanenbaum
heh! Thats such a funny/cool protocol.
While this is indeed a noble cause, i wonder if internet being slow in Antarctica is real. A large number of data recieving stations for polar satellites are stationed in Antarctica and they send data to other continents through high speed fiber lines which are also used for internet.
It is quite real. The satellite links are like 10 Mbps. You go far enough south, and you cant even hit the satellite because it's over the horizon. There aren't any high-speed polar satellites. Companies don't send their satellites that far south because there are too few customers to justify the cost.
That's changing with starlink, though, since those ones are in a polar orbit.
10 Mbps is like average Scotland internet unless you're in a major city.
My point is that Antarctica is well connected by fiber. Am I mistaken?