I was thinking about this, and I think having crappy equipment and crappy Internet has given me the weird requirements-engineering advantage of trying to think of ways to do things offline or with reduced bandwidth/processing.
I'm always thinking, okay... and now assume the Internet sometimes doesn't work or is glacially slow, or the power to the router occasionally shuts down, or your mobile data is out of range or you're using some really old, 2-thread netbook that your Uncle Robert gave you, or you've only got 1 GB of storage available, or you're looking at it on ePaper, or working offline on a long train ride... Will it still work? Will it work smoothly?
Because, you know, that's what most people's communications are actually like. I want to design things that work... for most people.
#SecondWorldProblems
nostr:note167z253pc3pyq7a4csmkhnggyv6t0u3vz8amfcdrcpy3njgckgnnsdmrqk4
First job I had was programming for a mobile device with 4MB RAM, of which 2 MB were reserved, and a 🐌-slow ARM CPU.
Even though it's been ages, I still catch myself coding for a device like that, even though the code is running on something with a bajillion times the resources and it doesn't really matter if it takes 10 or 100ms.
Old habits die hard
I've always thought that a lot of software companies encourage inefficient software, so people need a new computer to run their software and that new computer comes with new software and the people buy more software. It forces people to buy (especially I believe that is Microsoft's business model). Since most programmers have nicer computers, they get lazy and write inefficient code. The two together means everything takes more resources and is more inefficient than it should be. If programmers still tried to be efficient, I'd never have to upgrade my computer till it died. That would be wonderful.
this comes and goes but at one time i bumped into source code that required more than 16gb of memory to compile
it's notable that the language i prefer, Go, can build almost anything even at 512Mb of memory and on a 7gb HDD (i literally did this about 4 months ago on a VPS) but if you needed C++ or javascript or rust to do that, sorry, nope
For most software companies is a matter of "Get it out the door now, so we can get paid." Spending the time/resources to optimize code doesn't impress people enough to be part of the sales pitch, so it's ignored unless it brings something to a halt.
When I saw software bloat increasing with hardware performance, I assumed there would be cycles of optimization, but it seems that libraries are used more often, not less, and many libraries are super-bloated.
Rushing products to market is the short win. Being able to craft your product well will attract its own attention.
The world of computing is basically finding ways to do the things that were possible 50 years ago increasingly more difficult.
I totally felt that.
Everyone skips right over simple, efficient solutions, to join the hype.