My company tried to jump onto the bandwagon in 2018 or so, but it fizzled out very quickly. Fortunately.
My company tried to jump onto the bandwagon in 2018 or so, but it fizzled out very quickly. Fortunately.
It’s the same crap like with blockchain.
People have no idea how sophisticated modern IT systems already are, and if you glue fancy words on solved problems, people will cheer you for being super innovative.
Our legacy system always puts the label in the water and our clients rely on the faint cardboard flavor.
Microcontrollers aren’t “the whole board”, following that definition, an SoC wouldn’t have a CPU either.
MCs require support components. Clocks, power converters, level shifters, modem, etc. You’ll hardly wire a barrel plug and a servo directly to a DIP (though that would be pretty cool).
Those are still CPUs. Microcontrollers have CPUs, and those are the smallest units that can actually run code in a meaningful way.
However, Linux needs an MMU as far as I know, so you won’t see Ubuntu boot on an esp32, even though it does have a CPU.
Sure it can be done, but no corporation in the world will do that and the extremely large population of people who simply don’t care all that much about computers (and I don’t mean that as an insult) won’t do it either.
So effectively, a whole bunch of machines will get scrapped or their users won’t get any updates. And knowing MS’ history, they’ll probably scare people into buying a new PC via pop-ups every week.
But they should. Or at least comparable.
Think about the difference between Reddit and Lemmy. They both offer similar functionality, but Reddit will set your phone on fire if it gets the chance.
The same is true for YouTube. Browsing YouTube is scrolling through an image gallery, only video playback should be a problem. Yet, it will consume more resources than a well equipped laptop had when YouTube was launched. That’s insane.
We’re moving in a direction where computers get faster and faster, but for the last 10 years or so, the actual utility of the system as a whole stagnated. Besides games, what can a modern computer do, that a 2014 model couldn’t?
Yeah, you want to sniff nix first before you mainline nixos.
deleted by creator
I wonder what will happen with all the compute once the AI bubble bursts.
It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.
But what’s next? We’ve got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.
Never heard of it, makes total sense, but I’d guess 95% of developers just nuke the directory raw.
I feel like there’s a very fine balance for the effort required to publish a package.
Too easy and you get npm.
Too hard and you get an empty repo.
I feel like Java is actually doing a relatively good job here. Most packages are at least documented a bit, though obviously many are outdated.