nonsense, software has always been crap, we just have more resources
the only significant progress will be made with rust and further formal enhancements
This is a most excellent place for technology news and articles.
nonsense, software has always been crap, we just have more resources
the only significant progress will be made with rust and further formal enhancements
I don't trust some of the numbers in this article.
Microsoft Teams: 100% CPU usage on 32GB machines
I'm literally sitting here right now on a Teams call (I've already contributed what I needed to), looking at my CPU usage, which is staying in the 4.6% to 7.3% CPU range.
Is that still too high? Probably. Have I seen it hit 100% CPU usage? Yes, rarely (but that's usually a sign of a deeper issue).
Maybe the author is going with worst case scenario. But in that case he should probably qualify the examples more.
I haven't really checked but CPU usage on Teams while just being a member on a call is low, but using the camera with filters clearly uses more. Just checking CPU temps gives you more or less how much CPU is used by a program. So clearly it is just worst case scenario: using camera with filters on top.
My issue with Teams is that it uses a whole GB of ram on my machine with it just existing. It's like it loads the entire .NET runtime on the browser or something. IDK if it uses C# on the frontend.
IDK if it uses C# on the frontend.
Pretty sure it's a webview app, so probably all javascript.
Unless you’re running out of RAM what’s the issue? Unused ram is wasted ram.
Ram usage today is insane, because there are two types of app on the desktop today: web browsers, and things pretending not to be web browsers.
That's been going on for a lot longer. We've replaced systems running on a single computer less powerfull than my phone but that could switch screens in the blink of an eye and update its information several times per second with the new systems running on several servers with all the latest gadgets, but taking ten seconds to switch screens and updates information every second at best. Yeah, those layers of abstraction start adding up over the years.
I've not read the article, but if you actually look at old code, it's pretty awful too. I've found bugs in the Bash codebase that are much older than me. If you try using Windows 95 or something, you will cry and weep. Linux used to be so much more painful 20 years ago too; anyone remember "plasma doesn't crash" proto-memes? So, "BEFORE QUALITY" thing is absolute bullshit.
What is happening today is that more and more people can do stuff with computers, so naturally you get "chaos", as in a lot of software that does things, perhaps not in the best way possible, but does them nonetheless. You will still have more professional developers doing their things and building great, high-quality software, faster and better than ever before because of all the new tooling and optimizations.
Yes, the average or median quality is perhaps going down, but this is a bit like complaining about the invention of printing press and how people are now printing out low quality barely edited books for cheap. Yeah, there's going to be a lot of that, but it produces a lot of awesome stuff too!
Software quality collapse
That started happening years ago.
The developers of .net should be put on trial for crimes against humanity.
You mean .NET
.net is the name of a fairly high quality web developer industry magazine from the early 2010s now sadly out of print.