this post was submitted on 06 Feb 2024
101 points (89.8% liked)
Linux
48031 readers
826 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think because such an undertaking would require a wide breadth of extremely specialized knowledge. It would require intense coordination of many experts to work together over many years, all to design something that:
Item 1 is OK for hobbyists, who might value open source over new-ness, but item 2 all but guarantees that only big corporations can actually get involved. They don't care about free and open source. They just want a computing platform that their engineers can develop a product for. As long as there's enough documentation for their goals, open source is irrelevant.
The power of modern computing comes partly in how it enables abstraction. You don't need to understand the physics of electrons through a transistor to write a video game. Overall, the open source community has generally converged on the idea that abstracting away the really hard stuff is an acceptable tradeoff.
I actually disagree with point 1 to an extent. The startup work for such a machine would indeed require a lot of effort, but once that groundwork is in place, wouldn't that make it easier to maintain momentum and release a successor?
I guess it would depend on whether or not the project spawns a dedicated community that lasts for a long time. Without a wide pool of knowledgeable contributors, I think it would be hard for an original team to both support the one design while also developing the next iteration.
Not to bring it up as a whipping boy, but let's take the case of Wayland, which is "just" a software protocol. It was started back in 2008, and is still under active development. As more projects support it, more edge cases are coming up, which is why new features are added to the protocol all the time. In those 15 years, they've had to adjust to technologies that didn't exist back in 2008, like widespread adoption of 4k HDR displays, or Vulkan. Now imagine that, but with every aspect of a computer. In 2008, DDR3 RAM was just a year old. Today we're on DDR5 and you (probably) can't buy a new machine that takes DDR3. PCIE 2 was the latest shit in 2007. Now I see that PCIE 7 is planned for next year.
A global corporation can support old products while also developing new technologies because they have unfathomable labor and capital at their beck and call.
I think that free software can keep up with proprietary offerings because the barrier to entry is relatively low. You just need free time and a source control client. I think it would be different if your project toolchain involved literal tools that cost millions of dollars.