this post was submitted on 18 Sep 2023
188 points (95.2% liked)
Linux
48143 readers
788 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Stagnation isn't always evil, it's just part of tech. Once tech solves the problem it set out to, it should stagnate. Adding more bells and whistles makes things better less often than it makes them bloated and more prone to breaking. On the flipside, software that hasn't changed much other than bugfixes and security patches is the backbone of a loooot of our tech infrastructure. Edit: @SkyeStarfall@lemmy.blahaj.zone provides an excellent refutation, with counterexamples showing where lack of new features is hurting X11 here (direct link broke for me on lemmy.ml, hence the redirect)
I fail to see how the architectural difference fundamentally solves the issue of changes breaking compatibility. Now instead of breaking compatibility with the server, you're "only" breaking compatibility with the compositor. But that's okay because at least there are other compositors that fulfill this use case... oops switching to that compositor broke 3 of your other apps, well lets try another! ... and now my pc won't communicate with my GPU... well, we can always... and so on and so on.
Not saying that wayland is bad nor that X is better, but these are the two most common "cases against X/for wayland" that I hear and I just don't buy it. As much as I argued against it, I love trying new and different software and eking every last bit of performance out of my 8 year old PCs, I can't wait to give Wayland a try and see if there's a noticeable difference... I just wish these two arguments would go away already
The issue is that X was never a mature, feature-complete, stable project. It was always a hideous and bloated hodgepodge of disparate and barely working patches. The entire point of Wayland is to do exactly what you say tech should do: solve the particular problem (graphics server) well and cleanly, and limit itself to a definable set of features so it can actually reach that point of stability.
Looool. It was too stable, which means stagnation.
You mean bloated protocol or bloated implementation? Because kwin_wayland is pretty bloated.
Tying graphics server to audio server is very clean.
As I understand it, Wayland offloads a ton of stuff that was core to X11 (like input device handling) directly to the compositor. The end result is every compositor handling things differently. Compare something like i3 to Sway. Sway has to handle input, displays, keyboard layouts, etc directly in its config. If I switch to Hyprland I then have to learn Hyprland's configuration options for doing the same. Meanwhile, switching from i3 to dwm requires only setting up the WM to behave how I want - no setting up keyboards, mice, etc. It just feels clunky to work with Wayland compositors, frankly.
Also when something breaks in Wayland the fix is almost always hard to find or incredibly obscure because the fix isn't for Wayland- it's for the compositor. If your compositor isn't popular then good luck!
Can someone debunk this please? It feels like something is overlooked here
Not exactly. Imagine if xorg was also desktop environment with own compositor and effects. That's what wayland compositor is.
For 1., the big issue is that there constantly are appearing new standards in display technologies. Two semi-recent examples are HDR and VRR, both of which X11 struggles with, and implementing those into X11 has been said to be painful by its developers.
This is an excellent counterexample to claim 1, and I wish this was the top response to my comment. It not only negates the claim that "maintenance mode" isn't bad, it also provides specific examples of when it is bad.
Stagnation here specifically does mean that nobody is making bug fixes or security patches anymore. Xorg is abandoned, kaput, a former software project.
The new architecture allows developers to fix one thing without accidentally breaking 3 others.
Then the problem is that it's abandoned, not that it has stagnated (which can also be phrased as "stabilized" depending entirely on context and the speaker's/author's personal feelings about the project). Once again, I'm not saying that Xorg is good, but that particular critique needs to stop; it's major flaw is that even the "maintainers" are sick of it and want it to die, not that it has ceased major developments.
Even the article acknowledges this:
But it also falls into the "Bells and whistles" side of the critique immediately after:
and it even starts of explaining the problems with X by saying it's in "maintenance mode." I couldn't care less about new features, the Pareto principle implies 80% of users don't need new features regardless of how much dopamine they get from seeing the marketing hype. "Maintenance mode" isn't a bad thing, it's a good thing. Abandoned projects that most GUIs still rely on is a disaster waiting to happen.
That's an extremely bold claim, and vague, with no actual examples. Do I take it on faith that changing code can break things with X? Yes, but I, having worked with code, just assume that's what happens to all software. Do I believe that Wayland has found a way to do away with that problem of software architecture (and not necessarily protocol architecture)? Not unless they've somehow found a way to compartmentalize every single module such that every aspect is fully isolated and yet has interfaces for every potential use case that could ever be dreamed up. Any devs in the comments want to pipe up and let me know how that endeavor has worked for them in past projects?
The problem is not the code per se, but that we can't add stuff anymore that doesn't somehow break the core protocol. The plain fact is that we've been tacking on things to X11 which it was never designed to do for decades and we reached a breaking point a while ago.
Stuff like multi-DPI setups are impossible to implement in X11's single-framebuffer model; security on X11 is non-existent, but we can't retroactively fit any kind of permissions on the protocol as that breaks X11 applications that (rightfully) assumed they could get a pixmap from the root window. There's so much more, just take look at https://www.youtube.com/watch?v=RIctzAQOe44
By all means, feel free to start working on it!
All the people who developed Xorg for 20+ years decided that creating and working on Wayland was a better use of their time. But I'm sure you know better...
The problem isn't that Xorg is spaghetti code (it's pretty good for a large C project, imho). The problem is that the X11 protocol was designed to expose the capabilities of 1980s display hardware.
Wayland will become a spaghetti too, unless you do "compositorhop" because one compositor is not complete and need to use another, idk if this would be a good idea
Stop putting words in my mouth. I never mentioned spaghetti code, and i said nothing about being better or smarter than either Xorg or Wayland devs
You said that Xorg being abandoned is the problem. How should we interpret that, other than a criticism of the decision-making process of the devs?