this post was submitted on 12 May 2024
471 points (85.7% liked)

linuxmemes

21607 readers
1055 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.
  •  

    Please report posts and comments that break these rules!


    Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.

    founded 2 years ago
    MODERATORS
     
    you are viewing a single comment's thread
    view the rest of the comments
    [–] areyouevenreal@lemm.ee 0 points 7 months ago (1 children)

    Sigh. It's not just a fricking driver. It's an entire framebuffer you plug into a USB or Thunderbolt port. That's why they are more expensive, and why they even need a driver.

    A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That's a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don't run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.

    [–] dditty@lemm.ee 1 points 7 months ago* (last edited 7 months ago) (1 children)

    Thank you for taking the time to reply, and for further sharing your expertise to our conversation! I understand different resolutions, that the docking station has its own chipset, and why the Plugable is more expensive than other docking stations as a result. I now have a more nuanced understanding of frame-buffers and how DisplayLink interfaces with an OS like MacOS.

    Allow me to clarify the point I tried to make (and admittedly, I didn't do a good job of expressing it previously). Rather than focusing on the technical specs, I had intended to have a more general conversation about design decisions and Apple's philosophy. They know that consumers will want to hook up a base tier MacBook Air to two external displays, and intentionally chose not to build-in an additional frame-buffer to force users to spend more. I sincerely doubt there's any cost-saving for the customer because Apple doesn't include that out of the box.

    Apple's philosophy has always been that they know what's best for their users. If a 2020 M1 MacBook Air supports both the internal 2K display and a single external 6K display, that suggests to me it should have the horsepower to drive two external 1080p displays (that's just a feeling I have, not a known fact). And I'll acknowledge that Apple has improved this limitation for the newer MBAs, which allow you to disable the built-in display and use two external displays.

    My broader point is that Apple "knows what's best" for their users: they want customers to buy an Apple display rather than to just stick with the 1080p LCDs they already own, because they're not Retina®. Which do you honestly think is a more common use-case for a MacBook Air user: wanting to connect to two monitors (home office, University classroom system, numerous board room settings I've worked in, etc), or to connect their $1200 MBA to a $1600-$2300+ Studio Display? For that, anyone with an iota of common sense would be using a MBP etc since they're likely a creative professional who would want the additional compute and graphics power for photo/video-editing, etc.

    I don't disagree with your explanation of the thought-process behind why Apple may have made this hardware decision for MBAs, but it is effectively an arbitrary, non cost-saving decision that will certainly impede customers who expect two displays to just work, since they can do that on their 10-year-old Toshiba Satellite or w/e.

    Thanks, and have a great day

    [–] areyouevenreal@lemm.ee 1 points 7 months ago* (last edited 7 months ago)

    It's not just about Retina displays. High res and HDR isn't uncommon anymore. Pretty much all new TVs anybody would want to buy will be 4K. It has to support the Apple 5K display anyway because that's one of their products.

    As we've discussed two external displays are supported on the new macbook base models. It was a bit of an oversight on the original sure, but that's been fixed now.

    Also the same SoCs is used in iPads. It's not mac only. I can't imagine wanting three displays on an ipad.