this post was submitted on 28 Aug 2024
32 points (67.0% liked)

Technology

59323 readers
5426 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] corroded@lemmy.world 90 points 2 months ago (2 children)

This is kind of a shit article. Most of these are just old hardware that eventually had modern improvements, not "trends."

A "trend" is cold cathode black lights inside the case, not a silly naming scheme for CPU revisions.

[–] AlternateRoute@lemmy.ca 20 points 2 months ago (1 children)

Ya acrylic side cases where a trend, maybe 3D monitors but everything else in there was just specific technology that has been replaced by better technology..

[–] DarkThoughts@fedia.io 4 points 2 months ago (1 children)

The blower gpu fans were definitely a trend. I remember buying third party coolers and strapping 120mm fans onto them with zip ties.

[–] AlternateRoute@lemmy.ca 8 points 2 months ago (2 children)

Blower fans had a technical reason to exist that isn’t very relevant anymore.

It used to be to keep the card profile low so you could have other PCI card slots populated. However these days everything including Wifi can be found pre populated on the motherboard. It is rare you put in any additional PCI cards in modern personal systems.

[–] piccolo@ani.social 5 points 2 months ago

They also help keep hot air from the gpu from being dumped inside the case instead blowing straight out. But the better solution is just better case airflow. So their limited usecases is still in server racks.

[–] DarkThoughts@fedia.io 2 points 2 months ago

ATX boards had most relevant slots below that already, especially due to SLI / CF being a thing at that time. I know because I used wlan and audio cards back then - that is with the third party cooler + fans, which blocked like 3 slots.

[–] yokonzo@lemmy.world 7 points 2 months ago (5 children)

IDK I would say 3d monitors are a trend that died pretty hard

[–] conciselyverbose@sh.itjust.works 9 points 2 months ago

A trend implies a level of popularity. There was none.

It's ultimately just failed (or "pre-successful") technology that wasn't able to do the job well enough at a sufficient price to develop a market.

load more comments (3 replies)
[–] spechter@lemmy.ml 42 points 2 months ago (1 children)

How was IDE a hardware trend?

[–] dinckelman@lemmy.world 21 points 2 months ago

It's an XDA article, what did you expect.

None of these are trends. They're all hardware standards, and all but one of them are still very much here anyway

[–] tal@lemmy.today 35 points 2 months ago (5 children)

Molex connectors were almost universally hated for being flimsy and requiring a lot of effort to connect properly. They were fortunately replaced by SATA connectors.

I can understand the "lot of effort", but flimsy? Those things were built like a tank. SATA connectors certainly aren't more-durable (not that that normally matters, inside a case).

[–] lazynooblet@lazysoci.al 21 points 2 months ago* (last edited 2 months ago)

Yes they were flimsy. When pushing them together the crimped ends would get pushed out the back of the plastic connector casing. Or they wouldn't align properly and would require either major force or fiddly realignment.

[–] extremeboredom@lemmy.world 13 points 2 months ago

I remember instances where the force required to disconnect the connector caused me to slip and rip a wire out.

[–] dgriffith@aussie.zone 8 points 2 months ago* (last edited 2 months ago)

They also came from a time when hard drives could draw several amps while in use and much more on spin-up. There was a good reason why SCSI drive arrays used to spin each disk up one-by-one.

Molex connectors are good for 10 amps or so, SATA connectors couldn't have handled that amount of current.

[–] TheGrandNagus@lemmy.world 4 points 2 months ago* (last edited 2 months ago)

I have seen so many flimsy molex connectors. SATA was far, far, far more robust. They were enormously flimsy. Are you thinking of the right connector?

load more comments (1 replies)
[–] BlucifersVeinyAnus@sh.itjust.works 31 points 2 months ago
[–] SomethingBurger@jlai.lu 29 points 2 months ago (2 children)

RGB. Please. Finding hardware that doesn't light up like a Christmas tree is harder than it should be. Even a simple power LED can light up an entire room.

[–] echodot@feddit.uk 11 points 2 months ago (2 children)

I don't really mind RGB, but my complaint is why every single LED has to be vivid electric blue. I want old red LEDs back, they were nice, they didn't scorch my retinas.

[–] JohnEdwa@sopuli.xyz 2 points 2 months ago* (last edited 2 months ago)

Agreed. My PC case came with a blue power light, after one night of watching the blinking illuminate my entire room I ripped it out and swapped in a dim red one myself.

For a quick fix, you can make blue power LEDs slightly more tolerable by sticking a piece of yellow post-it note on top of them, it turns them white.

load more comments (1 replies)
[–] flodabo@programming.dev 2 points 2 months ago (2 children)

Not anytime soon. Way too cheap to include(like cents for a mouse or ram and a few dollars for a keyboard) , and way too popular not to include. Well at least you can disable it.

[–] isolatedscotch@discuss.tchncs.de 3 points 2 months ago (1 children)

right, you fan disabile them using their unique software which you have to install for every component, signing away your life (cough cough Disney) in the process

load more comments (1 replies)
load more comments (1 replies)
[–] sorghum@sh.itjust.works 22 points 2 months ago (3 children)

I remember my first serious build, blue acrylic case with as much black light reactive components I could get

[–] Badeendje@lemmy.world 8 points 2 months ago (1 children)

My case is an old Tower Server Case tucked away behind my monitors. Loads of space and no need for cable management.

[–] linearchaos@lemmy.world 9 points 2 months ago (1 children)

That bastard would slice you open and gut you like a pig at the first opportunity though.

[–] Badeendje@lemmy.world 4 points 2 months ago

I have sacrificed to the case god already

[–] nokturne213@sopuli.xyz 6 points 2 months ago

I remember the first full build I did. All of my fans had LEDs, the case had LEDs. The first time I tried to play on it in the dark basement the SU was blinding. I disconnected all of the case LEDs, and replaced my fans for plain black ones.

[–] __init__@programming.dev 4 points 2 months ago

Oh man I went through this phase too. I had the clear acrylic case and a bunch of those UV CCFL tubes.

[–] cmnybo@discuss.tchncs.de 19 points 2 months ago (3 children)

The thing that I wish would go away is oversized graphics cards that take up 3 or more slots. There needs to be more options for liquid cooling that doesn't require modifying the card.

[–] borari@lemmy.dbzer0.com 4 points 2 months ago (2 children)

I think I’m misunderstanding your comment. Once you liquid cool the card, it’s no longer an oversized behemoth. My reference 4080S is only taking up a single slot.

[–] cmnybo@discuss.tchncs.de 5 points 2 months ago (1 children)

Most graphics cards have massive air coolers that block other PCIe slots. I want more water cooled options since they are low profile. I just don't want to have to void the warranty on a brand new card to install a water block.

[–] borari@lemmy.dbzer0.com 3 points 2 months ago (1 children)

I know for sure that installing a water block does not void the warranty on reference Nvidia cards. I’ve read that Asus (and evga rip) are the same. Not sure about MSI, and have read that Gigabyte will try to void warranty.

load more comments (1 replies)
[–] MonkderVierte@lemmy.ml 2 points 2 months ago (1 children)

The PCP is still big in those cases.

[–] borari@lemmy.dbzer0.com 4 points 2 months ago

Sure, but the PCB with water block only takes up a single PCIe slot, and is shortened enough to fit in pretty much any case. Is my water cooled 4080S longer than my water cooled RX 480? Yes. Substantially longer? No. Thicker? Also no, basically same thickness.

[–] catloaf@lemm.ee 3 points 2 months ago

That would require cooler mount standards. I don't think AMD or Nvidia currently have a standard.

[–] tal@lemmy.today 3 points 2 months ago (1 children)

I am thinking that maybe more liquid cooling will happen with the whole AI thing on the datacenter side. That has a lot of parallel compute cards generating a lot of heat. Easier to move it with liquid than air.

Some other liquid-cooling annoyances:

  • Cases don't really have a standard-size mounting spot for the radiators.

  • I want to use one radiator for all of the things that require cooling. Like, I'd rather have an AIO device that provides multiple cold plates.

[–] conciselyverbose@sh.itjust.works 2 points 2 months ago* (last edited 2 months ago)

I really doubt liquid is easier for a data center. They have airflow solved pretty well and noise doesn't really matter. Liquid failing could potentially do way more damage, and might require shutting down whole areas for repair/damage prevention in the case of a single leak.

If they did do liquid at scale, it wouldn't be done in a way it would work down to consumers. It would be like custom boards with full coverage blocks for the whole system that tied into whole room water chillers or something.

[–] MonkderVierte@lemmy.ml 11 points 2 months ago (1 children)

The worst is still around: that GPU's require more and more power. I wished more focus on efficiency. Not long until water cooling is mandatory, to get all the heat away.

[–] JohnEdwa@sopuli.xyz 9 points 2 months ago* (last edited 2 months ago)

They are. GTX 590 from 2011 has a TDP of 375W. RTX 4080 has 320W, while offering over ten times better performance. 4060 outperforms the 1060, 2060 and 3060 while having a lower TDP than any of them.

If you want low TDP, the RX 6400 is twice as powerful as the 590 while having a TDP of 53W.

It's the very top of the line stuff like 4090 that push the limit by achieving that very last 10% performance bump at the cost of using double the power, and that's kinda like complaining a Bugatti Veyron gets terrible highway MPG figures.

[–] hellothere@sh.itjust.works 8 points 2 months ago (1 children)

The lack of PsyX cards is upsetting.

[–] Wispy2891@lemmy.world 2 points 2 months ago

Unfortunately those cards come and went so fast that the LLM that wrote this "article" didn't have enough data on this

[–] outrageousmatter@lemmy.world 6 points 2 months ago* (last edited 2 months ago)

The capacitor plague era, ever wonder why we don't see a lot of PC's in the early 2000s, this is why as everything with a cap would fail and kill the boards, essentially having to call on the oem to fix it.

[–] Fluffy_Ruffs@lemmy.world 5 points 2 months ago

Intel's slot CPU interface. Sure it cleaned up motherboard layouts but the need for more comprehensive cooling solutions that would soon follow made this a bad direction to go in.

[–] lnxtx@feddit.nl 4 points 2 months ago (1 children)

Did bottom PSU ATX cases disappeared? Floor dust suckers.

[–] Bakkoda@sh.itjust.works 2 points 2 months ago
[–] TropicalDingdong@lemmy.world 4 points 2 months ago (1 children)
load more comments (1 replies)
load more comments
view more: next ›