this post was submitted on 09 Oct 2023
531 points (97.8% liked)
Technology
59377 readers
5117 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You don't need it though. The power regulation is a decision between the load and the supply devices, the cable is an unnecessary third party. The cable should just be a multicore connection between two things, not a third device.
If I had to go out on a limb though, I'd say it's because manufacturers were selling cheap cables that didn't meet the specification, and people were using them with higher power devices, causing overheating. By including a chip in the spec for the cable, you can push some of the responsibility back towards the cable manufacturer, and they can limit the maximum current to whatever they've designed to. In which case, we already do have different cables for different voltages - if your cable isn't rated for 100W, then it might force a lower power even if your device and charger can do 100W. However it would be better if cable manufacturers would just meet the basic design specification to begin with, rather than creating unnecessary overhead.
The cable has to carry the negotiated power safely. It’s not unnecessary, it’s absolutely critical. I’ve personally seen and diagnosed the result of when this fails.
For your low power applications there is no need and the spec allows for that.
It wouldn't be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you're gonna have a bad time. If you use a 3A or better cable, then you don't need a cable chip to tell the actual devices to only work at 0.5A.
How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?
Divider resistors are okay, but the IC is a better choice for future proofing and reliability.