this post was submitted on 09 Oct 2023
531 points (97.8% liked)
Technology
59377 readers
4666 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It wouldn't be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you're gonna have a bad time. If you use a 3A or better cable, then you don't need a cable chip to tell the actual devices to only work at 0.5A.
How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?
Divider resistors are okay, but the IC is a better choice for future proofing and reliability.