this post was submitted on 20 Sep 2023
171 points (98.3% liked)

Technology

59377 readers
5811 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Chobbes@lemmy.world 30 points 1 year ago (14 children)

That’s not really something that’s on the horizon at all. There’s some experimental quantum computing stuff, but it’s not really practical for anything yet (and certainly not in a personal computer!) It’s also likely not going to be better at the stuff we use normal CPUs for. Eventually they might be useful for certain classes of problems, but probably in more of a coprocessor like capacity (kind of like a side unit like a GPU that’s good at certain tasks). Obviously it’s unknown what the future holds, but I don’t think quantum computing is going to replace silicon any time soon.

[–] DokPsy@infosec.pub 2 points 1 year ago (10 children)

I think it'll take a new component/circuit design for quantum to be viable for home computing similar to the transformation that happened to computers after the addition of the transistor

[–] herrvogel@lemmy.world 1 points 1 year ago (1 children)

I doubt quantum computing is ever gonna be viable for home computing. The benefits they offer over conventional computing are largely irrelevant to almost anything you might be doing at home, and better materials or manufacturing methods won't change that.

[–] DokPsy@infosec.pub 1 points 1 year ago

Depends on how we approach viability, imo

Can we currently see a reason for it with its current abilities/functions? No

But

We can look right at the history of conventional computing to predict a possible timeline for it. Single purpose computational machines that took a lot of power, a lot of room, and were fairly rare. Used for military or research purposes. Multi purpose machines that could run user created calculations and were slightly smaller and efficient. Begins to be used in more academic settings Multipurpose machines capable of being used to aid general office staff, continue to become more compact and efficient Portability becomes possible for select few with a need And so on until we arrive to now where nearly everything and everyone has a computer

load more comments (8 replies)
load more comments (11 replies)