this post was submitted on 16 Jul 2023
4 points (100.0% liked)

Technology

37717 readers
422 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

The risk, explains Yuste, is that the same tools which – in medicine – can help improve people’s lives, can also end up violating the information stored in the brain. “Although the roadmap is beneficial, these technologies are neutral and can be used for better or worse,” he notes. This isn’t only about securing personal data, such as shopping habits, a home address, or which political party one supports – it also involves things as intimate as memories and thoughts. And, in the not so distant future, even the subconscious.

top 1 comments
sorted by: hot top controversial new old
[–] Peanutbjelly@sopuli.xyz 0 points 1 year ago* (last edited 1 year ago)

Think "shopping habits" already includes subconscious thoughts. Advertisers know when you will quit a brand before you do.

Title made me think this was a "sentient a.i." argument, but I'm glad to see it's not. human neuro rights is exactly what I think we need to be thinking about.

We also need a fix for established classes in society. Why have the smallest fraction of the population hoarded almost all of the benefits from humanity's advancements in the past 50 years? It's unconscionable.

not actually reading the article though, because i can't easily read it past the cookie confirmation.