this post was submitted on 19 Jul 2025
82 points (81.5% liked)
Technology
73232 readers
4859 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean, having it not help people commit suicide would be a good starting point for AI safety.
It will take another five seconds to find the same info using the web. Unless you also think we should censor the entire web and make it illegal to have any information about things that can hurt people, like knives, guns, stress, partners, cars....
People will not be stopped suiciding because a chat bot doesnt tell them the best way, unfortunately.
good. every additional hurdle between a suicidal person and the actual act saves lives.
this isn’t a slippery slope. we can land on a reasonable middle ground.
you don’t know that. maybe some will.
the general trend i get from your comment is you’re thinking in very black and white terms. the world doesn’t operate on all or nothing rules. there is always a balance between safety and practicality.