this post was submitted on 14 May 2026
305 points (96.4% liked)
Technology
84653 readers
4855 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The "correct" way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can't find one online, then learn from that.
Never let it do something for you. That's how you lose. If you're not actively learning, you're actively rotting, and that goes for life in general too.
So Using it as my emotional dumbing machine is wrong ?
I don't think that's a good idea, if you can't find an explanation online that means that there's not much info available in which case the best thing would be to ask on a forum, that way other people that look for that info will find it.
Not really, google results have been just that bad for the last 10 years. I can spend 10min looking for a piece of documentation on something and not find it. Or I can prompt an internet-connected AI and have it spit out links to relevant docs. It's gotten THAT bad.
except the "explanation" frequently will be 100% "hallucinated" bullshit
That's why I always ask it to cite sources. Basically googld ATP since google is turning to shit and all other search engines still aren't quite as good
It could very easily use a completely different or hallucinated source.
But a lot of LLM products are now providing source links right in the response. I've found them useful, and hopefully they aren't produced just by feeding the text back in and asking for a link.