this post was submitted on 05 Jun 2025
668 points (97.4% liked)

People Twitter

7599 readers
143 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] solsangraal@lemmy.zip 132 points 1 month ago (54 children)

it only takes a couple times of getting a made-up bullshit answer from chatgpt to learn your lesson of just skip asking chatgpt anything altogether

[–] papalonian@lemmy.world 14 points 1 month ago (4 children)

I was using it to blow through an online math course I'd ultimately decided I didn't need but didn't want to drop. One step of a problem I had it solve involved finding the square root of something; it spat out a number that was kind of close, but functionally unusable. I told it it made a mistake three times and it gave a different number each time. When I finally gave it the right answer and asked, "are you running a calculation or just making up a number" it said that if I logged in, it would use real time calculations. Logged in on a different device, asked the same question, it again made up a number, but when I pointed it out, it corrected itself on the first try. Very janky.

[–] stratoscaster@lemmy.world 11 points 1 month ago (1 children)

ChatGPT doesn't actually do calculations. It can generate code that will actually calculate the answer, or provide a formula, but ChatGPT cannot do math.

[–] SaharaMaleikuhm@feddit.org 3 points 1 month ago

It's just like me fr fr

load more comments (2 replies)
load more comments (51 replies)