this post was submitted on 31 Dec 2023
418 points (99.1% liked)

Not The Onion

12272 readers
1430 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
 

Michael Cohen, the former lawyer for Donald Trump, admitted to citing fake, AI-generated court cases in a legal document that wound up in front of a federal judge, as reported earlier by The New York Times. A filing unsealed on Friday says Cohen used Google’s Bard to perform research after mistaking it for “a super-charged search engine” rather than an AI chatbot.

I... don't even. I lack the words.

all 40 comments
sorted by: hot top controversial new old
[–] fine_sandy_bottom@lemmy.dbzer0.com 92 points 10 months ago (3 children)

I'm genuinely amazed at the calibre of people running the US. More so that aparently half the nation thinks its the best choice.

[–] jonne@infosec.pub 23 points 10 months ago (1 children)

Michael Cohen was working for Trump precisely because he couldn't get a proper lawyer job elsewhere. Good lawyers will steer clear of a client that will ask them to commit crimes for them.

[–] fine_sandy_bottom@lemmy.dbzer0.com 5 points 10 months ago (1 children)

I think this is fairly reductive. I work in a related industry. Often it's the best lawyers working for the shadier clients, for obvious reasons.

[–] rambling_lunatic@sh.itjust.works 1 points 10 months ago

That being said, the best lawyers avoid clients who have a bad habit of not paying.

[–] Carighan@lemmy.world 23 points 10 months ago

Yeah, the mental acumen on display is truly terrifying. Just not in the way Cohen would love to understand that sentence as. 😅

[–] FlashMobOfOne@lemmy.world 0 points 10 months ago (1 children)

I'm not.

We've seen little other than the loss of economic and social liberty in the last 40 years.

99% of voters still choose the same two parties in charge of it like clockwork.

Instead of amazement, I feel cynical resignation.

[–] nilloc@discuss.tchncs.de 1 points 10 months ago (1 children)

The impending doom of the fascist right is the only thing keeping me voting for the dems. If we had rank choice I’d be so much happier voting every election.

[–] FlashMobOfOne@lemmy.world 1 points 10 months ago (2 children)

That's the thing. I look around and have no reason to think fascism is impending. It's here.

Women are getting jailed for miscarriage, cops are hanging out lackadaisically outside a school shooting on their phones with zero consequences, homeless jumped 12% in one year, and the big issue is sending hundreds of billions more off to other countries' wars.

The only plus is that things have gotten so bad it's forced unions to become more aggressive and unyielding, which has effected more positive change for workers than the ruling parties have achieved in decades.

[–] TheaoneAndOnly27@kbin.social 0 points 10 months ago (1 children)

I don't know if you're into podcasts, but Adam conover's podcast Factually is really great. They have an episode called " what's the left gets wrong about the right" and it dives into how the right is primarily a reactionary movement to social and economic progress to try to maintain power for the owning and ruling class. It's a really great episode and it hits a lot on some of the similar points that you were mentioning with the unions. It's definitely worth checking it out If you've got like an hour to kill, It's super dope.

[–] nilloc@discuss.tchncs.de 1 points 10 months ago

Conover’s rant about the Patagonia founder’s billion dollar donation to avoid taxes and set his kids up is really good too.

[–] pearsaltchocolatebar@discuss.online 0 points 10 months ago (1 children)

Women are getting jailed for miscarriage

Are you talking about the Ohio woman, because she wasn't arrested for a miscarriage?

[–] Snoozemumrik@lemmy.world 1 points 10 months ago (1 children)

Sure, you can argue she got arrested for "abuse of a corpse", but

In September, when Watts went to the hospital in pain and passing large blood clots, doctors told her that despite some fetal cardiac activity, her roughly 22-week pregnancy was not viable. She was in and out of the hospital over the next three days, including a lengthy wait for a hospital ethics panel to determine whether her preterm pregnancy, which was on the borderline of Ohio’s abortion limit, could be induced without legal liability for the doctors. Watts eventually went home against medical advice and experienced the miscarriage on the toilet

[–] pearsaltchocolatebar@discuss.online 0 points 10 months ago (1 children)

You're leaving out the part where she tried to shove it down the toilet, then left it there for an extended period of time when that didn't work.

[–] LemmysMum@lemmy.world 1 points 10 months ago

Yes, people afraid of going to jail for their bodily function will behave in unpredictable manners.

[–] TheBat@lemmy.world 76 points 10 months ago

I... don't even. I lack the words.

Have you tried ChatGPT?

[–] CommanderCloon@lemmy.ml 67 points 10 months ago (1 children)

That's the second time a lawyer has made this mistake, though the previous case wasn't at such a high level

[–] huginn@feddit.it 54 points 10 months ago (1 children)

Not even close to the second time. It's happening constantly but is getting missed.

Too many people think LLMs are accurate.

[–] PriorityMotif@lemmy.world 5 points 10 months ago (2 children)

Problem is that these llm answers like this will find their way onto search engines like Google. Then it will be even more difficult to find real answers to questions.

[–] ghurab@lemmy.world 3 points 10 months ago (1 children)

Some LLMs are already generating answers based on other llm generated contant. We've gone full circle.

I was using phind to get some information about edrum sensors, (not the intended usecase, but I was just messing around) and one of the sources was a very obvious AI generating article from a contant mill.

Skynet is going to be so inbred

[–] huginn@feddit.it 2 points 10 months ago

Model collapse is going to be a big deal and it doesn't take too much poisoned content to cause model collapse.

[–] huginn@feddit.it 1 points 10 months ago

Have found, not will find.

There are so many spam sites with LLM content.

[–] CareHare@sh.itjust.works 30 points 10 months ago

This is what you get when the political system favours lies above truth

The more these people lie and get away with it, the more it will become the culture. China levels of big brother oppression are only a decade or so away if this keeps on going.

[–] rsuri@lemmy.world 15 points 10 months ago* (last edited 10 months ago)

The problem is breathless AI news stories have made people misunderstand LLMs. The capabilities tend to get a lot of attention, but not so much for the limitations.

And one important limitation of LLM's: they're really bad at being exactly right, while being really good at looking right. So if you ask it to do an arithmetic problem you can't do in your head, it'll give you an answer that looks right. But if you check it with a calculator, you find the only thing right about the answer is how it sounds.

So if you use it to find cases, it's gonna be really good at finding cases that look exactly like what you need. The only problem is, they're not exactly what you need, because they're not real cases.

[–] JeeBaiChow@lemmy.world 12 points 10 months ago (3 children)

And this is the guy they want testifying about 45?

'my honor, I object on the grounds that the prosecution witness is incompetent'.

[–] RaincoatsGeorge@lemmy.zip 14 points 10 months ago

Only so much you can do when you have to source your information from the dickheads that trump surrounded himself with.

[–] Tar_alcaran@sh.itjust.works 9 points 10 months ago

Because it's devastating to my case!

[–] Gamoc@lemmy.world 1 points 10 months ago

"then he is on equal footing with the defense, objection overruled"

[–] QubaXR@lemmy.world 11 points 10 months ago (1 children)

Stable Geniuses. All that bunch.

[–] theterrasque@infosec.pub 2 points 10 months ago

That explains the horsing around

[–] gerowen@lemmy.world 11 points 10 months ago (1 children)

While the individuals have a responsibility to double check things, I think Google is a big part of this. They're rolling "AI" into their search engine, so people are being fed made up, inaccurate bullshit by a search engine that they've trusted for decades.

[–] Carighan@lemmy.world 10 points 10 months ago (1 children)

That's not what they're talking about here. Unless this so different in the US, only Microsoft so far shows LLM "answer" next to search results.

[–] gerowen@lemmy.world 6 points 10 months ago (2 children)

Google may not be showing an "AI" tagged answer, but they're using AI to automatically generate web pages with information collated from outside sources to keep you on Google instead of citing and directing you to the actual sources of the information they're using.

Here's an example. I'm on a laptop with a 1080p screen. I went to Google (which I basically never use, so it shouldn't be biased for or against me) and did a search for "best game of 2023". I got no actual results in the entire first screen. Instead, their AI or other machine learning algorithms collated information from other people and built a little chart for me right there on the search page and stuck some YouTube (also Google) links below that, so if you want to read an article you have to scroll down past all the Google generated fluff.

I performed the exact same search with DuckDuckGo, and here's what I got.

And that's not to mention all the "news" sites that have straight up fired their human writers and replaced them with AI whose sole job is to just generate word salads on the fly to keep people engaged and scrolling past ads, accuracy be damned.

[–] thisisnotgoingwell@programming.dev 5 points 10 months ago* (last edited 10 months ago)

I mean I kind of see your point but calling those results AI is not accurate unless you're just calling any kind of data collation/wrangling or even just basic programming logic "AI". What Google is doing is taking the number of times a game is mentioned in the pages that are in the gaming category and trying to spoon feed you what it thinks you want. But that isn't AI. the point of the person you were replying to is that it wasn't as if he had intended to perform a Google search and was misled, you have to go to Google bard or chatgpt or whatever and prompt it, meaning it's on you if you're a professional who's going to cite unverified word salad. The YouTube stuff is pretty obvious, it's a part of their platform. What was done has nothing to do with web searches.

[–] xx3rawr@sh.itjust.works 4 points 10 months ago* (last edited 10 months ago) (1 children)

It was kinda funny to me when everyone freaked out about misinformation and "death of search" when I see a lot of people already never leave Google and treat Instant Answers as the truth, like they do with Chat-GPT, despite being very innacurate and out of context a lot of times.

[–] LemmysMum@lemmy.world 0 points 10 months ago (1 children)

Never expect the bottom 80% of the bell curve to have self awareness. That's a bet you lose 9 times out of 10.

[–] fsmacolyte@lemmy.world 1 points 10 months ago

Funny how "self awareness" has two meanings here. It's the essence of what makes humans the smartest animals, but the problem you're referring to—lack of self reflection—is one of the most common problems amongst people today. Common sense ain't so common.

[–] dasgoat@lemmy.world 7 points 10 months ago

Well to be fair Michael Cohen is not a lawyer, so how could he have known?