this post was submitted on 10 Oct 2023
261 points (96.8% liked)

Technology

58157 readers
4161 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 29 comments
sorted by: hot top controversial new old
[–] flossdaily@lemmy.world 52 points 11 months ago (2 children)

I cloned my own voice to prank a friend, and... Wow, it was a gut-dropping moment when I understood just how dangerous this tool is for precisely this type of scam.

It's one thing to hear about it, but to actual experience it... Terrifying.

[–] qooqie@lemmy.world 1 points 11 months ago (1 children)

Mind sharing more info about the prank? Sounds like an interesting story

[–] flossdaily@lemmy.world 19 points 11 months ago

Oh, it was nothing more than just showing off the technology, really. It wasn't a committed bit.

I cloned my voice then left a voicemail that said something like: "hey buddy it's me. My car broke down and I'm at... Actually I don't know where I'm at. I walked to the gas station and borrowed this guy's phone. He said he'll give me a ride into to town if I can get him $50 bucks. Could you venmo it to him at @franks_diner? I'll get you back as soon as I can find my phone. ... By the way this is really me, definitely not a bot pretending to be me."

[–] AnokLola@lemm.ee 1 points 11 months ago (1 children)
[–] flossdaily@lemmy.world 4 points 11 months ago

Check out ElevenLabs.

[–] Imgonnatrythis@sh.itjust.works 28 points 11 months ago (2 children)

Do you guys remember when the T-1000 did this?

[–] remotelove@lemmy.ca 21 points 11 months ago

What's wrong with Wolfie? I can hear him barking...

[–] Hamartiogonic@sopuli.xyz 3 points 11 months ago

In Terminator 1 the T-800 made a scam call to Sarah in order to find out where she is. He deepfaked the voice of Sarah’s mother, and she fell for it.

[–] Heratiki@lemmy.ml 17 points 11 months ago (1 children)

Good luck criminals. I ignore nearly every call.

[–] Catfish@lemmygrad.ml 2 points 11 months ago (1 children)

Yeah but they'll call your family. A friend of mine was recently affected by this, a scammer had a clone of her voice asking for around $300 to fix their car because they got stranded in the middle of nowhere. So they call up your parents and to your mom it's like "Oh no! My baby! Of course I'll help you!" and your mom gives them $300 thinking it's you.

[–] Heratiki@lemmy.ml 1 points 11 months ago

Yeah my family knows better. I don’t call anyone either plus I’ve got all of my family on DEFCON 1 when it comes to asking for money. Had someone try and scam my mom via Facebook pretending to be my sister. I have family members contacting me ALL the time with issues with their stuff so they don’t trust anything at all.

This all stems from myself getting scammed nearly 20 years ago via email so I’ve educated everyone immensely.

[–] sramder@lemmy.world 11 points 11 months ago (3 children)

Anyone know how many hours of training data it takes to build up a convincing model of someone’s voice? It was 10’s of hours when I did a bit of research a year ago… the article says social media is the likely source of training data for these scams, but that seems unlikely at this point.

[–] treefrog@lemm.ee 11 points 11 months ago (2 children)

I don't remember the exact number but I did see an article recently that said it was videos on social media like you surmised.

And it was a pretty minimal amount of data needed. Definitely not tens of hours. Less than one hour iirc.

[–] Rozz@lemmy.sdf.org 3 points 11 months ago (2 children)

Is it safe to assume that if you don't have any family that posts videos to Facebook/socials you are in a safer place?

[–] Sacreblew@lemmy.ca 2 points 11 months ago

Make sure to use a fake accent when talking to strangers on the phone

[–] treefrog@lemm.ee 0 points 11 months ago

I certainly am hoping so myself.

[–] sramder@lemmy.world 3 points 11 months ago

The technology has clearly come a long way in a short time, really fascinating.

I remember the first examples I read about being trained with celebrity read audiobooks because they needed so much audio data. I want to say Tom Hanks or Anthony Hopkins but I could have that confused with something else.

[–] Even_Adder@lemmy.dbzer0.com 3 points 11 months ago (1 children)

TorToiSe can work off of just three ten second clips when you're using a pre-trained model. No telling if that'll sound any good.

[–] sramder@lemmy.world 2 points 11 months ago

I’ll have to check that out, thanks for the link.

[–] Johanno@feddit.de 1 points 11 months ago (1 children)

The most advanced Model I know just needs half an hour of your voice or sth.

[–] sramder@lemmy.world 4 points 11 months ago

Someone else mentioned that Microsoft has one capable of working with far less material.

But 30 minutes is definitely short enough to make this sort of scam/attack feasible in my mind.

[–] just_another_person@lemmy.world 5 points 11 months ago (2 children)

Whomever is stupid enough to think that Tom Hanks is calling you personally probably needs a court appointed guardian.

[–] TheFriar@lemm.ee 51 points 11 months ago (2 children)

Did you read the article? It’s talking about taking kids voices from TikTok and shit. Social media. People have been posting videos of themselves talking for years. That’s enough data to train an ai to leave a message saying, “mom, I lost my phone and I’m in trouble. I need some money.” Or something of that sort. It’s been happening for a long time. This is only making it more confincing

[–] PlantJam@lemmy.world 3 points 11 months ago

enough data

To be clear, about three seconds of your voice is "enough".

[–] just_another_person@lemmy.world 1 points 11 months ago (1 children)

The reference of the entire article is talking about scammers using AI models of voice you know and understand. None of these scam rings have the time to break it down to your family.

[–] TheFriar@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

You sure? It’s very easy for these scammers to make a bot to trawl those “address/people lookup” sites, get family names and numbers, and then search for anyone in there’s public social media, and compile that footage. It wouldn’t be much work at all after creating the bot. Those creepy people lookup sites list an absurd amount of information. It would make doing this very easy. And think of how much work already goes into scams that use sheer numbers to boost likelihood of working with a basic ruse. If they can trim that list of available phone numbers down to—even if it were just 30%, or 15% of available phone numbers now with personal information and an in by imitating someone they know and love? That’s still a fuck load of people. And the likelihood of success would shoot WAY up while actually cutting down on the amount of work they’d need to do. So I’d argue you have that backwards.

[–] MargotRobbie@lemmy.world 8 points 11 months ago* (last edited 11 months ago)

Unless you actually know Tom Hanks personally and are expecting a call from him, of course.

[–] d4rknusw1ld@lemmy.world 1 points 11 months ago
[–] TheFriar@lemm.ee -2 points 11 months ago

The Industrial Revolution and its consequences have been a disaster for the human race.