this post was submitted on 03 Oct 2023
692 points (95.9% liked)

Technology

59219 readers
3947 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Robin Williams' daughter Zelda says AI recreations of her dad are 'personally disturbing'::Robin Williams' daughter Zelda says AI recreations of her dad are 'personally disturbing': 'The worst bits of everything this industry is'

top 50 comments
sorted by: hot top controversial new old
[–] Blapoo@lemmy.ml 138 points 1 year ago (11 children)

Disturbing is an understatement. I'd call them repulsive. Relatives should be the only ones with this power, if at all.

Sure as shit not corporations. Fuck.

[–] whatwhatwhatwhat@lemmy.world 48 points 1 year ago (5 children)

Agreed, we desperately need regulations on who has the right to reproduce another person’s image/voice/likeness. I know that there will always be people on the internet who do it anyway, but international copyright laws still mostly work in spite of that, so I imagine that regulations on this type of AI would mostly work as well.

We’re really in the Wild West of machine learning right now. It’s beautiful and terrifying all at the same time.

[–] trachemys@lemmy.world 14 points 1 year ago (2 children)

It would be a shame to lose valuable things like there I ruined it, which seem to be a perfectly fair use of copyrighted works. Copyright is already too strong.

[–] TwilightVulpine@lemmy.world 20 points 1 year ago (26 children)

Copyright IS too strong, but paradoxically artists' rights are too weak. Everything is aimed to boost the profits of media companies, but not protect the people who make them. Now they are under threat of being replaced by AI trained on their own works, no less. Is it really worth it to defend AI if we end up with less novel human works because of it?

load more comments (26 replies)
load more comments (1 replies)
[–] lloram239@feddit.de 8 points 1 year ago (1 children)

but international copyright laws still mostly work in spite of that, so I imagine that regulations on this type of AI would mostly work as well.

The thing is, people still don't grasp the ease with which this will be possible and to a large degree already. This doesn't need hours of training anymore, you can clone voices with three seconds of audio and faces from a single image. Simple images can be clicked together in seconds with zero effort. Give it a few more years and you video can be created with equal ease.

You can regulate commercial use of somebodies likeness, which it largely already is, but people doing it for fun is unstoppable. This stuff is here today and it will get a whole lot more powerful going forward.

[–] vidarh@lemmy.world 2 points 1 year ago

Just a few years back, Vernor Vinge's scifi novels still seemed reasonably futuristic in dealing with the issue of fakes well by including several bits where the resolution of imagery was a factor in being able to analyze with sufficient certainty that you were talking to the right person, and now that notion already seems dated, and certainly not enough for a setting far into the future.

(at least they don't still seem as dated as Johnny Mnemonic's plot of erasing a chunk of your memories to transport an amount of data that would be easier and less painful to fit in your head by stuffing a microsd card up your nose)

[–] _number8_@lemmy.world 4 points 1 year ago (1 children)

yeah i don't think it should be legislated against, especially for private use [people will always work around it anyway], but using it for profit is really, viscerally wrong

[–] TwilightVulpine@lemmy.world 6 points 1 year ago

You know I'm not generally a defender of intellectual property, but I don't think in this case "not legislating because people will work around it" is a good idea. Or ever, really. It's because people will try to work around laws to take advantage of people that laws need to be updated.

It's not just about celebrities, or even just about respect towards dead people. In this case, what if somebody takes the voice of a family member of yours to scam your family or harass them? This technology can lead to unprecedented forms of abuse.

In light of that, I can't even mourn the loss of making an AI Robin Willians talk to you because it's fun.

[–] banneryear1868@lemmy.world 4 points 1 year ago* (last edited 1 year ago)

IMO people doing it on their own for fun/expression is different than corporations doing it for profit, and there's no real way to stop that. I think if famous AI constructs become part of big media productions, it will come with a constructed moral justification for it. The system will basically internalize and commodify the repulsion to itself exploiting the likeness of dead (or alive) actors. This could be media that blurs the line and proports to ask "deep questions" about exploiting people, while exploiting people as a sort of intentional irony. Or it will be more like a moral appeal to sentimentality, "in honor of their legacy we are exploiting their image, some proceeds will support causes they cared about, we are doing this to spread awareness, the issue they are representing are too important, they would have loved this project, we've worked closely with their estate." Eventually there's going to be a film like this, complete with teary-eyed behind-the-scenes interviews about how emotional it was to reproduce the likeness of the actor and what an honor it was. As soon as the moral justification can be made and the actor's image can be constructed just well enough. People will go see it so they can comment on what they thought about it and take part in the cultural moment.

[–] assassin_aragorn@lemmy.world 3 points 1 year ago (1 children)

We need something like the fair use doctrine coupled with identify rights.

If you want to use X's voice and likeness in something, you have to purchase that privilege from X or X's estate, and they can tell you to pay them massive fees or to fuck off.

Fair use would be exclusively for comedy, but still face regulation. There's plenty of hilarious TikToks that use AI to make characters say stupid shit, but we can find a way to protect voice actors and creators without stifling creativity. Fair use would still require the person's permission, you just wouldn't need to pay to use it for such a minor thing -- a meme of Mickey Mouse saying fuck for example.

At the end of the day though, people need to hold the exclusive and ultimate right to how their likeness and voice are used, and they need to be able to shut down anything they deem unacceptable. Too many people are concerned with what is capable than with acting like an asshole. It's just common kindness to ask someone if you can use their voice for something, and respecting their wishes if they don't want it.

I don't know if this is a hot take or not, but I'll stand by it either way -- using AI to emulate someone without their permission is a fundamental violation of their rights and privacy. If OpenAI or whoever wants to claim that makes their product unusable, tough fucking luck. Every technology has faced regulations to maintain our rights, and if a company can't survive without unbridled regulations, it deserves to die.

[–] whatwhatwhatwhat@lemmy.world 2 points 1 year ago

This was very well stated, and I wholeheartedly agree.

load more comments (10 replies)
[–] OprahsedCreature@lemmy.ml 52 points 1 year ago (1 children)

Capitalism literally Weekend at Berniesing the corpse of Robin Williams for profit.

This is fine

[–] dylanTheDeveloper@lemmy.world 10 points 1 year ago (1 children)

This gives me Michael Jackson hologram vibes

[–] DudemanJenkins@lemmy.world 8 points 1 year ago

At least that was just smoke screen trickery and not literal digital necromancy

[–] Case@unilem.org 38 points 1 year ago

Imagine losing your father in a tragic fashion, only for Hollywood execs to make a marketable facsimile of appearance and voice. If they could store his corpse and make it dance like a marionette they would.

Talk about retraumatizing the poor lady.

[–] AllonzeeLV@lemmy.world 35 points 1 year ago* (last edited 1 year ago) (2 children)

Hate it all you want. There's a buck to be made by our owners, so it will proceed.

Humanity at large is literally letting humanity's owner class destroy our species' only habitat, Earth, in the name of further growing their ego scores in the form of short term profit.

Who gives a shit about them stealing a dead celebrity's voice in the face of that? The hyper-rich stealing IP from the regular rich is wrong and should be illegal, but is clearly pretty far down the totem pole. Let's say we put all our effort into stopping them from doing that and win. We're still terraforming the planet to be less hospitable to human life, Zelda Williams included.

Priorities, can we have them? And no we can't "do both," because we have had no success stopping the owner class from doing anything that hurts others to further enrich themselves. I'm for putting all our effort into our species still being able to feed itself and having enough fresh water.

[–] daemoz@lemmy.world 2 points 1 year ago

Extremely anti post-modern-organic bias you seem to have. If we dont fill space with plastic and heat it enough, then HOW exactly do you propose we encourage establishing an entire Carbon-Polyethylene based evolutionary tree ?? 🌳

load more comments (1 replies)
[–] scarabic@lemmy.world 22 points 1 year ago (3 children)

This is such a random thought and I don’t mean to conclude anything by it:

I’ll bet people felt this way about the very first audio recordings.

How creepy to hear your sibling’s voice when that sibling is not even in the room!

…and moving pictures:

It looks like your mother is right there but she’s been dead for 10 years! Gah!

[–] Something_Complex@lemmy.world 21 points 1 year ago* (last edited 1 year ago) (1 children)

To be honest it is a bit creepy if it wasn't from Robin Williams' personality.

If you hear a message you brother left you is one thing. But listening to him taking when someone else is faking his voice and saying whatever they want.

That's the only difference, those video recording where of you brother.

These deep-fake things are someone else speaking in your brother's voice. A corporation using your brother to sell products and services.

Nothing to do with him and his personality

[–] Comment105@lemm.ee 11 points 1 year ago (3 children)

Yeah, there's a significant difference between a recording and generating.

load more comments (3 replies)
[–] eumesmo@lemmings.world 4 points 1 year ago (1 children)

It's not just a matter of discomfort for something new, but at something highly dangerous. Deepfakes have several bad and disturbing use cases, like itentity theft, sexual exploitation, marketing abuse, political manipulation, etc. In fact, I hard to find a significant good use of such technology.

load more comments (1 replies)
[–] anon_8675309@lemmy.world 22 points 1 year ago (1 children)

Almost 10 years without him. He was so great. This should not be his legacy.

[–] pete_the_cat@lemmy.world 6 points 1 year ago

If there is any celebrity that I hold dear, it's Robin Williams.

[–] Blizzard@lemmy.zip 11 points 1 year ago (2 children)

Another repost by the bot.

load more comments (2 replies)
[–] WuTang@lemmy.ninja 9 points 1 year ago

You don't need to be the son or daughter of a celebrity, just think about it 5 freaking seconds.

[–] banneryear1868@lemmy.world 4 points 1 year ago* (last edited 1 year ago) (1 children)

Seeing Tupac's hologram perform to a cheering crowd was when it crossed the line in to creepy for me. A lot of people seem turned off by this at least, and it's really exposing how these studios think of people. I think this could turn in to a thing where the studios really push these personality constructs, while many actors and the public will be morally opposed to it. So the studios might have to appeal to a moral justification for when it's appropriate to use these AI constructs, like, "we really wanted to honor Robin with this project that we felt carried on his legacy, and a percentage of proceeds will go to the good foundation to help other's who suffer like Robin did, so seeing Robin's personality construct perform for you is really a moral duty and helps make the world a better place." Also anywhere AI isn't noticeable to the viewer, for the cost savings and avoiding the negative reaction to it.

I think there will be studios producing fully AI-driven content though. They'll be like low budget and corny, a diarrhea level of quantity and quality. Not unlike those campy dramatized skits on YouTube now where it's like, "homeless girl steals a rich man's heart, will make you cry." They'll be these ultra-niche AI generated shorts that are a mix of advertisement and generic story arc. The AI spam is already pretty hilarious, "Elon has an invention that can make anyone a millionaire in 30 days." I think we're about to witness a dearth of content so shitty that no present day comparison could describe.

[–] BillMurray@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

Hold on, 50 cent had a hologram? Wouldn't it be easier and cheaper to just hire him, since he's still alive... when was this?

edit: see OP changed his comment from 50 cent to Tupac 🙄

[–] ShittyRedditWasBetter@lemmy.world 3 points 1 year ago (2 children)

Get used to it. Best case stuff like this gets covered commercially. Nobody is going to be able to regulate what individuals can do.

load more comments (2 replies)
[–] _number8_@lemmy.world 2 points 1 year ago (3 children)

imaginary scenario:

you love good will hunting, you're going thru a tough time, and you use AI to have robin williams say something gentle and therapist-y that directly applies to you and your situation -- is this wrong?

[–] Naz@sh.itjust.works 10 points 1 year ago (1 children)

I've asked extremely high end AI questions on ethics of this nature and after thinking for exactly 14.7 seconds it responded with:

• The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

• However, spreading those images to others, without the original person's consent is considered a form of invasion of privacy, impersonation, and is therefore unethical.

Basically, you're fine with imagining Robin Williams talking to you, but if you record that and share it with others/disseminate the content, then it becomes unethical.

[–] TwilightVulpine@lemmy.world 2 points 1 year ago (1 children)

• The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

That doesn't sound right at all. Copying and processing somebody's works for the sake of creating a replica is completely different than imagining it to yourself. Depending on how its done, even pretending that it's being done solely for yourself is incorrect. Many AI-based services take feedback from what their users do, even if they don't actively share it.

Just like looking at something, memorizing it and imitating it is allowed while taking a picture may not be, AI would not necessarily get the rights to engage with media as people do. It's not an independent actor with personal rights. It's not an extension of the user. It's a tool.

Then again I shouldn't be surprised that an AI used and trained by AI users, replies about its use as basically a natural right.

[–] JackbyDev@programming.dev 3 points 1 year ago (1 children)

Please see the second point. Essentially you cannot commit copyright violation if you don't distribute anything. Same concept.

[–] TwilightVulpine@lemmy.world 2 points 1 year ago (1 children)

These AIs are not being produced by the rights owners so it seems unlikely that they are being built without unauthorized distribution.

[–] JackbyDev@programming.dev 2 points 1 year ago

I get your point, but I think for the purpose of the thought exercise having the model built by yourself is better to get at the crux of "I am interested in making an image of a dead celebrity say nice things to me" especially since the ethics of whether or not building and sharing models of copyrighted content is a totally different question with its own can of worms.

[–] Empricorn@feddit.nl 4 points 1 year ago

I wouldn't apply morality, but I bet it isn't healthy. I would urge this theoretical person to consult with an actual licensed therapist.

load more comments (1 replies)
load more comments
view more: next ›