this post was submitted on 15 Jul 2024
64 points (93.2% liked)

Technology

59135 readers
3376 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Meta plans to scan for "skin vibrations" to combat deepfakes

Meta’s Creepy Skin Deep “Security” Idea

https://reclaimthenet.org/metas-creepy-skin-deep-security-idea

@technology@lemmy.world

#privacy #technology

top 13 comments
sorted by: hot top controversial new old
[–] jet@hackertalks.com 45 points 3 months ago (1 children)

This has been around for a while in research papers. Getting people's pulse rate, and even blood pressure from videos.

Other things you can get from videos, electrical interference to determine which power grid somebody is using. Noises in the background can be mapped as well. So uploading a video deanonymizes you quite well, for properly motivated investigator.

In the escalating war against deepfakes however it will just be part of the arms race, and new deepfakes will now include those fluctuations.

[–] BrianTheeBiscuiteer@lemmy.world 15 points 3 months ago (2 children)

The only other way to combat deep fakes is something that people and companies constantly fuck up: cryptography.

[–] Etterra@lemmy.world 4 points 3 months ago

Or, alternatively, just showing up to do stuff in person. Of course that's not always feasible but still.

[–] sugar_in_your_tea@sh.itjust.works 3 points 3 months ago (1 children)

Yeah, it really shouldn't be hard to digitally sign a video along with a number of the frames. We've had the tech for decades.

[–] GamingChairModel@lemmy.world 2 points 3 months ago (1 children)

We're starting to see it in some cameras, mostly for still photography, but I don't see why the basic concept wouldn't extend to video files, too. Leica released a camera last year that signs the photo, including the timestamp and location data, and Canon, Nikon, Sony, Adobe, and Getty have various implementations of the technique.

Once the major photo software editing workflows support it, we'll probably see some kind of chain of custody authentication support from camera to publication.

Of course, that doesn't prevent fakes in the sense of staged productions, but the timestamp and location data would go a long way.

[–] Laser@feddit.org 4 points 3 months ago (2 children)

But then what? So you have a camera signing its files and we pretend that extraction of the secret key is impossible (which it probably isn't). You load the file into your editing program because usually, the source files are processed further. You create a derivative of the signed file and there's no connection to the old signature anymore, so this would only make sense if you provide the original file for verification purposes, which most people won't do.

I guess it's better than nothing but it will require more infrastructure to turn it into something usable, or of this was only used in important situations where manual checking isn't an issue, like a newspaper posting a picture but keeping the original to verify the authenticity.

[–] GamingChairModel@lemmy.world 1 points 3 months ago

so this would only make sense if you provide the original file for verification purposes

Yes, that's exactly what I'm imagining. You're keeping receipts for after-the-fact proof, in case it needs to be audited. If you have a newsworthy photograph, or evidence that needs to be presented to the court system, this could provide an important method of proving an untampered original.

Maybe a central trusted authority can verify the signatures and generate a thumbnail for verification (take the signed photo and put it through an established, open source, destructive algorithm to punch out a 200x300 lossy compressed jpeg that at least confirms that the approximate photo was taken at that time and place, but without sufficient resolution/bit depth to compete with the original author on postprocessing.

[–] EngineerGaming@feddit.nl 1 points 3 months ago

Also at least last time I heard about these cameras, only specific proprietary editors (like Adobe) were compatible, which introduces all sorts of other problems.

[–] c0smokram3r@midwest.social 26 points 3 months ago (2 children)

Let’s see Mark Zuckerberg’s skin vibrations 🐍

[–] ivanafterall@lemmy.world 7 points 3 months ago

I bet his skin vibrates so humanly.

[–] AnAmericanPotato@programming.dev 12 points 3 months ago* (last edited 3 months ago)

Honestly, I don't find this very creepy. This is information you are already putting out there for everyone to see. If I post a video of myself speaking, I am not concerned about people seeing how my skin vibrates in that video.

As video generation tools become more advanced, we will need better algorithms to validate videos. The bar for "fooling the vast majority of humans" is much, much lower than the bar for "being literally indistinguishable from a real video". The main problem I see is that it's going to be a cat-and-mouse game, and I don't think any method you publish will remain valid for very long in practice. The same method will be used to improve the next version of video generators.

Also, lots of real videos use post-processing that might wash out some of the details they are looking for. Video producers might re-record lines so they don't perfectly match the video to begin with. It's been a long time since I used a Samsung phone, but on my old S6, I remember that it always had a beauty filter applied to the selfie camera that made me me look like a creepy porcelain doll. I could probably make a deepfake of myself that looks more "real" than those real videos and photos.

[–] DudeDudenson@lemmings.world 8 points 3 months ago

So we'll have deepfaked skin vibrations?