this post was submitted on 11 Feb 2024
643 points (97.9% liked)

Technology

59608 readers
3412 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes::Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."

you are viewing a single comment's thread
view the rest of the comments
[โ€“] abhibeckert@lemmy.world 4 points 9 months ago* (last edited 9 months ago) (1 children)

Click the padlock in your browser, and you'll be able to see that this webpage (if you're using lemmy.world) was encrypted by a server that has been verified by Google Trust Services to be a server which is controlled by lemmy.world. In addition, your browser will remember that... and if you get a page from the same server that has been verified by another cloud provider, the browser (should) flag that and warn you it might be

The idea is you'll be able to view metadata on an image and see that it comes from a source that has been verified by a third party such as Google Trust Services.

How it works, mathematically... well, look up "asymmetric cryptography and hashing". It gets pretty complicated and there are a few different mathematical approaches. Basically though, the white house will have a key, that they will not share with anyone, and only that key can be used to authorise the metadata. Even Google Trust Services (or whatever cloud provider you use) does not have the key.

There's been a lot of effort to detect fake images, but that's really never going to work reliably. Proving an image is valid, however... that can be done with pretty good reliability. An attack would be at home on Mission Impossible. Maybe you'd break into a Whitehouse photographer's home at night, put their finger on the fingerprint scanner of their laptop without waking them, then use their laptop to create the fake photo... delete all traces of evidence and GTFO. Oh and everyone would know which photographer supposedly took the photo, ask them how they took that photo of Biden acting out of character, and the real photographer will immediately say they didn't take the photo.

[โ€“] FrostKing@lemmy.world 1 points 9 months ago

Thanks a lot, that helped me understand. Seems like a good idea