52
‘We definitely messed up’: why did Google AI tool make offensive historical images?
(www.theguardian.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
It's not just historical. I'm a white male and I prompted Gemini to create images for me if a middle aged white man building a Lego set etc. Only one image was a white male and two of the others wrecan Indian and a Black male. Why when I asked for a white male. It was an image I wanted to share to my family. Why would Gemini go off the prompt? I did not ask for diversity, nor was it expected for that purpose, and I got no other options for images which I could consider so it was a fail.
Could you elaborate on the use case you're describing? You were trying to make an image of a middle aged white man building Lego for your family?
Yes, but it does not really matter what the rest of the prompt detail was? The point was, it was supposed to me an image of me doing an activity. I'd clearly prompted for a white man, but it gave me two other images that were completely not that. Why was Gemini deviating from specific prompts like that? Seems the identical issue to the case with the Nazis, just introducing variations completely of its own.
Yeah yeah sure sure but why were you generating an image of a middle aged white man building Lego for your family? I'm baffled.
That is really just not relevant at all to the discussion here, but to satisfy your curiosity, I'm busy building a Lego model that a family member sent me, so the generated AI photo was supposed to depict someone that looked vaguely like me building such a Lego model. I used Bing in the past, and it has usually delivered 4 usable choices. Fact that Google gave me something that was distinctly NOT what I asked for, means it is messing with the specifics that are asked for.
Why use an AI? Just like... take a selfie
So, what you're saying is that white people shouldn't use AI?
It would appear that is exactly what I'm saying as long as the reader lacked any reading comprehension skills.
I'm not the lego person, but I am not taking that selfie because: 1) I don't want to clean the house to make it look all nice before judgey relatives critique the pic, 2) my phone is old and all its pics are kinda fish-eyed, 3) I don't actually want to spend the time doing the task right now when AI can get me an image in seconds.