this post was submitted on 02 Sep 2023
177 points (92.8% liked)

Technology

58143 readers
5643 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Call Of Duty using AI to listen out for hate speech during online matches::The tool, which will monitor voice chat for any bullying and harassment, will be part of Modern Warfare III - the next game in the series - when it launches in November.

all 30 comments
sorted by: hot top controversial new old
[–] FireTower@lemmy.world 55 points 1 year ago (5 children)

Do people use CoD VOIP chat for anything but slurs, bullying, & harassment?

I can see this leading to fan complaints with stupid over moderation like we've seen in text moderation. For example Spanish speakers getting banned for saying "negro" or someone getting removed for naively being tricked into answering a trap question.

[–] Kolanaki@yiffit.net 33 points 1 year ago

I got banned once from something because I said "we found the chink in their armor, boys!" Took me forever to even figure out what word triggered it.

[–] phoneymouse@lemmy.world 10 points 1 year ago (2 children)

I mute voice chat because I play games to enjoy myself.

[–] TORFdot0@lemmy.world 3 points 1 year ago

Seriously, considering there is no dedicated server browser and even lobbies are not consistent. What on earth does random voice chat add to the experience? I just hit the mute button on my console as soon as I boot the game

[–] Dettweiler42@lemmyfly.org -1 points 1 year ago

I leave it on so I can hear people on the receiving end rage. We are not the same.

[–] specialneedz@lemmy.ml 4 points 1 year ago

Don’t support of system to get someone to listen back to flagged content could work well. Could be trusted member of the community like DOTA 2s overwatch feature

[–] SnipingNinja@slrpnk.net 3 points 1 year ago

I read that this will only mark the voice chat for human moderation, so I hope they'll use local language staff. It'll still have false positives if someone who doesn't understand the language tries to moderate

[–] CmdrShepard@lemmy.one 2 points 1 year ago (1 children)

I used to forget my mic was turned on and vape right next to the microphone. That's an additional use for VOIP chat.

[–] FireTower@lemmy.world 2 points 1 year ago

Yeah I forgot about that and hearing some kid's parent's marriage failing in the background.

[–] Hotdogman@lemmy.world 38 points 1 year ago (1 children)

Great. Now I'll never know whose house my mom is at every morning.

[–] Kolanaki@yiffit.net 22 points 1 year ago (1 children)

If they count "your mom" jokes as hate speech, the dialogue going on in game is just going to plummet to zero.

[–] BeakersBunsen@lemmy.zip 23 points 1 year ago (2 children)

Actually ban cheaters nah, police voice chat

[–] ramjambamalam@lemmy.ca 2 points 1 year ago

It's not exclusive. A twelve year old yelling slurs into their microphone is easily detectable using modern technology. Why not?

[–] mrpants@midwest.social 0 points 1 year ago

What a stupid take, these are completely different and valid problems with entirely separate solutions. One of which the gaming industry has spent decades fighting and the other they literally just got tools for.

[–] Candelestine@lemmy.world 19 points 1 year ago (1 children)

Hey, another thing a chatbot could be trained to be good at. That's actually not a bad idea.

[–] SkyeStarfall@lemmy.blahaj.zone 10 points 1 year ago (1 children)

Yeah, it could actually work if it's tuned properly. Certainly it could at least be an additional tool for moderators.

[–] greybeard@lemmy.one 11 points 1 year ago (1 children)

If only used to flag, then passed to a moderation team for verification, it would work really well to police something that is almost impossible to police otherwise. That said, I'm sure they wont do that, they'll just let it handle the moderation and ignore the false positives. Honestly, I'm still OK with that, I haven't used voice chat in games in 20 years because it always devolves into a cesspool. So even bad moderation is better than what we have now.

[–] freecandy@lemmy.world 1 points 1 year ago (1 children)

Training that neural network must have been a treat

[–] greybeard@lemmy.one 2 points 1 year ago

I'd guess that it is doing voice to text, then standard automatic moderation on the text, rather than a new AI that understand hateful sounds. Just a guess though. At this point, you could run the voice to text on local machines and pass that off to the server. Of course that means modders could disable the protection, but the vast majority of users wouldn't be able to do that. It would also give the added benefit of transcriptions for players that can't hear voice chat.

[–] lilShalom@lemmy.basedcount.com 13 points 1 year ago

Kids will create new slang for derogatory terms.

[–] Sanctus@lemmy.world 8 points 1 year ago

To boost it in a nostalgia effort.

[–] scarabic@lemmy.world 8 points 1 year ago

It’s “listening for it.”

LOL. This isn’t SETI where you’ll go 30 years without hearing anything. It should find hate speech within seconds. So it either works or not, and we should know already. Which is it?

[–] foggy@lemmy.world 8 points 1 year ago

It's associations are gonna be so strong that any human voice will immediately trigger the harassment alarm lol.

[–] autotldr@lemmings.world 7 points 1 year ago

This is the best summary I could come up with:


Publisher Activision said the moderation tool, which uses machine learning technology, would be able to identify discriminatory language and harassment in real time.

Activision's chief technology officer Michael Vance said it would help make the game "a fun, fair and welcoming experience for all players".

The issue is exacerbated in popular multiplayer games due to the sheer number of players, with around 90 million people playing Call Of Duty each month.

Activision said its existing tools, including the ability for gamers to report others and the automatic monitoring of text chat and offensive usernames, had already seen one million accounts given communications restrictions.

Call Of Duty's code of conduct bans bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.

Mr Vance said ToxMod allows the company's moderation efforts to be scaled up significantly by categorising toxic behaviour based on its severity, before a human decides whether action should be taken.


The original article contains 357 words, the summary contains 160 words. Saved 55%. I'm a bot and I'm open source!

[–] T156@lemmy.world 6 points 1 year ago* (last edited 1 year ago)

Wonder how effective it will be. People screaming into cheap mics tends to be incomprehensible at the best of times. It might not be able to detect much at all.

[–] ArchmageAzor@lemmy.world 2 points 1 year ago

Just wait until it mishears what is in actuality a man with an Irish accent saying the word "neither"

[–] Gerula@lemmy.world 2 points 1 year ago

This is dumb. Even without AI almost each generation has it's own slang and slurs ...

[–] giantofthenorth@lemm.ee 0 points 1 year ago (1 children)

Glad I haven't played a cod game in years.

There's already a report button if someone has an issue with it let them report it, this is just going to lead to a ton of false positives or be completely useless.

[–] CmdrShepard@lemmy.one 2 points 1 year ago

I'd argue the report button has the same flaws. Woop someone badly in a match? Reported. Report someone making bigoted remarks? You'll still see them in a lobby weeks later.