this post was submitted on 04 Oct 2023
148 points (97.4% liked)

Technology

59219 readers
3314 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Los Angeles is using AI to predict who might become homeless and help before they do::undefined

you are viewing a single comment's thread
view the rest of the comments
[–] qooqie@lemmy.world 44 points 1 year ago (3 children)

This is what state run AI models should be doing, not any of that other wack ass shit

[–] Piecemakers3Dprints@lemmy.world 27 points 1 year ago (1 children)

Color me skeptical, considering this city specifically has the single most notoriously corrupt and violent police force in the history of the nation. Yeah, that model is being trained to "help".

[–] LibertyLizard@slrpnk.net 8 points 1 year ago (1 children)

If you’ve never worked with local governments, you may not realize how independent these departments can be. And also that the people who go into this line of work usually really want to help people. I can’t speak to the situation in LA directly but I seriously doubt they would be sharing their tools with the police unless there was political pressure to do so. Which I think is unlikely in LA.

[–] Piecemakers3Dprints@lemmy.world 2 points 1 year ago (2 children)

That's a non-zero chance that LLM is safely secured and incapable of being used for unethical reasons, no matter how "independent" the political groups are.

[–] LibertyLizard@slrpnk.net 1 points 1 year ago

They can just build their own model if they want to. It’s not that hard, especially since the police have a lot of money. So the question is more do we allow this than will they somehow steal it from another unrelated program.

[–] Touching_Grass@lemmy.world 1 points 1 year ago

They're going to get it regardless. Question is will we

Definitely. Policy should be made on the basis of what's proven to be effective, not ideology.

AI could be more effective, provided that what's been fed into it is not garbage

[–] ShakeThatYam@lemmy.world 3 points 1 year ago (1 children)

No thanks. If this is remotely successful these fucks will next use it to Minority Report us.