this post was submitted on 14 Oct 2023
217 points (90.9% liked)

Technology

72865 readers
3165 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] themurphy@lemmy.world 136 points 2 years ago (26 children)

What the fuck is wrong with you guys. This is absolutely dystopian shit right there.

This is not "nice" or "neat"?!

It's straight up awful. It's war.

load more comments (26 replies)
[–] prenatal_confusion@lemmy.one 44 points 2 years ago (2 children)

The tank-like robot has the ability to transport itself to a preset destination. It can also spot and avoid obstacles by utilizing dozens of sensors and an advanced driving system. Moreover, the platform can self-destruct if it falls into enemy hands.

It is not an autonomous weapons system. It is a platform that can maneuver autonomously.

[–] whenigrowup356@lemmy.world 18 points 2 years ago* (last edited 2 years ago) (1 children)

Targeting for the weapon itself is done by a human remotely at least right?

...

...right?

*eta: yeah, it looks like it has a remote driver who can take over the steering and control the gun with a little PS4 controller thingy

[–] redcalcium@lemmy.institute 9 points 2 years ago (1 children)

How long until there is a version that would let the operator upload a photo of the target and the gun bot would seek and shoot that target automatically, with 99.9% face detection accuracy?

[–] Natanael@slrpnk.net 7 points 2 years ago

Turkey says they have flying drones who already have done that, so ...

[–] wombatula@lemm.ee 4 points 2 years ago

One more step towards the inevitable weapons-free platform that will eventually come.

[–] ChaoticEntropy@feddit.uk 21 points 2 years ago

"and prevents risks to human life"... no implications there, I'm sure.

[–] Dra@lemmy.zip 18 points 2 years ago (1 children)

Semi-autonomous doesnt really mean anything and is a deliberately sensationalist headline.

The key technological discussion is when it's not a human pulling the trigger.

Even guns are semi autonomous by this definition

[–] uriel238@lemmy.blahaj.zone 0 points 2 years ago* (last edited 2 years ago)

Australia already has area-denial sentries that autonomously shoot at any motion (with some parameters regarding size and speed).

These, or a similar techology was used for a while along the Korean DMZ until we started talking about building autonomous drones.

One of the shot-down airliner incidents (Flight 007, maybe?) involved a misdesignation of a sensor contact by a US Aegis missileboat system. The plane was pinging with F4 Phantom Radar (which ruled out an ordinary airliner). The Aegis required a human to authorize an attack, but it reported the contact as a bogey (unknown, peresumed to be hostile)...

— Apparently, I posted this without finishing it. —

So that instance might be considered the first historical case of an autonomous weapon system accidentally killing a civilian (at least partially civilan) target, given the human doing the authorizing had inadequate data to make an informed decision.

(A lot of cruelty of our systems comes from authorizations based on partial data. Law enforcement in the US is renowned for massaging their warrants to make them easy on the signing magistrate, resulting in kids and dogs slain during SWAT raids in poor neighborhoods. I'm ranting.)

[–] halfmanhalfalligator@feddit.de 16 points 2 years ago (1 children)
[–] glowie@infosec.pub 0 points 2 years ago

Yes, but more relevant now than then, no?

[–] A_A@lemmy.world 15 points 2 years ago* (last edited 2 years ago)

Almost 20 years ago (because bad ties) I was presented with a military video for the development of this horror.
Israelis ~~where~~ were very proud in the video to show this thing driving in front of a poor house and firing from outside through the wall to anything (eventually) living inside. They didn't give a fuck whether it was (to be) woman children or else. (of course the house was empty for the research phase, well I hope so).
This is the kind of monstrosity Palestinians are facing now.

[–] Candelestine@lemmy.world 9 points 2 years ago

I was hoping to participate in this conversation with some long, interesting back and forths with different people about this inevitable, emerging technology. Then I scrolled the comments section...

[–] Norgur@kbin.social 8 points 2 years ago (3 children)

Let's hope that the "AI" doing the aiming was programmed by Microsoft. That way, it would at least not hit anybody....

[–] OldQWERTYbastard@lemmy.world 7 points 2 years ago (2 children)

Guess I'm done talking shit about Clippy.

[–] Norgur@kbin.social 6 points 2 years ago

It looks like you are trying to violate the Geneva Convention. Would you like help?

  • get help assaulting civilians
  • commit several atrocities from the comfort of your office chair

☐ Do not show this message again

[–] jet@hackertalks.com 5 points 2 years ago

Looks like your trying to oppress a population, would you like Microsoft Bing AI to draft a press releases to establish a narrative justifying automated anti personal weapons deployed against civilians?

[–] Kalcifer@lemm.ee 2 points 2 years ago

Introducing our new Stormtrooper™ AI!

[–] magikmw@lemm.ee 5 points 2 years ago

What is a drone for 500.

[–] Kalcifer@lemm.ee 4 points 2 years ago* (last edited 2 years ago)

Scary, but neat.

[–] uriel238@lemmy.blahaj.zone 3 points 2 years ago

If the operators can select targets and the drone locks on and fires, that crosses into a high-risk moral gray-zone, since UIs are susceptible to misclicks and the drone may not consider collateral consequences (such as overpenetration.

If the drone can autonomously target and attack on its own algorithms, add to that the inevitable miscalculations meaning it will eventually kill a target that a soldier would not.

In the meantime anarchists amd revolutionaries should examine how to convice it it's been compromised to convince it to self-destruct.

And if it requires a signal from home to auto-destruct, how to block the affirmative signal.

I suspect the GLA is going to develop thick, sticky smoke bombs and signal jammers to make our ground drone blind and isolated. Then it can be neutralized and salvaged for parts.

[–] beeng@discuss.tchncs.de 1 points 2 years ago* (last edited 2 years ago)

OK but drones are only allowed to shoot drones. Military companies still win, as this is what's is all about right?

load more comments
view more: next ›