What the fuck is wrong with you guys. This is absolutely dystopian shit right there.
This is not "nice" or "neat"?!
It's straight up awful. It's war.
This is a most excellent place for technology news and articles.
What the fuck is wrong with you guys. This is absolutely dystopian shit right there.
This is not "nice" or "neat"?!
It's straight up awful. It's war.
The tank-like robot has the ability to transport itself to a preset destination. It can also spot and avoid obstacles by utilizing dozens of sensors and an advanced driving system. Moreover, the platform can self-destruct if it falls into enemy hands.
It is not an autonomous weapons system. It is a platform that can maneuver autonomously.
Targeting for the weapon itself is done by a human remotely at least right?
...
...right?
*eta: yeah, it looks like it has a remote driver who can take over the steering and control the gun with a little PS4 controller thingy
How long until there is a version that would let the operator upload a photo of the target and the gun bot would seek and shoot that target automatically, with 99.9% face detection accuracy?
Turkey says they have flying drones who already have done that, so ...
One more step towards the inevitable weapons-free platform that will eventually come.
"and prevents risks to human life"... no implications there, I'm sure.
Semi-autonomous doesnt really mean anything and is a deliberately sensationalist headline.
The key technological discussion is when it's not a human pulling the trigger.
Even guns are semi autonomous by this definition
Australia already has area-denial sentries that autonomously shoot at any motion (with some parameters regarding size and speed).
These, or a similar techology was used for a while along the Korean DMZ until we started talking about building autonomous drones.
One of the shot-down airliner incidents (Flight 007, maybe?) involved a misdesignation of a sensor contact by a US Aegis missileboat system. The plane was pinging with F4 Phantom Radar (which ruled out an ordinary airliner). The Aegis required a human to authorize an attack, but it reported the contact as a bogey (unknown, peresumed to be hostile)...
— Apparently, I posted this without finishing it. —
So that instance might be considered the first historical case of an autonomous weapon system accidentally killing a civilian (at least partially civilan) target, given the human doing the authorizing had inadequate data to make an informed decision.
(A lot of cruelty of our systems comes from authorizations based on partial data. Law enforcement in the US is renowned for massaging their warrants to make them easy on the signing magistrate, resulting in kids and dogs slain during SWAT raids in poor neighborhoods. I'm ranting.)
This story is from 2021.
Yes, but more relevant now than then, no?
Almost 20 years ago (because bad ties) I was presented with a military video for the development of this horror.
Israelis ~~where~~ were very proud in the video to show this thing driving in front of a poor house and firing from outside through the wall to anything (eventually) living inside. They didn't give a fuck whether it was (to be) woman children or else. (of course the house was empty for the research phase, well I hope so).
This is the kind of monstrosity Palestinians are facing now.
I was hoping to participate in this conversation with some long, interesting back and forths with different people about this inevitable, emerging technology. Then I scrolled the comments section...
Let's hope that the "AI" doing the aiming was programmed by Microsoft. That way, it would at least not hit anybody....
Guess I'm done talking shit about Clippy.
It looks like you are trying to violate the Geneva Convention. Would you like help?
☐ Do not show this message again
Looks like your trying to oppress a population, would you like Microsoft Bing AI to draft a press releases to establish a narrative justifying automated anti personal weapons deployed against civilians?
Ricochets gotta land somewhere. Just hope it's not a fleshy bit.
I hoped more for an immediate gun jam due to an issue where the gun couldn't reach Microsoft's telemetry servers in time followed by a bluescreen because the right headlight (not the left) was initiated at kernel level for some reason and threw a memory access violation.
Introducing our new Stormtrooper™ AI!
What is a drone for 500.
Scary, but neat.
Here we go. I can't wait for the Boston Dynamics one to be outfitted with a .50 and drone support.
That's going to be totally cool and amazing watching it launch its 300 lb body off stuff all Parkour like 360 no scoping dudes while the drone drops grenades.
/s incase you need it. This is totally going to suck.
If the operators can select targets and the drone locks on and fires, that crosses into a high-risk moral gray-zone, since UIs are susceptible to misclicks and the drone may not consider collateral consequences (such as overpenetration.
If the drone can autonomously target and attack on its own algorithms, add to that the inevitable miscalculations meaning it will eventually kill a target that a soldier would not.
In the meantime anarchists amd revolutionaries should examine how to convice it it's been compromised to convince it to self-destruct.
And if it requires a signal from home to auto-destruct, how to block the affirmative signal.
I suspect the GLA is going to develop thick, sticky smoke bombs and signal jammers to make our ground drone blind and isolated. Then it can be neutralized and salvaged for parts.
OK but drones are only allowed to shoot drones. Military companies still win, as this is what's is all about right?