590
this post was submitted on 13 Jan 2026
590 points (99.7% liked)
Technology
79240 readers
2075 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've love to have private alpr's in my neighborhood. We've had mailbox thefts and people driving around breaking into cars, even had car stolen. These guys are changing their plates regularly, but it would be super cool to at least get a neighborhood wide alert if someone who's done done shit has rentered our neighborhood. I'm just not keen on giving that data to 3rd parties lock stock and barrel
So you could do what?
Violence i am guessing.
Our (prior) neighborhood had 3 miles of roads with one entrance, a 911 call could get a sheriff's car response to block the entrance with a description of the vehicle (plate number, even).
Our current neighborhood, only 1/4 mile of road, so yeah, you'd have to shoot 'em.
So you want to monitor people so you can harm them, even though harm is not the appropriate punishment for their crime?
Never said I would shoot them, just pointing out the reality of modern life that the only possible response is to go interact with unknown belligerent members of the public in person, who may themselves be carrying firearms...
Jesus fuck that escalated quickly!
It's MAD on the local scale...
That's going to be unpopular to say around here, but the truth is that technology is largely amoral.
While the tech may be amoral, its still implemented and utilized by pricks whose goal is control.
The real conundrum is: once you have unique identifiers on vehicles - which pretty much all countries with cars have - where's the line? Do you require people to visually read the plates and write them down on paper? Who is allowed to keep databases of the information? How do you prevent people from keeping their own private databases? How do you prevent someone from creating a dash-cam app that does GPS/time coded databasing of all plate numbers it observes while driving? If a neighborhood HOA wants to network all their dash (and fixed location) apr-cam information into a central database, when does it become too much to allow? And how do you possibly enforce overstepping of the limits?
Scenario: A HOA has fixed cam automatic plate reader information and video evidence that proves XM3 5D9 was out smashin' mailboxes on Friday night. The HOA president is cruising downtown Saturday morning and finds XM3 5D9 parked on the street, using his dash mounted apr software, calls the cops (in a vain attempt) to have them come arrest the mailbox smashers who were recorded in close-up 4K high def night vision doing the deed from the window of their car. This feels close to the over-stepping limit, but what if there were no cameras or software involved and the same XM3 5D9 plate ID was used by the same people to make the same accusation of the same mailbox smashers, this time based on telephoto chemical film pictures?
This also ignores the fact that the person in the car the second time XM3 5D9 was spotted is not necessarily the same person in the car the first. So one could easily false accuse.
Oh, that's what the photos / videos are for... but, sure, circumstantial evidence is super basis for harassment of the innocent.
Yup, and its important to communicate that or we risk losing our voice in the general public and look like Luddites
Just FYI, using the term luddite derogatorily may not be as cool as you think it is. They were essentially an instance of organized labor flexing their power and not really "against technological advancement" like the term gets bandied about.
https://en.wikipedia.org/wiki/Luddite
I am aware, but i am using it in a colloquial sense. And you understood my point; which is exactly how the general public that needs to be swayed will interpret it.
You can and should make your point without denigrating labor movements.
Originally the Pedants were a group of trans atheist Linux users from Pedantia, so I won’t use it as a pejorative in this context.
Uhh okay? Language and its use changes. If you want to be effective in getting your point across you need to keep up. The choir in lemmy isn't who needs to be persuaded.
Feel free to be technically correct, but I would like to see the idea take mass adoption instead.
Despite the memes, also typically not the best kind of correct.
Before someone says it.
The problem with surveillance tech is that even if it was initially implemented with the best intentions by good people that aren't seeking to abuse it, it can change hands.
That is true
Enabling a surveillance state is not amoral.
Your phrasing seems to imply I said it was, but I never said that.
If you are in a discussion about the development and deployment of technology to facilitate a surveillance state, then saying “technology is neutral” is the least interesting thing you could possibly say on the subject.
In a completely abstract, disconnected-from-society-and-current-events sense it is correct to say technology is amoral. But we live in a world where surveillance technology is developed to make it easier for corporations and the state to invade the privacy of individuals. We live in a world where legal rights are being eroded by the use of this technology. We live in a world where this technology is profitable because it helps organizations violate individual rights. If you live in the US, as I do, then you live in a world where federal law enforcement agencies have become completely contemptuous of the law and are literally abducting innocent people off the street. They use the technology under discussion here to help them do that.
That a piece of tech might potentially be used for a not-immoral purpose is completely irrelevant to how it is actually being used in the real world.
And that is what we need to focus our messaging on. The evil people and institutions enabling this as those are permanent. Tech comes and goes (and should not be anthropomized). Focusing on the tech just means in institution looks for another path. Focusing on the institution is to block the at the source.
“Technology is neutral” is a bromide engineers use to avoid thinking about how their work impacts people. If you are an engineer working for flock or a similar company, you are harming people. You are doing harm through the technology you help to develop.
The massive surveillance systems that currently exist were built by engineers who advanced technology for that purpose. The scale and totality of the resulting surveillance states are simply not possible without the tech. The closest alternatives are stasi-like systems that are nowhere near as vast or continuous. In the actual world the actual tech is immoral. Because it was created for immoral purposes and because it is used for immoral purposes.
All technology has that potential. Some more than others. The issue is that institutions, like flock, exist solely for the evil applications.
As I said before: In a conversation about technology as it actually exists, talking about potentials is not interesting. Yes all technology has the potential to be good or bad. The massive surveillance tech is actually bad right now in the real world
This issue with asserting that technology is neutral is it lets the people who develop it ignore the impacts of their work. The engineers that make surveillance tech make it, ultimately, for immoral purposes. When they are confronted with the effects of their work on society they avoid according with the ethics of what it is that they are doing by deploying bromides like “technology is neutral.”
Example: Building an operant conditioning feedback system into a social media app or video game is not inherently bad, you could use it to reinforce good behaviors and deploy it ethically by obtaining the consent of the people you use on. But the operant conditioning tech in social media apps and video games that actually exists is very clearly and unambiguously bad. It exists to get people addicted to a game or media app, so that they can be more easily exploited. Engineers built that tech stack out for the purpose of exploiting people. The tech, as it exists in the real world, is bad. When these folks were confronted with what they had done, they responded by claiming that tech is not inherently good or bad. (This is a real thing social media engineers really said) They ignored the tech—as it actually exists—in favor of an abstract conversation about some potential alternative tech that does not exist. The effect of which is the people doing harm built a terrible system without ever confronting what it was they were doing.
I don't see how that is the case. The tech is neutral, but the engineers know what the application they are hired for is. That is determined by people and subject to morality.
Would you say openCV or the people working on it are evil? I wouldn't. I would say that once someone takes that project for flock is evil.
I think this framing is more important when talking with the general public as they are likely to walk away thinking that its the tech that creates problems and not the for profit corporations who will be free to continue doing the same, so long as they don't use that tech.
It is literally the case. People who have literally made tools to do bad things justified it by claiming that tech is neutral in an abstract sense. Find an engineer who is building a tool to do something they think is bad, they will tell you that bromide.
OpenCV is not, in itself, immoral. But openCV is, once again, actual tech that exists in the actual world. In fact, that is how I know it is not bad, I use the context of reality—rather than hypotheticals or abstractions—to assess the morality of the tech. The tech stack that makes up Flock is bad, once again I make that determination by using the actual world as a reference point. It does not matter that some of the tech could be used to do good. In the case of Flock, it is not, so it’s bad.
Bold a keyword there for you
At no point in this conversation have I ever said that tech in an abstract sense is inherently good or bad. The point that I am making— and this is the last time I will make it— is that it is not interesting to talk about the ethics of some technology in an abstraction in cases where the actual tech is as it is actually implemented is clearly bad.
Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.
But that is what you are doing and I am saying that it is people who are responsible for the implementation.
Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.
People are the ones who do things with tech; hence they are responsible for the actions. Tech is just an object with no will of its own to do right or wrong.
Last attempt, I swear.
By digressing to abstraction, good people can and do justify building tech for immoral purposes. It is irrelevant that tech is not inherently good or bad in cases where it is built to do bad things. Talking about potential alternate uses in cases where tech is being used to do bad is just a way of avoiding the issues.
I have no problem calling flock or facebooks tech stack bad because the intentions behind the tech are immoral. The application of the tech by those organizations is for immoral purposes (making people addicted, invading their privacy etc). The tech is an extension of bad people trying to do bad things. Commentary about tech’s abstract nature is irrelevant at that point. Yeah, it could be used to do good. But it’s not. Yeah, it is not in-and of-itself good or bad. Who cares? This instantiation of the tech is immoral, because it’s purposes are immoral.
The engineers who help make immoral things possible should think about that, rather than the abstract nature of their technology. In these cases, saying technology is neutral is to invite the listener to consider a world that doesn’t exist instead of the one that does.
And did those assemble themselves to be evil? Or did someone make them that way?
To go back go my openCV example it is just tech. It does not become a lpr with a cop back end until flock configures it that way
Yes, exactly my point.
The technology enables the surveillance state. Therefore the technology is not amoral.
Thw issue youll run into is effectiveness at that small scale, sonyoull be tempted to share data with other systems like that, and eventually you'll end up creating a different flock.
The idea and motive and intention is great. The (edit: eventual) outcome is always evil.
I wonder if a segregated system design could address this. Similar in-system segregation like a TPM for the actual detection/matching part of the system separated from the command and control part.
As in, the camera and OCR operations would be in their own embedded system which could never receive code updates from the outside. Perhaps this is etched into the silicon SoC itself. Also on silicon would be a small NVRAM that could only hold requested license plate numbers (or a hash of them perhaps). This NVRAM would be WRITE ONLY. So it would never be able to be queried from outside the SOC. The raw camera feed would be wired to the SoC. The only input would be from an outside command and control system (still local to our SoC) that and administrator could send in new license plates numbers to search against. The output of the SoC would "Match found against License Plate X". Even the time stamp would have to be applied by the outside command and control system.
This would have some natural barriers against dragnet surveillance abuse.