this post was submitted on 18 Dec 2025
41 points (100.0% liked)

Tech

2364 readers
8 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] blarghly@lemmy.world 8 points 5 days ago* (last edited 5 days ago) (7 children)

Seem like an uphill battle, legally. I assume a good analogy would be a bar. Suppose two people meet in a bar, they consensually leave together, and then one rapes the other. Even if the bar was informed that one of these people has raped people he met in the bar before, afaik, the bar doesnt have a legal responsibility to ban him, since the bar isn't a court of law and it would be way too much responsibility to saddle every bar owner of deciding the guilt or innocence of someone.

Otoh, even if the case doesn't pan out, it might push match group to be more aware of these sorts of things and implement features that actually work to reduce incidences. But still, it's a difficult problem to solve. They can't discriminate based on sex/gender, so all reports would need to be handled via the same mechanism. So imagine they implement a reporting system with harsher penalties - if you are accused of assault, you are instantly permabanned. Well, now expect things like, say, some neckbeard spamming assault reports against every woman he matches with who doesn't agree to go out with him.

Also, iirc, the reason that you can rejoin after being banned is that the apps delete all your data 6 months after you delete your account. Assuming they actually do this (I'm a little doubtful), this would get the privacy people all riled up.

[–] chickenf622@sh.itjust.works 4 points 4 days ago (2 children)

You could hash the data, so you can keep a list of bad actors while keeping the data, relatively, private.

[–] astutemural@midwest.social 3 points 4 days ago (1 children)

Up until they get hacked, and then face lawsuits for improper storage of personal information. This seems like a no-win scenario for these apps. Not that I'm losing any sleep over it, mind.

[–] chickenf622@sh.itjust.works 3 points 4 days ago

That's the point of hashing data. When it gets stolen it's very difficult to reverse the hash, assuming a good algorithm has been used.

[–] DoctorPress@lemmy.zip 1 points 4 days ago (1 children)

That is 100% useless without a proper way to knowing if someone is dangerous, and reports mustn't define it.

[–] chickenf622@sh.itjust.works -1 points 4 days ago (1 children)

Person is reported by multiple people. Take hashes of there PII. If the person with the same PII tries to sign up, as stated in the article, don't let them sign up if some or all of the hashes match.

[–] DoctorPress@lemmy.zip 5 points 4 days ago (1 children)

Multiple people doesn't make it correct or truthful.

In fact you can already see abuses of systems like this where people mass report a target to kick them out, whenever reports are actually valid or not.

[–] chickenf622@sh.itjust.works 0 points 4 days ago

Good point I was just trying to figure out a solution to identify a known bad actor without storing their PII

load more comments (4 replies)