this post was submitted on 20 Aug 2023
52 points (100.0% liked)
Technology
37705 readers
99 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
While yes, that's an accurate quip, it actually does highlight a deeper issue in the industry. If everyone passes your scam test, they don't need to buy your scam test.
Additionally, scam emails aren't 50/50 yes/no pass/fail. It's more a combination of red flags to gauge how risky the email is to click on links, reply to, download attachments from, etcetera.
Currently the scam testing industry has no way to rate an individuals ability other than how many scam emails they did or didn't click on. That is a false metric. It incites scam testers to trick people to justify their value to the customer.
Maybe a better way would be to stick with pentesters. The real trick is if they can actually scam someone.
I mean, they are two different aspects of security. Pen testers are important, but they can't help you if an employee clicks on the wrong link.
Isn't social engineering a part of what they do? The goal would be to train employees to look out for both pentesters and real scammers.