serenissi

joined 2 months ago
[–] serenissi@lemmy.world 1 points 1 week ago

The text. And probably images too (but the only mistake being the wrong port depiction (all c) says more human).

[–] serenissi@lemmy.world 2 points 4 weeks ago* (last edited 4 weeks ago)

Afaik steam deck doesn't have a gps module. You've to get any of their identifying information.

What you can do is perhaps sending memes with different cannery token redirects for each worker. Send them around when the device is being used. That way you can compare the grabbed ip with steam log and see which worker's match. As the deck doesn't have sim they either will use home wifi or mobile hotspot. Both will work this way.

[–] serenissi@lemmy.world 27 points 1 month ago (2 children)

scaled by system/themselves ... looks like those are x11 apps. why is firefox into this? run it as native wayland with MOZ_ENABLE_WAYLAND

[–] serenissi@lemmy.world 16 points 1 month ago (4 children)

I've a suggestion that might work depending on how honest the perspn hiring the worker is and on their contract. You can tell the person to send some questionnaire or feedback form etc to all of them which will track their ip and name/email (say unique form per worker). Then you can match the ip, as home ips are mostly static for short duration. Tell them to send the form at night or sometime when they'll be at home and give it a short deadline.

[–] serenissi@lemmy.world 1 points 1 month ago (1 children)

the bad guys use bots or services and are done. regular users have to endure while no security is added

put in other words, common users can't easily become 'bad guy' ie cost of attack is higher hence lower number of script kiddies and automated attacks. You want to reduce number. These protections are nothing for bitnet owners or other high profile bad actors.

ps: recaptcha (or captcha in general) isn't a security feature. At most it can be a safety feature.

[–] serenissi@lemmy.world 1 points 1 month ago (3 children)

stopping automated requests

yeah my bad. I meant too many automated requests. Both humans and bot generate spams and the issue is high influx of it. Legitimate users also use bots and by no means it's harmful. That way you do not encounter captcha everytime you visit any google page, nor a couple of scraping scripts gets a problem. Recaptcha (or hcaptcha, say) triggers when there is high volume of request coming from same ip. Instead of blocking everyone out to protect their servers, they might allow slower requests so legitimate users face mininimal hindrance.

Most google services nowadays require accounts with stronger (like cell phone) verification so automated spam isn't a big deal.

[–] serenissi@lemmy.world 1 points 1 month ago (1 children)

And what will you do if a person in a CGNAT is DoSing/scraping your site while you want others to access? IP based limiting isn't very useful, both ways.

[–] serenissi@lemmy.world 1 points 1 month ago (3 children)

hCaptcha, Microsoft CAPTCHA all do the same. Can you give example of some that can't easily be overcome just by better compute hardware?

[–] serenissi@lemmy.world 2 points 1 month ago

If you need to switch without reboot then dual booting is out of question and hence so is Asahi. Asahi is for running linux on apple hardware. In VM you can run anything; drawbacks include non native performance, can't directly use touchpad, gpu and other hardwares, it's still running macos underneath which might be a concern of privacy depending on how much you trust the proprietary code by apple, not using free software stack etc.

[–] serenissi@lemmy.world 1 points 1 month ago (5 children)

There isn't a good way to classify human users with scripts without adding too much friction to normal use. Also bots are sometimes welcome amd useful, it's a problem when someone tries to mine data in large volume or effectively DoS the server.

Forget bots, there exist centers in India and other countries where you can employ humans to do 'automated things' (youtube like count, watch hour for example) at the same expense of bots. There are similar CAPTCHA services too. Good luck with those :)

Only rate limiting is the effective option.

[–] serenissi@lemmy.world 35 points 1 month ago (13 children)

The objective of reCAPTCHA (or any captcha) isn't to detect bots. It is more of stopping automated requests and rate limiting. The captcha is 'defeated' if the time complexity to solve it, whether human or bot, is less than what expected. Now humans are very slow, hence they can't beat them anyway.

view more: next ›