Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
It's a shame we don't have those banner ad schemes anymore. Cybersquatting could be a viable income stream if you could convince the cleaners to click banner ads for a faction of a penny each.
That's insane... Can't a website owner require bots (at least those who are identifying themselves as such) to prove at least they're affiliated with a certain domain?
I don't know what "12,181+181" means (edit: thanks @Thunraz@feddit.org, see Edit 1) but absolutely not 1.2181 × 10^185^. That many requests can't be made within the 39 × 10^9^ bytes of bandwidth − in fact, they exceed the number of atoms on Earth times its age in microseconds (that's close to 10^70^). Also, "0+57" in another row would be dubious exponential notation, the exponent should be 0 (or omitted) if the mantissa (and thus the value represented) is 0.
AI bots killing the internet again? You don't say
I had to pull an all nighter to fix some unoptimized query because I had just launched a new website with barely any visitors and hadn’t implemented caching yet for something that I thought no one uses anyway, but a bot found it and broke my entire DB through hitting the endpoint again and again until nothing worked anymore
fracking clankers.
You can also use crowdsec on your server to stop similar BS. They use a community based blacklist. You choose what you want to block. Check it out.
I'm going to try and implement crowdsec for all my ProxMox containers over Cloudflare tunnels. Wish me luck and that my wife and kids let me do this without constantly making shot up fore to do.
They also have a plugin for opnsense (if you use that)
I used to, but moved on to a full Unifi infrastructure about 2 years ago.
Yeah, then you need to implement it at the webhost level.
Good luck and if you need help drop by their discord. They have an active community.
Can they help me keep my wife and kids at bay too? That's what I need the most help with 😂
I don't think asking help about domestic issues on the Internet is healthy... However, who knows maybe they can ( ͡~ ͜ʖ ͡°)
Fucking hell.
Yeah and that's why people are using cloudflare so much.
One corporation DDOS's your server to death so that you need the other corporations' protection.
basically protection racket
That's a nice website you gots there, would be ashame if something weres to happen to it.
A friend (works in IT, but asks me about server related things) of a friend (not in tech at all) has an incredibility low traffic niche forum. It was running really slow (on shared hosting) due to bots. The forum software counts unique visitors per 15 mins and it was about 15k/15 mins for over a week. I told him to add Cloudflare. It dropped to about 6k/15 mins. We excitemented turning Cloudflare off/on and it was pretty consistent. So then I put Anubis on a server I have and they pointed the domain to my server. Traffic drops to less than 10/15 mins. I've been experimenting with toggling on/off Anubis/Cloudflare for a couple months now with this forum. I have no idea how the bots haven't scrapped all of the content by now.
TLDR: in my single isolated test, Cloudflare blocks 60% of crawlers. Anubis blocks presumably all of them.
Also if anyone active on Lemmy runs a low traffic personal site and doesn't know how or can't run Anubis (eg shared hosting), I have plenty of excess resources I can run Anubis for you off one of my servers (in a data center) at no charge (probably should have some language about it not being perpetual, I have the right to terminate without cause for any reason and without notice, no SLA, etc). Be aware that it does mean HTTPS is terminated at my Anubis instance, so I could log/monitor your traffic if I wanted as well, so that's a risk you should be aware of.
It's interesting that anubis has worked so well for you in practice.
What do you think of this guy's take?
I wouldn't be surprised if most bots just don't run any JavaScript so the check always fails
AI scrapers are the new internet DDoS.
Might want to throw something Infront of your blog to ward them off like Anubis or a Tarpit.
the one with the quadrillion hits is this bad boy: https://www.babbar.tech/crawler
Babbar.tech is operating a crawler service named Barkrowler which fuels and update our graph representation of the world wide web. This database and all the metrics we compute with are used to provide a set of online marketing and referencing tools for the SEO community.
Metrics on what - how much beating can a server take before it commits ritual Sudoku and fries itself?
I run an ecommerce site and lately they've latched onto one very specific product with attempts to hammer its page and any of those branching from it for no readily identifiable reason, at the rate of several hundred times every second. I found out pretty quickly, because suddenly our view stats for that page in particular rocketed into the millions.
I had to insert a little script to IP ban these fuckers, which kicks in if I see a malformed user agent string or if you try to hit this page specifically more than 100 times. Through this I discovered that the requests are coming from hundreds of thousands of individual random IP addresses, many of which are located in Singapore, Brazil, and India, and mostly resolve down into those owned by local ISPs and cell phone carriers.
Of course they ignore your robots.txt as well. This smells like some kind of botnet thing to me.
I don’t really get those bots.
Like, there are bots that are trying to scrape product info, or prices, or scan for quantity fields. But why the hell do some of these bots behave the way they do?
Do you use Shopify by chance? With Shopify the bots could be scraping the product.json endpoint unless it’s disabled in your theme. Shopify just seems to show the updated at timestamp from the db in their headers+product data, so inventory quantity changes actually result in a timestamp change that can be used to estimate your sales.
There are companies that do that and sell sales numbers to competitors.
No idea why they have inventory info on their products table, it’s probably a performance optimization.
I haven’t really done much scraping work in a while, not since before these new stupid scrapers started proliferating.
Negative. Our solution is completely home grown. All artisinal-like, from scratch. I can't imagine I reveal anything anyone would care about much except product specs, and our inventory and pricing really doesn't change very frequently.
Even so, you think someone bothering to run a botnet to hound our site would distribute page loads across all of our products, right? Not just one. It's nonsensical.
Downloading you wallpapers? Lol what for
It's 12181 hits and the number behind the plus sign are robots.txt hits. See the footnote at the bottom of your screenshot.
Phew, so I'm a dumbass and not reading it right. I wonder how they've managed to use 3MB per visit?
- Get a blocklist
- Enable rate limits
- Get a proper robots.txt
- ~~Profit~~ Silence
Can you just turn the robots.txt into a click wrap agreement to charge robots high fees for access above a certain threshold?
why do a agreement when you can serve a zip bomb :D
Puts the full EU regulations in robot.txt
Check out Anubis. If you have a reverse proxy it is very easy to add, and for the bots stopped spamming after I added it to mine
What is the blog about? It may be increased interest as search providers use them for normal searches now.. or it could be a couple of already sentient doombots.
Please don't be a blog about von Neumann probes. Please don't be a blog about von Neumann probes. Please don't be a blog about von Neumann probes..
What’s wrong with blogs about von Neumann probes? Genuinely curious!
If an ai read it several thousand times, I thought it was too on the nose joke sorry
lol that’s funny. I guess I’m just slow
I want to search for a blog on this now...
Hydrogen bomb vs coughing baby type shit