You can also self host it: https://github.com/schlagmichdoch/pairdrop
If you're "sorry for politics" why not post it in a political meme community? We posted corn over this.
Shh. You're stepping on the non-American throwing out ragebait in US news and politics with your pesky follow-up questions.
[Holds hand 1mm from your face]
I'm not touching you, I'm not touching you, I'm not touching you. Filter it out, we're in public.
This probably isn't the answer you want to hear, but yes, I still consider it slop.
Not everyone is an artist, and that's okay. Just do your best, and even the worst chicken-scratch doodles are better than what any AI craps out.
Lol, but I have a hard time believing O'Brien didn't prepare a backup power supply for his pattern buffer.
O'Brien when the power is out but he can't just go fix it himself:

So, I set this up recently and agree with all of your points about the actual integration being glossed over.
I already had bot detection setup in my Nginx config, so adding Nepenthes was just changing the behavior of that. Previously, I had just returned either 404 or 444 to those requests but now it redirects them to Nepenthes.
Rather than trying to do rewrites and pretend the Nepenthes content is under my app's URL namespace, I just do a redirect which the bot crawlers tend to follow just fine.
There's several parts to this to keep my config sane. Each of those are in include files.
-
An include file that looks at the user agent, compares it to a list of bot UA regexes, and sets a variable to either 0 or 1. By itself, that include file doesn't do anything more than set that variable. This allows me to have it as a global config without having it apply to every virtual host.
-
An include file that performs the action if a variable is set to true. This has to be included in the
serverportion of each virtual host where I want the bot traffic to go to Nepenthes. If this isn't included in a virtual host'sserverblock, then bot traffic is allowed. -
A virtual host where the Nepenthes content is presented. I run a subdomain (
content.mydomain.xyz). You could also do this as a path off of your protected domain, but this works for me and keeps my already complex config from getting any worse. Plus, it was easier to integrate into my existing bot config. Had I not already had that, I would have run it off of a path (and may go back and do that when I have time to mess with it again).
The map-bot-user-agents.conf is included in the http section of Nginx and applies to all virtual hosts. You can either include this in the main nginx.conf or at the top (above the server section) in your individual virtual host config file(s).
The deny-disallowed.conf is included individually in each virtual hosts's server section. Even though the bot detection is global, if the virtual host's server section does not include the action file, then nothing is done.
Files
map-bot-user-agents.conf
Note that I'm treating Google's crawler the same as an AI bot because....well, it is. They're abusing their search position by double-dipping on the crawler so you can't opt out of being crawled for AI training without also preventing it from crawling you for search engine indexing. Depending on your needs, you may need to comment that out. I've also commented out the Python requests user agent. And forgive the mess at the bottom of the file. I inherited the seed list of user agents and haven't cleaned up that massive regex one-liner.
# Map bot user agents
## Sets the $ua_disallowed variable to 0 or 1 depending on the user agent. Non-bot UAs are 0, bots are 1
map $http_user_agent $ua_disallowed {
default 0;
"~PerplexityBot" 1;
"~PetalBot" 1;
"~applebot" 1;
"~compatible; zot" 1;
"~Meta" 1;
"~SurdotlyBot" 1;
"~zgrab" 1;
"~OAI-SearchBot" 1;
"~Protopage" 1;
"~Google-Test" 1;
"~BacklinksExtendedBot" 1;
"~microsoft-for-startups" 1;
"~CCBot" 1;
"~ClaudeBot" 1;
"~VelenPublicWebCrawler" 1;
"~WellKnownBot" 1;
#"~python-requests" 1;
"~bitdiscovery" 1;
"~bingbot" 1;
"~SemrushBot" 1;
"~Bytespider" 1;
"~AhrefsBot" 1;
"~AwarioBot" 1;
# "~Poduptime" 1;
"~GPTBot" 1;
"~DotBot" 1;
"~ImagesiftBot" 1;
"~Amazonbot" 1;
"~GuzzleHttp" 1;
"~DataForSeoBot" 1;
"~StractBot" 1;
"~Googlebot" 1;
"~Barkrowler" 1;
"~SeznamBot" 1;
"~FriendlyCrawler" 1;
"~facebookexternalhit" 1;
"~*(?i)(80legs|360Spider|Aboundex|Abonti|Acunetix|^AIBOT|^Alexibot|Alligator|AllSubmitter|Apexoo|^asterias|^attach|^BackDoorBot|^BackStreet|^BackWeb|Badass|Bandit|Baid|Baiduspider|^BatchFTP|^Bigfoot|^Black.Hole|^BlackWidow|BlackWidow|^BlowFish|Blow|^BotALot|Buddy|^BuiltBotTough|
^Bullseye|^BunnySlippers|BBBike|^Cegbfeieh|^CheeseBot|^CherryPicker|^ChinaClaw|^Cogentbot|CPython|Collector|cognitiveseo|Copier|^CopyRightCheck|^cosmos|^Crescent|CSHttp|^Custo|^Demon|^Devil|^DISCo|^DIIbot|discobot|^DittoSpyder|Download.Demon|Download.Devil|Download.Wonder|^dragonfl
y|^Drip|^eCatch|^EasyDL|^ebingbong|^EirGrabber|^EmailCollector|^EmailSiphon|^EmailWolf|^EroCrawler|^Exabot|^Express|Extractor|^EyeNetIE|FHscan|^FHscan|^flunky|^Foobot|^FrontPage|GalaxyBot|^gotit|Grabber|^GrabNet|^Grafula|^Harvest|^HEADMasterSEO|^hloader|^HMView|^HTTrack|httrack|HTT
rack|htmlparser|^humanlinks|^IlseBot|Image.Stripper|Image.Sucker|imagefetch|^InfoNaviRobot|^InfoTekies|^Intelliseek|^InterGET|^Iria|^Jakarta|^JennyBot|^JetCar|JikeSpider|^JOC|^JustView|^Jyxobot|^Kenjin.Spider|^Keyword.Density|libwww|^larbin|LeechFTP|LeechGet|^LexiBot|^lftp|^libWeb|
^likse|^LinkextractorPro|^LinkScan|^LNSpiderguy|^LinkWalker|msnbot|MSIECrawler|MJ12bot|MegaIndex|^Magnet|^Mag-Net|^MarkWatch|Mass.Downloader|masscan|^Mata.Hari|^Memo|^MIIxpc|^NAMEPROTECT|^Navroad|^NearSite|^NetAnts|^Netcraft|^NetMechanic|^NetSpider|^NetZIP|^NextGenSearchBot|^NICErs
PRO|^niki-bot|^NimbleCrawler|^Nimbostratus-Bot|^Ninja|^Nmap|nmap|^NPbot|Offline.Explorer|Offline.Navigator|OpenLinkProfiler|^Octopus|^Openfind|^OutfoxBot|Pixray|probethenet|proximic|^PageGrabber|^pavuk|^pcBrowser|^Pockey|^ProPowerBot|^ProWebWalker|^psbot|^Pump|python-requests\/|^Qu
eryN.Metasearch|^RealDownload|Reaper|^Reaper|^Ripper|Ripper|Recorder|^ReGet|^RepoMonkey|^RMA|scanbot|SEOkicks-Robot|seoscanners|^Stripper|^Sucker|Siphon|Siteimprove|^SiteSnagger|SiteSucker|^SlySearch|^SmartDownload|^Snake|^Snapbot|^Snoopy|Sosospider|^sogou|spbot|^SpaceBison|^spanne
r|^SpankBot|Spinn4r|^Sqworm|Sqworm|Stripper|Sucker|^SuperBot|SuperHTTP|^SuperHTTP|^Surfbot|^suzuran|^Szukacz|^tAkeOut|^Teleport|^Telesoft|^TurnitinBot|^The.Intraformant|^TheNomad|^TightTwatBot|^Titan|^True_Robot|^turingos|^TurnitinBot|^URLy.Warning|^Vacuum|^VCI|VidibleScraper|^Void
EYE|^WebAuto|^WebBandit|^WebCopier|^WebEnhancer|^WebFetch|^Web.Image.Collector|^WebLeacher|^WebmasterWorldForumBot|WebPix|^WebReaper|^WebSauger|Website.eXtractor|^Webster|WebShag|^WebStripper|WebSucker|^WebWhacker|^WebZIP|Whack|Whacker|^Widow|Widow|WinHTTrack|^WISENutbot|WWWOFFLE|^
WWWOFFLE|^WWW-Collector-E|^Xaldon|^Xenu|^Zade|^Zeus|ZmEu|^Zyborg|SemrushBot|^WebFuck|^MJ12bot|^majestic12|^WallpapersHD)" 1;
}
deny-disallowed.conf
# Deny disallowed user agents
if ($ua_disallowed) {
# This redirects them to the Nepenthes domain. So far, pretty much all the bot crawlers have been happy to accept the redirect and crawl the tarpit continuously
return 301 https://content.mydomain.xyz/;
}
Is the bootloader unlockable?
Yeah, I wanted to buy one but by the time I was in a position to, they were gone. Ended up getting a Fusion Hybrid instead. Love it, but wish I'd have bought the plugin version.
This is mostly adding on to another reply, but there's two types of hybrid drivetrains:
-
Parallel: Both the internal combustion engine and the electric motors are coupled to the drivetrain and "share the load" or operate independently depending on demand. (electric only, engine only, both simultaneously). This is the most common type and is seen in the Toyota hybrids (Prius, Camry Hybrid, etc) and in Ford's previous hybrids (Ford and Toyota cross-license a lot of their hybrid drivetrain tech, so this makes sense).
-
Series: The drivetrain is fully electric and there is an internal combustion engine that only drives a generator to provide power to charge the battery and send power to the traction motor. The Chevy Volt is (well, was) the only true series hybrid I can think of right now (not to be confused with the Chevy Bolt which is an EV).
For all intents and purposes, these gas-powered range extended trucks are just series hybrids. I think the main differentiator is that the traction battery and generator portion are a bit larger.
My record is 9.9 years (and going). This is my old Thinkpad T420 which lives on my equipment rack to act as a SSH / web console for my equipment. I just close the lid and put it to sleep when it's' not in use. It doesn't even connect to the internet, just my isolated management VLAN.
My HomeAssistant server (also an old ThinkPad) is the next longest at just under a year. It also lives on an isolated VLAN.
Both of these are repurposed laptops with batteries in good condition and thus have built-in UPS (in addition to the UPS they're plugged into).
The rest average about 4-7 months depending on power outages (rare but when they do occur, they're longer than my UPS can provide) and rebooting for kernel updates.