brickfrog

joined 2 years ago
[–] brickfrog@lemmy.dbzer0.com 13 points 1 month ago* (last edited 1 month ago) (15 children)

Not overly active but there are a few communities you could join if you like

!OpenSignups@lemmy.dbzer0.com

!Opensignups@lemmy.ml

!Opensignups@noworriesto.day

https://opentrackers.org/ is also a good site to keep an eye on (though it seems to be less active at the moment).

[–] brickfrog@lemmy.dbzer0.com 9 points 1 month ago (2 children)

Not sure which country you're in but in the U.S. I haven't seen many gift cards that are contactless tap-to-pay so you would want to double-check. Without tap-to-pay those type of cards would need to be added into a phone app (Google Wallet / Apple Pay) to be able to tap-to-pay using it.

It's possible outside the U.S. it's more common for gift cards to be able to tap-to-pay.

Or if you're talking about store gift cards then the same applies, most of those aren't tap-to-pay either so you'd want to double-check.

[–] brickfrog@lemmy.dbzer0.com 1 points 1 month ago (1 children)

Prime95 and Memtest86+ both run under Linux so no issues there.

You could also run those sort of apps off a Linux boot USB, or one of the Linux based diagnostic boot USB projects. I like SystemRescue (https://www.system-rescue.org/System-tools) but there are plenty of others you can check out - SystemRescue includes memtest86+ / stress / stress-ng / stressapptest for system stress testing so that could be something to try.

[–] brickfrog@lemmy.dbzer0.com 1 points 1 month ago (3 children)

Yup like others said the lack of a CPU cooler is definitely the problem here. CPUs heat up quickly and once they hit their thermal limit the system will shut down to try to avoid hardware failure. Hopefully the CPU wasn't damaged from repeatedly overheating while you were testing without cooling.. it might be okay, only way to know for sure is to properly install a cooler and then test.

Once you've got it going I'd suggest doing a burn-in test just to be sure the CPU will last. Been a bit since I've done a build but usually I'd run something like Prime95 to be sure the CPU and cooling is stable.

[–] brickfrog@lemmy.dbzer0.com 1 points 1 month ago

Ah yeah I saw that one but I don't think it does quite what OP wants. Seems more like it is designed to monitor a running qBittorrent client and then copy the .torrent file(s) to Transmission, with all torrent data in the same data folder. Might not help much for OP with all the different data folders they have in their current setup.

My concept is as such: have a shared folder where everything is moved after download. I call this /mnt/torrents.

The script provided that makes all of this happen is a python script. It queries the qBittorrent client for uploading or completed downloads, checks to see if they are private or public torrents, then copies the .torrent files to the respective "watched" directory of the public or private (transmission) client. It just copies the .torrent files to directories, so it should be usable with other torrent clients that have "watched" directories.

But either way nice effort! I'm kind of surprised at the lack of scripts to import torrents into Transmission. The only related script I could find is to do Transmission --> qBittorrent but it doesn't seem to do the reverse https://github.com/Mythologyli/transmission-to-qbittorrent

[–] brickfrog@lemmy.dbzer0.com 2 points 1 month ago* (last edited 1 month ago) (2 children)

and even then, I tried one and for some reason it wouldn’t verify my downloaded files and insisted on redownloading the torrent from scratch. Even though I had made sure I was pointing to the correct directory. This may be because I’ve renamed files in the past

That should work fine.. I suspect that failed maybe because you renamed like you said. Make sure Transmission is adding torrents in paused mode, then do another test with a torrent you definitely didn't rename. Maybe just do a test download in qBittorrent and then attempt to add it into Transmission e.g. a Linux Mint torrent or similar is usually a safe test https://www.linuxmint.com/edition.php?id=319

Because of how you have your torrents organized it does sound like you'll need to tough it out and add each torrent and configure it manually.

It would be easier if you had all the torrent data saved in the same folder(s), in which case just configure Transmission to add torrents in pause mode, configure a watch folder, copy your qBittorrent's .torrent files into that watch folder, and finally do a re-check in Transmission and start all the torrents. Then just hardlink the torrent data out into your own nested folders how you want them set up, that way the same data exists and is linked in two places (torrent data folder and your own folders). Maybe it's something to consider for your future configuration but it's not going to help you much right now.

For now yeah, the best you could do is set Transmission to add torrents in paused mode, configure a watch folder, copy paste your current qBittorrent .torrent files, then afterwards in Transmission change each torrent's data location and re-check one-by-one. Not sure if it's any faster than just adding the torrents manually one-by-one :/

You should be able to find the current .torrent files wherever MacOS saves your qBittorrent files, look for a folder that looks like qBittorrent / BT_backup, all the .torrent files in BT_backup are your loaded torrents inside qBittorrent.

With some luck maybe you can find a tool that does qBittorrent --> Transmission migrations? I wasn't sure if any exist, all I can find are tools to do Transmission --> qBittorrent e.g. https://github.com/undertheironbridge/transmission2qbt

(note I'm not on MacOS so maybe someone else has more direct advice to offer)

[–] brickfrog@lemmy.dbzer0.com 4 points 1 month ago

and then aggressively super-seed that content back out.

What exactly do you mean by super-seed? In torrent clients there is indeed something called super-seeding aka initial seeding but that does quite the opposite of "aggressively" seed anything. The whole point of super-seeding is to encourage other peers/leeches to share data amongst themselves and hopefully become seeds themselves. This results in your own torrent client avoiding uploading torrent data to the swarm more than necessary, it's the opposite of building ratio if you're minimizing uploading data.

https://en.wikipedia.org/wiki/Super-seeding

https://www.bittorrent.org/beps/bep_0016.html

It might be you meant to aggressively seed on an internet pipe with high upload bandwidth e.g. one of those 20 Gbps seedboxes or similar, that would make sense.

[–] brickfrog@lemmy.dbzer0.com 11 points 1 month ago* (last edited 1 month ago)

The vast majority of private trackers do not have a "hard" ratio economy like you describe. Most private trackers are flexible to give users ways to increase their own upload ratio without requiring that ratio to be "paid" by another user doing the downloading. e.g. when torrents are freeleech the users get to download for free but can still upload to improve their own ratio. And when there's bonus systems in place those bonus points can be used to add to the user's own uploaded data count. And sometimes private trackers have events where they make the entire tracker, or entire categories of torrents, freeleech so a whole ton of users get to download for free and will still be able to seed those same torrents afterwards.

does that mean that there are some users who will forever be below 1, and thus end up getting kicked out, thus resulting in the private tracker just… shrinking over time?

Sure, that could happen too. Private trackers will always get some users that just aren't going to cut it and eventually lose access to the tracker. In most cases the tracker will just end up adding new users and maintain the total user count. Each tracker is going to be different in how they approach this.. I think over time the user churn doesn't happen as much, at some point there's enough users on the tracker that are doing fine with ratio and whatnot while the tracker hits its own maximum user count so actually needing to replace users with new signups becomes less of a priority.

[–] brickfrog@lemmy.dbzer0.com 5 points 1 month ago

Agree with you, SO is great for finding info. There are solutions on there for niche problems that I haven't been able to find elsewhere, the type of thing where someone actually took the time to type out a step-by-step answer and it's now there and searchable on SO. It's a bummer that so many people seem to hate on the site nowadays.

And lets not forget the whole reason SO came out in the first place, back then web results were littered with question/answer links to sites like Experts-Exchange. I hated trying to figure out if an answer was on there, most of the time you ended up with a link to a question that you think has an answer but oh no you need to subscribe to view an answer that may or may not exist.

[–] brickfrog@lemmy.dbzer0.com 4 points 1 month ago* (last edited 1 month ago)

Core 2 Duos are slow, yeah. I've got an Asus F8SP-X1 laptop from ~ 2008 with a Core 2 Duo T9500, 4 GB RAM, and a SSD SATA drive in it. It was originally a mid-range Windows Vista system. Over its years I managed to upgrade it as far as it could go. It does run standard Ubuntu and Windows 10 - Certainly not fast but it does run. Performance would lean towards unbearable without the SSD. I suspect Gnome isn't doing it any favors and switching to a lighter DE or distro would help (or maybe just ditching the DE altogether) but since it's just a spare laptop it's no big deal.

One of the takeaways from your experiment is if it the system was already crap at running Windows 10 it's not necessarily going to fare better with Linux, at least if you're expecting a nice desktop environment. I don't know if in 2025 we need to equate the "will this run Linux?" challenge on old Windows XP/7 hardware aside from the geek/techie users that want to do something with that old hardware. Anyone else non-technical stuck with that type of hardware isn't thinking about Windows 10 being retired.

[–] brickfrog@lemmy.dbzer0.com 3 points 1 month ago

OP can go from Comcast Xfinity to 2 gig fiber, seems like a good call. Hell I'm jealous, still stuck with the Comcast Xfinity cable shit where I'm at.

[–] brickfrog@lemmy.dbzer0.com 15 points 1 month ago* (last edited 1 month ago)

You may as well call them and ask. Main things you want to find out are what plans/prices they offer and if they have any data caps. And/or if it's still under construction definitely ask to be put on their list of interested customers.

Honestly just about anything fiber is going to be an improvement over Comcast cable internet... if I were you I'd at least inquire if they have a 1 gig download/upload plan and work from there. Good luck!

view more: ‹ prev next ›