CPUs don't run LLMs.
flossraptor
Even if you get a 3090 with 24gb of vram, you're going to load the biggest model you can and realize it is useless for most tasks. Less than that and I don't even know what you would use it for.
If you guys really want to make people care about using crypto for payments, implement Dai Hard on ETH.
https://medium.com/@coinop.logan/daihard-game-theory-21a456ef224e
Basically, it creates a scenario where people can trust each other for anonymous transactions because both parties are highly incentivized to act fairly. This is accomplished by having both parties make a deposit that cab be burned if things go south.
Even if you just make an alias in pfSense and block its traffic, you're fine.
12.8? That's nothing.
Very few people on the planet face a threat model where memorizing a seed phrase is worthwhile. Leaving a seed phrase accessible long enough to memorize it considerably increases the likelihood of it being compromised. Locking it up and hiding it immediately greatly reduces this possibility.
You are right about it not being difficult to memorize a seed phrase. It is not hard to do many things that are unnecessary and unlikely to be useful. Only an idiot would fail to realize that such an exercise is essentially pointless, and then proceed to brag about his ability to memorize a few words, while simultaneously demonstrating his inability to reason about basic things.
The only thing Ethereum accomplished is becoming superior money. I wish we could say we had another significant use-case.
For some people "uncensored" means it hasn't been lobotomized, but for others it means it can write porn.
Nvidia is the only game in town right now. I decided on a 3090 for the time being, with the option of adding another one later. I think in two years we will have 100x better options specifically tailored for AI.
It's notepad with extra steps.
I think it's very overkill and you're going to be sitting at like 10-15% cpu when it's done. But for projects like this saving a few bucks is not worth it. You need room to grow.
With a dedicated 3090 (another card for OS) a 34b 5bpw just fits and runs very fast. Like 10-20t/s. The quality is good for my application, but I'm not coding.