this post was submitted on 02 Oct 2024
335 points (91.6% liked)
Technology
59472 readers
3684 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly, I can say I don't really get it either. I would only use the open source models anyway, but it just seems rather silly from what I can tell.
I feel like the last few months have been an inflection point, at least for me. Qwen 2.5, and the new Command-R, really make a 24GB GPU feel "dumb, but smart," useful enough so I pretty much always keep Qwen 32B loaded on the desktop for its sheer utility.
It's still in the realm of enthusiast hardware (aka a used 3090), but hopefully that's about to be shaken up with bitnet and some stuff from AMD/Intel.
Altman is literally a vampire though, and thankfully I think he's going to burn OpenAI to the ground.
What do you think about the possibility of decentralized AI through blockchain so that you could pay some tokens or something like that to rent the GPUs to run your AI for as long as you wish to instead of having to buy all the hardware and assemble it yourself?
Isn't that just cloud computing but with extra steps?