this post was submitted on 20 Nov 2023
133 points (100.0% liked)
Technology
37705 readers
174 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Nothing that runs on my GPU / CPU comes even close to GPT 3.5, GPT4 is not even in the same universe, and that’s with them running far more slowly.
In my tests, the self hosted options that have access to a 30xx or 40xx graphics card return results far faster than gpt4
Which model are you talking about?
Mistral for chatgpt, and i'm not saying it gives better answers, just that it's much faster than my web portal to gpt4
Oh, faster is easy. GPT 3.5 is also far faster than GPT 4. Faster at quality replies is the issue.