514
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
That's already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.
Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there's no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.
Hell, you can even run llama.cpp on Android phones.
This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.
Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay
Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.
And where would one look for these sexy sexy AI models, so I can avoid them, of course...
Huggingface is where the models live. Anything that's uncensored (and preferably based on llama 2) should work.
Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.
There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.
General rule: if you don't have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.