this post was submitted on 12 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

So real quick, I've been exploring local LLM for a bit and so on. In this video I get into what I think is the future for LLM, but in a nut shell I think Microsoft will eventually push out a local LLM to machines to cut down on a lot of resources and cost. In doing so, it likely will be possible for developers to tap into that local LLM for their game.

The worries I seen bring up is

  1. Spoilers - As mention in the video it is currently and it should always be possible to solve for this in the stuff sent to the LLM. The LLM can't talk about what it doesn't know.
  2. The NPC talks about stuff it shouldn't - by fine tuning it, this solves this problem to an extreme degree. The better you prep it, the less likely it will go off script. Even more with how you coded your end.
  3. Story lines shouldn't be dynamic. The answer to this is simple. Don't use the LLM for those lines and given NPC.
  4. Cost - Assuming I'm right about Microsoft and others will add a local LLM. The local part of it removes this problem.

https://www.youtube.com/watch?v=N31x4qHBsNM

It is possible to have it where given NPC show different emotions and direct their emotions as shown here where I tested it with anger.

https://www.youtube.com/shorts/5mPjOLT7H-Q

you are viewing a single comment's thread
view the rest of the comments
[–] thewayupisdown@alien.top 1 points 10 months ago

(I only lurk here so apologies for any mistakes or dumb ideas) Anyway I spent some time trying to get some text-based fanfiction Disco Elysium setting game working in GPT4 when it was still only 8k (?) tokens. I got it to keep track of the sympathy level between Harry and Kim, and have the voices in Harry's head as well as NPCs speak a) in dialect (Scouse, Cockney,...) and style fitting their character, and b) reproduce to some extent the Martinaise Créole spoken in the game by having x% of words (nouns primarily) replaced permanently with their German, Romanian, French, ... translation. After an instruction phase I had the game first think up the story and then transform it into a branching narrative. It worked at some point pretty well until it would start forgetting things and hallucinate about prior events. Thus I never got to finish the first day or judge the narrative. Then it felt like GPT4 was taking a dive, the dialects where verbalized poorly (lots of 'h' in Cockney) or at some point it would just write "(Received Pronunciation)" or similar in front and print the answer in standard English.

I tried to implement a stripped down version of a safegame, have GPT4 use a compression algorithm and hand that over from round to round when I lost interest/realized I needed a better understanding of the fundamentals.