this post was submitted on 12 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

So real quick, I've been exploring local LLM for a bit and so on. In this video I get into what I think is the future for LLM, but in a nut shell I think Microsoft will eventually push out a local LLM to machines to cut down on a lot of resources and cost. In doing so, it likely will be possible for developers to tap into that local LLM for their game.

The worries I seen bring up is

  1. Spoilers - As mention in the video it is currently and it should always be possible to solve for this in the stuff sent to the LLM. The LLM can't talk about what it doesn't know.
  2. The NPC talks about stuff it shouldn't - by fine tuning it, this solves this problem to an extreme degree. The better you prep it, the less likely it will go off script. Even more with how you coded your end.
  3. Story lines shouldn't be dynamic. The answer to this is simple. Don't use the LLM for those lines and given NPC.
  4. Cost - Assuming I'm right about Microsoft and others will add a local LLM. The local part of it removes this problem.

https://www.youtube.com/watch?v=N31x4qHBsNM

It is possible to have it where given NPC show different emotions and direct their emotions as shown here where I tested it with anger.

https://www.youtube.com/shorts/5mPjOLT7H-Q

you are viewing a single comment's thread
view the rest of the comments
[–] AffectionateCan2342@alien.top 1 points 10 months ago

We are already testing local llms with unreal 5 for educational purposes (digital twin), combining it with RAG and faster whisper. Still in testing phase but seems really promising: https://vm.tiktok.com/ZGe1fstF9/ (starting at 0:40)