this post was submitted on 14 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm gonna say you can remedy this problem even with 7b or 13b models to an extent but you're gonna need to shift most of your game's logic to be handled in the backend side of your game (the programming side and database) and feed the model the gist of your game state representation in form of text (use templates or simple paraphrasing methods for this part) with each prompt.