this post was submitted on 14 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I've been using self-hosted LLM models for roleplay purposes. But these are the worst problems I face every time, no matter what model and parameter preset I use.

I'm using :

Pygmalion 13B AWQ

Mistral 7B AWQ

SynthIA 13B AWQ [Favourite]

WizardLM 7B AWQ

  1. It messes up with who's who. Often starts to behave like the user.

  2. It writes in third person perspective or Narrative.

  3. Sometimes, generates the exact same reply (exactly same to same text) back to back even though new inputs were given.

  4. It starts to generate more of a dialogue or screenplay script instead of creating a normal conversation.

Anyone has any solutions for these?

you are viewing a single comment's thread
view the rest of the comments
[–] Curious_Drive_4194@alien.top 1 points 1 year ago (3 children)

For 1 and 2, apply grammar sampling to force LLM to start all his sentences with:

:

This will "force" the LLM to write dialogue as the specified character. Won't work 100% of the times, but will become a very rare event.

[–] tronathan@alien.top 1 points 11 months ago

I use a custom front-end and append the character name / colon to the end of all my prompts to force this, I wonder if grammar sampling would be better. Don't really have my head around grammar sampling yet.

[–] Hey_You_Asked@alien.top 1 points 1 year ago

where did you learn about this?

I'm really struggling to wrap my head around the intuition that goes into imposing grammars on LLM generation.

[–] fox-lad@alien.top 1 points 1 year ago

Very cool trick, thanks.