this post was submitted on 24 Nov 2023
1 points (100.0% liked)

LocalLLaMA

14 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

I tried to apply a lot of prompting techniques in 7b and 13b models. And no matter how hard I tried, there was barely any improvement.

you are viewing a single comment's thread
view the rest of the comments
[–] mrsublimezen@alien.top 1 points 2 years ago

from my experiences every thing is a prompt:

question = prompt

answer= prompt

--------

example:

question: do you know what apples are

answer: yes apples are red fruit

now its access the knowledge nodes for fruit and apples

--------

so each layer of question an answer is accessing deferent nodes in the llm

niching down answers making the next answer more correct to the question