this post was submitted on 10 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Trained and finetuned - 2 things.
The trained on wikipedia - yes, they feed the wikipedia articles to it - hook and sinker. No Q/A. But that doesn't mean it will be able to give you answer, unless you fine tune it with Q/A "I want you to behave like this" template - but the kick is - what we all are using to our huge advantage - it can be fine-tuned on a totally different Q/A, it will still be able to answer from wikipedia. It's a hat trick.
Thanks for the information and explanation