this post was submitted on 09 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So what are the implications in real day useage?
It's able to retrieve every information from at least 65k if it's small enough.
What are the results with bigger chunks to be retrieved?
Is it able to process all of the 64k tokens in order to generate an answer that takes all the 64k into account.
For sure it's interesting but many more test are needed to be done to have a full picture of the real capabilities.