this post was submitted on 10 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
if you're making a lora, training on wikipedia directly will pretty much make it output text that looks like wikipedia. which is to say it will (probably) be worse at chatting.
a strategy i've been using lately is to get gpt4 to make a conversation in my chosen format *about* each chapter of my "textbook", i can automate this with pretty good results and it's done in about 10 minutes. It does kind of work, it'll at least get the bot to talk about the topics I chose, but as far as actually comprehending the information it's referencing... it's bad. It gets better as I increase rank, but it takes a lot of VRAM. I can only get to around 256 before it'll die
please share!!