this post was submitted on 30 Oct 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hey everyone, happy to say I’m officially announcing Obsidian V0.5 as part of my work at Nous Research and building upon my work creating the Capybara V1.9 dataset.

This model is blazing fast and is likely the first Multi-modal model that is efficient enough to fit within the ram constraints of even a non-pro iphone! at practical speeds as well!

This model in its current state is largely a multi-modal version of Nous-Capybara-3B which I also only recently released, I’ve designed the dataset with novel synthesis methods (Paper currently being done) it’s made to be robust with conversational abilities and even includes multi-turn data that has been synthesized as a continuation of single turn data examples contained within datasets like Airoboros, Know_logic, EverythingLM and more.

It’s built using Llava 1.5 techniques but instead of a 7B llama as a base, we choose to use the new StableLM 3B model trained for 4 trillion tokens. (We plan to train upon Mistral likely as well)

Any questions or feedback are much appreciated!

Download here: https://huggingface.co/NousResearch/Obsidian-3B-V0.5

Or download quantized version here, Courtesy of Nisten: https://huggingface.co/nisten/obsidian-3b-multimodal-q6-gguf

you are viewing a single comment's thread
view the rest of the comments
[–] toothpastespiders@alien.top 1 points 1 year ago (6 children)

I was extraordinarily skeptical of the utility of 3b models until...about 1 day ago when I gave orca mini a fair shot. In particular by training it on one specialized task. Which wound up producing results that honestly floored me.

All of which is to say that I'm VERY excited to see this. I really think the 3B models can be something of a perfect swiss army knife. Compact and always available. Multi modal capabilities are just a perfect fit for that exact type of methodology. Can't wait to give this a shot!

[–] InTheTransition@alien.top 1 points 1 year ago (1 children)

What was the task? Just curious about what I can use mini models for

[–] toothpastespiders@alien.top 1 points 1 year ago

Creating alpaca formatted json data from big blocks of text that often have a lot of garbage in it. The untrained orca 3b model wasn't able to stick to the format if I provided it as an example in the instructions. But it did great with it after training on a small dataset of about 100 examples or so.

It's still a bit early to call it a total success since I've only ran it through a handful of tests on similar blocks of text. But just the fact that it's grabbing facts from the text and correctly formulating prompts around it is really impressive to me. 13b trained on the same data set is, unsurprisingly, still quite a bit better. But 3b's still doing far far better than I would have thought possible. It'd be really cool to get a little scraping pipe going with next to no resource use.

load more comments (4 replies)