toothpastespiders

joined 1 year ago
[–] toothpastespiders@alien.top 1 points 1 year ago

Creating alpaca formatted json data from big blocks of text that often have a lot of garbage in it. The untrained orca 3b model wasn't able to stick to the format if I provided it as an example in the instructions. But it did great with it after training on a small dataset of about 100 examples or so.

It's still a bit early to call it a total success since I've only ran it through a handful of tests on similar blocks of text. But just the fact that it's grabbing facts from the text and correctly formulating prompts around it is really impressive to me. 13b trained on the same data set is, unsurprisingly, still quite a bit better. But 3b's still doing far far better than I would have thought possible. It'd be really cool to get a little scraping pipe going with next to no resource use.

[–] toothpastespiders@alien.top 1 points 1 year ago (6 children)

I was extraordinarily skeptical of the utility of 3b models until...about 1 day ago when I gave orca mini a fair shot. In particular by training it on one specialized task. Which wound up producing results that honestly floored me.

All of which is to say that I'm VERY excited to see this. I really think the 3B models can be something of a perfect swiss army knife. Compact and always available. Multi modal capabilities are just a perfect fit for that exact type of methodology. Can't wait to give this a shot!

view more: ‹ prev next ›