leaderof13

joined 10 months ago
[–] leaderof13@alien.top 1 points 10 months ago (1 children)

Which expert you are talking about, machine learning noob here

 

I have setup llama 2 7b model locally and was able to get it to work . After the initial setup I have tried to upload a text file and train it using Lora as articulated in this post

https://www.reddit.com/r/LocalLLaMA/s/t8BfMr0qF2

The above article lists rentry guide which offers very good information and I have followed the same , after fine tuning is completed I try and run the trained model to get some specific answers as per the trained data( I have modified the William Shakespeare name to something else to verify my model is trained properly), but it’s not at all providing any coherent response, it’s either some garbage text or not able to understand my question

I m not sure where I m going wrong, I m using cpu with 12 cores , 32gb ram windows machine.

Any help in getting the correct response after the training would be appreciated

 

I would like to understand if anyone has implemented generative AI without using openAI and if so which open source you have used and how successful it has been so far

We have enterprise level incident data, relevant documentation etc that users will search about and need to generate responses using generative ai.

Is it possible to do this without relying on open AI at all