this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I'm a true lurker but am tinkering with some models for coding.

My aim is to unashamedly stop outsourcing some of my coding tasks to developers and, instead, ask my AI. I'm trying to load my entire application codebase and then ask for tweaks and new features.

Does anyone have any ideas about providing multiple (code) files as input/context so that structure is meaningful? Until now I do something similar to:

---codefile1.code---

Some actual code...

---codefile2.code---

Some actual code...

Then I prime it to state this is how the code format will be and give it my task.

This leaves a couple of questions:

Is there a better way to do this that I'm missing? It sometimes understands which file it is modifying..but sometime just plain wrong.

Are there any clever tricks to reduce context in code? (Thinking along the lines of minify for JavaScript..but undoable!!)

Would there be any value at all in fine-tuning/lora-ing (forgive my ignorance) on my full codebase?

And lastly, the staple query, which coding model is 'best' currently!! ๐Ÿ˜‚ (I'm using Codebooga-34b-v0.1and it's pretty reasonable..but often steps outside of my reference code and gets a bit confused). Are there any coding model test/leaderboards people know of?

Any thoughts welcome and if I nail this I'll be looking to build a little open project tool to make it a nice workflow where you can perhaps pick a subset of code files as context etc.. (for most tasks only a subset is required tbh).

Thanks.

Edit - Newline after filename

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Hey_You_Asked@alien.top 1 points 11 months ago (1 children)
[โ€“] antsloveit@alien.top 1 points 11 months ago

How excellent. Cheers. I may need to tweak to local llm but this seems like an incredible leg up on what I was about to undertake!