this post was submitted on 13 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Using and losing lots of money on gpt-4 ATM, it works great but for the amount of code I'm generating I'd rather have a self hosted model. What should I look into?

you are viewing a single comment's thread
view the rest of the comments
[–] DifferentPhrase@alien.top 1 points 1 year ago (4 children)

As far as self hosted models go, deepseek-coder-33B-instruct is the best model I have found for coding. Anecdotally it seems more coherent and gives better results than Phind-CodeLlama-34B-v2.

[–] SlateHardjaw@alien.top 1 points 1 year ago (1 children)

What environment do you use to interact with self-hosted code models when coding? I've been using and enjoying Cursor for the way it's integrated into the IDE, but I've been exploring options for going self-hosted just to feel freer from whatever record I'm putting on someone else's server.

[–] DifferentPhrase@alien.top 1 points 11 months ago

My code editor of choice (Helix) doesn’t support integrations or plugins so I haven’t tried Cursor or Copilot. I’m building my own UI right now that focuses on first-class support for models served by llama.cpp.

load more comments (2 replies)