this post was submitted on 23 May 2024
45 points (94.1% liked)

Selfhosted

40218 readers
966 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I am a teacher and I have a LOT of different literature material that I wish to study, and play around with.

I wish to have a self-hosted and reasonably smart LLM into which I can feed all the textual material I have generated over the years. I would be interested to see if this model can answer some of my subjective course questions that I have set over my exams, or write small paragraphs about the topic I teach.

In terms of hardware, I have an old Lenovo laptop with an NVIDIA graphics card.

P.S: I am not technically very experienced. I run Linux and can do very basic stuff. Never self hosted anything other than LibreTranslate and a pihole!

you are viewing a single comment's thread
view the rest of the comments
[–] applepie@kbin.social -5 points 5 months ago (2 children)

You would need 24gb vram card to even start this thing up. Prolly would yield shiti results

[–] Bipta@kbin.social 5 points 5 months ago (1 children)

They didn't even mention a specific model. Why would you say they need 24gb to run any model? That's just not true.

[–] applepie@kbin.social -1 points 5 months ago

I didnt say any. Based on what he is asking, he can't just run this shit on an old laptop.