TheBluePillock

joined 11 months ago
[–] TheBluePillock@lemmy.world 5 points 4 months ago (1 children)

The term narcissist is somewhat overused, though there are also a lot of them these days. To echo a bit of what others said, thinking average high schoolers are dumb is not a sign of narcissism. Average high schoolers are notoriously foolish. Even if you feel like it's more than that and it's a serious problem, that alone does not make a narcissist.

Actual narcissists are unstable. They need the adoration of others to feel good about themselves. They're prone to fits of rage when anything damages their ego, and they can take just about anything as criticism then decide to fly off the handle.

If that is you, get help from a professional who specializes in it. If anyone reading this knows a person like that, read up and find a way to save yourself.

[–] TheBluePillock@lemmy.world 2 points 5 months ago

Seconding Etymotic. And if you wanted to upgrade later, you can get custom ear molds from an audiologist and use those instead of the included foam or rubber tips. It's among the best sound isolation you can get. I've also always had excellent experiences with their customer service.

[–] TheBluePillock@lemmy.world 5 points 8 months ago

My pixel (5a) only does adaptive charging if your alarm is set for the A.M. If you're second or third shift, it doesn't even try. There's no way to turn it on even in developer options. It was a pretty big wtf when I figured that one out.

[–] TheBluePillock@lemmy.world 3 points 8 months ago

I would love to be corrected, but when I looked into it, it sounded like you'd probably want 32gb VRAM or better for actual chat ability. You have to have enough memory to load the model, and anything not handled by your GPU takes a major performance hit. Then, you probably want to aim for a 72 billion parameter model. That's a decently conversational level and maybe close to the one you're using (but it's possible they're higher? I'm just guessing). I think 34B models are comparatively more prone to hallucination and inaccuracy. It sounded like the 32GB VRAM was kinda entry point for the 72B models so I stopped looking, because I can't afford that.

So somebody with more experience or knowledge can hopefully correct me or give a better explanation, but just in case, maybe this is a helpful starting point for someone.

You can download models on huggingface.co and interact with them through a web-ui like this one.