this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Title sums it up.

you are viewing a single comment's thread
view the rest of the comments
[–] hysterian@alien.top 1 points 11 months ago (1 children)

That’s odd considering the 4060 Ti desktop is 8GB VRAM. But are you saying just speed or are you able to run larger parameter LLMs on your laptop that your desktop wouldn’t be able to?

[–] __SlimeQ__@alien.top 1 points 11 months ago

I have the 16gb version of 4060ti, so the cards have nearly identical capabilities.