So, how long until they sell out to microsoft? ^(/s)
Void_0000
joined 1 year ago
How hard can it be?
Seriously though, what makes it require more VRAM than regular inference? You're still loading the same model, aren't you?
So, how long until they sell out to microsoft? ^(/s)
How hard can it be?
Seriously though, what makes it require more VRAM than regular inference? You're still loading the same model, aren't you?
I self hosted searxng, but the problem is after I was done I realised that defeats most of the privacy benefits of searxng: If I'm the only one using it, then I might as well just be using the search engines themselves directly.
So now I also have firefox running in a docker container, searching random junk on searxng every couple of minutes.