Thanks!
ISMETA
joined 1 year ago
GPT3 is 800GB while the entirety of the English Wikipedia is around 10GB compressed. So yeah it doesn't store evey detail of everything but LLMs do memorize a lot of things verbatim. Also see https://bair.berkeley.edu/blog/2020/12/20/lmmem/
A big issue for me with snap is, that the server side software is proprietary. So it really really does feel like they are trying for lock-in
Relevant xkcd: https://xkcd.com/1102/