this post was submitted on 29 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zondartul@alien.top 1 points 2 years ago (12 children)

The point of the paper is that LLMs memorize an insane amount of training data and, with some massaging, can be made to output it verbatim. If that training data has PII (personally identifiable information), you're in trouble.

Another big takeaway is that training for more epochs leads to more memorization.

[–] Mandelmus100@alien.top 1 points 2 years ago (7 children)

Another big takeaway is that training for more epochs leads to more memorization.

Should be expected. It's overfitting.

[–] Hostilis_@alien.top 1 points 2 years ago

That's not overfitting. That's just fitting.

load more comments (6 replies)
load more comments (10 replies)