this post was submitted on 29 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Mandelmus100@alien.top 1 points 2 years ago (7 children)

Another big takeaway is that training for more epochs leads to more memorization.

Should be expected. It's overfitting.

[–] we_are_mammals@alien.top 1 points 2 years ago (3 children)

It's overfitting.

Overfitting, by definition, happens when your generalization error goes up.

[–] DigThatData@alien.top 1 points 2 years ago (2 children)

it's possible to "overfit" to a subset of the data. generalization error going up is a symptom of "overfitting" to the entire dataset. memorization is functionally equivalent to locally overfitting, i.e. generalization error going up in a specific neighborhood of the data. you can have a global reduction in generalization error while also having neighborhoods where generalization gets worse.

[–] Hostilis_@alien.top 1 points 2 years ago

Memorization is functionally equivalent to locally overfitting.

Uh, no it is not. Memorization and overfitting are not the same thing. You are certainly capable of memorizing things without degrading your generalization performance (I hope).

load more comments (1 replies)
load more comments (1 replies)
load more comments (4 replies)