GissaMittJobb

joined 2 years ago
[–] GissaMittJobb@lemmy.ml 24 points 3 months ago (6 children)

I have no idea what they think this will accomplish, to be honest. It has the legal value of posting on Facebook that you don't allow them to use your photos.

[–] GissaMittJobb@lemmy.ml 9 points 3 months ago (1 children)

Well, he has terminal cancer, so his spewing of hatred will soon come to a permanent end

[–] GissaMittJobb@lemmy.ml 2 points 3 months ago

I recommend the original Swedish movie, it's quite good.

[–] GissaMittJobb@lemmy.ml 1 points 3 months ago (1 children)

No, the old model does not have the training data. It only has "model weights". You can conceptualize those as the abstract rules that the old model learned when it read the training data. By design, they are not supposed to memorize their training data.

I expressed myself poorly, this is what I meant - it has the "essence" of the training data, but of course not the verbatim training data.

To outperform the old model, the new model needs more than what the old model learned. It needs primary sources, ie the training data itself. Which is going to be deleted.

I wonder how valuable in relative terms the old training data is to the process, compared to just the new training data. I can't answer it, but it would be interesting to know.

[–] GissaMittJobb@lemmy.ml 2 points 3 months ago (3 children)

I think we're in agreement with each other? The old model has the old training data, and then you train a new one on that model with new training data, right?

[–] GissaMittJobb@lemmy.ml 3 points 3 months ago (6 children)

I guess it depends on how important old data is when building upon new models, which I fully admit I don't know the answer to. As I understand it though, new models are not trained fully from scratch, but instead are a continuation of the older model trained with new techniques/new data.

To speculate, I guess not having the older data present in the new training stages might make the attributes of that data be less pronounced in the new output model.

Maybe they could cheat the system by trying to distill that data out of the older models and put that into the training data, but I guess the risk of model collapse is not-insignificant there

Again, limited understanding here, take everything I speculate with a grain of salt

[–] GissaMittJobb@lemmy.ml 3 points 3 months ago (8 children)

They essentially still have the information in the weights, so I guess they won't fret too much over not having it in the original training data.

[–] GissaMittJobb@lemmy.ml 10 points 3 months ago

Heads up - your location is showing in this picture

[–] GissaMittJobb@lemmy.ml 14 points 3 months ago

I'm guessing it means "Blow", so Cocaine in this context. The cocaine is burning

[–] GissaMittJobb@lemmy.ml 3 points 3 months ago

I honestly think it's dumber than that - I think he's still salty about wind farms being put up outside his golf course in Scotland and is taking his petty revenge on the world

[–] GissaMittJobb@lemmy.ml 14 points 3 months ago (1 children)

I've noticed this when the nights get hot here - having my feet outside the blankets and in the path of the fan helps a great deal with not overheating.

view more: ‹ prev next ›