BFrizzleFoShizzle

joined 1 year ago
[–] BFrizzleFoShizzle@lemmy.nz 2 points 1 year ago

I think the main point they're disagreeing with is this:

you wouldn’t be able to mathematically prove that the signal is perfectly recovered 100% of the time for all possible inputs

They explain why you don't need 100% accuracy - most compression codecs would only use the network for a prediction, which doesn't actually have to be correct. It just has to be "more likely to be correct" than existing algorithms.

If you want to read up more on the context of these prediction functions, the general class of compression algorithms you'd use for this are called prediction wavelet codecs. FLAC and arguably PNG are both prediction wavelet codecs.

[–] BFrizzleFoShizzle@lemmy.nz 8 points 1 year ago

there will be a continuous group of chrome users of ~5% that have the feature disabled

inb4 that 5% gets killedbygoogle.com