this post was submitted on 19 Nov 2023
1 points (100.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes.
But "the haves" like to pretend its not in order to make it seem like everything's "fair".
Goeff Hinton's 1986 backpropagation research paper is like 4 pages.
Nowaday this is called a brain-fart.
And it was already invented like a dozen times. Also, its just chain rule.
Hintons paper was famous not because he claimed to invent backprop but because (iirc) it was the first instance of it being used to optimize neural nets.
Like the transformer paper is famous but it didn’t invent attention—just applied it in a novel way.