this post was submitted on 22 Nov 2023
1 points (100.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The main trick is learning to filter out the bs "attention aware physics informed multimodal graph centric two-stage transformer attention LLM with clip-aware positional embeddings for text-to-image-to-audio-to-image-again finetuned representation learning for dog vs cat recognition and also blockchain" papers with no code.
That still leaves you with quite a few good papers, so you need to focus down into your specific research area. There's no way you can keep caught up in all of ML.
Yeah, those bs ones pop up everywhere. If only there was some model to sort between those and the good ones... And I'm kind of giving up on being caught up, seeing g all the answers.