this post was submitted on 23 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

A few months ago I came across maximum mean discrepancy as a measure of distribution difference, and today I read this term and totally forgot what is means and had to find a youtube video to refresh my understanding. This happens a lot of times in my research. I feel like unless something is really basic (e.g. CNN, cross entropy, etc) and used a lot in my day-to-day model building, I easily forgot what I have read. I wonder is it just because I have a bad memory or I do not have a good way to organize information?

you are viewing a single comment's thread
view the rest of the comments
[–] Maykey@alien.top 1 points 11 months ago

I like to imagine one research in terms of another. For example I see Luna as a cousin of RMT (core idea of both is to get smaller sequence from a bigger, but methods and goals are very different), but if you squint, you will see the similarities. Helps with breaking down whole paper to smaller parts and see how one research is different from another and how they are similar. And I reward myself with a cookie if I find similarities when papers do not mention each other. I also have a (paper) notebook where I write down notes

Disclaimer, I'm not student/researcher, but a dirty hobbyist