this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

CHAT GPT recommends the paper "Rectified Linear Units Improve Restricted Boltzmann Machines" by Vinod Nair and Geoffrey E. Hinton as it is one of the foundational papers introducing and exploring the benefits of ReLUs in neural networks. It also says it is a good starting point to learn about ReLUs and their advantages in machine learning models.

But, from your experience do you have any other papers or textbooks or even videos that you would recommend to someone learning about it? I don't mind if they're math heavy, as I do have a Bsc Honours in App Math.

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] d84-n1nj4@alien.top 1 points 11 months ago

I believe it was first created in “Cognitron: A self organizing multilayered neural network”, but was not referred to as ReLU. It was popularized by “Deep Sparse Rectifier Neural Networks” and “Rectified Linear Units Improve Restricted Boltzmann Machines”.

In regard to deep learning and GPU use: It’s efficient compared to other activation functions because it consists of comparison and thresholding operations, and the derivative is just 1 when positive and 0 if not (for backpropagation). It’s effective because it adds non-linearity to layers of linear operations like the convolution.