CLIPS is used in my graduate level AI class and across various industries that utilize expert systems. It’s definitely not abandoned.
d84-n1nj4
I believe it was first created in “Cognitron: A self organizing multilayered neural network”, but was not referred to as ReLU. It was popularized by “Deep Sparse Rectifier Neural Networks” and “Rectified Linear Units Improve Restricted Boltzmann Machines”.
In regard to deep learning and GPU use: It’s efficient compared to other activation functions because it consists of comparison and thresholding operations, and the derivative is just 1 when positive and 0 if not (for backpropagation). It’s effective because it adds non-linearity to layers of linear operations like the convolution.
Start contributing to GitHub repositories like tinygrad. If you have the skills of an ML engineer, you can make these contributions. Your work will speak for itself.
My exact setup. MacBook Air to ssh into my Linux machine with a Nvidia RTX 3090
You should find a GitHub repo, like tinygrad, and make contributions to it. Eventually, your work will speak for itself and you’d be much better positioned to land a job. If after finding several repos to try to make contributions to, you find yourself not able to make such contributions, take an honest assessment of your skills and set goals to improve the areas you need to improve. I’ve been working for several years in this area and would never dare something like “knowing how to do everything”. Be specific about what you know how to do. Maybe you’re just trollin’ us.