ChaosLamp_Genie

joined 10 months ago
[–] ChaosLamp_Genie@alien.top 1 points 10 months ago

Excited for you. My advice is to buddy up with other PhDs as it is often the best part of the journey. Academics in CS can be shy so put some effort in making the first move.

We really need more efficient networks! Here is an insight for you. Current large models use the same operators for billions of neurons, but like in classical circuits, where implementing a truth table with NANDs alone requires up to an extra exponential number of them compared to using the best combination of logic gates, neural circuitry can suffer from the same inefficiencies. For example you can approximate a smooth curve out of thousands of ReLU or you can use 1 trigonometric function. Neural architecture search (NAS) looks at this but is definitely not there yet and is even more expensive than normal learning!