Ok_Reality2341

joined 11 months ago
[–] Ok_Reality2341@alien.top 1 points 11 months ago

Send DM bro got a few questions

 

Title really - right now the leader of OpenAI is blurry, which isn’t good for the industry to follow.

We need a strong leader to focus our efforts, but if this isn’t the case - perhaps it’ll get chaotic?

 

Title really - right now the leader of OpenAI is blurry, which isn’t good for the industry to follow.

We need a strong leader to focus our efforts, but if this isn’t the case - perhaps it’ll get chaotic?

[–] Ok_Reality2341@alien.top 1 points 11 months ago

6th degree polynomial?

[–] Ok_Reality2341@alien.top 1 points 11 months ago (1 children)

Why do you not see the value with Bitcoin? It is a decentralised currency - that is valuable to a lot of people

[–] Ok_Reality2341@alien.top 1 points 11 months ago

Your problem is that you are surrounded with other ML engineers

Every bit of data is quickly fed to a neural network - new problems require new data

The best way I advise is join an industry where ML engineers are rare, you’ll have first pick on all the data

 

In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference

 

In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference