Relevant-Yak-9657

joined 11 months ago
[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

But saying that it dropped Angularjs is like saying that google dropped tensorflow. They just rebooted it like tensorflow right? Thanks for Noop though. No idea that it existed lol.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Yes, completely. In fact, tf.keras will just be changed internally (as in source code) as keras_core, but you won't notice any difference in tf.keras (except for some removal of currently depracated legacy code and a visual update in .fit()).

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Yes, same story. Keras allowed me to understand the basics. Personally, my journey has been as Keras for architecture, Pytorch/TensorFlow for implicit gradient differentiation, JAX for explicit gradient optimization, and then creating a library on JAX to understand how these libraries work.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

As the others said, it's a pain to reimplement common layers in JAX (specifically). PyTorch is much higher level in it's nn API, but personally I despise rewriting the amazing training loop for every implementation. That's why even JAX uses Flax for common layers, because why use an error prone operator like jax.lax.conv_from_dilated or whatever and fill its 10 arguments every time? I would rather use flax.linen.Conv2D or keras_core.layers.Conv2D in my Sequential layer and prevent debugging a million times. For PyTorch, model.fit() can just quickly suffice and later customized.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago (2 children)

I would also like to hear about some programming frameworks (or languages) that Google has abandoned before.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Actually, another perspective to put is that TensorFlow's deployment is something JAX doesn't have (not that I know of) and cutting it would be idiotic for google, since they eliminated their own tool in an ongoing AI revolution. TensorFlow is their current tool and if they are going to abandon it, they will need a strong replacement for it's deployability which does guarantee a few years (since the JAX team doesn't seem to be quite focused in deployment). IIRC JAX deploys by Tensorflow rn.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Yeah, unifying these tools feels like the best way to go for me too. I also like JAX for a similar reason because there are 50 different libraries with different use cases and it is easy to mix parts of them together, due to the common infrastructure. Like Keras losses + flax models + optax training + my custom libraries super classes. It's great tbh.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago (5 children)

Also, JAX is not official a google product, but rather a research product. So on paper, Tensorflow is google's official framework for deep learning.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

That could be a valid concern. Personally, not too worried, since this is just a speculation though. Besides, the field is diverse enough that most people would benefit from learning multiple frameworks.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Yeah, from what I see, despite the mess TensorFlow might be, it still is getting updated frequently and has been improving these days. Not sure why they would depracate anytime soon.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago

Saves time when creating neural nets. If you want to utilize subsets of its speed and not spend hours analysing training methods, Keras can be a good addition. Other than that, Flax is probably the best way for full tkinter while having an abstraction better than a NumPy like api.

[–] Relevant-Yak-9657@alien.top 1 points 11 months ago (4 children)

Most people here would say that PyTorch is better, but IMO Keras is fine and no hate to tensorflow either. They just did a lot of questionable API design changes and FC has been weird on twitter. For me, it is pretty exciting, since Keras_core seems pretty stable as I use it and it is just another great framework for new people in deep learning or quick prototyping.

view more: next ›