this post was submitted on 25 Nov 2023
1 points (100.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Biggest 'advantage' i can see is that, since Google is deprecating tf soon, JAX is the only googly deep learning lib left. It fills a niche, insofar a that is a definable niche. I'm sticking with pytorch for now.
No clue about things like speed/efficiency, which may be a factor.
Do you have a source? IMO TF is too big to deprecate soon. They did stop support for windows, but nobody abandons an enormous project suddenly
TLDR: No, they are not officially planning to deprecate TF. Yes they are still actively developing TF. No, that doesn't fill me with much confidence, coming from Google, especially while they are also developing Jax.
Just searched this again and kudos, I can't find anything but official Google statements that they are continuing support for TF in the foreseeable future. For a while people were doom-saying so confidently that Google is completely dropping TF for JAX that I kinda just took it on blind faith.
All that said: #TF REALLY COULD GET DEPRECATED SOON Despite their insistence that this won't happen, Google is known for deprecating strong projects with bright futures with little/no warning. Do not take the size of Tensorflow as evidence that the Goog is going to stand by it. Especially when they are actively developing a competing product in the niche.
fwiw, it is also the current fad in tech to make high level decisions abruptly without proper warning to engineers. It really does mean almost nothing when a company's engineers are enthusiastically continuing their support of a product.
TF is just not on solid ground.
Also, JAX is not official a google product, but rather a research product. So on paper, Tensorflow is google's official framework for deep learning.
What obligation does Google have to not deprecate tf? Google abandons projects all the time.
Actually, another perspective to put is that TensorFlow's deployment is something JAX doesn't have (not that I know of) and cutting it would be idiotic for google, since they eliminated their own tool in an ongoing AI revolution. TensorFlow is their current tool and if they are going to abandon it, they will need a strong replacement for it's deployability which does guarantee a few years (since the JAX team doesn't seem to be quite focused in deployment). IIRC JAX deploys by Tensorflow rn.
I would also like to hear about some programming frameworks (or languages) that Google has abandoned before.
Noop (Language)
AngularJS (Framework)
The latter was quite popular as a JavaScript web framework. There may be more examples, I'm not an expert at hating google.
But saying that it dropped Angularjs is like saying that google dropped tensorflow. They just rebooted it like tensorflow right? Thanks for Noop though. No idea that it existed lol.
That could be a valid concern. Personally, not too worried, since this is just a speculation though. Besides, the field is diverse enough that most people would benefit from learning multiple frameworks.
Yeah, from what I see, despite the mess TensorFlow might be, it still is getting updated frequently and has been improving these days. Not sure why they would depracate anytime soon.
Google isn’t deprecating TF.
My experience is that JAX is much lower level, and doesn’t come with batteries included so you have to pick your own optimization library or module abstraction. But I also find it makes way more sense than PyTorch (‘requires_gradient’?), and JAX’s autograd algorithm is substantially better thought out and more robust than PyTorch’s (my background was in compilers and autograd before moving into deep learning during postdocs, so I have dug into that side of things). Plus the support for TPUs makes life a bit easier compared to competing for instances on AWS.
It’s a drop in replacement for numpy. It does not get sexy than that. I use it for my research on PDE solvers and deep learning and to be able to just use numpy and with automatic differentiation on it is very useful. Previously I was looking to use auto diff frameworks like tapenade but that’s not required anymore.