Word is that AMD support is getting better, but by far everyone is still using NVidia.
this post was submitted on 28 Nov 2023
1 points (100.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 1 year ago
MODERATORS
Take a look at flux.jl. Last time I checked, they were building in support. You may have some success if you join that community over pytorch or tensorflow.
I know people have started implementing AMD support for some popular use cases like Stable Diffusion inference but most things require CUDA. Support is getting better but there is still a long way to go before widespread adoption makes sense. Nvidia has invested heavily in ML software support for close to a decade now which is why everyone is using Nvidia GPUs.