Is there some good cloud host for getting AMD GPUs?
Machine Learning
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
Well now I know what I’m doing with my weekend. Thanks for sharing! Hopefully I can report back some xtx performance numbers.
I tried on this config - Ryzen 9 7950x MI210. I got this result Throughput: 129 requests/min, 1028.89 tokens/s on llama2-7b. Which is even better than the performance they cite on the post
Will report back on 13b performance ASAP
Will this run on a ryzen with Radeon built in?
If so, can’t you build a ryzen machine with like 128gb of ram and dedicate nearly all of it to video?
There are limits, some motherboards have higher limits https://www.tomshardware.com/news/dollar95-amd-cpu-becomes-16gb-gpu-to-run-ai-software
I'm a bot, bleep, bloop. Someone has linked to this thread from another place on reddit:
- [/r/datascienceproject] LLM inference with vLLM and AMD: Achieving LLM inference parity with Nvidia (r/MachineLearning)
^(If you follow any of the above links, please respect the rules of reddit and don't vote in the other threads.) ^(Info ^/ ^Contact)