GinjaTurtles

joined 11 months ago
[–] GinjaTurtles@alien.top 1 points 11 months ago

I’ve messed around with a bunch of open source AI TTS that I can self host. Here’s my 2 cents:

  • mrq has a repo where you can fine tune tortoise to audio samples you have using a GUI https://git.ecker.tech/mrq/ai-voice-cloning there’s some good YouTube videos by Jarrod’s Journey about this
  • if you want some of the best sounding local TTS, using finetuned tortoise + a finetuned RVC model is going to be very nice quality
  • recently the tortoise maintainer added HIFI GAN for even faster inference but i don’t think you can finetune this HIFI GAN model yet since it’s a custom implementation for tortoise
  • one of the models that I’m going to look into next that sounds incredible is google soundstorm. I believe a few people have implemented an open source pytorch soundstorm model on GitHub
  • I’m not sure how good a finetuned version of soundstorm would be but this what I’m going to try out next when I have time (work sucks)
[–] GinjaTurtles@alien.top 1 points 11 months ago

I recently scoured a bunch of the internet looking for good options out there for a custom deep learning API project I’m working on (I wanted to host my API on cloud GPUs and be able to scale it and not burn my wallet)

The big guys (Microsoft, AWS, google) can be pretty pricey for cloud GPUs especially for side projects. But I think they give startup credits which can be very useful

Tensordock marketplace ended up my top choice. You can get some really cheap 4090/3090s (0.3-0.4 an hour) that are running in data centers. I haven’t had any issues with them yet.

Other options I found was Runpod 0.44hr for 3090

Genesis cloud 0.3hr for 3080

Vast AI is also really cheap GPUs but heard mixed things about them

Google cloud has a T4 for I believe 0.35hr

I’m not paid by any of these companies to promote their stuff. Hopefully this comment helps someone

[–] GinjaTurtles@alien.top 1 points 11 months ago

I recently scoured a bunch of the internet looking for good options out there for a custom deep learning API project I’m working on (I wanted to host my API on cloud GPUs and be able to scale it and not burn my wallet)

The big guys (Microsoft, AWS, google) can be pretty pricey for cloud GPUs especially for side projects. But I think they give startup credits which can be very useful

Tensordock marketplace ended up my top choice. You can get some really cheap 4090/3090s (0.3-0.4 an hour) that are running in data centers. I haven’t had any issues with them yet.

Other options I found was Runpod 0.44hr for 3090

Genesis cloud 0.3hr for 3080

Vast AI is also really cheap GPUs but heard mixed things about them

Google cloud has a T4 for I believe 0.35hr

I’m not paid by any of these companies to promote their stuff. Hopefully this comment helps someone