You can use the 1070 to run the desktop, stable diffusion, Whisper, TTS and other add-ons.
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
it will work but you will be limited to 1070 speeds if using all 3 gpus
Be sure to prioritize the 3090s pcie lanes
I just use Linux. Stop the X server or don't install it altogether. (If you have it, just Ctrl+Alt+F1, log in, sudo systemctl stop lightdm.) Enjoy the entirety of your VRAM minus 1 MB available for you, so from there you run oobabooga or an API server or whatever and connect to it from your laptop.
As an added benefit, I can leave the GPU and all the noisy fans and their heat somewhere else in my home. And you don't need to connect a display to the Linux box either, just setup openssh-server, keys and work remotely.
Clever. I often run SD and textUI in --listen mode than use it on my IPad.