Unless your 10TB is text, there's no way you're going to get 20% compression to fit 8TB.
Data Hoarder
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
Have you weeded out or linked any duplicate files first? Might save you some room.
Have you weeded out or linked any duplicate files first? Might save you some room.
I'd STRONGLY suggest not to mess with TB sized archives or archive parts. Put the smaller drives together with mergerfs (use a rclone union remote if on one OS that doesn't do mergerfs) and then copy there everything you want. If you think not all will make it enable file system compression if you think the files will compress, if not exclude some directories you might not really need to back up.
You could use tar
to do that, and it's built into Windows nowadays (if you aren't on a unix-like system (mac, linux, etc.)).
Here is a website with more information:
https://www.thewebhelp.com/linux/creating-multivolume-tar-files/
Or something like this would also work, but it requires the split
command which I don't think is available on Windows:
https://superuser.com/a/290990