For those of us unaware. Can you mention what purpose aria serves in your downloading process?
Data Hoarder
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
it can speed up downloads on sites that limit per connection so downloading fragments 8 at a time speeds things up considerably
that's what I've used it for
have you tried checking the Aria2c settings? It might be causing the fragmentation issue. Also, try using the --fixup flag in YT-DLP to see if it can repair the fragmented files. Good luck!
What I would do is create a new directory. Download a few videos within subdirectories within that new directory using only YT–DLP. Don’t use aria and upon completion compare the whole videos from the new download session using only YT-DLP with the previous one (aria). I would check by playing the files side by side if possible and check the videos’ duration. Check the file sizes and run diff in a terminal with both files provided as arguments. If diff complains of a mismatch between the binary files they indeed differ in one aspect.
May be the NAS is the problem.
Aria2c will get these fragments. But at the end ffmpeg or equivalent will merge everything and delete all the temporary fragments.
Somehow it is not getting deleted
Ask in yt-dlp discussions (github).