I've tried helping keeping and seeding some historical archives, but it gets tedious quickly downloading one by one (and there's not much activity in them anyway). It would be nice if there was a more organized way of helping archival! (like downloading a large chunk of their database and seeding it automatically)
On one of my high seas sites there is an archive team which uses the API to find "at risk" torrents (<3 seeds). People dedicate X TB and have a script that hits the API, finds some at risk torrents then grabs them to seed and repeats till it fills the space alotted to archiving purposes.
Seems like something the IA could implement with some help.
With multiple sources within each tier treated equally.
The general thought process is more about the site disappearing so the focus on sources makes more sense and transcodes can be created trivially if needed for a theoretical new site as needed.
With the IA this is unlikely to be an issue as they'd just want to make sure each torrent has seeds so no real need for prioritization around sources.