ArchiveTeam are independent from IA, but their stuff mostly does end up uploaded into the Wayback Machine. Storage space (like yours) isn’t usually what they are looking for, but rather the internet bandwidth and “virgin” IP address of aforementioned “warriors” running their code to scrape different websites, and then uploading the results to AT’s servers, where they are collected and eventually uploaded again to IA.
talk to the archive team at their irc channel or take a look at the website. they are sometimes looking for programmers, archivists, donors or a variety of people to help with what they need.
consider running the warrior too. its pretty lightweight.
if that doesnt work out, with that storage space you could also back up a bunch of wikipedia and still spare. they have torrents and dumps of their stuff publicly available.
How do I go about this? I’ve got 30TB available, 24/7 uptime. It’s not much but it might help.
In case you haven’t looked into it yourself yet…
ArchiveTeam are independent from IA, but their stuff mostly does end up uploaded into the Wayback Machine. Storage space (like yours) isn’t usually what they are looking for, but rather the internet bandwidth and “virgin” IP address of aforementioned “warriors” running their code to scrape different websites, and then uploading the results to AT’s servers, where they are collected and eventually uploaded again to IA.
Check out https://tracker.archiveteam.org/ for current projects
talk to the archive team at their irc channel or take a look at the website. they are sometimes looking for programmers, archivists, donors or a variety of people to help with what they need.
consider running the warrior too. its pretty lightweight.
if that doesnt work out, with that storage space you could also back up a bunch of wikipedia and still spare. they have torrents and dumps of their stuff publicly available.
Thanks I’ll look into it.
I already have a full backup of the entire English Wikipedia using Kiwix. It’s surprisingly small. Around 100GB.