Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, as a sort of pet project I don’t think backing up the whole thing is possible.

You might be able to back up a significant portion of the unique data in IA if you limited it to text files. I think they probably have the highest information to file size ratio.

It’s also probably the most likely to already be back up, though. Interesting issue; you might also get somewhere by cutting the 50TB up into 10GB torrents (or 100GB or whatever, something reasonable for a consumer hard drive) and maybe adding a script that checks the torrent swarm stats to recommend a torrent to download.

Something where I run it, tell it I want to let it use 600GB, and it hands me torrent files for the least seeded 600GB. Maybe a super basic web UI so people can see how well backed up it is?

Unsure if people would sign on or not; I probably would. I’ve got 10 or so TB of NFS I’m not using I could chuck at it. I would guess there are other data hoarders out there who would do the same, but only if it were somewhat easy. I’m probably not going to volunteer to do an hour of rtorrent cleanup a week to make sure I’m backing up the right things.




I think a part of the scope of this project may have already been solved by The FreeNet Project (now HyphaNet) [0].

[0]: https://www.hyphanet.org/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: