Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What's your backup solution?
140 points by SnowingXIV on Nov 20, 2016 | hide | past | favorite | 145 comments
Do you keep work and personal separate? Do you use multiple providers like iCloud/OneDrive/Dropbox/Google Drive or do you stick to one? Do you do any type of auto-sync folders?

What's your process like? I'm curious on what people to find the most automatic and reliable solution for typical day to day with photos, documents, and their work backups. (Thinking more along the lines of your own side projects or things that you're more in control of, less on the corporate side that have imposed solutions and is managed by a team).




I own some land in rural TN, with a winterized trailer and access to spring water. that is backup water and shelter for minimal cost. A person/year of shelf-stable food (not fun, but indefinitely survivable) is about $1000. So if I am smart about it, and fall back gracefully, every $3k-$4k in savings is a year of "backup survival". I still need to make sure I have at least a year's worth of currency in a liquid form in case of infrastructure or banking collapse and ideally at least 6-months food dry-packed on site in case purchasing food while falling back is not viable.

just a reminder, there is a lot more to back up than your hard-drive.


If banking fails your dollars are going to be in a world of hurt. Do you invest in gold/silver, too?

Unrelated question, what part of TN is your unprotected, cash filled trailer in?


My general expectation is to the contrary. I think people underestimate how psychologically ingrained the dollar is into the american people. Money will still be worth something, maybe not as much as I like, but something. People have so much faith in the system, that everybody will be hedging against "when things get better".

Even then "failure of banking" does not have to be universal. Maybe my assets are frozen or other similar, localized issues.

In the grand scheme of things, gold is also likely to be difficult to liquidate. It does not really have any intrinsic value at this point.

As to where: On a hill, by a lake, surrounded by paranoid gulf war vets with lots of guns.


I would personally stockpile hard liquor if I wanted to hedge for a social collapse. Whiskey never goes bad, it's always useful and a few ounces can go a long way.


> Whiskey never goes bad

As long as you keep it sealed in an air-tight container and in complete darkness. Oxygen and/or light will trigger chemical processes that result in undesirable flavor changes.

Just a heads-up, speaking from experience and research :)


Stockpile antibiotics.


In the Argentina bank crash, the government froze all assets, but they also printed a lot of money so cash went way down in value too, but yeah, I believe it would still be worth something.

https://en.wikipedia.org/wiki/1998%E2%80%932002_Argentine_gr...

Edit: I think that other posters mentioning other practical barter-friendly alternatives to cash have a point (stuff like whiskey or soap)


lots of practical things have value other than money and gold, like ammunition and whiskey.

tennessee, remember?


You'd be surprised how effective simply burying something in the middle of the woods at a known location is.


How far away are you from the nearest large settlement (1000+ people)? This sort of stuff fascinates me, Europe is so densely populated this sort of thing isn't as viable.

Possibility to hunt and fish in the region too?


The region is wierd. I am about 20 miles from each of the 2 closest towns. Close enough to have hospital and walmart access.

I am right against a TVA lake, which makes for wierd geopolitics. As the crow flys, i am only a mile or so from my nearest neighbors, but between impassible terrain and foliage you have to actually travel 2+ miles. Natural lakes tend to have gentle slopes down to them, but the TVA lakes have many ridges jutting out into the lake and essentially only having 1 access road and cliffs over water on the other 3 sides. Parnoid crazy people like me do tend to like the region because of this. I make a point of spending some time with the neighbours who control the easements, retired vets mostly and they are more than happy to keep and eye on things in exchange for mostly absentee neighbours.

The deer and rabbits are almost at nuisance levels, and like most TVA lakes, the lake has both local fish population and is artificially stocked. Hunters wandering in through the hills or via the lake are my biggest hazard. Sustenance farming+ hunting is common out here.


Similar setup. Didn't get into it for backup but it could serve as that. Add a rifle to the mix for lots of reasons.

Edit: digital backup I'm less concerned. Have lost stuff and survived just fine. Most of my digital trail is ephemeral and not worth as much to me personally as I thought it was. Pretty much only some writings and many of my photos matter. It's really only about augmenting my memory. Plus NSA backups.


I'm using Arq (https://www.arqbackup.com/) with Google Cloud Storage. Arq is a stand alone backup program that supports a bunch of different storage providers. From their site:

> Amazon Cloud Drive, Amazon Web Services, Google Cloud Storage, Google Drive, Dropbox, or OneDrive accounts, or your SFTP server or NAS.

It encrypts locally so I don't have to worry about the backup company reading my files.

I have 3 backups configured. One for "Documents" which backs up everyday. Another for Photos (which are stored locally on an external HD) and another for Music (which apparently everyone but me doesn't own music anymore and just uses Spotify) both of which I run manually once in a while. If I ever get a desktop again I'll make those automated.

I deleted my entire hard drive by accident 5 months ago so since that time I also started using Time Machine to an external drive. It starts nagging me every 10 days to plug it in.

Also use Google Photo's free service not as backup but just as access to my photos. Earlier this year I uploaded all 130k photos at their "free" resolution

I don't backup work. Work does that. Most of it is in git. Same with personal projects. It's all on github.


I really wish Arq would make a unix app. If so I would buy licenses for all my machines.


How does this handle one of the backup sources going down or losing your data?


Work must always be separate from personal. Period. End of story.

I use Synology. I simply love it. I have RAID 6 which cuts down on the space I can use, but it gives me the peace of mind that I need, especially after one of my drives already died and I had to get it replaced. With RAID 6, I had no impact in speed whatsoever and 2 more drives would have to fail before I lost anything. Thanks to Amazon, I can get a new drive next day, so the chance of data loss is relatively small.

I also have an external USB connected to it that I will occasionally do full backups, every few months (my data doesn't change a huge amount).

Using a Synology plugin, I upload all my pics, encrypted, to Amazon Cloud for $60/year. I'm still figuring out if it's worth it to expand this to all my data, given that I don't completely trust how secure this is, and what it would look like if I did suffer a catastrophic data loss. I'm limited by time though.


For my personal stuff I have a seafile server on a vps. I have 'full history' turned on, which essentially gives me snapshots of every uploaded change of every file, browsable by individual file or by date and time for the whole backup space.

For work stuff, I've been loving borg-backup. Others here have mentioned why it's superior. I agree. Once they get support for remote object stores like S3, it will be hands down the best backup solution: https://github.com/borgbackup/borg/issues/102. Even though, there is a fuse layer for S3 that may work (haven't tried that yet).


CrashPlan. Dirt cheap for unlimited storage for generational backups for 10 machines. Whole extended family fits in one subscription.


Tried restoring from it yet?

/r/datahoarders have a long list of horror stories about that. Admittedly, these guys operate in TBs, but the general feeling on the sub towards CrashPlan is to avoid it at all costs (no pun intended).


Yes, it has saved my behind several times.

My requirements are pretty simple: has to work well on windows + mac, has to be cloud based, have decent bandwidth, unlimited storage, unlimited retention, and has to and require no configuration of my own apart from selecting files and schedules.

I just haven't found anything with the level of polish that CrashPlan has in terms of UI and usability. To avoid having all eggs in one basket I try to keep at least a mirror of everything too, e.g. on a NAS.

I have been a DataHoarder kind of guy in the past (dabbled with my own backup scripts, etc.) but found that the chance of user error on my part is probably an order of magnitude greater than the risk of losing data any other way - and with user error there is always the risk that you also bork your backups. So I don't want any system that has any kind of complex configuration on my part.

Further, I don't want to separate the storage from the backup, simply because buying unlimited cloud storage is to expensive. I don't need fast random access storage, I need something cheap like glacier, but that quickly becomes comples. So I'm very happy to hide that complexity behind a cloud backup provider that gives unlimited retention and storage.


I just restored ~3TB from it a couple months ago. Worked very smoothly and faster than the upload process. Each TB took about a day to download vs 1.5-2 weeks to upload


I also use Crashplan. For the price it can't be beat. Plus the upload speeds are very fast when compared to someone like iDrive.

However be warned, their client is a Java application and the more files and terabytes you want to backup, the more ram it takes.

For example, I backup about 3 terabytes on a server and it takes 3+ GB of constant ram usage to do this.

https://support.code42.com/CrashPlan/4/Troubleshooting/Adjus...


Yes, i wish they made a leaner client. I have a budget NAS with 128mb ram which I have to backup from a client because 128mb isn't enough.

I thought they had fixed this by now.


Please double check if CP really does backup all your files - it doesn't for me (Ubuntu 14.04, haven't checked after upgrading to 16.04 since I've cancelled my account anyway).

I suspect it's related to the huge amount of files in average js projects, eventhough i'm now excluding node_modules/ from backing up. I've also spent quite some time with CP support to debug the issue but without any result.

Nonetheless, I don't trust CP anymore to backup all my files. I've actually needed access to files in the past and wasn't able to find them.


I must admit I have only done partial restores, but it looks right size wise. I also use Windows CP exclusively so many-small-files and deep hierarchy are basically frowned upon...

If I had that problem I'd include a huge tar of everything in the file set. Or go to backblaze.


For low-volume data (mostly configuration files) of my personal machines, I have been using tarsnap for a couple of months now. It has saved me once already when the µSD card holding the operating system on my home server gave up the ghost, so I consider it a good investment (it helps that tarsnap is pretty cheap thanks to deduplication).

For higher volume data (source code, mostly) I use a droplet at Digital Ocean and keep my data spread across several machines at home.

On my desktop, a Mac Mini, I use Time Machine with an external ~1.7 TB USB drive that I hook up every couple of days (roughly twice a week).

It is not the most sophisticated arrangement, but it works well enough as insurance in case a machine suddenly dies. It works less well against accidentally removing files, but I have cut myself into that particular finger often enough to finally learn my lesson. ;-) [The downside of this lesson is that I've become kind of a hoarder when it comes to files. But these days, disk space is cheap, so it is not that much of a problem.]


Recently build myself a home box from Ebay parts. Two four core Xeons, 72gb DDR3 memory and six onboard SATA ports. Two 120gb SSDs running Proxmox on that for virtualisation and a 2tb drive for misc storage. Bought an LSI 9211 HBA with another eight SATA ports for fifty euros, which gets PCI passthrough'ed to a FreeNAS VM. Four cheap 2tb drives for the initial pool.

Another VM provides an instance of NextCloud locally and that's pretty much it. Works, feels secure enough and didn't cost more than 500€ total. A four disk Synology Home NAS with drives would cost more and offer significantly less.

As for online backup, being on satellite internet makes uploading pretty much anything too painful to even consider.


Yeah, I went the NAS route and now I think I should have built my own server instead, like you did. I ended up adding an Intel NUC (with Linux Debian) which does all the work and the NAS is used for storage only. The NAS runs an extremely stripped-down version of Linux which was not flexible enough for my needs.


Arq (https://www.arqbackup.com) and AWS. As close to set it and forget it that you can get.

Process:

- Local TimeMachine backups

- Arq backup to AWS Glacier


Note that Arq with Glacier only is usually a mistake (as discovered by the poster below), since restores need to be requested and queued up, and take hours to come back with the results. Restores are also very expensive.

As an Arq user who used Glacier previously, I much rather recommend using Google Cloud Storage:

* Nearline restore is instantaneous;

* Storage per month is >30% cheaper [1];

* Recovering data from GCS is super cheap; about 3% (!) of equivalent Glacier costs.

For the super paranoid, GCS also has more expensive multi-region buckets.

[1] https://cloud.google.com/pricing/tco/storage-nearline


Arq also supports Amazon Cloud Drive as a target, and if you're in the US, it's a flat $60 per year for unlimited storage.


There are some Terms of Service [1] items that might come and back and bite you (or not — who knows, they're a bit vague):

    The Service is offered in the United States. We may 
    restrict access from other locations. There may be
    limits on the types of content you can store and 
    share using the Service, such as file types we 
    don't support, and on the number or type of devices
    you can use to access the Service.
The fact that they "may restrict access from other locations" could be a problem when you travel.

[1] http://www.zdnet.com/article/is-amazons-online-storage-reall...


Anyone doing this? Sounds like something that will be banned once it catches on, but also sounds good.


I used to use Arq but i wanted to pull a single 1KB file off it from a computer sitting next to me and the ETA I got was around 90 minutes. I deleted Arq and its backups after that.


If you used Arq with AWS's Glacier, that is not Arq's fault, but AWS's; Glacier is designed to be (extremely-)high-latency.

If you want low-latency restores, see my other comment: https://news.ycombinator.com/item?id=13000979. I've restored small, individual files with Arq + GCS several times, and each time takes just a few seconds.

All the other storage targets that Arq supports (Dropbox, S3, SFTP, Google Drive, etc.) should also be very fast to restore from.


Should, yes. My restore from SFTP over a LAN was going to be about 90 minutes.


I've never used SFTP with Arq. Did you investigate it any further? I'm sure the author would love a bug report.


I heard a lot of horror stories when people tried to get back data from Glacier so I decided that I'll use simply S3. Aren't you afraid of Glacier?


I consider Glacier as the final offsite backup in the event my apartment burns down or something terrible happens to all of my local backups. In that case I don't care about how long it might take to restore - I will throttle the recovery as needed to not blow the AWS budget.


Arq + Dropbox

I used to back up to Glacier but I was already paying for Dropbox so I moved everything there.


Can you elaborate on the speed of backup restores with Dropbox? I'm also paying for Dropbox and planning to start using it as a backup destination with arq.


Bup. https://github.com/bup/bup

Git + bloom filters + Python

I backup hundreds of thousands of user accounts at cloud.sagemath.com using it, plus my own data, etc. I've been using it for four years and haven't found anything better for my requirements.


When I examined the available backup schemes that are were built off of a content-addressible storage scheme, all of them had one shortcoming or another except for Borg. If I remember correctly, Bup's shortcoming is that it is git based and hence architected to be immutable, meaning that pruning old backups is hairy.


The new bup version can do pruning, FWIW (haven't tried to see how hairy it actually is).

bup does have a huge advantage of deduplicating multiple machine images that can backup simulatenously. With Borg, there is a lock held on the repository (only one machine at a time), and if multiple machines do use the same repository, they will need to download the indices (Slow compared to bup's bloom filter), or use sshfs etc (slow)

borg also has internal encryption. If only borg adopts bup's bloom filters and concurrent access, it will only have advantages...


I just spent some time this weekend rebuilding my backup routines. In short, I Borg [1] and multiple targets. For a local target at home, I use a RasPi with RAID1 [2]).

For company sync/backup, we use AeroFS [3].

[1] https://borgbackup.readthedocs.io/en/stable/ [2] https://news.ycombinator.com/item?id=13000024 [3] https://aerofs.com/


Tarsnap is my preferred backup provider for personal and work. Do be careful with the key. I also have a OneDrive.

Work stuff also goes on an LTO tape. I guess we are just traditionalists at heart.


I use multiple Blackblaze accounts (https://www.backblaze.com/).

My requirements, and why I picked Blackblaze:

1) Work and personal data must be kept separate. (I use multiple Blackblaze accounts.)

2) Backups must be continuous and happen behind the scenes, I must never have to manually trigger a backup.

3) Must be able to restore not just a file, but a file on a specific date (Blackblaze keeps all changes for up to 30 days... wish it was longer but this hasn't caused me any issues yet).

4) Must be secure, encrypted, and support 2FA.

5) Must allow me to access my files from a different computer, or phone.

6) Must be simple enough that my parents can use it with minimal training (I've had my 65 year-old mother on Blackblaze for about a year now... no reason not to keep family safe too).

7) Must be affordable... Blackblaze is like $50 / year per account for unlimited data. It's 1/2 the price of Dropbox, and you don't have to mess with any silly simlinks that break when Apple pushes new software updates...

Bottom line, the people at Blackblaze set out to build a product that does backups right, and for me they are succeeding. I love their product.


4) Must be secure, encrypted, and support 2FA.

Doesn't Backblaze require that you enter your password/keyphrase on their website to restore from an encrypted backup?



Hmm, good point. Yes, I have to enter the key to use the iOS app. I don't know how that works but guessing it would be better to encrypt then upload... but if they did that I'd have to download then decrypt. Sucks for phones.


Encrypted rclone [1] to local external HDD and Amazon Cloud.

[1] http://rclone.org


Thanks for making me aware of the existence of rclone. I know what my next weekend project is. Getting a proper backup system set up. :)


I have a 4 TB external USB hard drive split in two equal partitions. One is dedicated to Time Machine backups and the other has full-disk disk-image backups made by SuperDuper!. The partition with disk images on it also has installer files for programs that I can't redownload on a whim.

Since I can only hold a few disk images on a 2 TB drive, I copy these disk images to a ProLiant MicroServer with a raidz3 (I've since heard this is a bad idea compared to mirror sets) to give me access to older backups in case my external drive dies.

I also use Backblaze; it backs up everything on the iMac and the external drive.

The laptop pretty much doesn't get backed up, but I don't have much state on there that can't be recreated with `brew install` and `git clone`.

Dropbox syncs some things, but I use SyncThing for syncing large files between my iMac, MacBook, and Windows machine. I also use a second SyncThing shared folder to keep my 1Password vault in sync between my iMac and laptop.


> I also use Backblaze; it backs up everything on the iMac and the external drive.

Be careful with that external drive. Backblaze deletes external drive backups if you don't sync them every 30 days [1]. I've lodged frustrated complaints with them over data loss but that is their current policy.

[1]: https://help.backblaze.com/hc/en-us/articles/217664898-What-...


Tarsnap. I like knowing that cperciva is ensuring the security of my backups. ;-)


I'm in the process of setting up a system. Anyone using Amazon Drive unlimited storage? https://www.amazon.com/clouddrive/home

I'd like to keep my laptop files backed up automatically.


I'm currently using Amazon's Cloud Drive for backups with Arq, see: http://arqbackup.com/ - It's a pretty sweet deal, if it lasts, however no unlimited service is truly unlimited. For example Microsoft's OneDrive was also unlimited at first.

However, if you end up doing this, it's a pretty bad idea to use your account for anything else but backups. Especially Amazon's Cloud Drive. And I also can't speak for how durable the stored data is and it's a good idea if important data is backed up in two places.


Maybe this should be a separate ask HN: what restore solution worked best for you?


Backblaze. They ship you TB external hard drives with all your data for a fee, and refund the fee if you return the drives.


It would great if they would keep that level of convenience for external drives. Backblaze deleting external drive backups every 30 days [1] is maddening.

[1]: https://help.backblaze.com/hc/en-us/articles/217664898-What-...


Wow, I didn't know about that.... that's disappointing.


I'm pretty confident they make it intentionally hidden so you don't know until it affects you. For example, titling the FAQ "What happens to my backups when I'm away or on vacation?" vs "What is your external drive retention policy?"


I don't think it's hidden. Backblaze will warn you if it hasn't seen a drive in a few days, and right in that email is the heads up that the drive will be deleted. I thought it was also in the external drive management pane of the app. However if you just let Backblaze do its thing, and don't have warnings enabled, you might get got good.


Not the warnings themselves — the fact that your data is not retained on external drives in the onboarding and FAQ.


I use borg for laptops and servers. I use syncthing for (android) phones and tablets to sync photos/videos to home server where they are then backed up by borg.

In the past, I have used rdiff-backup and tarsnap.


I live nomadically and am often without a stable-fast internet connection. So my backup solution has been catered to what I feel makes the most sense to me (Of course always open to improvements!)

I have a 1TB Google Drive plan; this stores all of my photos - whether coming in from Google Photos (Uncompressed) on my phone, GoPro, etc. as well as pretty much any sort of documents - everything goes straight to Drive. Anything I deem sensitive I throw in an encrypted volume which is then synced.

As I run macOS, I use TimeMachine for Backups - I travel with a 500GB portable SSD and store TimeMachine on there regularly. This doesn't include GoogleDrive, or any other data.

If I'm traveling somewhere, I keep my drive and laptop separate in-case one is lost.

If my computer ever fails, is stolen, etc. then I should be able to purchase a new one, restore with TimeMachine, and then camp out in an area with fast internet to restore anything on GoogleDrive I'd like, and should be able to start working again immediately.

I know I have a single point of failure with only one TimeMachine backup; I'd like to improve that. However it's also not catastrophic if it's lost; just a pain in the butt as it'd take me time to get my environment back to the way I like it.

Recently with the stories of Google locking out their users - I'd like to setup a routine of storing 'Google Takeout' on some drives and throw those in storage as I know I have a lot of information in there.


All of my machines get backed up to my Synology NAS. The NAS then syncs all the backups out to Amazon Drive. It's not "real" offsite backup, but it's good enough for what I'm doing. In addition, the NAS also syncs music, video, and images out to Amazon Drive separately so that we can view our media from anywhere without depending on our home internet connection.


All machines in my household get backed up to a home-built NAS. It runs low power inexpensive hardware that's plenty fast, along w/Linux Mint + Samba to manage the file shares. I preferred this since I've used Linux most of my life - I prefer avoiding learning new systems when possible, which is why I'm not running something like FreeNas, Synology, etc.


Local backup using Time Machine to a USB HDD plugged into an AirPort Extreme.

Offsite backup to Backblaze.

Photos are also back up on iCloud (paid) & Google Photos (free).


A little Barebones PC (Intel NUC) used as a sync server, connected to a 9TB NAS (Thecus N4310 with 4X 3TB HDDs in RAID 5). I do not like the apps included in the NAS, that's why I use the NUC (with Linux) as the "brain" of the backup process, which is done by calling rsync through various cron jobs.

The NUC and the NAS are on their own subnet (with a router dedicated to them) and not accessible from the external network, however the NUC can access a shared folder on my PC which is where it takes the data from (and then it saves it to the NAS).

I do have online, cloud-based backup for less sensitive data, so I can access that while on the move. I use the cloud Drive offer from Mailbox.org (I also use Mailbox.org for mail/calendar/notes/address book).

Lately I'm thinking of replacing the online part with my own cloud server though. Probably I could run something like mail-in-a-box and NextCloud/OwnCloud on my own online VMs. But I'm too lazy and, at least for now, I trust Mailbox.org.


Praying and LVM RAID1.


That's how real men do it.


Github for code and Google Drive for documents & photos. Google photos turned out as the best solution for photo syncing, sharing, simple editing (rotation with keybind) and unlimited free storage of pretty good quality photos. Google documents turned out as the most practical editor (cloud, sharing, all the basic editing commands). I am on the unix systems most of the time and I always find something missing in Pages or LibreOffice. If word was available on all systems I would probably use it together with dropbox, not because I prefer word, but because most of files I get are in word's format and if I have to change and help with something, its best to do it in word.

I don't really have anything else to backup. What do you guys have on your systems to require complete system backup?


I have all kinds of dotfiles I'd hate to recreate from scratch. Mostly application configuration stored in ~/Library or /usr/local.


github works well for dotfiles.


Personal (family) stuff - I have a 3TB NAS in a RAID1 configuration at home that my wife and I save our photos and videos to. We also back up our personal documents and such there. We are not super-prolific in terms of media creation, so currently I think we only have about 200GB worth of files. About 3 times a year, I make a copy of the drive and store it in a safety deposit box at our bank (e.g. in case the house burns down). I just don't like the idea of our entire set of family photos and videos in the cloud.

Work - I'm a consultant/freelancer, so for some things I rely on the client's backup systems and for things that are more my own responsibility, I rely mostly on cloud version control (github, bitbucket), as my work artifacts are mostly code.


Anything even remotely important gets printed out on paper. Even source code. I was burned once when one of my computers died and my NAS died while trying to recall the backup.

On a semi-related note, I find it easier to read printed code and mark changes than to stare at the screen.


So do you print the source again after every change? And what happens to the then obsolete paper with the previous version? (or do they get version numbers?)


Backblaze. Set it up and forget about it. I've done a handful of recoveries and it's never failed.

I also have dropbox and sync most of my code to bitbucket/github, but I don't really consider that "backup" even if I can sometimes use it as such.


I use rsync, mostly to backup from LVM snapshots over SSH to either rsync --link-dest snapshots or ZFS snapshots.

https://gitlab.paivola.fi/tech/pvl-backup


Do you also keep your zfs snapshots around?

(they're a lot more efficient than lvm snapshots. One machine I use took snapshots every half hour, and we accidentally filled it up with >3000 snapshots before anyone noticed that the reaper script for old snapshots wasn't running :-P . No performance degradation!)


Currently setting up rclone+duplicity -> rsync.net

rsync.net partly because I want a solution whereby if an attacker ever gains access to our credentials, they could not wipe all of our backups, a la http://www.infoworld.com/article/2608076/data-center/murder-... . Most other backup stories do not survive that type of threat.

rclone because it allows consolidation of various cloud data (sending that to rsync.net in case they ever get damanaged/wiped).

work+personal separate of course.


For home: Synology NAS backing up to glacier weekly. It doesn't get a lot of use, so it's very cost effective.

For work: projects on gitlab.com and servers (all Linux-based servers) I use Backup Ninja. Backup Ninja is really just a front-end for several other packages, making it easy to setup backups via SSH on a remote server.

Databases are backed up locally, then everything is backed up to a remote server using rdiff. I can have 30 or 60 days of backups for a relatively small cost. The backups go to a 2TB server hosted by another cloud provider, which is more space than I need.


Arq [1] with Google Cloud Storage here, scheduled nightly. See my other comments about pricing: https://news.ycombinator.com/item?id=13000979.

Arq supports multiple sources and targets. When a source is an external hard disk that is only occasionally plugged in, Arq does the sensible thing and backs up the disk when it's plugged in, but otherwise leaves it alone.

[1] https://www.arqbackup.com


I used to use backblaze, but their client would take up too much space - i've commented on here about it before being very buggy. So i switched to amazon cloud and have been syncing files that way.


I have three backup systems.

My main backup is a local harddrive formatted with btrfs. I have a script in cron.daily that runs rsync then makes a btrfs snapshot.

In addition to that, I run crashplan to back up my computer to the cloud. In some sense, this is actually my main backup because I trust them far more than I trust my own backup system, but it is not as convenient to restore from than a local drive (but still quiete convenient).

I also keep have a two small hard-drives in a safety deposit box. These drives are exact mirrors of each other and tested/updated once a year.


Work: A 1TB external enclosure. That's more for "oh shit my work laptop's SSD failed" backup. The real work is done on Bitbucket and a Google Apps account. I do not manage the servers.

Home: My home server has a 4TB main backup drive and a 2TB redundant drive (easily downloaded "safe" stuff, like steam, origin, gog backups). The 4TB drive is synced to 3 other 4TB external drives that I swap between home, work, and my parent's house (100+ miles away).

All encrypted of course, whether it's Bitlocker or LUKS.


I can't promise this won't eat your lunch and pee on the floor but I've added a Ceph backed to preserve:https://github.com/cholcombe973/preserve I'll be adding integration tests with checksums to verify integrity this week. If anyone wants to build another backend I'd be happy to merge it.


Github for my contracts.

My music collection sits on a server hooked up to my stereo. I had the disks in RAID-Z1, but when one of the disks pulled a Seagate I accidentally made the disks striped vedvs. To back this up, I boot up a rackmount server with larger disks in RAID-Z1 and use rsync.

This is the only data that I really care about preserving. I found this out through changing my operating system a lot and having my laptop stolen.


I gleefully deleted my music collection after being back in the USA and able to use Spotify. Having access to a astounding quantity of music without having to maintain a library is so, so freeing.


There's a lot of music that isn't available on Spotify. You also lose access to that "astounding quantity of music" as soon as you stop paying your monthly fee.


Don't use scare quotes it's rude.


I'm strongly considering this. I've got about 30-40GB of music stored locally (relatively small, I know), and I'm not sure why anymore. I tend to prefer mainstream music, so it's easy to find this stuff, and I've not listened to anything local in many years.

My biggest concern with deleting them would be forgetting about music that I previously liked. Seems like running `find` on my music directory would solve that.

Then again, with 30-40GB, I could just stick it all on a couple SD cards (for redundancy) and ignore them in a drawer.


Keep in mind it doesn't need to be an overnight solution! Sign up for a streaming service you like, and subscribe to the music you want and delete your local copy as you go.

Unsure about Spotify - but Google Music allows you to upload your own collection as well.


Cloudberry Backup (closed source, proprietary, 30USD) with Backblaze B2 (0.005/GB/month). Once it is setup it runs in the background and just does its thing. I just migrated from Cloudberry/Glacier to Cloudberry/B2 and dont see any reason for using Glacier any more. Data is locally encrypted.


Just curious to hear a little more about your choice of B2 over Glacier or S3. I haven't seen much written about it yet. Do retention policies work with B2?


1. I have a synology which acts as central store for everything. 2. Everything on synology (except for videos) gets backed up to crashplan. 3. Once every few months I backup synology to my 5TB external drive.

I hate to loose any file even if its not important :) I still have files from the floppy disk era.


I backup my main system to Backblaze, and then back it up to an external drive every month or so. That external drive gets plugged into my laptop and the files get copied there. Yeah, I have Dropbox, but I also have a music collection that's a bit too big.


For my personal systems, I use rsnapshot to make incremental copies on a nightly basis into another internal SSD used only for backup. Because I only use SSD, the whole thing runs under a minute. Then monthly this gets it copied into an USB external HDD.


I have a synology box with a pair of drives in RAID 1. That's not bad, but if something happens to that or my places goes on fire, well, I'll lose it all.

So I need a remote backup solution for it. I haven't found anything cheap and satisfactory yet...


Daily Duplicity snapshots that sync to S3, plus hourly snapshots via Back In Time to an internal drive, plus an occasional manual rsync to a portable USB drive off-site.

Probably overkill, but if one backup method ever has a flaw I'm still fine.


> Probably overkill,

Yeah.. I was about to say. LOL. Have you even been in a situation where you had to restore?


Yes, I regularly test the backups by restoring a random directory to my desktop and then opening a few random files to test. For things like photos which don't change (as opposed to a document which may have been edited) I'll do a quick spot-check of file hashes as well.

I'm picky about backups and have never had an issue where files were permanently lost. I'm to the point now where if I'm not sure I want a file I'll go ahead and delete it, and then if I need it later I restore from the Back In Time snapshot.

Storage is dirt cheap, so spend the time up front getting your backup 'idiot proof' and automatic and it's totally worth it in the long term.


Storage is indeed cheap. I've just had a hard time figuring out how to set everything up and validate it. Thank you for the detailed explanation.


Locally: QNAP NAS which accepts Time Machine backups from Apple computers. The NAS also runs a Linux container with SpiderOak on it for any NAS-only data and critical time machine backups. Endpoints also have SpiderOak.


A combination of a NAS, Crashplan (I may be switching to BackBlaze) and occasionally burning critical data to BluRay discs for cold storage.

On my work desktop I also do incremental Acronis images and Windows File History on local drives.


I've used both Crashplan and Backblaze personally but found both pieces of software to be painfully resource intensive, particularly in CPU usage on a rMBP with good specs. (I should not be able to know your software is running by the slowness it causes me in the middle of development when I have it set on the "optimal" settings.)

IMO neither is really better than the other. I feel like a good complete remote personal backup solution for someone with multiple drives [1] doesn't exist today.

[1]: Read someone with an SSD MacBook and years of data accumulated from other machines that don't fit on the built-in SSD.


Why do you want to switch from Crashplan to Backblaze?


I have become less and less of a fan of the software over time.


Personal: cloud server (it's quite expensive, like 40€/month but it's the best I could find) with rsync without --delete

Pro: backuppc (used at a "large" scale without issues) or snapshots on big fileservers


I use aws-cli tools. I create buckets on S3 and then use a simple script:

    aws s3 sync <source> s3://<bucket-name> --delete
If needed, put this in a cron and you are good to go


GitHub and Google Drive are enough for me really. I don't backup what isn't important and subscribe to services for consuming music/movies/tv so I only have a small amount to store.


Dropbox + spideroak.

Dropbox for non sensitive stuff and spideroak for complete PC backup.


Time Machine + Backblaze.


Everything personal (local drives, Google Drive, Dropbox) goes to my Synology NAS, NAS is backed up weekly to S3. The best thing in Synology is that it solves this out-of-the box.


Oh and forgot, that server stuff is backed up to another server with rdiff-backup. Some years ago this was the quickest solution and until now never had any issues with it.


Dropbox for most stuff. External HDD for manual image backups.


1. Daily: rsync snapshots to disk attached to Raspberry Pi (similar to rsnapshot)

2. Periodically: duplicity to Oracle archive storage (7x cheaper than AWS Glacier - $0.001)


Encrypted rclone [1] with multiple cascaded self-hosted peers.

[1] http://rclone.org/


A separate file server (QNap) that backups in CrashPlan, and to an external USB drive. Not ideal, but works.

I don't bring work back to home.


Crashplan with user-specified encryption key.


I use rsnapshot each night to an encrypted external drive. The drive gets swapped each week with one kept off-site.


I definitely keep work and personal data separate. There are a number of legal and ethical reasons for this.


rsync to btrfs volume, then take a snapshot. 2 LOC.

I do this to a pen drive on site and to a remote volume.


I'd probably be what you'd call a digital nomad. I keep a Time Machine backup on a ruggedized external drive, and I use Backblaze. I've been very happy with Backblaze, if that's what you're asking about. Additionally, Google Drive and another external USB flash drive for a couple sensitive documents.

I don't work.


well you probably don't have that much to back up then:)


Heh, no. ;)


Timemachine, Backblaze and a bit of Dropbox (particularly for syncing photos from phone)


Personal backups are done with Dropbox, which stores several EncFS collections.


For the most part I keep work and home separate, although there is some overlap as I exclusively work from home. I have several backup processes:

First, I take automatic daily differential backup images of the primary hard drive on my main desktop. (This is a Windows machine, so I use Macrium Reflect.) Images are great because if your drive dies or gets corrupted, you drop in a new one, copy the image over, and are back up and running as quickly as possible.

For the same reason I take images of the volumes on my home Linux server. For this I take an LVM snapshot, mount it, create a tarball, then unmount and remove the snapshot. These are only done monthly because the process isn't incremental or differential, so takes longer, and the contents don't change as often anyway. It could be completely automated, but I just have a script that I run manually, because I like to do any updates that have the potential to break things while the snapshots are open for ease of rollback. (I have a monthly calendar reminder to do this along with quickly checking that my various other backups are running properly, and backing up my phone using Helium.)

I continuously mirror the user folder from my desktop (and my wife's computer) to a Samba share on the home server using FreeFileSync. (There are many different ways to sync files, but since the desktops are on windows, this was easiest.) Then I run Crashplan on the home server to do incremental cloud backups of all these files, both to deal with the worst case scenario of the house burning down, as well as to catch any files changed intra-day, not caught by the daily disc images.

Misc: Regarding the phone backup, my preference is to do a full disc image, for the same reason as on the computers - break your phone and you can replace it and be back where you started in no time. Unfortunately that requires rooting, and in Samsung Hell rooting permanently devalues your phone due to Knox insanity. So for now I back up those apps that allow it with Helium, a couple of high priority ones using their built-in backups, and grumble about how I should just root. I also usually only bother to do this every few months, since anything really important on the phone is in the cloud anyway.

Except photos. I run FolderSync on the phone to continually sync any new photos to my OwnCloud on the home server via WebDAV (from which they're backed up to Crashplan), as well as back to Google via their photos backup tool, but in "standard" res - not as a backup, but just so I have my full photo library accessible on my phone.

I also occasionally fire up Thunderbird to download email from Gmail. I use the web client exclusively, but like to keep a local copy of all my email on the off chance I'm ever somehow locked out. It's highly unlikely, but the peace of mind costs all of 5 seconds every couple months.

I think that's it. Sounds like a lot, but I set it up incrementally over time, and for the most part it all just works. If I were setting things up fresh, I'd probably save some time by paying for more Google storage or something rather than using OwnCloud, and by using the CrashPlan family plan rather than syncing everything to the home server then backing up from there. Regardless, I do believe it's good practice to double-check whatever you use for your backups, at least every couple months.

(This is all leaving aside the Tempest servers, which use various RAID, master/slave SQL, LVM snapshot backups of the same sort I do locally, and rsync to off-site storage.)


Arq.

S3 and google drive.


+1 for Arq. I use the 1TB of storage I get with my Office 365 subscription to backup my Documents, keys, photos, and other volatile files. Everything else gets backed up to AWS Glacier.


Backblaze. Works great, $5 for unlimited space, covers everything.


crashplan.com

Works on Windows, Mac and Linux, $150/yr gets you unlimited number of machines and unlimited storage, it keeps unlimited versions for recovery.

Also important - data sent to cloud encrypted.


I use rsnaphot, with ssh to backup remote hosts/servers.


Crashplan. Its awesome


We have Crashplan at work. Haven't needed to use it yet, but it gets out of the way and it seems like a good solution.


Bitbucket for work backup and spideroak for personal files.


Two Time Capsules in separate buildings.


Retired, so no need for work/personal separation, but I never did it before, but made sure that my company knew any work I did at home was going onto my general tape backups and was not going to be deleted any time soon after I left the company.

Nowadays: most important stuff, rsync.net, more than a bit expensive at 20 cents/GB/month, but their many other virtues and especially simplicity make up for it, including support of private git repoes. currently $9.60/month, that's a grandfathered price in that now the minimum you start with is larger.

Saved my email repository and a fair amount else when the Joplin 2011 tornado trashed the machine that it was on as well as my BackupPC discs which were in another room that suffered the only wall breach (my main system came through fine, and that near miss of losing everything else prompted me to start using tape again, LTO-4 was capacious and cheap enough by then, I'd long outgrown DAT).

Most of most important stuff with BackupPC to a 1GB USB 3.0 drive (would be all to a bigger drive, and will be when the next item changes state, but I'm economizing now).

Since my 2 2TB bulk drives are all over 5 years old now, they're getting nightly rsynced to a new 4 TB drive on my other system, which when one of them fails will physically replace both of them.

Backstopping all of that, a LTO-4 tape drive on my other system, tar incrementals every night, those tapes cycled every week from a pool of 5, then full backups every month, which are put off-site sometime during the month.

Now that I've got a serious Internet connection, if I was starting over, and didn't have the sunk costs of the LTO-4 tape drive system (drive, SAS controller, fast disk to feed it) and plenty of tapes, I'd probably do this level of backup to the cloud at a fairly raw level to S3, Glacier, GCS, or Backblaze.


> Now that I've got a serious Internet connection, if I was starting over, and didn't have the sunk costs of the LTO-4 tape drive system (drive, SAS controller, fast disk to feed it) and plenty of tapes, I'd probably do this level of backup to the cloud at a fairly raw level to S3, Glacier, GCS, or Backblaze.

Personal warning to anyone else attempting to create a multi-drive setup with Backblaze. Backblaze deletes external drive backups if you don't sync them every 30 days [1]. I've lodged frustrated complaints with them over data loss but that is their current policy.

[1]: https://help.backblaze.com/hc/en-us/articles/217664898-What-...


Errr, I mean their new low cost raw storage B2 Cloud Storage (now 0.001 cent/month more than Glacier https://news.ycombinator.com/item?id=13010949), not their total solutions for those not wanting to go to that sort of trouble/complexity/whatever.

(As someone who's been using tape drive backups since 1978, DECtape to start with, fortunately before I learned the -rf flags for rm ^_^, I'm not a conventional user.)


Gave up on cloud backups. Takes too long on onedrive. Dropbox too expensive.

For personal I now use easeus todo on Windows. Keep a backup hdd at work and at home.had

For work I let them take care of it. I connect to VM and it's all backed up.



My backup solution is #YOLO




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: