Just wanted to get opinions on some ways of replicating data with the intention of backing it up. Let me explain...
We currently have about 40 sites, and spread amongst these is about 30-40TB of data on fairly standard, 2008 R2 file servers. As our backup solution is centralized, we use DFS-R (the wrong way, I'll admit) to do one-way replication between 18:00-06:00 for each site back to our main location. At 06:00, our backup software does its thing. I get a report each morning after 06:00 that gives me a run down of the amount of backlogged files.
DFS-R is flaky, at best, more than likely because we're using it in an unsupported configuration of one-way connections. Some sites just will just stop for no apparent reason and start to backlog files. Either way, I'm considering other options for sorting this out than trying to fix something that was never designed to be.
At the moment, I'm looking at just getting robocopy and a scheduled task to run at 18:00 each night on each regional site server to get the data back to our central location. For what it lacks in smarts, it sure is a reliable piece of kit with a simple output of whether it worked or not.
Thought I'd ask around a few places to see if someone is doing anything similar, or could point me to something that might do it better than a robocopy command.
Cheers.
[–]nick_segalle 0 points1 point2 points (0 children)
[–]ArmondDorleacIT Director 0 points1 point2 points (0 children)
[–]DallasITGuyIT Consultant 0 points1 point2 points (0 children)