Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Data Storage Microsoft Software

Microsoft Releases Public Beta of Data Protection 262

Torrey Clark writes "Microsoft has released the public beta of its disk-to-disk backup product, Data Protection Manager. The product is designed to make backups easier than simply backing up to tape. Disk-to-disk backup completes images in significantly less time, meaning much less downtime for systems during backups."
This discussion has been archived. No new comments can be posted.

Microsoft Releases Public Beta of Data Protection

Comments Filter:
  • by R.D.Olivaw ( 826349 ) on Friday April 15, 2005 @07:45AM (#12242927)
    Don't know about the rest of the world but we don't have to take systems down to backup them here.
    • Agreed -- though snapshotting *is* necessary to get consistant state, that's provided by AFS (when backing up our fileservers), Oracle (when backing up database contents) and LVM (everywhere else).
    • You don't with Windows either, but you have to make sure there are no handles to critical files when you do. After that you can just use dd or whatever, I use dd, because I came from a unix background and found it the most simple solution.

      If you're not in the know and still reboots, why not just g4u [feyrer.de]?
    • by ntshma ( 864614 ) on Friday April 15, 2005 @07:59AM (#12243007)
      Microsoft Data Protection Instructions: 1: Click on START and then SHUTDOWN.
    • Sometimes systems will need to be brought down before performing a backup. Otherwise the image might be in a state of flux, causing potential problems during recovery.

      Some backup utilities provide capability to take a snapshot and backup that snapshot while the system continues to be used.

      One of the features Microsoft is touting in this product is 'moving only the byte-level changes of the file servers' thus eliminating any downtime.
      • It would be a poetic mental picture if we're not talking about fat, overworked, graveyard-shift sysadmins trying to finish their nighties and go home...
      • This is why you use a NetApp [netapp.com]. Backups are atomic, guaranteeing filesystem integrity, so recovering from a restore is exactly identical to recovering from a system crash, except you have more guarantees about the state of your data. I've used NetApp snapshots (the fundamental building block upon which backups are based on a NetApp) to back up Oracle databases that were under heavy read/write load, and restored backups of same. No worries.
    • I think what they are getting at here is the time it takes to restore the data. Rather than taking the time to restore from a tape which is very time consuming, it images to a hard drive. That drive can simply just be plugged in when the machine is down for the drive replacement. As opposed to taking down the system, replacing the drive, reloading the OS, installing the backup software, restoring the data, verifying it's integrity...
    • Don't know about the rest of the world but we don't have to take systems down to backup them here.

      Yeah, it's pretty pathetic. Contrast their approach with our simple "poor man's RAID" backup solution which has worked on Sun systems, *BSD systems, and GNU/Linux systems for over 10 years:

      (install two identical hard drives)

      dd if=/dev/hda of=/dev/hdb bs=1048576


      Run as frequently as you need a backup image. This has worked, as I said, for over a decade, and has allowed quick and easy recovery of every m
  • by teh_mykel ( 756567 ) on Friday April 15, 2005 @07:45AM (#12242929) Homepage
    Now, if Microsoft could actually release a product that didnt require an amazing array of backup software, we'd be talking business.
  • Surprising (Score:5, Funny)

    by ThePlague ( 30616 ) * on Friday April 15, 2005 @07:45AM (#12242932)
    The really surprising thing is that they released the source code, and here it is:

    xcopy *.* "x:\" /d/s/e/c/f/h/k/y
    • by WARM3CH ( 662028 ) on Friday April 15, 2005 @08:25AM (#12243151)
      Disk-to-disk backup? In fact, I (ie. my computer!) do it every night. Simple copy command? I think that does not cut it. I'm in a tight development cycle and each day write a lot of code, documents and receieve/generate lots of data files. I need to back up all important data but surely I don't need to make backup of the executable files, temp files, OS system files and such. The solution that I use is simple: I have two hard disks in my computer. The files that I need to have back-up from, are scattered on these two drives. Now, I have made a BackUP directory on each one of these drives and put a copy of all important files in them. So, I have 3 copies of every important files: the original, and two back-ups. In case a hard-disk goes banana, I always have a copy of all important files on the other one. I run the back-up every night. Just need to copy the files that has been changed or the whole new directories made during the day. So the problem is: I need two desinations for each source. I need to be able to select which directories or even which files to back-up (or not to backup) and I need to check which files have been changed or which new files (or directories) have been created. I need to be able to schedule the back ups for midnight and I need to forget about all these details in practice as I have to focus on my work :) How I did it? Well, I tried a script in the beginning but found it difficult to manage over the time and it was very tedius. Now I use SyncBack [2brightsparks.com] which is a freeware program with all these features that I need (and more! like FTP and compression to Zip, etc.). QED.
    • Only they will call it Data Protection Manager 2006 and charge $800 (Base XCopy) for it... Plus the $400 Exchange Agent (/EA) and the $400 SQL Agent (/SQLA) and the $400 Multi-Server (/MS)Agent and the $400 Netware Compatibility Agent (/FU) and the $40,000 Linux Compatibility Agent (/HAHA)

      - usage: Xcopy c:\*.* x:\ /h/e/c/k/HAHA...
    • The really surprising thing is that they released the source code, and here it is:

      xcopy *.* "x:\" /d/s/e/c/f/h/k/y


      Sir, a DMCA takedown notice has been filed with your ISP and Slashdot. Please remove all source codes at once.

      - Friendly protector of your rights
  • by xtracto ( 837672 ) on Friday April 15, 2005 @07:47AM (#12242941) Journal
    So it seems DPM is only a "data-mover", so it will need to be combined with another technology, after some research i found this:

    StoreAge Networking Technologies announced that it will be developing enhanced solutions to support Microsoft System Center Data Protection Manager

    The full article is: here [wwpi.com]
  • by Tanami ( 601011 ) on Friday April 15, 2005 @07:54AM (#12242975)
    We've been doing disk-to-disk for a year or so now using rsync's --link-dest feature to create apparently complete mirrors each night, but with only those changed files actually occupying disk space (beyond that of a symlink). Makes restoration an absolute breeze compared to tape, but I'm not sure if this M/S effort does the same? *runs off to look*
    • by gtoomey ( 528943 ) on Friday April 15, 2005 @07:58AM (#12243002)
      I rsync from my web server in USA to Australia using my ADSL connection.

      Rsync usually reports a 1000 fold speedup over a dumb copy.

    • Which is great till you start thinking about stuff like disaster recovery, offsite copies etc.

      Tape's fine if you have decent software managing it. My personal recommendation is Tivoli Storage Manager if you have money to throw at it and if you don't then Bacula. More are however a little more than simple network backup systems and may be overkill on a small network.

      • Which is great till you start thinking about stuff like disaster recovery, offsite copies etc.

        I don't understand? That's precisely why we started doing it.

        We have an offsite server in a managed facility to which we back up each night over SDSL - nightly update via RSync for ~400GB of total data is around an hour on average. This server collects data from two sites, in the event of total system failure at either site, we've got lots of options depending on the disaster - home users could connect direct

        • " nightly update via RSync for ~400GB of total data is around an hour on average"

          Ah. That explains it. 400Gb is what I'd describe as a small system. We do an incremental of around 16Tb and we're not a particularly large site. Pushing 400Gb over a WAN is expensive enough, try it with a bit more data.

          "I could restore right now, in literally 10 seconds, any file on our network shares exactly as it was at the end of any working day between now and the 5th of January. Perhaps more importantly, pretty much any
  • by pcmanjon ( 735165 ) on Friday April 15, 2005 @07:55AM (#12242982)
    What's wrong with:

    dd if=/dev/hdb1 of=/mnt/hdh1/path/to/desired/backup/image/here.iso

    Oh, not available for Windows, so you'll have to buy a product instead. But isn't dd much easier than using a program that expires after 270 days.

    http://www.microsoft.com/windowsserversystem/dpm /e valuation/faq.mspx
    Q. When does the DPM beta expire?
    A. The Data Protection Manager software expires 270 days after installation.
  • But.... (Score:3, Funny)

    by Anonymous Coward on Friday April 15, 2005 @07:57AM (#12242993)
    Microsoft has also said that it won't be using its own software since it prefers to destory any information that could be used against it in a court of law.
  • Now you can (Score:3, Funny)

    by R.Caley ( 126968 ) on Friday April 15, 2005 @07:57AM (#12242995)
    have your viruses and trajans back and working again in a fraction of the time.

    Nerver more will you have to endure those painful minutes between rebuilding your system and getting re-infected.

  • by suman28 ( 558822 ) <suman28@@@hotmail...com> on Friday April 15, 2005 @07:57AM (#12242996)
    Imagine that...less down time. Who would have ever thunk it.
  • by (trb001) ( 224998 ) on Friday April 15, 2005 @08:02AM (#12243023) Homepage
    Or is this just RAID-1 backup without the read performance boost?

    --trb
    • Our company just takes out a scsi harddrive from our RAID1 and BAM, instant backup. Replace harddrive removed with another one and mirror. If their is a problem with the filesystem (say after a bad patch) out with the new and in with the old to go back right where you were. Of course this only works with our OS and not company data as it's stored on RAID5 and archived to tape.
    • It could be useful for other things. For example, doing tape backups over the network. A full backup of a 50GB drive over the network is going to take two hours (or longer). During that time your server's hard disk is going to be thrashing like crazy and performance will be shot to hell. But if you've got two disks, then you just "break" the array (making one of the disks readonly) and backup from just one disk. Minimal performance hit, you get the benefits of RAID1 (and snapshots), and it's a pretty s
    • I just want to add my recommendation for RAID-1. I set it up a couple of years ago when my last drive died, and I decided I was sick enough of losing data that it was worth it. I just had one of the drives die on me last weekend, and my downtime was just the few hours it took me to go to the store, get a new drive, and swap it in. Then I just told my Intel RAID app to mirror to the new drive and I could continue working while it rebuilt the array.

      Without RAID-1 I'd be pissed for the rest of the month.
    • It's not doing a sector by sector copy from one disk to an identically sized disk, it's copying the files from one drive to space *within* another drive. The receiving drive (or "drive") will most likely be on a separate server with space to hold multiple full and/or incremental copies. This is a very common feature of backup software, they do backups to more than just tape.

      What makes this newsworthy is it's Microsoft so they'l likely undercut the prices of their competitors and use their insider knowledge
  • A few points (Score:4, Interesting)

    by erroneus ( 253617 ) on Friday April 15, 2005 @08:03AM (#12243031) Homepage
    Someone criticized the "downtime" thing. Frankly, in order to get a good backup, any other processes running on data should not be in flux or the backup itself could be corrupt. So even in most conventional backup schemes, there is a period of time in which backups run and nothing else does.

    Another point is that I do not see where it will support operating systems other that Windows. This is to be expected, but a mature solution should be capable of backing up multiple operating systems as many sites I have seen have a heterogenus computing environment. At my site there are Windows servers but there are also Novell, Linux and SunOS. Is there a solution for those too?

    On the other hand, if we're talking about what essentially amounts to "dd" I am sure there could be a handy Knoppix CD created to suit the task in some automated way. It could actually be quite simple in that at a certain time of day (night?) power to a bootable external CD drive is enabled, the system is scheduled to reboot at the same time, it boots from CD, runs "dd" per the scripting in the custom Knoppix where it finishes the job by writing out information to a log file about success or failure and then reboots the computer again. That's just off the top of my head but I am sure that even more elegant schemes could be cooked up. This solution would be effective at creating viable images at a good speed and could even utilize compression along the way.

    If Microsoft wants to make a "ghost" backup, then maybe they should just license the technology from Symantec.
    • Re:A few points (Score:3, Informative)

      You've never heard of using broken mirrors, I take it?

      Zero downtime. Instant backup for a change. Sure, you need RAID-1 for it, but disk is cheap compared to the data on it.

      Also, various products have quite capable open file managers. We use Veritas Netbackup at my workplace, and it's excellent, cross platform and high performance.
      • A nice idea until Murphy's law strikes.
        • ...broken mirrors...
          A nice idea until Murphy's law strikes.

          Not an issue. There are a couple of solutioons:

          1. 3 way mirror ... Break off the third mirror, back it up, then re=attach it before the next backup.
          2. Primary is raid-5. Secondary is a simple concatonation (I presume Windows can handle this sort of setup.) Once again, break off secondary to do backup.
          3. Filesystem (such as Veritas) which allows snapshot filesystems. Snapshot systems keep track of old data when new data is written to the filesyst
    • Re:A few points (Score:4, Informative)

      by jbarr ( 2233 ) on Friday April 15, 2005 @08:13AM (#12243080) Homepage
      I have used Acronis' backup product on my workstations, and it works on-the-fly. I even tested it by doing a full backup, formatting the system disk, and restoring, and everything "came back" like it was before I did the backtup. Acronis certainly has an excellent product.
    • Re:A few points (Score:3, Informative)

      by jabuzz ( 182671 )
      I take it that you have never heard of snapshots and file system freezes then? It goes like this; freeze filesystem, take snapshot, unfreeze filesystem. Typically this takes place in under 10 seconds. Then you backup using the snapshot which can take as long as you need. Provided you don't run out of snapshot space of course. Then you release the snapshot once the backup is complete. Try man xfs_freeze for information on how you backup on real operating systems.
  • Haven't RTFA because I couldn't be arsed.

    Shouldn't the system back up to a disk spool and then to tape for offsite storage? Hell, even the freebies Amanda and Bacula do that already. And Yup, Bacula is available for Windows.

    It does have to be said though that some very expensive commercial backup systems are only just managing to include disk spooling prior to tape ( Having had to deal with it for several years, I refer to that steaming pile of dung which is Netbackup).

  • Why is this news? (Score:2, Informative)

    by Anonymous Coward
    I don't recall /. articles for the release of any of these applications:

    Freshmeat Backup Apps [freshmeat.net]

    (flame away)
  • by bro1 ( 143618 ) on Friday April 15, 2005 @08:22AM (#12243135) Homepage
    Now it would be nice to get 5$ each time data is corrupted by this backup system.
  • to Disk? (Score:5, Funny)

    by rossdee ( 243626 ) on Friday April 15, 2005 @08:25AM (#12243154)
    "Please insert disk 2 of 1,270,196 in drive A: and click continue"
    • I actually backed up my hard drive to 216 floppies once. I used MSBACKUP, which thankfully skipped the files that couldn't be recovered due to bad floppies and restored the rest.

      No one believes me though.
      • No one believes me though.

        MSBACKUP actually worked? It could skip bad floppies? WTF!? No way! In my experience, the smallest of errors will make any Microsoft program blow up.
  • Great! (Score:3, Funny)

    by Mad Merlin ( 837387 ) on Friday April 15, 2005 @08:26AM (#12243163) Homepage
    A beta Microsoft product for backing up all of my critical data! Where do I sign up?
  • It's a BETA - use it in a production environment and you deserve whatever bad things happen to you.

    Now *really* dangerous product groups with pre-programmed expiries are foods! They're not even marked as BETA! Go waste your time bitching about those non-BETA products that expire even though you've paid for them instead.
  • by damieng ( 230610 ) on Friday April 15, 2005 @08:42AM (#12243267) Homepage Journal
    I've recently been using Subversion as a backup solution at home with great success.

    My server runs it's own SVN repository and each of my machines can check in it's important files into the tree.

    This backup solution is quick and thanks to tools like TortoiseSVN integrates into the desktop for ease of use.

    Additional bonus factors are the ability to see the revision history, roll-back, full cross-platform support.

    You can also manage multiple copies of the same file to multiple machines should you need to work on them or just want additional resilience.

    The real icing of the cake of course is that you can run it over SSL via Apache or over SSH and therefore remotely access your backed-up files from out on the Internet should you suddenly need an invoice or a photograph while sitting in a net cafe in a foreign country.

    Oh, and it's free by both definitions. http://subversion.tigris.org

  • If you really need the uptime, you may already have a storage unit, which is almost certainly capable of snapshots/snapclones with close to zero downtime (some of them don't even bother copying the full contents of the drives -- just the differences!).

    Anyway, this would be only for databases, AFAICT. Any other kind of data usually does not need that kind of bringing-down-the-server-for-backups consistency.

    So, what's the point? Is this to be sold to enterprises that are so small that don't use storage syst
  • RTFA (Score:2, Informative)

    It seems like a decent feature set. A sort of scheduled mirroring of volumes over a LAN, utilising the Shadow Volume copying. An agent runs on the source server and logs the data changes. At scheduled times this agent transfers the accumulated changes to the DPM server.

    The server can produce snapshots etc and there seems to be some tie in to standard file save/open dialogs so users can access previous versions.

    Disk manufacturers will love it :-)

  • Too late (Score:3, Interesting)

    by dtfinch ( 661405 ) * on Friday April 15, 2005 @09:06AM (#12243435) Journal
    I've already switched to samba and rsync. Microsoft's backup was outdated by at least a decade, and even failed to complete at random when I've used it for disk to disk backups. And Windows' mandatory file locking policy makes safe, reliable backups entirely impossible. An xcopy backup is even dangerous, because it temporarily locks files as it opens them for reading, potentially causing other server processes to fail if they attempt to write to the files.
  • Hell no (Score:3, Interesting)

    by dtfinch ( 661405 ) * on Friday April 15, 2005 @09:42AM (#12243704) Journal
    From the FAQ:
    a customer has to purchase a server license for every DPM server that is deployed and a Data Protection Management License (DPML) for every server they protect.

    Now they have incentive to never upgrade the poor quality backup software already included in Windows. Admins will have to buy their backup software seperately or look elsewhere. Server operating systems are expected to come with _good_ backup software, so from a strictly technical sense Microsoft is being an ass.
  • It looks like you are backing up data...
  • by Anonymous Coward
    Hi!

    You all seem to bash MS again... .. but tell me, where can I find an useable Backup program from my SuSE 9.2 Professional? Windows 2000 Professional as well as Windows XP Professional both have a good schedulable backup program (included free as it should). But there is nothing on SuSE. (Ok, there is tar, but that definitely does not count! And then there is that on system backup in the YaST, but even that doesn't come close to what a backup program should be like - in order to be useable.) So, in terms

To do two things at once is to do neither. -- Publilius Syrus

Working...