BitTorrent For Enterprise File Distribution? 291
HotTuna writes "I'm responsible for a closed, private network of retail stores connected to our corporate office (and to each other) with IPsec over DSL, and no access to the public internet. We have about 4GB of disaster recovery files that need to be replicated at each site, and updated monthly. The challenge is that all the enterprise file replication tools out there seem to be client/server and not peer-to-peer. This crushes our bandwidth at the corporate office and leaves hundreds of 7Mb DSL connections (at the stores) virtually idle. I am dreaming of a tool which can 'seed' different parts of a file to different peers, and then have those peers exchange those parts, rapidly replicating the file across the entire network. Sounds like BitTorrent you say? Sure, except I would need to 'push' the files out, and not rely on users to click a torrent file at each site. I could imagine a homebrew tracker, with uTorrent and an RSS feed at each site, but that sounds a little too patchwork to fly by the CIO. What do you think? Is BitTorrent an appropriate protocol for file distribution in the business sector? If not, why not? If so, how would you implement it?"
Different torrent client ? (Score:5, Informative)
ask us (Score:5, Informative)
Next time you should ask at the official BitTorrent IRC channel [irc].
The Python BitTorrent client [bittorrent.com], which runs on Unix, has a version called "launchmany" which is easily controlled via script. It should fit your needs very nicely.
rsync (Score:5, Informative)
In a word, Yes (Score:5, Informative)
I've seen bittorrent used for several business critical functions. One example is world of warcraft distributing updates using it.
Cisco already makes a product to do this - WAAS (Score:5, Informative)
It is like Rsync on steroids. Cisco's Wan optimization and Application Acceleration product allows you to "seed" your remote locations with files. It also utilizes some advanced technology called Dynamic Redundancy Elimination that replaces large data segments that would be sent over your WAN with small signatures.
What this means in a functional sense is that you would push that 4 Gig file over the WAN one time. Any subsequent pushes you would only sync the bit level changes. Effectively transferring only the 10 megabytes that actually changed.
While it is nice to get the propeller spinning, there is no sense reinventing the wheel.
Cisco WAAS - http://www.cisco.com/en/US/products/ps5680/Products_Sub_Category_Home.html [cisco.com]
If the CIO expects "official" support... (Score:5, Informative)
Personally I like the portable media shipment suggestions. But if your CIO/company requires enterprise software from a large vendor with good support, have a look at IBM's Tivoli Provisioning Manager for Software:
http://www-01.ibm.com/software/tivoli/products/prov-mgrproductline/ [ibm.com]
Besides the usual software distribution, this package has a peer-to-peer function. It also senses bandwidth. If there's other traffic it slows down temporarily so it won't saturate the link. Once the other traffic is done (like during your off-hours or maintenance windows) it'll go as fast as it can to finish distributing files.
WAFS from GlobalScape (Score:1, Informative)
We do something similiar using WAFS by GlobalScape (Previously Availl).
http://www.globalscape.com/wafs/
It provides bit-level updates to data either on a schedule or continuously, and can keep a specified file version archive too. The continuous update to HQ should keep DSL utilisation low.
Re:Bittorrent is not secure (Score:5, Informative)
While security is always something to be considered, this from the question:
"private network of retail stores connected to our corporate office (and to each other) with IPsec over DSL, and no access to the public internet"
Private network? Check.
No access to public internet? Check.
So pretty much no way for the files to be seeded outside the company.
And even if there were a way to seed on the internet when they don't have access to it, password protect the file so only a client with the password can download it. That's not unbreakable, but if a competitor wanted the information there are easier ways to get it.
Captain disillusion (Score:5, Informative)
with IPsec over DSL, and no access to the public internet.
Unless you have very long wires, some box is going to route them. Are those your own?
Otherwise, your ISP's router, diligent in separating traffic though it may be, can get hacked.
Why am I saying this? Not to make you don your tinfoil hat, certainly, but just to point out that if the scenario is as I describe, you're not 100% GUARANTEED to be invulnerable. Maybe a few tinfoil strips in your hair would look nice... ;)
About the actual question: bit torrent would probably be fine, but if most of the data is unchanged between updates, you may want to compute the diff and then BT-share that. How do you store the data? If it's just a big tar(.gz|.bz2) archive, bsdiff might be your friend.
If you push from a single seeder to many clients, maybe multicast would be a good solution. But that's in the early design phase I think, which is not what you need :)
Best of luck!
How is the VPN setup (Score:5, Informative)
Your best bet is multicast, there are programs for software distribution that use multicast.
Cleversafe? (Score:1, Informative)
You should take a look at cleversafe.org - it's an opensource 'dispersed storage' infrastructure which allows you to slice up files and distribute them across a network of storage servers. Not sure if this would get you what you want, but it's worth looking into.
Re:it's called dsync (Score:5, Informative)
I hate to reply to my posts, but this link has an even shorter description of the tool:
conferences.sigcomm.org/sigcomm/2008/papers/p505-puchaA.pdf
Foldershare? (Score:1, Informative)
Windows DFS -- Dont use FRS (Score:5, Informative)
Re:Cisco already makes a product to do this - WAAS (Score:4, Informative)
BitTorrent is not very flexible in this regard and so if you have bits -added- to the middle, then everything after the first added bit will need to be updated.
The worse case is of course, if you have new material at the beginning and everything is shifted. BitTorrent is not designed for that.
Re:Sure, why not? (Score:5, Informative)
One of the things that always amused was when people claimed Bram Cohen was "selling out" by working with the movie/music industry. BitTorrent was never intended for piracy use, it's merely it's most common use.
It's very regularly used for Linux distros, game patches (World of Warcraft!), etc.
Kontiki (Score:3, Informative)
Re:Sure, why not? (Score:2, Informative)
Re:Cisco already makes a product to do this - WAAS (Score:1, Informative)
>BitTorrent is not very flexible in this regard and so if you have bits -added- to the middle, then everything after the first added bit will need to be updated.
I disagree. Your point holds true if and only if we are talking about a single large file (e.g. a dvd image). As the question pertains to replicating fileS, bittorrent does appear to have the neccessary flexibility. Adding a new file or modifying a file in the middle of a torrent does not force the redownloading of the entire torrent or even any files that appear after the addition - it simply requires downloading the file that has been changed/added.
Re:rsync (Score:5, Informative)
Yes, and there are ways you can use rsync from well-planned scripts that are very powerful beyond just file transfer.
1. The basic case of "transfer or update existing files at destination to match source." It always takes advantage of existing destination data to reduce network transfers.
2. The creation of a new destination tree that efficiently reuses existing destination data in another tree without modifying the old tree. See --copy-dest option.
3. In addition to the previous, don't even create local disk traffic of copying existing files from the old tree to new, but just hard link them. This is useful for things like incremental backup snapshots. See --link-dest option.
It may not be as sexy as p2p protocols, but you can implement your own "broadcast" network via a scattered set of rsync jobs that incrementally push their data between hops in your network. And a final rsync with the master as the source can guarantee that all data matches source checksums while having pre-fetched most of the bulk data from other locations.
I've been enjoying various rsync applications such as the following (to give you an idea of its power): Obtain any old or partial mirror of a Fedora repository and update it from an appropriate rsync-enabled mirror site, to fill in any missing packages. This is a file tree of packages and other metadata. Concatenate all of the tree's files into one large file. Then use rsync to "update" this file to match a correponding DVD re-spin image on a distro website. Rsync will figure out when most of those file extents cooked into the ISO image are already in the destination file, and just go about repositioning them and filling in the ISO filesystem's metadata. An incredibly small amount of traffic is spent performing this amazing feat.
Re:How I would do it... (Score:5, Informative)
Not necessarily true. PGP allows you to sign with multiple keys. Each site would have their own key that they would use to decrypt the file. One file, multiple keys, multiple users. Simple.
Re:The question remains.. (Score:3, Informative)
I also assumed that this was hub and spoke and that the "to each other" statement was just routing. Depending on the number of remote sites, and that he did not mention a specific hardware supplier, I would assume that a meshed ipsec VPN setup would be a task to maintain as it would likely be all manual.
I am all for open source systems but find that Cisco 8xx series routers are well priced(under $500) and easily managed for easy mesh vpn setups for up to 20 links. I run this setup with a ASA5510 at the center and each site connected to the ASA and 4 other sites for remote administration office and any other connections are just routed. Basically a hybrid hub&spoke + appropriate meshing.
Re:Sure, why not? (Score:3, Informative)
You're talking about the difference between the provider pirates and the end-user pirates. SCENE people hate p2p. Average Joe-wants-stuff-for-free doesn't know what the "scene" is, and uses p2p (always wondering why torrents say RELOADED or RAZOR1911).
Re:Cisco already makes a product to do this - WAAS (Score:4, Informative)
Even with a large file only the differences can be retransmitted with bittorrent, provided that the overall filesize doesn't change. At startup, bittorrent will verify the local data and then discard and redownload the chunks that don't match the checksum in the torrent file.
But rsync would be a better solution in this scenario as it was explicitly designed for such a use and will handle changes to the file much better.
Re:Chained client/server (Score:1, Informative)
Bittorrent will do this for you.
Especially with Super-seeding/Initial seeding.
Re:Kontiki (Score:2, Informative)
Re:Snail-mail USB sticks (Score:3, Informative)
We use both to replicate data between windows servers internally and on external sites.
Re:rsync (Score:2, Informative)
Re:BitTorrent not efficient for this scenario (Score:4, Informative)