Forgot your password?
typodupeerror
Software

Secure File Storage Over Non-Trusted FTP? 384

Posted by kdawson
from the beeping-sounds-while-backing-up dept.
hmckee writes "Does any software exist that enables me to store/backup/sync files from my local computer to a non-trusted FTP site? To accomplish this, I'm using a script to check timestamps, encrypt and sign the files individually, then copy each file to an offsite FTP directory. I've looked over many different tools (Duplicity, Amanda, Bacula, WinSCP, FileZilla) but none of them seem to do exactly what I want: (1) multi-platform (Windows and Linux), stand-alone client (can be run from a portable drive). (2) Secure backup (encrypted and signed) to non-trusted FTP site. (3) Sync individual files without saving to a giant tar file. (4) Securely store timestamps and file names on the FTP server. Any help or info on alternative solutions appreciated."
This discussion has been archived. No new comments can be posted.

Secure File Storage Over Non-Trusted FTP?

Comments Filter:
  • by BadAnalogyGuy (945258) <BadAnalogyGuy@gmail.com> on Monday August 18, 2008 @02:31AM (#24641803)

    This guy was always complaining about headaches. He would constantly be pounding his head into his fist and whimper to me that he felt like his head would split open. He took pain killers all the time, and for a long duration was addicted to a certain prescription pain medication. But none of that helped because as soon as the medication started to wear off, the pain would come right back again.

    Finally, I had had enough of his complaining. I told him to stop pounding his head with his fist. Whaddayano! His headaches went away in a day.

    Moral of the story: Don't try to find workarounds for your problem. Fix the problem.

    • by elronxenu (117773) on Monday August 18, 2008 @02:42AM (#24641849) Homepage

      The problem is asking questions to Slashdot?

      • by gmack (197796) <gmack&innerfire,net> on Monday August 18, 2008 @05:49AM (#24642669) Homepage Journal

        The real problem is not knowing about rsync since it's designed for exactly his problem.

        • by B'Trey (111263) on Monday August 18, 2008 @09:02AM (#24643767)

          The real problem is not knowing about rsync since it's designed for exactly his problem.

          No, rsync isn't a very good solution for a couple of reasons. First, unless there's some capabilities that I'm not aware of, rsync has no encryption capabilities. Given an unencrypted file tree and an encrypted version of the file tree, rsync has no way to compare the two for changes. The only solution to that which I see is to maintain a local encrypted mirror of your file tree. So then you need twice as much space, since you're maintaining two local file trees, and you need a tool to update automatically sync the local file tree and the local encrypted version of the file tree. If you have that tool, then it may work or be hacked to work with a remote file tree, completely removing the need for rsync. Even supposing that you found a tool to do that which won't work with a remote file tree, you're nullifying the primary advantage of rsync.

          rsync is designed to do incremental updates. If you have a text file and change one word, rsync doesn't transfer the whole file. It only sends enough info to correctly update the remote file so that it matches the new local file. (Or vice versa, of course.) But when you change a single word and reencrypt a text file, the whole file changes. So rsync will have to transfer the whole file. So will any other solution, of course, but it does mean that rsync loses much of the capability which makes it so valuable.

          You could do something like unencrypt the local file tree mirror, rsync with the working file tree, reencrypt the file tree and then rsync the local encrypted tree with the remote encrypted tree mirror, but that's a lot of work and processing power and hardly matches the clean, integrated solution that the article is asking for. It's probably more cumbersome than whatever it is he's doing now.

          • by mishehu (712452) on Monday August 18, 2008 @09:32AM (#24644081)
            It appears that there are options for rsync encrypting files on the far end. rsyncrypto [lingnu.com] might be just one of these. I have not used them but I remember them coming across my 'radar screen' in the past.
            • by B'Trey (111263) on Monday August 18, 2008 @09:52AM (#24644293)

              Interesting. Things like this are why I always hedge my bets and say things like "...unless there's some capabilities that I'm not aware of, rsync has no encryption capabilities..."

              That being said, I'd be extremely leery of this program. The website says: "Rsyncrypto does, however, do one thing differently. It changes the encryption schema from plain CBC to a slightly modified version. This modification ensures that two almost identical files, such as the same file before an after a change, when encrypted using rsyncrypto and the same key, will produce almost identical encrypted files." I'm far from an expert at crypto but I know enough to be extremly suspicious of that claim. A "slight change" in an encryption algorithm can be enough to transform an algorithm from highly secure to trivially crackable. And I strongly suspect that making similar files produce similar encrypted files means that there's a great deal of info about the unencrypted file suddenly available from examining the encrypted file. I wouldn't trust this without extensive review from some heavy weights in the crypto field.

              • Re: (Score:3, Informative)

                by vrmlguy (120854)

                From http://rsyncrypto.lingnu.com/index.php/Algorithm [lingnu.com]: "The entire rsyncrypto can be summarized with one sentence. Rsyncrypto uses the standard CBC encryption mode, except every time the decision function triggers, it uses the original IV for the file instead of the previous cypher block."

                So you're basically dividing each file into chunks and encrypting them separately using the standard algorithm. Seems pretty safe to me. The only obvious leakage is that an attacker can tell if two files are substantiall

                • Re: (Score:3, Insightful)

                  by B'Trey (111263)

                  It might be safe but unless you're quite knowledgeable about encryption, gut feelings about what seems safe aren't very reliable. I still suspect that doing this opens up more areas of attack. Note that I'm making no claims of expertise, so I don't KNOW this to be the case. I'm just saying that I'd be leery.

                • by mdmkolbe (944892) on Monday August 18, 2008 @11:37AM (#24645975)

                  Rsyncrypto is insecure. By resetting to the IV, it opens an information leak similar to the one with ECB mode [wikipedia.org] (see the picture of the penguin on that page).

                  To see why CBC with occasional reset-to-IV is insecure (regardless of trigger function), consider a long repeating pattern of the same bytes (e.g. the white spaces in the penguin picture). CBC won't encrypt them to the same value (like ECB does), but every time the IV resets the same sequence of encrypted bytes will appear. This pattern is detectable and further the places where this pattern is disrupted is detectable. So going back to the penguin picture, the non-background portions will have a shadow that disrupts the repeating background pattern and revealing the content of the file.

                • by Sun (104778) <shachar@shemesh.biz> on Monday August 18, 2008 @11:55AM (#24646297) Homepage

                  Note - I'm the one who designed and wrote rsyncrypto.

                  The only obvious leakage is that an attacker can tell if two files are substantially identical.

                  Well, no. Two files will likely be encrypted using different session (read - AES) keys, and therefor will not be even remotely similar. One file having two significantly similar areas will show up, however. This case was deemed somewhat remote in my analysis. You are free to perform your own, of course. If you do, please feel free to email me.

                  The only place where actual data leakage may happen are due to the fact that a persistent attacker can compare cipher texts, and know where the decision function triggered. This is a good point to ask "how much information is gleaned about the file".

                  I think current rsyncrypto is ok on that front, and future plans include improvements. These improvements, alas, will cost in performance.

                  Shachar

              • Re: (Score:3, Interesting)

                by Anonymous Coward

                Here's a review of rsyncrypto [linux.com] that also says it isn't really secure:

                For an example of an information leak, suppose you have an XML file and you use rsyncrypto to copy the file to a remote host. Then you change a single XML attribute and use rsyncrypto to copy the updates across. Now suppose an attacker captured the encrypted versions in transit, and thus has copies of both the encrypted file before the change and after the change. The first thing they learn is that only the first 8KB of the file changed, be

      • Re: (Score:3, Informative)

        by jetole (1242490)
        The problem is FTP. It is an old deprecated protocol that is inherently insecure and even FTP w/ SSL is simply a work around to a broken problem. As long as you are using insecure FTP then you are officially screwed and I seriously doubt any company is making product when they know FTP has the SSL option (which is a work around but it works). The real answer to your problem is use a secure protocol like SSH which does everything you just asked for natively. Now because I just posted two easy answers to yo
        • by thegrassyknowl (762218) on Monday August 18, 2008 @06:39AM (#24642891)

          The real answer to your problem is use a secure protocol like SSH which does everything you just asked for natively.

          Does it encrypt and sign the files one-by-one so that the admin of the remote site (who you don't trust) can't read, alter or share them on you?

          • Re: (Score:3, Interesting)

            by MagdJTK (1275470)

            Gah. I wish people wouldn't keep trying to use public key encryption when it's not needed. Public key encryption is used to get around the key distribution problem. Signing is used because anyone can easily encrypt stuff using your public key and you can't guarantee they are who they say they are.

            From what I can tell, he's not sending these files to anyone. He's uploading them and the only person who will access them will be himself. This is exactly what regular, symmetric encryption is for!

            Encrypt the file

            • by Leebert (1694) on Monday August 18, 2008 @02:45PM (#24648915)

              What's wrong with using a public key for the backups? That's what I do.

              If you're using purely symmetric encryption for your backups, you have to store the keys somewhere, and that somewhere has to be online when the backup is generated. Then you have to physically move it somewhere that it's not reachable. It's a manual process.

              With a public key system, you can store your private key offline all of the time, and not have to deal with symmetric key management. GPG does that for you.

              Where is the downside?

          • Re: (Score:3, Insightful)

            by poot_rootbeer (188613)

            Does it encrypt and sign the files one-by-one so that the admin of the remote site (who you don't trust) can't read, alter or share them on you?

            If you don't trust the remote server, why the fuck would you consider using it as a backup site?

            There isn't an encryption/protection scheme possible that will prevent the remote admin from outright deleting whatever files on his own filesystem that he wishes to. Oops, no more backups.

            • Re: (Score:3, Interesting)

              by Sloppy (14984)
              Yep. I spotted that problem too. The fact that he has the requirement to use non-trusted FTP (i.e. can't install openssh) strongly suggests this ftp server is someone else's. His backup could vanish at any moment. Since he can't rely on this, then he'll still need to backup somewhere else too. So why not switch his attention to that "somewhere else"?
        • by B'Trey (111263) on Monday August 18, 2008 @08:45AM (#24643623)

          The problem is FTP. It is an old deprecated protocol that is inherently insecure and even FTP w/ SSL is simply a work around to a broken problem.

          Wow. It might be better to understand the problem before you make suggestions. FTP isn't the problem. FTP is just a way to move files from here to there. It's unsecured and untrusted but, in this case, SO IS THE REPOSITORY. Exactly what benefit do you get from using SSH to securely transfer files to an unsecure location? That's like using an armored truck to move your valuables to the QuickStorage down the road. What's wanted is an automated way to encrypt the files locally, then transfer the encrypted files to an untrusted site. If the files are encrypted, then it doesn't matter that FTP is unsecured.

    • by ettlz (639203)
      A slightly less acerbic answer is "Get Python, and code it yourself, schmuck!"
    • by stephanruby (542433) on Monday August 18, 2008 @04:19AM (#24642301)

      Yeah, I don't get this guy. First, he says he wants it for his home computer. Then, he says it has to be multi-platform (Windows and Linux) plus stand-alone that can be run from a portable drive.

      And I say why? Let's assume for a moment that this guy has two computers at home, one that runs Linux and one that runs Windows. He doesn't need an app that does everything perfectly on both platforms. He just needs an app that does it perfectly on one, and either one is fine really. If he prefers to use his Linux box to coordinate the secure backup to an untrusted FTP site, then he just needs to have his Windows machine send the data unencrypted over to his Linux box -- then his Linux box can just do the bulk of the job. Or if he prefers to do it the other way around and use his Windows machine to do the secure backup to the untrusted site, he can just use that and have his Linux box send the data unencrypted to his windows machine.

      And of course, why does it even need to go onto FTP instead of SFTP? Instead of wasting valuable man-hours reinventing SFTP from scratch, or finding someone else that has, he could just pay a few dollars to a provider who will give him SFTP. And if his current Provider won't do that, get an other additional provider that will do it. If backing up is really as important as he seems to make it, then spending a few extra dollars each month shouldn't be a problem.

      • by hmckee (10407) on Monday August 18, 2008 @04:32AM (#24642373)

        I should have stated that the data wasn't THAT important since it's already backed up in two other places.

        I was initially using Amazon S3 to do the backups, but since I had 20 GB of spare storage on my hosting site, I figured someone else must have tried doing the exact same thing because it's the cheapest solution. It didn't take me long to write a small script to encrypt files and send them to the FTP server, another reason I figured someone else may have done this. So, rather than extending the script, I thought I'd "Ask Slashdot" to see if anyone else had completed the exercise.

        If it were REALLY important for me to have this storage, I'd go back to using S3 or spring an extra $10 a month to get my account upgraded to use SSH/SFTP.

        As it stands now, I may just get a kick out of implementing the project for fun.

    • by n3tcat (664243)
      That was a pretty bad analogy
  • Really is a pity (Score:5, Informative)

    by pembo13 (770295) on Monday August 18, 2008 @02:36AM (#24641817) Homepage
    I have explicitly asked my web host provider for either SFTP or FTPS. They basically said that it wasn't possible to provide that on a shared host. This seems untrue to me, I just can't state it as a fact since I haven't attempted it myself. But to get what the OP wants, one would essentially need a secure file system implementation on top of FTP. Ie. only the client can see the unencrypted file, not the in between transport over FTP, or the server side disk drive.
    • Re:Really is a pity (Score:5, Informative)

      by ThePromenader (878501) on Monday August 18, 2008 @02:46AM (#24641883) Homepage Journal

      I'd translate "wasn't possible" to "couldn't be bothered". Once SSH installed (and it is there by default in most *nix distros), you have but one 'user' file to configure (to 'jail' you within a certain hierarchy). Ta-da! Change your host and use SFTP.

      • I'd translate "wasn't possible" to "couldn't be bothered".

        I'd translate it as uneconomic.

         

      • Re: (Score:3, Interesting)

        by Builder (103701)

        I fully agree with this provider. Providing shell access to a shared machine is madness and you cannot provide security for your users this way.

        SFTP requires that SSH be running, so there is always a risk of shell access being gained through breaking scponly or whatever other jail you use.

        Virtual machines are the only way I know of providing this, and they cost more because of setup / maintenance costs. Failing that, FreeBSD jails, but they are unpopular due to people wanting Linux hosting.

        You get what you

    • Re:Really is a pity (Score:5, Informative)

      by EdIII (1114411) * on Monday August 18, 2008 @03:30AM (#24642089)

      It is ENTIRELY possible to provide that on any host, regardless of the number of users. All you are asking (correct me if I am wrong) is that the connection between you and the FTP server is secured through SSH or TLS.

      That is trivial. Sounds like they cannot be bothered to enact rudimentary security. As a policy in my own systems, and any systems that I pay to use, I demand that any connections that go over untrusted networks be encrypted. There are so many products that help you do this it just makes their refusal all the more ridiculous. I have a product that does not support encrypted connections and I just stunnel to protect it.

      Anything less is just reckless. Tell them to protect your connection or you will get another provider. Simple as that.

    • by Rufus211 (221883)

      They might be telling the truth, depending on how they share the hosts and how they have logins setup.

      HTTPS is not possible with virtual hosts (where foo.com and bar.com are both running on 1.2.3.4). The reason being is that the HTTP server doesn't know if you're talking to foo.com or bar.com until after the connection has started, but it needs to send out one of their certificates in order to get the connection started.

      I'd guess FTPS has the same issue, as the FTP server won't know what to respond as. SF

  • by RAMMS+EIN (578166) on Monday August 18, 2008 @02:43AM (#24641859) Homepage Journal

    I'm working on a backup solution that allows people to back up their data to a remote server securely and efficiently. For "efficiently", think rsync: only the differences are sent (and some information necessary to identify what the differences are). For "securely", think assymetric cryptography: your backup is stored in encrypted form, so that only someone who possesses your private key can use it.

    All this is currently in very early stages of design. I'd welcome any suggestions for protocols or software I could use. Currently, I am thinking to implement a transactional network block device protocol, and implement the backup protocol on top of that. I still need to decide on a programming language I can use for parts I need to write myself, too (something safe (no buffer overflows, please), yet with byte level access...and no Java or .NET, please).

    By the way, this is going to be a commercial product, but the code and the protocols will be open. I'll charge for the storage and bandwidth. :-D

    • If the backup is going to be stored in encrypted form then how is efficient "rsync-like" difference identification going to be possible?

      A small change in a source file will likely change everything following it in the encrypted version.

    • Re: (Score:3, Interesting)

      by davidkv (302725)

      Have you checked out rsyncrypto [lingnu.com]?

      • by RAMMS+EIN (578166)

        No, but I will. It looks like it could be very useful to me. Thanks for the pointer!

        • by davidkv (302725)

          There's also esync [zexia.co.uk], but as far as I know (I emailed the guy a few years ago) he got swamped with other stuff and never got any further.

          There's quite a bit of theory on his pages though. Might be of interest.

          • Re: (Score:3, Informative)

            by Sun (104778)

            I've looked at it a while back (I'm the one who wrote rsyncrypto). When compared with rsyncrypto, the main thing I didn't like (aside from the fact there appears to be no implementation... ) is the amount of state stored. Esync actually needs access to the old plain text file in order to work (or a substantially similar state). Rsyncrypto, on the other hand, needs just a few pieces of state per file, that once created never change. These include the symmetric session key and such stuff, and are about 68 byt

    • by gsasha (550394)
      Manent does it - I'm welcoming collaboration. Check it out at http://freshmeat.net/projects/manent [freshmeat.net]. I'm not leaving my email here but you'll find it if you drill down to the project website.
    • by sumdumass (711423)

      Here is a suggestion, make sure something forces the user/admin on the client side to back up their encryption keys and settings to either something local or a combination of local and remote that isn't on the same computers being backed up. And make sure this is verified every so often by either requiring a "file" stored on the local backup to make the program work or simply make it refuse to work again without making another updated key and setting backup.

      Nothing pisses me off more then walking into a job

    • Re: (Score:3, Interesting)

      by Bert64 (520050)

      Well, I use rsync over SSH (so the network traffic and authentication is encrypted)...
      You could potentially use an encrypted disk locally, and rsync the encrypted disk image over (it should still only xfer the changes), assuming you don't trust the target host.

  • TrueCrypt (Score:5, Informative)

    by kcbanner (929309) * on Monday August 18, 2008 @02:50AM (#24641909) Homepage Journal
    See http://www.truecrypt.org/ [truecrypt.org] for cross platform encryption...you can throw your files in there.
    • by hmckee (10407)

      I use TrueCrypt on my portable hard drive and tried using it for this application. The problem was that TrueCrypt couldn't create a file system on an FTP server.

      I've been using TrueCrypt to encrypt individual files before sending them to the FTP server. I'll have to give it a look again since my version might be a little out of date.

      • by kcbanner (929309) *
        If you don't mind the bandwidth use can't you just sync up your TrueCrypt encrypted file?
  • by pananza (1228694)
    I use Amazons S3 service and a great multi-platform UI called JungleDisk. S3 costs a little bit, but you get security (encryption), backup, reliability for a cheap price. Check out: http://www.amazon.com/s3 [amazon.com] and http://www.jungledisk.com/ [jungledisk.com]
    • I second the recommendation.

      Backups are differential on a block level (blocks are a few MB, if I'm not mistaken). File identities and extended attributes are preserved. Upload resume and "on the fly" (i.e., without re-uploading) encryption key changes are supported for a premium (JD Plus service).

      I'm not sure how secure the web access interface is, but I think you can disable it.

    • by hmckee (10407)

      I was using Amazon S3 before realizing I was paying double when I had a spare 20 gigabytes on my FTP/HTTP hosting service. I could pay an extra $10 a month to get SFTP/SSH service but I guess I'm being cheap.

      I'm also not storing anything so important that I need a technically superior solution.

  • by Horus107 (1316815) <Florian...Lindner@@@xgm...de> on Monday August 18, 2008 @03:02AM (#24641957)

    duplicity combined with ftplicity:

    "Anyone storing data on an unfamiliar FTP server needs to encrypt and sign it to ensure reliable protection against prying eyes and external manipulation. duplicity is just the tool for this, and the ftplicity script from c't magazine makes working with it child's play."

    http://www.heise-online.co.uk/security/Backups-on-non-trusted-FTP-servers--/features/79882 [heise-online.co.uk]
    http://duplicity.nongnu.org/ [nongnu.org]

    • by hmckee (10407)

      Yes, I've looked at this, but I'm already using a Python script to do most of that. I was hoping to find something with a GUI and that was easier to put on a portable hard drive than Python.

      • I was hoping to find something with a GUI

        Then you should have put this as a requirement in your query. But I would ask WHY you want a gui? Backups should be set-and-forget! My USB sticks have multi-platform autorun scripts to execute my backup. I only need an interface if I choose to expand or shrink the backup set--I can edit a text file that has the list of what to exclude.

        and that was easier to put on a portable hard drive than Python.

        Python is pretty easy to put on a portable hard drive and there ar

        • by hmckee (10407)

          Blame this on my not writing up a really thorough spec for the small summary. You can see some of my other posts for more info, but this was sort of a query to see if anyone had done something similar because it seems like a simple project that might be useful.

          As to the GUI, I was thinking it would be nice if it could double as a backup tool and a remote file system tool, ie access the files from another computer.

    • by sumdumass (711423)

      Anyone storing data on an unfamiliar FTP server better make sure that is isn't important or private. FTP doesn't encrypt anything, including the user name and password. Simply sniffing either end of the connetion could allow anyone to delete the files, down load them to be cracked on a 2 million node bot net contributing cycles, or even infect the files with something that would notify me when you access them and either send your keys to me or give me access to your system.

      And I said me not because I would

  • Ok ftp supports reading chunks of data from files, i.e. byte range n-m.

    However it doesn't support (I strongly suspect) _writing_ chunks. Sure you can say, 'REST n' and start writing but I think the file would be truncated.

    This means, encrypted images like Truecrypt containers are out,s ince you'd be writing the entire file over and over again.

    So you'll have to stay with single files.

  • by zonky (1153039) on Monday August 18, 2008 @04:02AM (#24642227)
    This may well mean that despite whatever you do, encypt etc, someone can sniff the password and then simply come in and delete all your files. i.e, whatever other steps you take, this is inherently worthless.
    • by GauteL (29207) on Monday August 18, 2008 @07:30AM (#24643109)

      This may well mean that despite whatever you do, encypt etc, someone can sniff the password and then simply come in and delete all your files.

      i.e, whatever other steps you take, this is inherently worthless.

      Hardly. As long as the data is encrypted well enough to stop people from stealing or modifying the data in ways that could have serious privacy and financial implications this is a net gain in data availability.

      Even if the chance of someone doing this was as high as 5% over the period in question, it would still mean that there was 95% chance of you having a good off site backup. That is better than nothing as long as you realise that there is still a 5% risk and don't act like it is totally secure.

      As a simplified example; if your PC at home is 95% sure of retaining all of its data in the period and your portable USB hard drive is 95% sure of retaining all of the data, the chance of you losing any data at all is 0.0125%. Even with exaggarated risk factors, this is not bad.

  • If you can ssh to the site, you should be able to do sftp, which is basically ftp over ssh. That is about as secure as it gets without personalized encryption keys.

    If you cannot ssh to the site, then you should find another host.
  • by gsasha (550394) on Monday August 18, 2008 @04:08AM (#24642247) Homepage
    Well, it's feature list is exactly what you want and some more :). Here's the project description:
    Manent is an algorithmically strong backup and archival program. It features efficient backup to anything that looks like storage. Currently it supports plain filesystems ("directories"), FTP, and SFTP. Planned are Amazon S3, optical disks, and email (SMTP and IMAP). It can work (making progress towards finishing a backup) over a slow and unreliable network. It can offer online access to the contents of the backup. Backed up storage is completely encrypted. Backup is incremental, including changed parts of large files. Moved, renamed, and duplicate files will not require additional storage. Several computers can use the same storage for backup, automatically sharing data. Both very large and very small files are supported efficiently. Manent does not rely on timestamps of the remote system to detect changes.
    Check it out: http://freshmeat.net/projects/manent [freshmeat.net]. It's under active development (the UI and the setup are currently in fetal stage) but the basic functionality is there and is well tested.
    Disclaimer: I am the author.
  • If possible, keep it simple. This is what I do - it is from UNIX, I don't know if Windows can handle it, but probably through a proper UNIX subsystem:

    (cd /source/directory;tar cf - *)|ssh user@target '(cd /target/directory;tar xvf -)'

    The left side will copy the whole directory tree under /source/directory and put it out on stdout in tar format; the right side will route the stdout to the target machine, where it will be unpacked under the target directory. If you don't want to copy everything, there are way

  • I've looked into this in the past. There is nothing better than Duplicity.

    I eventually gave up and started backing up my data to servers that I do trust. You should too. You can rent a VPS for only $20 per month. It's just easier and *know* that you're the only one who has root access (assuming that you keep updating your system, of course).

  • I can see a relative easy solution for Linux and that is just scripting the whole thing. Almost any backup script should be able to do what you want and can get the files from Windows machines as well. That will be in CLI, which should not be an issue as backups should not run in GUI anyway, but automagicaly with cron.

    It becomes different if you also want the restore to be in the same tool.

To thine own self be true. (If not that, at least make some money.)

Working...