Eric Nagel

Eric Nagel

CTO, PHP Programmer, Affiliate Marketer & IT Consultant

FTP or Amazon S3 Server Backup PHP Script

I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.

I’ve got a dedicated server with GoDaddy, and have been pretty happy with things. I know how important backups are, but don’t like how GoDaddy handles it.

First, you have to set it up yourself, which is OK if you know what you’re doing in Plesk, but Plesk backups SUCK. You can’t extract a single file from them, or get the SQL commands to rebuild part the database.

Furthermore, Plesk restorations SUCK. Plesk removes the current (live) site, then attempts the restore. Oh, and if the restore fails? Sorry – you have nothing now. Talk about back-asswards.

The final nail in the coffin is the fact that you can ONLY access the GoDaddy backup FTP server from your main server. So if your main server fails, don’t worry: you have a backup – you just can’t get to it.

After all that, I decided to write my own backup script. If you want to do something right, you have to do it yourself.

I started with a basic script that saved the files an FTP server, which was a good first step. But I couldn’t find a cheap FTP server for backups, so I looked at Amazon S3. While the pricing is a bit confusing, I know it’s cheap. It looks like I can backup my entire server, daily, for about $0.02 / day. You’re not going to find a better deal than that!

[button type=”big” icon=”upcoming-work” url=””]Download Now[/button]

Enough already! Here’s the script. It’s completely free, and probably has more comments than actual code in it. Just PLEASE don’t ask for support on this one – get a sys. admin if you need help configuring it. Or figure it out with trial & error.

Storing your data in “the cloud” can be dangerous if you don’t have permissions set right. A webinar by @WilsonMattos cleared that up for me (previous webinar available for download).

So, get your server backed up, and once you do, make sure you can successfully restore the data!

  • Rodney
    Posted May 14, 2009 3:43 pm 0Likes

    Thanks for sharing your script. Is this setup to only work on Plesk servers, or do you think it might work on a WHM/cpanel server as well?

  • Eric
    Posted May 14, 2009 4:23 pm 0Likes

    If you change the paths (mainly $cVhostsDir), this should work on WHM/cpanel as well.

  • Dash
    Posted June 26, 2009 9:19 am 0Likes

    Hi Eric, how will this script handle the number of backups in the S3 bucket? Do I manually have to delete old backups? Or is that included in the script?

  • Eric
    Posted June 28, 2009 2:08 pm 0Likes

    @Dash: the very first variable in the script is $nDays, which is “How many days of backups to keep?” It’s set to 5, but you can change that to whatever you’d like. After backing up, the script takes care of clean up.

  • Z
    Posted July 29, 2009 1:45 am 0Likes

    Just wanted to stop by and say thank you very much for sharing this script. It’s very well commented, and worked perfectly. I set it up as a daily cron job. I now have piece of mind. Thanks a lot!


  • RPM
    Posted October 23, 2009 3:32 pm 0Likes

    Great Script!

    I have a question if you don’t mind; if i would like to run this script in the daily cron folder should i change $workingDir ?


  • Eric Nagel
    Posted October 23, 2009 3:51 pm 0Likes

    @RPM – everything above “Done editing script” should be looked over and probably changed, including $workingDir

  • Niro
    Posted November 22, 2009 2:25 am 0Likes

    Eric, thanks for the excellent script.
    For some reason it gets stuck when trying to send to amazon a database file which is 400M after the compression. Top command shows cpu is at 100% on httpd process. smaller files did work.

    I tried removing the https in the s3 initialization but it did not help.

    Anyone know how to solve this?

  • Jordi
    Posted January 5, 2010 10:48 am 0Likes

    I have the same problem as Niro. Is there some sort of file limit? The file that it’s uploading is around 2GB.

  • Eric Nagel
    Posted January 5, 2010 10:58 am 0Likes

    Hi Niro, Jordi – I’m not sure why it’s getting stuck. According to the author of the S3 component

    Known Issues:
    Files larger than 2GB are not supported on 32 bit systems due to PHP’s signed integer problem
    SSL is enabled by default and can cause problems with large files. If you don’t need SSL, disable it with S3::$useSSL = false;

    For the database, you can change the dump to dump tables, not the whole database at once, and maybe then you’d be under 2GB. Or, take the .sql file that’s the export from the database and split it up.

  • Jordi
    Posted January 5, 2010 2:32 pm 0Likes

    Thanks for the quick response. For me unfortunately it didn’t work though. I tried to turn of SSL but that didn’t do the trick. It’s probably the 2GB file limit then because the tar.gz file is just a little over 2GB.

    For me it’s not the database dump that’s causing problems. That’s actually uploading correctly. The problem lies with the complete file dump, probably have to many images 😀

    I’ll have to try something else then. Maybe I’ll have to split it into different files. Too bad automated backups with an option to upload to S3 isn’t baked into Plesk.

    Thanks again!

  • Jordi
    Posted January 7, 2010 10:41 am 0Likes

    I wrote a new version which fixed my large file problems and the timeout issue. Hopefully this will fix your problem too Niro, or anyone else that stumbles on the same issues 🙂

    My server is backing up again, I feel much saver now hahaha 😀

  • Eric Nagel
    Posted January 7, 2010 11:24 am 0Likes

    Nice job, Jordi! Thanks for not only improving on the backup script, but giving it away, too!

    @Niro – take a look at what Jordi did, and you should be able to backup your server now

  • Trackback: Backup Your WordPress Installation Easily | HighEdWebTech
  • Adam J
    Posted April 1, 2010 8:12 am 0Likes

    Does this back up the plesk settings?
    i.e. dns/ ssl certs/ email accounts etc?

  • Eric Nagel
    Posted April 1, 2010 8:15 am 0Likes

    Hi Adam – no, it does not. Also, Plesk is not required… this script will backup just about any configuration (I just happen to have & use Plesk)

  • Adam J
    Posted April 1, 2010 8:20 am 0Likes

    Hi Eric,

    I use plesk as well and recreating and having to re-enter all those settings would take me a long time as I have many domains and email accounts.

  • Brade
    Posted August 24, 2010 5:46 pm 0Likes

    Just an FYI, I tried Jordi’s script, and it worked great for DB’s but slowed our server way down during the actual web directory back-ups. I’ve since found that JungleDisk now has a server-specific version, which I personally have installed on our server, and it works really great:

    $5 a month + .15 per GB (1st 10 GB are free), so still a pretty cheap solution. Has functions to automate recovery, etc. as well, and much less resource intensive during the backup.

  • Carlos
    Posted February 12, 2011 8:08 pm 0Likes

    Thank you friend, Worked perfect!

    For make a backup from Linux server, I made this changes:

    First, I changed the original values in $mysql_server, $mysql_username $mysql_password.

    Before, change the path (original from plesk), for the path in linux server:

    $workingDir = ‘/home/myusername/’;

    I downloaded the S3 Class file (S3.php) from with the version (0.4.0 – 20th Jul 2009)

    And I put the file in a new folder in public_html/backupfiles/s3.php

    And later:


    And Done!

    Obviously I´m a newbie user. But I´m secure that this information will be useful for another user.

  • guilliam
    Posted August 24, 2011 5:16 am 0Likes

    hello eric,

    is there a way i can include the cpanel’s email on my backup? thanks for this easy to newbie solution you shared.

    – g

    • Eric Nagel
      Posted August 26, 2011 5:08 pm 0Likes

      hrm… not really sure. You’d have to either dump the email from cpanel, automatically, or find where they’re stored tar / gzip them in their native format.

  • Chris Seckler
    Posted July 8, 2012 7:11 am 0Likes

    I noticed you mentioned your script will work on Plesk servers and WHM/cpanel servers, but will your script work on Media Temple servers as well?

    • Eric Nagel
      Posted July 8, 2012 7:20 am 0Likes

      Hi Chris,

      This will work with any set-up, as long as you set the variables at the beginning of the script properly. You do NOT have to be running plesk. I just used the basis of this script for a webmin set-up.

  • Juan Lopez
    Posted July 31, 2012 2:01 am 0Likes

    Hi Eric,

    This script:
    it will dump all databases, public_html files and compress it all, correct?


  • microno
    Posted November 28, 2014 6:58 am 0Likes

    Hi, this script is no longer available, could you please share it again, or share it with me? I’m really interested, thank you.

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.