News:

Join the Facebook Fan Page.

Main Menu

Automating Backup Of Forum

Started by 3guk, August 16, 2005, 08:15:38 AM

Previous topic - Next topic

3guk

Right Guys,

Spent the last 9 hours working on my own website and learning about cron jobs and stuff.

I have two scripts here for you that auto backup your site / forum what ever.

Mysql Backup
The first one is to back up a mysql database.

#! /bin/sh
DATESTAMP=`date +%m-%d-%Y-%H%M`
cd /home/yourusername/backups/database/
mysqldump --opt -pPASSWORD -uUSERNAME DATABASE | gzip >$DATESTAMP.gz
cd /home/yourusername/
echo $DATESTAMP >> msg.txt
cat msg.txt | /usr/lib/sendmail -t


Save as daily.sh and put it in your home directory. So as soon as you login to ftp I would place it there before public_html. Edit the file replacing , youusername, PASSWORD, USERNAME and DATABASE for your own values. Please create the directory backups and the directory database within backups.

From: [email protected]
To: [email protected]
Subject: Daily Backup Routine

Begin your message here. It is very important to leave a blank line before you start the body of your message.
The Backup Routine has been executed


Obviously save as msg.txt and put it in the same place. editing the email @ here.com for your own values. The msg can be sent to any email address and it just confims that the script has executed.

Chmod to 755 , daily.sh that is. Chmod msg.txt to 777.

If you would just like to make a backup of MYSQL only and nothing else then you can skip the next step and go to the last step.



Files Backup

This script is a little more complex.

On the 1st of the month a permanet full backup is made

Every Sunday a full backup is made - overwriting last Sundays backup

The rest of the time an incremental backup is made. Each incremental backup overwrites last weeks incremental backup of the same name.

if NEWER = "", then tar backs up all files in the directories
otherwise it backs up files newer than the NEWER date. NEWER gets it date from the file written every Sunday.

#!/bin/sh
# full and incremental backup script
# created 07 February 2000
# Based on a script by Daniel O'Callaghan <[email protected]>
# and modified by Gerhard Mourani <[email protected]>

#Change the 5 variables below to fit your computer/backup

COMPUTER=computername                          # name of this computer or site name
DIRECTORIES="/home/username/public_html"                      # directoris to backup
BACKUPDIR=/home/username/backups                      # where to store the backups
TIMEDIR=/home/username/backups/last-full              # where to store time of full backup
TAR=/bin/tar                            # name and locaction of tar

#You should not have to change anything below here

PATH=/usr/local/bin:/usr/bin:/bin
DOW=`date +%a`              # Day of the week e.g. Mon
DOM=`date +%d`              # Date of the Month e.g. 27
DM=`date +%d%b`              # Date and Month e.g. 27Sep

# On the 1st of the month a permanet full backup is made
# Every Sunday a full backup is made - overwriting last Sundays backup
# The rest of the time an incremental backup is made. Each incremental
# backup overwrites last weeks incremental backup of the same name.
#
# if NEWER = "", then tar backs up all files in the directories
# otherwise it backs up files newer than the NEWER date. NEWER
# gets it date from the file written every Sunday.


# Monthly full backup
if [ $DOM = "01" ]; then
        NEWER=""
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DM.tar $DIRECTORIES
fi

# Weekly full backup
if [ $DOW = "Sun" ]; then
        NEWER=""
        NOW=`date +%d-%b`

        # Update full backup date
        echo $NOW > $TIMEDIR/$COMPUTER-full-date
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DOW.tar $DIRECTORIES

# Make incremental backup - overwrite last weeks
else

        # Get date of last full backup
        NEWER="--newer `cat $TIMEDIR/$COMPUTER-full-date`"
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DOW.tar $DIRECTORIES
fi


Save this as data.sh and save in the same place as the other script. Edit the variables at the top of the script, they should be fairly self explanatory, the computer name I have set to the name of my site.

Chmod to 755.

Obviously if you have not done the mysql one then you have to creat the directories  : backups and backups/last-full. If you change these to something else then please create them as the script will not work.

Then in last-full you need to create a blank file, use notepad if you like and on the first line type the date like so - 16-Aug. This tells the server that the last full backup was on this date so that it does get all confused.  :P  Call this file COMPUTERNAME-full-date replace computer name with the one that you have set.

After doing this you need to configure a cron job to execute this script at a given time.

Cron Jobs
MySQL Backup

Cron jobs are easily accesible via Cpanel , Click cron jobs and then click advanced (unix style).

You then need to create a cron job, so for the mysql backup script you would do
minute : 00
hour : 01
day : *
month : *
weekday : *
command : /home/username/daily.sh

This basically tells the server to execute daily.sh at 1 everyday obviously you can change this to be what ever time you like but at 1am no one tends to be browsing and hence server load is low. Edit the username in the command to suit your cpanel login name.

If you want to test it just set the hour and minute accordingly and click commit changes. It should execute at the time you said and will leave a nice date stamped archive for you to download at your own pleasure. This will leave huge amounts of data on the server so please keep on top of it. I download all my mysql dumps every week and empty the directory.

Files Backup


You then need to create a cron job, so for the files backup script you would do
minute : 00
hour : 02
day : *
month : *
weekday : *
command : /home/username/data.sh

This basically tells the server to execute data.sh at 2 everyday obviously you can change this to be what ever time you like but at 2am no one tends to be browsing and hence server load is low. I have set it to 2am so that it doesnt clash with the mysql dump script, again just to reduce server load at anyone time. Edit the username in the command to suit your cpanel login name.

If you want to test it just set the hour and minute accordingly and click commit changes. It should execute at the time you said and will leave a incremental backup on the server something like computername-day.tar.

I reccomend editing the line - if [ $DOM = "01" ]; then - changing the 01 for the current date - 17 in this case. This causes it to do a full backup. Obviously once this is done change back to 01.

On the first of a month a permanent full backup will be made. Meaning that if you choose not to delete them you will have 12 main backup files within a year in the directory.

Every Sunday you will also get a full backup created but the next sunday this gets wiped and replaced with a new sunday backup for that week.

The incremental backups each day also get wiped weekly with each day overwriting the the last weeks backup so week 1 monday will be replaced by week 2 monday.

Finally,

These are not my scripts and I take no credit for creating or finding them, I just felt a tutorial was needed on how to auto backup your site. These days data just isnt safe.

I have set Cute FTP to auto download the files meaning that each day when my computer turns on Cute ftp downloads the incrematal backup. On sunday it downloads the latest full backup. And on the first of each month it downloads the main monthly backup.

All to folders on my hard drive meaning that I will always have a very recent backup spare should my server get broke :P.

Hope this is useful.

FaSan


3guk

I know there are probably countless other scripts that do this but I have found this one to be extremly stable and to do exactly what I want.

I just felt that some people with larger boards may find this useful.

Jerry

I think this looks like a nice tutorial, I will try it out soon :)


- Jerry
Find me on:
Facebook
Twitter
PlanetSMF

"If all you look for is the negative in things, you will never see the positive."

3guk

Jerry,

If you do let me know how it goes, I have done it on my site and it works like a charm but I would be interested to know about other people.

Zenigata

Quote from: Jerry on August 21, 2005, 07:57:08 AM
I think this looks like a nice tutorial, I will try it out soon :)

Hi,
any news about this tutorial?

Thanks

dtm.exe

Quote from: Zenigata on September 28, 2005, 03:49:35 PM
Hi,
any news about this tutorial?

Thanks

Why don't you try it and see how it works?  :).

Zenigata

mysqldump --opt -pPASSWORD -uUSERNAME DATABASE | gzip >$DATESTAMP.gz

If my password is test and username is user, does I have to write:

mysqldump --opt -p test -u user

or

mysqldump --opt -ptest -uuser

Thanks

3guk

mysqldump --opt -ptest -uuser

Dont forget the database one too !!

rendi

okay how to modify your script to download database of mysql when the another script already to backup i have to modify but give me error here my script


#! /bin/sh
DATESTAMP=`date +%m-%d-%Y-%H%M`
wget http://mysite.com/$DATESTAMP.gz
echo $DATESTAMP >> msg.txt
cat msg.txt | /usr/lib/sendmail -t



From: [email protected]
To: [email protected]
Subject: Your Database has been download

Wget Database succesfull

Zenigata

Hi,
I have this script called daily.sh to perform a database backup to my space:

#! /bin/sh
DATESTAMP=`date +%m-%d-%Y-%H%M`
cd /home/mydirectory
mysqldump 
--opt -pmypassword -umyusername mydatabasename gzip >$DATESTAMP.gz
cd 
/home/mydirectory
echo $DATESTAMP >> msg.txt
cat msg
.txt | /usr/lib/sendmail -t



It runs via a cronjob.

Now I need to:


  • ftp the backup to a remote server
  • Delete the daily backup from my space and from the remote server after it's a week old


    Please help me to complete this script. Thanks.

ameir

Quote from: Zenigata on March 22, 2006, 11:57:22 AM
Hi,
I have this script called hxxp:daily.sh [nonactive] to perform a database backup to my space:

#! /bin/sh
DATESTAMP=`date +%m-%d-%Y-%H%M`
cd /home/mydirectory
mysqldump 
--opt -pmypassword -umyusername mydatabasename gzip >$DATESTAMP.gz
cd 
/home/mydirectory
echo $DATESTAMP >> msg.txt
cat msg
.txt | /usr/lib/sendmail -t



It runs via a cronjob.

Now I need to:


  • ftp the backup to a remote server
  • Delete the daily backup from my space and from the remote server after it's a week old


    Please help me to complete this script. Thanks.
I created a script similar to 3guk's that dumps multiple or all databases, gzips them, and emails them as file attachments and can FTP them over to a remote server.  You can delete old files from "my space" but I haven't explored how to delete files over FTP yet.  The script is relatively long, so go to hxxp:ameir.net/serendipity/index.php?/archives/14-MySQL-Backup-to-FTP-and-Email-Shell-Script-for-Cron-v2.html [nonactive] to see the code.  Just copy and paste into a ".sh" file, chmod to 755, and follow the cron commands of 3guk.  For anyone that is also interested in a directory backup script that I wrote that has similar functionality, visit hxxp:ameir.net/serendipity/index.php?/archives/15-Folder-Backup-to-FTP-and-Email-Shell-Script-for-Cron-v2.html [nonactive].  I hope you enjoy those scripts, as they've been very handy for me.
hxxp:www.ameir.net [nonactive]
hxxp:www.terpunderground.net [nonactive]

SleePy

Looks good  ;)
You could also do it via php and cron jobs

Code (DBbackup.php) Select

<?php
$datestamp 
date("Y-m-d");      // Current date to append to filename of backup file in format of YYYY-MM-DD

/* CONFIGURE THE FOLLOWING SEVEN VARIABLES TO MATCH YOUR SETUP */
$dbuser "USERNAME";            // Database username
$dbpwd "PASSWORD";            // Database password
$dbname "DATABASE";            // Database name. Use --all-databases if you have more than one
$filename"backup-$datestamp.sql.gz";   // The name (and optionally path) of the dump file
$to "[email protected]";      // Email address to send dump file to
$from "[email protected]";      // Email address message will show as coming from.
$subject "MySQL backup file - "$dbname;      // Subject of email

$command "mysqldump -u $dbuser --password=$dbpwd $dbname | gzip > $filename";
$result passthru($command);

$attachmentname array_pop(explode("/"$filename));   // If a path was included, strip it out for the attachment name

$message "Compressed database backup file $attachmentname attached.";
$mime_boundary "<<<:" md5(time());
$data chunk_split(base64_encode(implode(""file($filename))));

$headers "From: $from\r\n";
$headers .= "MIME-Version: 1.0\r\n";
$headers .= "Content-type: multipart/mixed;\r\n";
$headers .= " boundary=\"".$mime_boundary."\"\r\n";

$content "This is a multi-part message in MIME format.\r\n\r\n";
$content.= "--".$mime_boundary."\r\n";
$content.= "Content-Type: text/plain; charset=\"iso-8859-1\"\r\n";
$content.= "Content-Transfer-Encoding: 7bit\r\n\r\n";
$content.= $message."\r\n";
$content.= "--".$mime_boundary."\r\n";
$content.= "Content-Disposition: attachment;\r\n";
$content.= "Content-Type: Application/Octet-Stream; name=\"$attachmentname\"\r\n";
$content.= "Content-Transfer-Encoding: base64\r\n\r\n";
$content.= $data."\r\n";
$content.= "--" $mime_boundary "\r\n";

mail($to$subject$content$headers);

unlink($filename);   //delete the backup file from the server
?>


then for your cron job command do this:
php -q /home/path_to_you/DBbackup.php

I would recomend leaving this file above your web root (usually public_html,www, or htdocs).

I set mine up like that then told my mail program to search for "MySQL backup file - " from what I put in the subject line and move it to a folder. everything automated for lazy me  ;D
you can do your files to via the same way

I got this from a post Danielle made at Lunarpages
http://www.lunarforums.com/forum/index.php?topic=22118.0

you could also add this into the script before it does the backup.

$db = mysql_connect("localhost", $dbuser, $dbpwd);
if(!db)
$message .= "Didn't connect to database. Optimize failed";
mysql_select_db("dbname");  // All Databases can not be used here
$alltables = mysql_query("SHOW TABLES");
while ($table = mysql_fetch_assoc($alltables))
{
foreach ($table as $db => $tablename)
{
        mysql_query("OPTIMIZE TABLE '".$tablename."'")
        or die(mysql_error());
    }
}


and that will optimize your tables before they get backed up.
If I knew how to pass mysql optimize command through without logging into mysql I would post but but I dont. otherwise you could do it in your .sh file with a loop command till its finished.
Jeremy D ~ Site Team / SMF Developer ~ GitHub Profile ~ Join us on IRC @ Libera.chat/#smf ~ Support the SMF Support team!

topdog2k

And now for the dumbest question in the thread....

where do I find the Tar=/bin/tar

I dont know the name or location of my tar

Klozi

Nice work, but most of all hosting services are including a backup job like this. For all others a good script. :>

3guk

It was more for people who run their own servers, or for people who don't like using the host provided script.

I hope it has worked well for people as those scripts took me a while to use properly !!

mark7144

How do you get CuteFTP to download the backup automatically?

Sarge

Quote from: mark7144 on July 21, 2007, 12:58:08 PM
How do you get CuteFTP to download the backup automatically?

You probably need to automate CuteFTP (client, right?) using command-line parameters and either a batch file or some other program to run CuteFTP at specified intervals. For more information on how this can be done, check out the CuteFTP documentation or support site.

    Please do not PM me with support requests unless I invite you to.

http://www.zeriyt.com/   ~   http://www.galeriashqiptare.net/


Quote
<H> I had zero posts when I started posting

TheDisturbedOne

Bringing back an old topic, can I make this so that it emails me the backup?

Simplemind

Quote from: 3guk on August 16, 2005, 08:15:38 AM
Right Guys,

Spent the last 9 hours working on my own website and learning about cron jobs and stuff.

I have two scripts here for you that auto backup your site / forum what ever.

Mysql Backup
The first one is to back up a mysql database.

#! /bin/sh
DATESTAMP=`date +%m-%d-%Y-%H%M`
cd /home/yourusername/backups/database/
mysqldump --opt -pPASSWORD -uUSERNAME DATABASE | gzip >$DATESTAMP.gz
cd /home/yourusername/
echo $DATESTAMP >> msg.txt
cat msg.txt | /usr/lib/sendmail -t


Save as daily.sh [nofollow] and put it in your home directory. So as soon as you login to ftp I would place it there before public_html. Edit the file replacing , youusername, PASSWORD, USERNAME and DATABASE for your own values. Please create the directory backups and the directory database within backups.

From: [email protected]
To: [email protected]
Subject: Daily Backup Routine

Begin your message here. It is very important to leave a blank line before you start the body of your message.
The Backup Routine has been executed


Obviously save as msg.txt and put it in the same place. editing the email @ here.com [nofollow] for your own values. The msg can be sent to any email address and it just confims that the script has executed.

Chmod to 755 , daily.sh [nofollow] that is. Chmod msg.txt to 777.

If you would just like to make a backup of MYSQL only and nothing else then you can skip the next step and go to the last step.



Files Backup

This script is a little more complex.

On the 1st of the month a permanet full backup is made

Every Sunday a full backup is made - overwriting last Sundays backup

The rest of the time an incremental backup is made. Each incremental backup overwrites last weeks incremental backup of the same name.

if NEWER = "", then tar backs up all files in the directories
otherwise it backs up files newer than the NEWER date. NEWER gets it date from the file written every Sunday.

#!/bin/sh
# full and incremental backup script
# created 07 February 2000
# Based on a script by Daniel O'Callaghan <[email protected]>
# and modified by Gerhard Mourani <[email protected]>

#Change the 5 variables below to fit your computer/backup

COMPUTER=computername                          # name of this computer or site name
DIRECTORIES="/home/username/public_html"                      # directoris to backup
BACKUPDIR=/home/username/backups                      # where to store the backups
TIMEDIR=/home/username/backups/last-full              # where to store time of full backup
TAR=/bin/tar                            # name and locaction of tar

#You should not have to change anything below here

PATH=/usr/local/bin:/usr/bin:/bin
DOW=`date +%a`              # Day of the week e.g. Mon
DOM=`date +%d`              # Date of the Month e.g. 27
DM=`date +%d%b`              # Date and Month e.g. 27Sep

# On the 1st of the month a permanet full backup is made
# Every Sunday a full backup is made - overwriting last Sundays backup
# The rest of the time an incremental backup is made. Each incremental
# backup overwrites last weeks incremental backup of the same name.
#
# if NEWER = "", then tar backs up all files in the directories
# otherwise it backs up files newer than the NEWER date. NEWER
# gets it date from the file written every Sunday.


# Monthly full backup
if [ $DOM = "01" ]; then
        NEWER=""
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DM.tar $DIRECTORIES
fi

# Weekly full backup
if [ $DOW = "Sun" ]; then
        NEWER=""
        NOW=`date +%d-%b`

        # Update full backup date
        echo $NOW > $TIMEDIR/$COMPUTER-full-date
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DOW.tar $DIRECTORIES

# Make incremental backup - overwrite last weeks
else

        # Get date of last full backup
        NEWER="--newer `cat $TIMEDIR/$COMPUTER-full-date`"
        $TAR $NEWER -cf $BACKUPDIR/$COMPUTER-$DOW.tar $DIRECTORIES
fi


Save this as data.sh [nofollow] and save in the same place as the other script. Edit the variables at the top of the script, they should be fairly self explanatory, the computer name I have set to the name of my site.

Chmod to 755.

Obviously if you have not done the mysql one then you have to creat the directories  : backups and backups/last-full. If you change these to something else then please create them as the script will not work.

Then in last-full you need to create a blank file, use notepad if you like and on the first line type the date like so - 16-Aug. This tells the server that the last full backup was on this date so that it does get all confused.  :P  Call this file COMPUTERNAME-full-date replace computer name with the one that you have set.

After doing this you need to configure a cron job to execute this script at a given time.

Cron Jobs
MySQL Backup

Cron jobs are easily accesible via Cpanel , Click cron jobs and then click advanced (unix style).

You then need to create a cron job, so for the mysql backup script you would do
minute : 00
hour : 01
day : *
month : *
weekday : *
command : /home/username/daily.sh

This basically tells the server to execute daily.sh [nofollow] at 1 everyday obviously you can change this to be what ever time you like but at 1am no one tends to be browsing and hence server load is low. Edit the username in the command to suit your cpanel login name.

If you want to test it just set the hour and minute accordingly and click commit changes. It should execute at the time you said and will leave a nice date stamped archive for you to download at your own pleasure. This will leave huge amounts of data on the server so please keep on top of it. I download all my mysql dumps every week and empty the directory.

Files Backup


You then need to create a cron job, so for the files backup script you would do
minute : 00
hour : 02
day : *
month : *
weekday : *
command : /home/username/data.sh

This basically tells the server to execute data.sh [nofollow] at 2 everyday obviously you can change this to be what ever time you like but at 2am no one tends to be browsing and hence server load is low. I have set it to 2am so that it doesnt clash with the mysql dump script, again just to reduce server load at anyone time. Edit the username in the command to suit your cpanel login name.

If you want to test it just set the hour and minute accordingly and click commit changes. It should execute at the time you said and will leave a incremental backup on the server something like computername-day.tar.

I reccomend editing the line - if [ $DOM = "01" ]; then - changing the 01 for the current date - 17 in this case. This causes it to do a full backup. Obviously once this is done change back to 01.

On the first of a month a permanent full backup will be made. Meaning that if you choose not to delete them you will have 12 main backup files within a year in the directory.

Every Sunday you will also get a full backup created but the next sunday this gets wiped and replaced with a new sunday backup for that week.

The incremental backups each day also get wiped weekly with each day overwriting the the last weeks backup so week 1 monday will be replaced by week 2 monday.

Finally,

These are not my scripts and I take no credit for creating or finding them, I just felt a tutorial was needed on how to auto backup your site. These days data just isnt safe.

I have set Cute FTP to auto download the files meaning that each day when my computer turns on Cute ftp downloads the incrematal backup. On sunday it downloads the latest full backup. And on the first of each month it downloads the main monthly backup.

All to folders on my hard drive meaning that I will always have a very recent backup spare should my server get broke :P.

Hope this is useful.
TY so much its working Perfect  Nice !!!      :)

Advertisement: