How to Schedule Backups With a Cron Job

Avatar
  • updated
  • Answered

Hello, I need to schedule a daily backup with cronjob.


I have created file backup.sh with this content:


_________________________________________________________________________


#!/bin/sh


wget -O /dev/null -q –http-user=$1 –http-password=$2 [redacted] –post-data=”dest=homedir&email=$3&email_radio=1&user=$1&pass=$2″


_________________________________________________________________________


and a cronjob with this command:


lynx -dump -useragent=/home/myUserName/backup.sh “myUserName” “myPassword” “myEmailAddress”


I also tryed:


lynx -dump -useragent=lynx(/home/myUserName/backup.sh) myUserName myPassword myEmailAddress


I got an email on the corect time, with the following message:


_________________________________________________________________________


/bin/sh: -c: line 0: syntax error near unexpected token `('


/bin/sh: -c: line 0: `lynx -dump -useragent=lynx(/home/myUserName/backup.sh) myUserName myPassword myEmailAddress


_________________________________________________________________________


and an other one, with previus tests with the following message:


_________________________________________________________________________


Warning: User-Agent string does not contain "Lynx" or "L_y_n_x"!


Alert!: User-Agent string does not contain "Lynx" or "L_y_n_x"


Can't Access `file://localhost/home/myUserName/myEmailAddress


Alert!: Unable to access document.


lynx: Can't access startfile


_________________________________________________________________________


What is the correct syntax to make it work?


Thank you!

Duplicates 1
Looking for help concerning backup of my MySQL databases.
I am looking for help concerning my MySQL databases.

I am very concerned about hacking and want to backup my databases several times a day because they most probably will change many times in a day.

It seems that a cron job is the logical answer.

I cannot find a cron to do this AND it must not overwrite the previous cron as any corruption will be written into the new cron.
(I think that having the cron Email the databases to me would prevent overwriting)

I am open to any suggestions...including paying for the cron to be written.

Can anyone help and/or advise me?

Thank you
Jim MacLeod
Avatar
anonymous
Hello Jim,

There is no bad limit as long as you are running them at different times and all 18 at the same time.

Best Regards,
TJ Edens
Avatar
Scott
Hello Jim, Thank you for your question on MySQL cron job backups. We do not have any ready made cron job scripts, but you can certainly use a cron job to back up your databases. I would not recommend more than once a day, however, as it may cause resource usage issues when combined with the CPU usage of normal site behavior. Below are links to two articles that may assist you in creating your scheduled cron: How to set up a cron How to schedule backups with a cron Kindest Regards, Scott M
Avatar
BradM
Hi AndreasKoug, What is the correct syntax to make it work? I don't have a script readily available for you to backup your account via a cron job, but I'd be happy to help troubleshoot some of the error messages that you are receiving. line 0: syntax error near unexpected token `(' Sometimes code doesn't paste correctly when asking a question here in Community Support. Can you post a copy of your script to pastebin so we can test a little further? Be sure to post a comment at the bottom of his page with a link to your code. Warning: User-Agent string does not contain "Lynx" or "L_y_n_x"! Have you tried setting your user agent to Lynx instead of lynx(/home/myUserName/backup.sh)? I have created file backup.sh with this content: Before attempting to setup the cron, have you first verified that your backup.sh file is working correctly? If you are on a VPS or Dedicated server, you can SSH into your account and run your script. Once you know that is working correctly, then you can focus on the syntax of the cron. Again, feel free to post a comment at the bottom of this page with more details, and we'll be happy to assist further. Thanks, - Brad