Shell script to copy data from remote server to Google Cloud Storage using Cron - gsutil

I want to Sync my server data to Google Cloud Storage to copy automatically using shell script. I don't know how to make script. Every time i need to use:
gsutil -m rsync -d -r [Source] gs://[Bucket-name]
If anyone knows the answer please help me!

To automate the sync process use cron job:
Create a script to run with cron $ nano backup.sh
Paste your gsutil command in the script $ gsutil -m rsync -d -r [Source_PATH] gs://bucket-name
Make the script executable $ chmod +x backup.sh
Based on your use case, put the shell script (backup.sh) in one of the below folders: a) /etc/cron.daily b) /etc/cron.hourly c) /etc/cron.monthly d)
/etc/cron.weekly
If you want to run this script for a specific time then go to the terminal and type: $ crontab -e
Then simply call out the script with cron as often as you want, for example, in midnight: 00 00 * * * /path/to/your/backup.sh
In case you are using Windows on your local server, The commands will be the same as above but make sure to use Windows path instead.

Related

Run interactive local script on remote machine using docker-machine ssh

I have a local interactive (ruby) script, script.rb. I have a dockermachine, aws01. (The script pulls large files from point A, does some simple processing, and uploads them to S3).
Unfortunately, this incantation doesn't seem to do it:
docker-machine ssh aws02 -t ruby < script.rb
It runs the script, but not interactively :/
Any ideas how to do this in a single command?
(You could copy the script over and run it, you could grab the docker-machine's info and plug it into SSH with the -t flag... but I don't know how to do that in a single command)
You are putting the script itself on the standard input of the remote command (< redirection) so there is no other channel left for you to interact with the script.
In short, it is not possible with a single command. I would go with two:
docker-machine ssh aws02 "cat > script.rb" < script.rb
docker-machine ssh aws02 -t "ruby script.rb"

create a backup of database every day using Cron. [putty]

I have this code which created a backup of my database.
pg_dump -U dbadmin -h 127.0.0.1 123telcom -f dbbackup
Now i want to create a backup every night.
Is there a way u can execute this code with crontab?
0 3 * * * pg_dump -U dbadmin -h 127.0.0.1 123telcom -f dbbackup
I'm new to putty so if anyone could help me a little that would be great.
I suspect that you have fallen foul of cron's PATH set up.
If you look in /etc/crontab, it will define a PATH for itself and you will probably have a different PATH set up for your login.
Create your script with the first 2 lines:
#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
where the PATH includes whatever is set up in your environment and ensure that the script is executable.
To test what is going on try this script:
#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
echo $PATH >> /home/yourhome/cron.txt
create an entry in /etc/crontab:
* * * * * root /home/yourhome/yourshell.sh
tell cron about the changes by using sudo crontab -e and then just save it and exit (often Ctrl O and Ctrl X if using nano editor) or I think that you can just kill the cron process and it will re-spawn.
Then check the cron.txt file to see what it is using for PATH.
PS Don't forget to remove this script from the crontab afterwards

Unable to run a postgresql script from bash

I am learning the shell language. I have creating a shell script whose function is to login into the DB and run a .sql file. Following are the contents of the script -
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
echo "Running SQL Dump - auto_qa_db_sync"
\\i auto_qa_db_sync.sql
After running the above script, I get the following error
./autoqa_script.sh: 39: ./autoqa_script.sh: /i: not found
Following one article, I tried reversing the slash but it didn't worked.
I don't understand why this is happening. Because when I try manually running the sql file, it works properly. Can anyone help?
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production and run script"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT -f auto_qa_db_sync.sql
The lines you put in a shell script are (moreless, let's say so for now) equivalent to what you would put right to the Bash prompt (the one ending with '$' or '#' if you're a root). When you execute a script (a list of commands), one command will be run after the previous terminates.
What you wanted to do is to run the client and issue a "\i ./autoqa_script.sh" comand in it.
What you did was to run the client, and after the client terminated, issue that command in Bash.
You should read about Bash pipelines - these are the way to run programs and input text inside them. Following your original idea to solving the problem, you'd write something like:
echo '\i auto_qa_db_sync.sql' | $DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
Hope that helps to understand.

Chaining terminal script on mac os x

I am trying to chain some terminal commands together so that i can wget a file unzip it and then directly sync to amazon s3. Here is what i have so far i have s3cmd tool installed properly and working. This works for me.
mkdir extract; wget http://wordpress.org/latest.tar.gz; mv latest.tar.gz extract/; cd extract; tar -xvf latest.tar.gz; cd ..; s3cmd -P sync extract s3://suys.media/
How do i then go about creating a simple script i can just use variables?
You will probably want to look at bash scripting.
This guide can help you alot; http://bash.cyberciti.biz/guide/Main_Page
For your question;
Create a file called mysync,
#!/bin/bash
mkdir extract && cd extract
wget $1
$PATH = pwd
for f in $PATH
do
tar -xvf $f
s3cmd -P sync $PATH $2
done
$1 and $2 are the parameters that you call with your script. You can look at here for more information about how to use command line parameters; http://bash.cyberciti.biz/guide/How_to_use_positional_parameters
ps; #!/bin/bash is necessity. you need to provide your script where bash is stored. its /bin/bash on most unix systems, but i'm not sure if it is the same on mac os x, you can learn it by calling which command on terminal;
→ which bash
/bin/bash
you need to give your script executable privileges to run it;
chmod +x mysync
then you can call it from command line;
mysync url_to_download s3_address
ps2; I haven't tested the code above, but the idea is this. hope this helps.

rsync: polling for new files

I've got:
$ rsync -azv zope#myserver:/smb/Data/*/*/* ~/rsynced_samples/
And I want it to run forever, syncing any new file as soon as it appears on myserver:
(specifying a poll interval, such as 4 seconds would be an ok comprise)
Instead of rsync you can use inotifywait which use kernel specific file changes triggers.
This script (inotify.sh) can you give an idea:
#!/bin/bash
directory=$1
inotifywait -q -m --format '%f' -e modify -e move -e create -e delete ${directory} | while read line
do
echo "doing something with: $line";
# for example:
# cp $line to <somewhere>
You can invoke this script specifying the "monitor" directory, in this way
./inotify.sh ~/Desktop/
The $line variable contains the full file path.
If you want to limit to only newly created files you can use on the flag "-e create"
Use cron to set up a check based on your time interval (say, every minute, perhaps?) . This link should help: http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
Note that a cron tab is set up on your machine side, not in your bash script
also useful: http://benr75.com/pages/using_crontab_mac_os_x_unix_linux
and here is a code example:
1) crontab -e // this opens up your current crontab or creates one if it does not exist
2) enter: * * * * * file.sh >> log.txt // this would pipe the output of your file to a log file and run it every minute.
hope that helps