FTP Script - Put images in year/date folders - ssh

I have the following script that I am running on Centos which should connect to an external FTP and upload to a directory.
The directory is dependent on the date and needs to dynamically fetch the year and week number.
#!/bin/sh
USER3='USERNAME'
PASSWD3='PASSWORD'
YEAR= date "+%G"
WEEK= date "+%V"
ftp -n -i HOST.com <<SCRIPT3
user $USER3 $PASSWD3
binary
cd htdocs/uploads/$YEAR/$WEEK/
bin
mput *.jpg
quit
SCRIPT3
If I run the script I get this as a response:
# bash test.sh
2014
28
So it looks like it is displaying the year and week number but not implementing them in to the folder location part of the script.
How do i get the year/date to echo in to the URL of the folder?

Like this:
#!/bin/sh
USER3='USERNAME'
PASSWD3='PASSWORD'
YEAR=$(date "+%G")
WEEK=$(date "+%V")
ftp ...
Note that you may need to create the directory before changing there...

Related

No such file or directory from sh script

Looking for the origin of this error message:
Processing: +([^_]).flv
date: +([^_]).flv: No such file or directory
I started getting this at some point in the last few months (can't say when as I wasn't logging my cron output. I know, I know!).
When I originally wrote this, it worked ok for at least two months. I'm wondering if there was an sh update that broke it?
The script runs via crontab and gets all .flv files in the current directory without an underscore and processes each one. It then checks the modified date for files that have been created in the last 24 hours and runs the yamdi meta tag injector for .flv files.
It seems like it's not recognizing the pattern as a pattern and looking for it as an actual file to me. If I run this script from an ssh shell it works ok, it's only when running via cron that it gives this error.
shopt -s extglob
now=$(date +"%s")
for f in +([^_]).flv; do
echo "Processing: $f"
age=$(date -r "$f" +"%s")
calc=$(((now-age) / 60 / 60))
if(( calc < 24 )); then
echo "$f age=$calc"
yamdi -i "$f" -o "$f".seek
rm "$f"
cp "$f".seek "$f"
touch -d #$age "$f"
fi
done
This is most likely a problem of the wrong shell being used; make sure your script's first line represents the right shell:
#!/bin/bash
for bash, or whatever shell you wrote this for. You might want to check your environment variables that cron may set (that's a very common problem -- one assumes everything is set up correctly, but the environment that cron offers to scripts it executes is different).

scp files in a certain order using ls

Whenever I try to SCP files (in bash), they end up in a seemingly random(?) order.
I've found a simple but not-very-elegant way of keeping a desired order, described below. Is there a clever way of doing it?
Edit: deleted my early solution from here, cleaned, adapted using other suggestions, and added as an answer below.
To send files from a local machine (e.g. your laptop) to a remote (e.g. your calculation server), you can use Merlin2011's clever solution:
Go into the folder in your local machine where you want to copy files from.
Execute the scp command, assuming you have an access key for the remote server:
scp -r $(ls -rt) user#foo.bar:/where/you/want/them/.
Note: if you don't have a public access key it may be better to do something similar using tar, then send the tar file, i.e. tar -zcvf files.tar.gz $(ls -rt), and then send that tar file on its own using scp.
But to do it the other way around you might not be able to run the scp command directly from the remote server to send files to, say, your laptop. Instead, you may need to, let's say bring files into your laptop. My brute-force solution is:
In the remote server, cd into the folder you want to copy files from.
Create a list of the files in the order you want. For example, for reverse order of creation (most recent copied last):
ls -rt > ../filenames.txt
Now you need to add the path to each file name. Before you go up to the directory where the list is, print the path using pwd. Now do go up: cd ..
You now need to add this path to each file name in the list. There are many ways to do this, here's one using awk:
cat filenames.txt | awk '{print "path/to/files/" $0}' > delete_me.txt
You need the filenames to be in the same line, separated by a space, so change newlines to spaces:
tr '\n' ' ' < delete_me.txt > filenames.txt
Get filenames.txt to the local server, and put it in the folder where you want to copy the files into.
The scp run would be:
scp -r user#foo.bar:"$(cat filenames.txt)" .
Similarly, this assumes you have a private access key, otherwise it's much simpler to tar the file in the remote, and bring that.
One can achieve file transfer with alphabetical order using rsync:
rsync -P -e ssh -r user#remote_host:/some_path local_path
P allows partial downloading, e sets the SSH protocol and r downloads recursively.
You can do it in one line without an intermediate using xargs:
ls -r <directory> | xargs -I {} scp <Directory>/{} user#foo.bar:folder/
Of course, this would require you to type your password multiple times if you do not have public key authentication.
You can also use cd and still skip the intermediate file.
cd <directory>
scp $(ls -r) user#foo.bar:folder/

List all files ssh; limit to filetype

Using SSH, how can I list all files in current directory (ls) but limit it to a certain filetype? (.pdf in this case)
so basically
ssh user#mydomain.com
cd public_html
ls (only pdfs)
The linux ls command accepts wildcard filters, try something like ls *.pdf to list all pdf's in the current directory.

Putty ssh commands zip all the files within this folder then download

oh so i cd into my folder
ls
cgi-bin wp-comments-post.php wp-mail.php
googlec3erferfer228fc075b.html wp-commentsrss2.php wp-pass.php
index.php wp-config-sample.php wp-rdf.php
license.txt wp-config.php wp-register.php
php.ini wp-content wp-rss.php
readme.html wp-cron.php wp-rss2.php
wp-activate.php wp-feed.php wp-settings.php
wp-admin wp-includes wp-signup.php
wp-app.php wp-links-opml.php wp-trackback.php
wp-atom.php wp-load.php xmlrpc.php
wp-blog-header.php wp-login.php
(uiserver):u45567318:~/wsb454434801 >
What i want to do is zip all the files within this folder then download it to my computer i am really new to ssh and this is a clients website but really want to start to use command line for speed, i have been looking a this reference http://ss64.com/bash/ to find the right commands but would really like some help from somebody please??
Thanks
cd path/to/folder/foldername
zip -r foldername.zip foldername * [use * if it has any sub directory]
Please try this code, it will solve your problem.
If you are in directory itself then
zip -r zipfilename.zip *
Go to folder path using cd command
zip -r foldername.zip foldername
Ex : zip -r test-bkupname.zip test
Here test is the folder name.
tar zcvf ../my_directory.tar.gz .
will create my_directory.tar.gz file.
scp ../my_directory.tar.gz username#your-ip:/path/to/place/file
will transfer file to your computer.
Looks like this is the webroot directory.
Why not zip the directory above (httpdocs / html / whatever) and then move this into the website space, and download from there?
i.e. go into the directory above the web root. For example, if your web root is /var/www/html/ go into /var/www/ and run the following commands:
zip -r allwebfiles.zip html
mv allwebfiles.zip /html/allwebfiles.zip
Then in your web browser go to http://mydomain.com/allwebfiles.zip and just download that file.
When extracting, you'd just need to either extract into /var/www/ OR extract into webroot and move all files up one level.
Use the following Ssh command to download multiple files at one time
mget ./*

How do I call a local shell script from a web server?

I am running Ubuntu 11 and I would like to setup a simple webserver that responds to an http request by calling a local script with the GET or POST parameters. This script (already written) does some stuff and creates a file. This file should be made available at a URL, and the webserver should then make an http request to another server telling it to download the created file.
How would I go about setting this up? I'm not a total beginner with linux, but I wouldn't say I know it well either.
What webserver should I use? How do I give permission for the script to access local resources to create the file in question? I'm not too concerned with security or anything, this is for a personal experiment (I have control over all the computers involved). I've used apache before, but I've never set it up.
Any help would be appreciated..
This tutorial looks good, but it's a bit brief.
I have apache installed. If you don't: sudo apt-get install apache2.
cd /usr/lib/cgi-bin
# Make a file and let everyone execute it
sudo touch test.sh && chmod a+x test.sh
Then put the some code in the file. For example:
#!/bin/bash
# get today's date
OUTPUT="$(date)"
# You must add following two lines before
# outputting data to the web browser from shell
# script
echo "Content-type: text/html"
echo ""
echo "<html><head><title>Demo</title></head><body>"
echo "Today is $OUTPUT <br>"
echo "Current directory is $(pwd) <br>"
echo "Shell Script name is $0"
echo "</body></html>"
And finally open your browser and type http://localhost/cgi-bin/test.sh
If all goes well (as it did for me) you should see...
Today is Sun Dec 4 ...
Current directory is /usr/lib/cgi-bin Shell
Shell Script name is /usr/lib/cgi-bin/test.sh