Download using gsutil - gsutil

I was using gsutil to download a trace file from google storage.
The command I used was:
gsutil/gsutil cp gs://clusterdata-2011-1/task_usage/part-00499-of-00500.csv.gz ./
But I got an error:
GSReponseError: Status=404, code=NoSuchKey, reason=Not Found.
However I used ls command in gsutil and the file existed.
Any suggestion is appreciated.

It works finally. The reason may be gsutil version or that the last time the server wasn't working.

Related

SSH opening file error - no idea why

Running Debian Linux - newest version.
cp /included/filename /usr/bin/
It gives me error "cannot stat '/included/filename': No such file or directory
I don't get why there should be an error. I am doing it as superuser.
From your latest comment i conclude you got the paths mixed up. If you want to copy the file install.sh located under /usr/bin/included/ you would need to do
cp /usr/bin/included/install.sh /usr/bin/
to make something similar to your provided command work, id assume you are in /usr/bin and the first argument needs to be a relative one
cd /usr/bin
cp ./included/install.sh /usr/bin/
Please provide more information on what you are trying to do and provide realworld example code.

gsutil ls returns error: "contains wildcard"

For some reason, we've got a folder which causes gsutil ls to error:
$ gsutil ls -lR gs://mybucket/proj103
...
...
...
gs://mybucket/proj103/delivery/161025_To_Viewport/app_icon/:
39219977 2016-11-17T10:44:08Z gs://mybucket/proj103/delivery/161025_To_Viewport/app_icon/App Ikon.psd
CommandException: Cloud folder gs://mybucket/proj103/delivery/161025_To_Viewport/app_icon/Client - VR [Squared]/ contains a wildcard; gsutil does not currently support objects with wildcards in their name.
When I look in the network share (from my Windows machine) from which the files originate (we upload them to the bucket nightly vi gsutil rsync) I see this:
Directory: \\10.1.1.100\prod\proj103\delivery\161025_To_Viewport\app_icon
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 10/25/2016 6:18 PM Client - VR [Squared]
-a---- 10/25/2016 5:29 PM 39219977 App Ikon.psd
Are those brackets causing some kind of issue?
I'm on gsutil version 4.22.
In addition to the answer by #mhouglum (thanks!) I'd like to add that there's a workaround:
gsutil ls -lR gs://mybucket/proj103/**
This workaround was suggested, also by #mhouglum, here.
The short answer is: yes, unfortunately the brackets are what's causing the issue here.
This is a current limitation in gsutil, and it's being tracked in a GitHub issue (#290). I've added a reference to your Stack Overflow post there.

s3cmd get recursive equivalent command for s3Express(windows)?

I need the equivalent command of "s3cmd get --recursive"(of Linux or Mac) for s3Express(Windows)?
Thanks in advance
The AWS Command-Line Interface (CLI) is available for Windows, Mac and Linux. It includes a recursive copy command and also a useful sync command that will only copy new or modified files.
See: AWS CLI S3 documentation
You can find more details from this pdf file: http://www.s3express.com/docs/s3express_backup.pdf
To summarize, the command syntax you are looking for is:
get "" -s -onlydiff
the -s is the s3cmd --recursive equivalent
the -onlydiff is the s3cmd sync equivalent

Hive script not running in crontab with hadoop must be in the path error

After setting Hadoop Home path and Prefix path in .bashrc and /etc/profile also im getting the same error - Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
If i run the script from crontab im facing this error from hive> prompt its working fine
plz help with the regarding how to solve this
Set $HADOOP_HOME in $HIVE_HOME/conf/hive-env.sh
try loading user bash profile in the script, as below,
. ~/.bash_profile
bash is included in user bash_profile and it will have user specific configurations as well.
see the similar question Hbase commands not working in script executed via crontab

Setup Amazon S3 backup on QNAP using s3cmd

I own a QNAP-219P and I want to set this up manually using s3cmd.
I did quite a bit of research on this, and here are the references I got:
http://web.archive.org/web/20091120211330/http://codemonkeybrown.com/qnaps3.html
http://wiki.qnap.com/wiki/Running_Your_Own_Application_at_Startup
http://wiki.qnap.com/wiki/Add_items_to_crontab
http://blog.wingateuk.com/2013/03/cloud-backup-on-qnap-nas.html?showComment=1413660445187#c8935766892046800936
I'm trying to get the s3cmd to work on my TS-219P.
I got everything to work (on command line), even running the script file (s3-backup.sh) on command line:
#!/bin/bash <-- I also tried #!/bin/sh
/share/maintenance/s3cmd-1.5.0-rc1/s3cmd --rr sync -rv /share/all-shared-folders/emilie/ s3://kingjim-backup/kingjim-nas/emilie/ >> /share/maintenance/log/s3cmd/backup_`date "+%Y%m%d-%H-%M"`.log <-- I also tried running s3cmd via python by adding /usr/bin/python on the front.
If I run using the SSH command prompt, it seems to work perfectly.
The problem though, is the cronjob. I can confirm the cronjob trigger, and it was run, because my log file (the one above) was generated, but the log is always empty, even though I'm sure there are some new files created/modified.
This is my cronjob task:
14 3 * * * /share/maintenance/s3-backup.sh 2>&1 | logger
I've done a number of different variations on the above, but couldn't find out what was missing.
I feel like some dependency is missing when the crontab is running, as compared to when I run it on command prompt. But I don't know how to debug crontab.
Found out that the problem was that the s3cmd configuration file was not found when running s3cmd.
So the fix was simply to copy this .s3config file to a safe shared folder, and then call the s3cmd with the "--config" parameter followed by the file.
Like this:
/share/maintenance/s3-backup/s3cmd/s3cmd --config
/share/maintenance/s3-backup/s3cmd.config --rr sync -rv /share/MD0_DATA/ s3://xxx-backup/xxx-nas/ >> /share/maintenance/s3-backup/logs/backup_`date "+%Y%m%d-%H-%M"`.log 2>&1