can't download file with scp- no error but the file don't download - scp

I'm trying to download a file using scp
I run
scp user#ip:/home/user/file.gzip $HOME/Desktop
I get
file.gzip 100% 156MB 155.8MB/s 00:01
but the file is not found
I tried on Ubuntu and Windows and the results is the same. And I tried with another destiny folder, but the file don't found.
Please I appreciate any help

Using that syntax, scp will attempt to save the file with the name 'Desktop'
Instead, save it inside your home folder by specifying a filename explicitly:
scp user#ip:/home/user/file.gzip $HOME/file.gzip
Or, to preserve the original filename and save inside your 'Desktop' folder, append forward slash period:
scp user#ip:/home/user/file.gzip $HOME/Desktop/.

Related

Copy files from remote to local

I tried to copy the file from remote to my local directory but I am getting an error No such file or directory.
Below is the command I used to copy
scp username#remoteserver.xxx:/path to the file/filename /path to the local directory/
All the paths I copied using pwd.
Do you get "No such file or directory" for the
/path to the file/filename
or the
/path to the local directory/
part? If it's the first, then you might have an issue with the path itself. If it's for the second then specifying localuser#localmachine:/path to the local directory/ might fix it. If you're on the same LAN, you could also try localuser#localmachineip:/path to the local directory/

How to tell to gsutil file upload command that the "dst_url" is actually a file, not a directory?

On a GCS bucket, I have uploaded from the web console a file with name fileName, and created a directory with the same fileName name. Hence, the GCS bucket contains both at its root a file with name fileName and a directory with the same name.
Now, when I attempt to update the fileName file via the gsutil command via
gsutil cp localFile gs://bucketname/fileName
where localFileis a file on my local machine and gs://bucketname/fileName is hence the dest_url parameter, instead of overriding the GCS fileName file, a new file with name fileName is created in the GCS directory with name fileName, which yields the GCS file gs://bucketname/fileName/fileName.
I could not find an option in the gsutil command, which indicates that the provided dest_url should be interpreted as a file and not a directory.
Did I miss something? Any work-around, please?
A workaround to accomplish this is to use:
gsutil rsync -d . gs://bucketname
This command will keep the contents of filename directory but will change the contents of filename file; as a matter of fact, this command synchronizes the contents of your local path (in this case the "current" path specified by "dot": .) with the contents of your bucket, therefore, if you just change the contents of filename file in your local system, the synchronization will result in an "overwriting" of your filename file in your GCS bucket without affecting your filename directory.

Copying all files from a directory using a pig command

Hey I need to copy all files from a local directory to the HDFS using pig.
In the pig script I am using the copyFromLocal command with a wildcard in the source-path
i.e copyFromLocal /home/hive/Sample/* /user
It says the source path doesnt exist.
When I use copyFromLocal /home/hive/Sample/ /user , it makes another directory in the HDFS by the name of 'Sample', which I don't need.
But when I include the file name i.e /home/hive/Sample/sample_1.txt it works.
I dont need a single file. I need to copy all the files in the directory without making a directory in the HDFS.
PS: Ive also tried *.txt, ?,?.txt
No wildcards work.
Pig copyFromLocal/toLocal commands work only for a file or a directory.It will never take series of files (or) wildcard.More over, pig concentrates on processing data from/to HDFS.Upto my knowledge you cant even loop the files in a directory with ls.because it lists out files in HDFS. So, for this scenario I would suggest you to write a shell script/action(i.e. fs command) to copy files from locally to HDFS.
check this link below for info:
http://pig.apache.org/docs/r0.7.0/piglatin_ref2.html#copyFromLocal

SCP is creating subdirectory... but I just want it to copy directly

I'm trying to use scp to copy recursively from a local directory to a remote directory.... I have created the folders on the remote side:
Remote Location (already created):
/usr/local/www/foosite
I am running scp from the local machine in directory:
/usr/local/web/www/foosite
But it's copying the "foosite" directory as a subdirectory... I just want the contents of the folder, not the folder itself...
Here is the command I'm using:
scp -r /usr/local/web/www/foosite scpuser#216.99.999.99:/usr/local/www/foosite
The problem is that if you don't use the asterisk (*) in the local part of the call, scp will create a new top level directory in the remote server. It should look like this:
scp -r /usr/local/web/www/foosite/* scpuser#216.99.999.99:/usr/local/www/foosite
This says "Copy the CONTENTS" (but not the directory itself) to the remote location.
Hope this helps... Took me an hour or so to figure this out!!!
Old question, but I think there is a better answer. The trick is to leave the foosite directory off of the destination:
scp -r /usr/local/web/www/foosite scpuser#216.99.999.99:/usr/local/www
This will create the foosite directory on the destination if it does not exist, but will just copy files into foosite if the directory already exists. Basically the -r option will copy the last directory in the path and anything under it. If that last directory already exists on the destination, it just doesn't do the mkdir.

Rename files without extension with AutoHotkey

I am trying to add .pdf to the filename of files without extension in a folder.
It is possible to rename for example txt files to pdf using the following command:
FileMove, %SourceFolder%\*.txt, %SourceFolder%\*.pdf
Also, I can add .pdf to all files by:
FileMove, %SourceFolder%\*, %SourceFolder%\*.pdf
But I only want to target only the files without extension. How to do this?
Example
As suggested by kasper and MCL
SetWorkingDir %A_ScriptDir%
Loop, files\*
{
if !StrLen(A_LoopFileExt) ; if no file extension
{
FileMove,%A_LoopFileFullPath%,%A_LoopFileFullPath%.pdf ;rename file
}
}
see [Loop, FilePattern]
see [FileMove]
#kasper, You can use windows command prompt to rename the file as well as change the extension.
Just navigate to the file directory and use command
ren abc.txt abc.pdf
Here abc.txt is old file and abc.pdf is pdf of abc.txt.