I'm working on a research project in a ML lab, and running stuff on their machines virtually using ssh. The machines are Linux, and my home laptop is a Mac. The actual machine storage is really small, so I'm supposed to use these two specific directories that show up under df -I (inodes list). But how do I like redirect files to store there?
Adding it to the end of directories when I use scp -r doesn't work. My command to send directories from my local computer to the ssh is scp -r /Local directory pwd/ username#server:/home/username, and for example sticking on the anode directory to the back of it (so new command would be scp -r /Local directory pwd/ username#server:/home/username/inode directory) doesn't work.
Would appreciate help
I recently set up Lamp stack on ubuntu 14.04 for my web server. I'm working through Digital Ocean. These are the steps I went through...
On local machine I logged in to my web server with
sftp user#web_server_ip
Then
sftp> cd /var/www/html
How would I go upon getting onto my local machine to get the file for the site? And how would I transfer them?
I know that I have to use the [get] and [put] commands
I'm just confused what's considered local/remote? if I'm logged into the remote server on my local machine. Overthinking it?
This is the tutorial I'm trying to follow: How To Use SFTP to Securely Transfer Files with a Remote Server
Edit:
So I tried moving a whole directory from my local machine and this is what I ended up doing
scp -r /path/directory_name name#ip_address:/var/www/html
scp: /var/www/html/portfolio.take7: Permission denied
Should I be changing permission by using sudo prior to scp -r?
Edit2:
I have also tried
Where_directory_is$ scp -r /path/directory_name name#ip_address:/var/www/html
/var/www/html: No such file or directory
It might be easier to start with SCP which allows you to copy files with one command. So for example, if you had a local file /path/filename.css and wanted to transfer it to your server, you could use the following command on your local machine:
scp /path/filename.css username#remote_hostname_or_IP:~
This command copies the local file and transfers it to the home directory of the username on the remote server using SSH. You can then SSH in (ssh username#remote_hostname_or_IP) and then do what you need with the file sitting in your home directory, such as move it to the proper Apache directory.
Once you start to get more comfortable, you can switch to sftp if you like.
Update
Here is how to set up your Apache permissions. Let's say you have an account you on the linux computer running Apache, and we'll say the IP is 192.168.1.100.
On your local machine, create this shell script, secure.sh, and remember shell scripts need to have execute privileges (chmod +x secure.sh). Fill it with the following contents:
#!/usr/bin/env bash
# Lockdown the public web files
find /var/www -exec chown you:www-data {} \;
find /var/www -type d -exec chmod -v 750 {} \;
find /var/www -type f -exec chmod -v 640 {} \;
This shell script is setting the permissions for anything in the /var/www/ directory to be 750 for the directories and 640 for the files. This gives you complete read/write permissions for the files and www-data (which is the account for Apache) read permissions. Run this anytime you have uploaded files to ensure the permissions are always set correctly.
Next, SSH into your remote computer and go to the /var/www/html directory. Ensure that the ownership is not set to root. If it is, scp the secure.sh file into your remote computer, become root and run it. This only needs to be done once, so you can remotely set the permissions.
Now you can copy directly to /var/www/ through the scp -r command on your local computer from the top of the directory you wish to copy to /var/www/html:
scp -r ./ you#192.168.1.100:/var/www/html/
Then run this command to remotely run the secure.sh shell script and send the output to out.txt:
ssh you#192.168.1.100 -p 23815 ./secure.sh > out.txt
Then cat out.txt to see that the file permissions changed accordingly.
If this is a public facing computer, then you must add an SSH key to your scp connection. You can use this tutorial to find out more about generating your own keys, it is quite easy. To use the key, you only need to add -i private_key_file to your scp and ssh commands. Lastly, it would actually be safer to keep the /var/www files as root, SSH into the computer, su to become root, then run secure.sh as root (with the owner changed to root in the shell script). It all depends on the level of security you need to worry about. If it is a development computer (which is what I am assuming) no worries then.
For folders use
scp -r root#yourIp:/home/path/ /pathOfDirectory/
For files
scp -r root#yourIp:/home/path/ /pathOfDirectory/file fileNameCopied
I want to be able to transfer a directory and all its files from my local machine to my remote one. I dont use SCP much so I am a bit confused.
I am connected to my remote machine via ssh and I typed in the command
scp name127.0.0.1:local/machine/path/to/directory filename
the local/machine/path/to/directory is the value i got from using pwd in the desired directory on my local host.
I am currently getting the error
No such file or directory
Looks like you are trying to copy to a local machine with that command.
An example scp looks more like the command below:
Copy the file "foobar.txt" from the local host to a remote host
$ scp foobar.txt your_username#remotehost.edu:/some/remote/directory
scp "the_file" your_username#the_remote_host:the/path/to/the/directory
to send a directory:
Copy the directory "foo" from the local host to a remote host's directory "bar"
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
scp -r "the_directory_to_copy" your_username#the_remote_host:the/path/to/the/directory/to/copy/to
and to copy from remote host to local:
Copy the file "foobar.txt" from a remote host to the local host
$ scp your_username#remotehost.edu:foobar.txt /your/local/directory
scp your_username#the_remote_host:the_file /your/local/directory
and to include port number:
Copy the file "foobar.txt" from a remote host with port 8080 to the local host
$ scp -P 8080 your_username#remotehost.edu:foobar.txt /your/local/directory
scp -P port_number your_username#the_remote_host:the_file /your/local/directory
From a windows machine to linux machine using putty
pscp -r <directory_to_copy> username#remotehost:/path/to/directory/on/remote/host
i had a kind of similar problem. i tried to copy from a server to my desktop and always got the same message for the local path. the problem was, i already was logged in to my server per ssh, so it was searching for the local path in the server path.
solution: i had to log out and run the command again and it worked
In my case I had to specify the Port Number using
scp -P 2222 username#hostip:/directory/ /localdirectory/
Your problem can be caused by different things. I will provide you three possible scenarios in Linux:
The File location
When you use scp name , you mean that your File name is in Home directory. When it is in Home but inside in another Folder, for example, my_folder, you should write:
scp /home/my-username/my_folder/name my-username#127.0.0.1:/Path....
You File Permission
You must know the File Permission your File has. If you have Read-only you should change it.
To change the Permission:
As Root ,sudo caja ( the default file manager for the MATE Desktop) or another file manager ,then with you Mouse , right-click to the File name , select Properties + Permissions
and change it on Group and Other to Read and write .
Or with chmod .
You Port Number
Maybe you remote machine or Server can only communicate with a Port Number, so you should write -P and the Port Number.
scp -P 22 /home/my-username/my_folder/name my-usernamee#127.0.0.1 /var/www/html
You also need to make sure what is in the .bashrc file of the user.
I've also got this ridiculous error because I put cd and ls commands in there, as it was mean to let them see the current files & directories when the user is has logged in from ssh.
The filename should go at the end of the path to the directory. That is, it should be the full path to the file. You are doing this from a command line, and you have a working directory for that command line (on your local machine), this is the directory that your file will be downloaded to. The final argument in your command is only what you want the name of the file to be. So, first, change directory to where you want the file to land. I'm doing this from git bash on a Windows machine, so it looks like this:
cd C:\Users\myUserName\Downloads
Now that I have my working directory where I want the file to go:
scp -i 'c:\Users\myUserName\.ssh\AWSkeyfile.pem' ec2-user#xx.xxx.xxx.xxx:/home/ec2-user/IwantThisFile.tar IgotThisFile.tar
Or, in your case:
cd /local/path/where/you/want/the/file/to/land
scp name#127.0.0.1:/local/machine/path/to/directory/filename filename
Be sure the folder from where you send the file does not contain space !
I was trying to send a file to a remote server from my windows machine from VS code terminal, and I got this error even if the file was here.
It's because the folder where the file was contained space in its name...
If you want to copy everything in a Folder + have a special Port use this one.
Works for me on Ubuntu 18.04 and a local machine with Mac OS X.
-r for recursive
-P for Port
scp -rP 1234 /Your_Directory/Source_Folder/ username#yourdomain.com:/target/folder
As #Astariul said, path to the file might cause this bug.
In addition, any parent directory which contains non-ASCII character, for example Chinese, will cause this.
In that case, you should rename you parent directory
This happened to me and I solved it.
This problem can be because the file you are trying to get is not existing (typo in the name of file or folder?) or because it is invisible to the user that you enter in scp.
The problem in my case was that the files that I wanted to get from remote machine were created by another user (root on my case), so, those files were invisible
To fix, I did:
ssh myuser#myserver
chown myuser:myuser myfile
exit
scp mysuer#myserver:/home/myuser/myfile /localfolder/myfile
For me on my mac,
I just have to run the command from my MAC terminal
scp -r root#ip_addres:/root/source /Users/path/Desktop/others/destination
I need to rsync log files like this:
rsync --progress -rvze ssh name#host:/path/to/folder/*.log
When I run this command though, I get an error:
rsync: getcwd(): No such file or directory (2)
No such file or directory? That's odd. So I try to ssh directly:
ssh name#host
it prompts to enter my name, I do, then I type
cd /path/to/folder
which works fine (log files are present).
I double checked my ssh keys, everything seems to be in order there, but for some reason I can't ssh into a subfolder on this host, so there's no way I can get rsync working correctly.
EDIT:
Running the identical rsync command on my Mac, it works fine. Running it in my ubuntu EC2 instance is still failing.
Are you sure there are any log files at all? If not this command will fail with the 'No such file or directory'
Rather use:
rsync --progress --include='*.log' -rvze ssh name#host: /path/to/folder/ local_folder
The 'direct' ssh syntax you use in your second test is not supported:
ssh name#host:/path/to/folder/
because it will use host:/path/to/folder/ as the hostname.
I uploaded ajaxplorer "pydio-core-5.0.4.zip" to my server and after I extracted files into a folder in the server i request the folder to starting install but i get this message :
"Impossible write into the AJXP_DATA_PATH folder: Make sure to grant write access to this folder for your webserver!"
i made the folder : /data permissions to 777 and it did not make change ..
any solve ?
I'v got the same problem few hours ago.
The problem:
You put full permissions (777) to the data folder, but subfolders don't get it.
The solution:
sudo chmod -R 777 data
sudo chmod -R 777 data
or
sudo mkdir -m 777 your_pydio_path/data/tmp/sessions
I know this is old, but I was having the same issue with pydio-core-6.0.8. Also, I'm going to preface this by saying that I am a php noob. But I was able to resolve my issue without a chmod 777 command. Instead, I made the nginx user the owner of the data directory.
chown -R nginx /path/to/pydio-core-6.0.8/data
And then made sure that php-fpm was running as the nginx user with the two php-fpm.conf settings
listen.owner = nginx
user = nginx
After restarting php-fpm, I was able to load the pydio page which went into the startup wizard.
This command is so easy! But it's dangerous!
Go to /var/www/pydio for apache2 or /usr/share/nginx/html/pydio for nginx and try:
chmod ugo+x data
It's more protected!