How to use specific inodes/directories that come under df -i to store files - ssh

I'm working on a research project in a ML lab, and running stuff on their machines virtually using ssh. The machines are Linux, and my home laptop is a Mac. The actual machine storage is really small, so I'm supposed to use these two specific directories that show up under df -I (inodes list). But how do I like redirect files to store there?
Adding it to the end of directories when I use scp -r doesn't work. My command to send directories from my local computer to the ssh is scp -r /Local directory pwd/ username#server:/home/username, and for example sticking on the anode directory to the back of it (so new command would be scp -r /Local directory pwd/ username#server:/home/username/inode directory) doesn't work.
Would appreciate help

Related

Trying to transfer local files to web server

I recently set up Lamp stack on ubuntu 14.04 for my web server. I'm working through Digital Ocean. These are the steps I went through...
On local machine I logged in to my web server with
sftp user#web_server_ip
Then
sftp> cd /var/www/html
How would I go upon getting onto my local machine to get the file for the site? And how would I transfer them?
I know that I have to use the [get] and [put] commands
I'm just confused what's considered local/remote? if I'm logged into the remote server on my local machine. Overthinking it?
This is the tutorial I'm trying to follow: How To Use SFTP to Securely Transfer Files with a Remote Server
Edit:
So I tried moving a whole directory from my local machine and this is what I ended up doing
scp -r /path/directory_name name#ip_address:/var/www/html
scp: /var/www/html/portfolio.take7: Permission denied
Should I be changing permission by using sudo prior to scp -r?
Edit2:
I have also tried
Where_directory_is$ scp -r /path/directory_name name#ip_address:/var/www/html
/var/www/html: No such file or directory
It might be easier to start with SCP which allows you to copy files with one command. So for example, if you had a local file /path/filename.css and wanted to transfer it to your server, you could use the following command on your local machine:
scp /path/filename.css username#remote_hostname_or_IP:~
This command copies the local file and transfers it to the home directory of the username on the remote server using SSH. You can then SSH in (ssh username#remote_hostname_or_IP) and then do what you need with the file sitting in your home directory, such as move it to the proper Apache directory.
Once you start to get more comfortable, you can switch to sftp if you like.
Update
Here is how to set up your Apache permissions. Let's say you have an account you on the linux computer running Apache, and we'll say the IP is 192.168.1.100.
On your local machine, create this shell script, secure.sh, and remember shell scripts need to have execute privileges (chmod +x secure.sh). Fill it with the following contents:
#!/usr/bin/env bash
# Lockdown the public web files
find /var/www -exec chown you:www-data {} \;
find /var/www -type d -exec chmod -v 750 {} \;
find /var/www -type f -exec chmod -v 640 {} \;
This shell script is setting the permissions for anything in the /var/www/ directory to be 750 for the directories and 640 for the files. This gives you complete read/write permissions for the files and www-data (which is the account for Apache) read permissions. Run this anytime you have uploaded files to ensure the permissions are always set correctly.
Next, SSH into your remote computer and go to the /var/www/html directory. Ensure that the ownership is not set to root. If it is, scp the secure.sh file into your remote computer, become root and run it. This only needs to be done once, so you can remotely set the permissions.
Now you can copy directly to /var/www/ through the scp -r command on your local computer from the top of the directory you wish to copy to /var/www/html:
scp -r ./ you#192.168.1.100:/var/www/html/
Then run this command to remotely run the secure.sh shell script and send the output to out.txt:
ssh you#192.168.1.100 -p 23815 ./secure.sh > out.txt
Then cat out.txt to see that the file permissions changed accordingly.
If this is a public facing computer, then you must add an SSH key to your scp connection. You can use this tutorial to find out more about generating your own keys, it is quite easy. To use the key, you only need to add -i private_key_file to your scp and ssh commands. Lastly, it would actually be safer to keep the /var/www files as root, SSH into the computer, su to become root, then run secure.sh as root (with the owner changed to root in the shell script). It all depends on the level of security you need to worry about. If it is a development computer (which is what I am assuming) no worries then.
For folders use
scp -r root#yourIp:/home/path/ /pathOfDirectory/
For files
scp -r root#yourIp:/home/path/ /pathOfDirectory/file fileNameCopied

scp command - transfer folder over ssh

I have a Arduino Yun and want setup the server for Yun.
So what I want is to copy a folder that contain a py file and a index.html to my Yun
I used mac terminal to do this operation
the command looks like this
scp -r /Users/gudi/Desktop/LobsterHeartRate root#192.168.240.1:/mnt/sda1
and then terminal asked for the password
after I typed, it shows
scp: /mnt/sda1/LobsterHeartRate: Not a directory
I didn't type /mnt/sda1/LobsterHeartRate why it shows this error
Your code
scp -r /Users/gudi/Desktop/LobsterHeartRate root#192.168.240.1:/mnt/sda1
requires that the remote directory /mnt/sda1 exists. This looks like it is not true in your case. Check it using ssh root#192.168.240.1 ls /mnt/sda1.
scp is simple tool and it does not allow you to rename directories on the fly and the target directory must exists. You might try
scp -r /Users/gudi/Desktop/LobsterHeartRate root#192.168.240.1:/mnt/
ssh root#192.168.240.1 mv /mnt/LobsterHeartRate /mnt/sda1
or so, if it will suit your needs. But copying more files, rsync is usually more suitable. Check its manual page and give it a try next time.
As #Jens Höpken notes, your post is a bit sparse. But trying to read between the lines of your post I suspect that LobsterHeartRate is a DIRECTORY on your local system but a FILE named LobsterHeartRate in your target system. This might be happening right at the top of the directory tree, or perhaps you have directories/files of the same name further down the tree. scp -rv might help resolve any confusions here.
Beware: scp -r resolves symbolic links. If you want to preserve symlinks you need to do something else. For historic reasons I use the following, though cpio with a find front-end opens up interesting possibilities for fine-grained file selections.
( cd /Users/gudi/Desktop && tar -cf - LobsterHeartRate ) |
ssh root#192.168.240.1 'cd /mnt/sda1 && tar -xf -'
For a safe "dry run" you could change the -xf to a -tf. The && chains are required to prevent bad things from happening if any prior command fails.
Disclaimer: any debugging is left as an exercise for the student.

scp files from local to remote machine error: no such file or directory

I want to be able to transfer a directory and all its files from my local machine to my remote one. I dont use SCP much so I am a bit confused.
I am connected to my remote machine via ssh and I typed in the command
scp name127.0.0.1:local/machine/path/to/directory filename
the local/machine/path/to/directory is the value i got from using pwd in the desired directory on my local host.
I am currently getting the error
No such file or directory
Looks like you are trying to copy to a local machine with that command.
An example scp looks more like the command below:
Copy the file "foobar.txt" from the local host to a remote host
$ scp foobar.txt your_username#remotehost.edu:/some/remote/directory
scp "the_file" your_username#the_remote_host:the/path/to/the/directory
to send a directory:
Copy the directory "foo" from the local host to a remote host's directory "bar"
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
scp -r "the_directory_to_copy" your_username#the_remote_host:the/path/to/the/directory/to/copy/to
and to copy from remote host to local:
Copy the file "foobar.txt" from a remote host to the local host
$ scp your_username#remotehost.edu:foobar.txt /your/local/directory
scp your_username#the_remote_host:the_file /your/local/directory
and to include port number:
Copy the file "foobar.txt" from a remote host with port 8080 to the local host
$ scp -P 8080 your_username#remotehost.edu:foobar.txt /your/local/directory
scp -P port_number your_username#the_remote_host:the_file /your/local/directory
From a windows machine to linux machine using putty
pscp -r <directory_to_copy> username#remotehost:/path/to/directory/on/remote/host
i had a kind of similar problem. i tried to copy from a server to my desktop and always got the same message for the local path. the problem was, i already was logged in to my server per ssh, so it was searching for the local path in the server path.
solution: i had to log out and run the command again and it worked
In my case I had to specify the Port Number using
scp -P 2222 username#hostip:/directory/ /localdirectory/
Your problem can be caused by different things. I will provide you three possible scenarios in Linux:
The File location
When you use scp name , you mean that your File name is in Home directory. When it is in Home but inside in another Folder, for example, my_folder, you should write:
scp /home/my-username/my_folder/name my-username#127.0.0.1:/Path....
You File Permission
You must know the File Permission your File has. If you have Read-only you should change it.
To change the Permission:
As Root ,sudo caja ( the default file manager for the MATE Desktop) or another file manager ,then with you Mouse , right-click to the File name , select Properties + Permissions
and change it on Group and Other to Read and write .
Or with chmod .
You Port Number
Maybe you remote machine or Server can only communicate with a Port Number, so you should write -P and the Port Number.
scp -P 22 /home/my-username/my_folder/name my-usernamee#127.0.0.1 /var/www/html
You also need to make sure what is in the .bashrc file of the user.
I've also got this ridiculous error because I put cd and ls commands in there, as it was mean to let them see the current files & directories when the user is has logged in from ssh.
The filename should go at the end of the path to the directory. That is, it should be the full path to the file. You are doing this from a command line, and you have a working directory for that command line (on your local machine), this is the directory that your file will be downloaded to. The final argument in your command is only what you want the name of the file to be. So, first, change directory to where you want the file to land. I'm doing this from git bash on a Windows machine, so it looks like this:
cd C:\Users\myUserName\Downloads
Now that I have my working directory where I want the file to go:
scp -i 'c:\Users\myUserName\.ssh\AWSkeyfile.pem' ec2-user#xx.xxx.xxx.xxx:/home/ec2-user/IwantThisFile.tar IgotThisFile.tar
Or, in your case:
cd /local/path/where/you/want/the/file/to/land
scp name#127.0.0.1:/local/machine/path/to/directory/filename filename
Be sure the folder from where you send the file does not contain space !
I was trying to send a file to a remote server from my windows machine from VS code terminal, and I got this error even if the file was here.
It's because the folder where the file was contained space in its name...
If you want to copy everything in a Folder + have a special Port use this one.
Works for me on Ubuntu 18.04 and a local machine with Mac OS X.
-r for recursive
-P for Port
scp -rP 1234 /Your_Directory/Source_Folder/ username#yourdomain.com:/target/folder
As #Astariul said, path to the file might cause this bug.
In addition, any parent directory which contains non-ASCII character, for example Chinese, will cause this.
In that case, you should rename you parent directory
This happened to me and I solved it.
This problem can be because the file you are trying to get is not existing (typo in the name of file or folder?) or because it is invisible to the user that you enter in scp.
The problem in my case was that the files that I wanted to get from remote machine were created by another user (root on my case), so, those files were invisible
To fix, I did:
ssh myuser#myserver
chown myuser:myuser myfile
exit
scp mysuer#myserver:/home/myuser/myfile /localfolder/myfile
For me on my mac,
I just have to run the command from my MAC terminal
scp -r root#ip_addres:/root/source /Users/path/Desktop/others/destination

Is it possible to make SCP ignore symbolic links during copy?

I need to reinstall one of ours servers, and as a precaution, I want to move /home, /etc, /opt, and /Services to backup server.
However, I have a problem: because of plenty of symbolic links a lot of files are copied multiple times.
Is it possible to make scp ignore the symbolic links (or actually to copy link as a link not as a directory or file)? If not, is there another way to do it?
I knew that it was possible, I just took wrong tool. I did it with rsync
rsync --progress -avhe ssh /usr/local/ XXX.XXX.XXX.XXX:/BackUp/usr/local/
I found that the rsync method did not work for me, however I found an alternative that did work on this website (www.docstore.mik.ua/orelly).
Specifically section 7.5.3 of "O'Reilly: SSH: The Secure Shell. The Definitive Guide".
7.5.3. Recursive Copy of Directories
...
Although scp can copy directories, it isn't necessarily the best method. If your directory contains hard links or soft links, they won't be duplicated. Links are copied as plain files (the link targets), and worse, circular directory links cause scp1 to loop indefinitely. (scp2 detects symbolic links and copies their targets instead.) Other types of special files, such as named pipes, also aren't copied correctly.A better solution is to use tar, which handles special files correctly, and send it to the remote machine to be untarred, via SSH:
$ tar cf - /usr/local/bin | ssh server.example.com tar xf -
Using tar over ssh as both sender and receiver does the trick as well:
cd $DEST_DIR
ssh user#remote-host 'cd $REMOTE_SRC_DIR; tar cf - ./' | tar xvf -
One solution is to use a shell pipe. I have a situation where I got some *.gz files and symbolic links generated by some software to link to the same *.gz files with a slightly shorter name. If I simply use scp, then the symbolic links will be copied as regular files and resulting in duplicates. I know rsync can ignore symbolic links, but my gz files are not compressed with syncable options, and sync is very slow in copying these gz files. So I simply use the following script to copy over the files:
find . -type f -exec scp {} target_host:/directory/name/data \;
The -f option will only find regular files and ignore symbolic links. You need to give this command on the source host. Hope this may help some user in my situation. Let me know if I missed anything.
A one liner solution which can be executed at client to copy folder from server using tar + ssh command.
ssh user#<Server IP/link> 'mkdir -p <Remote destination directory;cd <Remote destination directory>; tar cf - ./' | tar xf - C <Source destination directory>
Note: mkdir is must, if the remote destination directory is not present then the command will simply compress the entire home of the remote server and extract it to client.

rsync deploy and file/directories permissions

I'm trying to use rsync to deploy my website that resides on a shared web host.
Phpsuexec is running on it and that caused me problems with permissions on files and directories I've transfered via rsync. Actually files should be set to 644 and directories to 755, otherwise I get a 500 error.
After several attempts, I came with this rsync command:
rsync -avz -e ssh --chmod=Du=rwx,go=rx,Fu=rw,og=r -p --exclude-from=/var/www/mylocalfolder/.rsyncignore /var/www/mylocalfolder/ user#mywebsite.net:~/
Unfortunately this command doesn't works as expected because all the sent directories have been set to 744. On the other hand, files permission have been correctly set on 644.
I can't understand what is wrong.
P.S. I use Linux on my local machine.
Try it like this :
--chmod=Du=rwx,Dg=rx,Do=rx,Fu=rw,Fg=r,Fo=r
It worked for me.