Putty ssh commands zip all the files within this folder then download - ssh

oh so i cd into my folder
ls
cgi-bin wp-comments-post.php wp-mail.php
googlec3erferfer228fc075b.html wp-commentsrss2.php wp-pass.php
index.php wp-config-sample.php wp-rdf.php
license.txt wp-config.php wp-register.php
php.ini wp-content wp-rss.php
readme.html wp-cron.php wp-rss2.php
wp-activate.php wp-feed.php wp-settings.php
wp-admin wp-includes wp-signup.php
wp-app.php wp-links-opml.php wp-trackback.php
wp-atom.php wp-load.php xmlrpc.php
wp-blog-header.php wp-login.php
(uiserver):u45567318:~/wsb454434801 >
What i want to do is zip all the files within this folder then download it to my computer i am really new to ssh and this is a clients website but really want to start to use command line for speed, i have been looking a this reference http://ss64.com/bash/ to find the right commands but would really like some help from somebody please??
Thanks

cd path/to/folder/foldername
zip -r foldername.zip foldername * [use * if it has any sub directory]
Please try this code, it will solve your problem.

If you are in directory itself then
zip -r zipfilename.zip *

Go to folder path using cd command
zip -r foldername.zip foldername
Ex : zip -r test-bkupname.zip test
Here test is the folder name.

tar zcvf ../my_directory.tar.gz .
will create my_directory.tar.gz file.
scp ../my_directory.tar.gz username#your-ip:/path/to/place/file
will transfer file to your computer.

Looks like this is the webroot directory.
Why not zip the directory above (httpdocs / html / whatever) and then move this into the website space, and download from there?
i.e. go into the directory above the web root. For example, if your web root is /var/www/html/ go into /var/www/ and run the following commands:
zip -r allwebfiles.zip html
mv allwebfiles.zip /html/allwebfiles.zip
Then in your web browser go to http://mydomain.com/allwebfiles.zip and just download that file.
When extracting, you'd just need to either extract into /var/www/ OR extract into webroot and move all files up one level.

Use the following Ssh command to download multiple files at one time
mget ./*

Related

How to not use sudo in XAMPP?

I am using XAMPP on Ubuntu 20.04. It has been installed at the location opt/lampp/ location.
As usual before writing any code I am saving my php file within opt/lampp/htdocs location.
Now, every time I make changes to any file or want to save it I have to use the command line with sudo and obviously type my password again and again and again.
Could there be a way around this?
And not install xampp in root directory or but some other directory which is not within root directory?
Thank you.
I have found a solution to the problem. Now I can create files with in the specific folder of htdocs.
My location of installation of xampp was default one opt/lampp
Go the parent folder of htdocs that is /lampp and type
sudo chown -R $USER:$USER ./htdocs
Now go to the parent folder of ./lampp and type
sudo chmod -R 755 ./lampp/
Here what we basically did was change the permission to create folder and make changes to the file to the location where we have installed xampp.

How to delete all files with lftp except for cgi-bin and .ftpquota

I'm setting up a new ci/cd pipeline on gitlab. For the deployment I have to run npm run build and then copy the dist/ folder to the webserver via ftp (with lftp). To ensure a clean deployment the script should remove all files except the folder cgi-bin and the file .ftpquota on the webserver first and then copy the files.
I've researched through the web and haven't found a suitable solution. With the flag --delete, lftp deletes all files.
Thats my script so far:
- lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rnev dist/ ./ --ignore-time --delete --parallel=10 --exclude-glob .git* --exclude .git/"
My current script removes all files, but I want it to remove everything except the cgi-bin folder and the .ftpquota file.
As seen in unix.stackexchange.com you should add the -x option:
Please check it

wget chose location ftp download

im trying to use wget for FTP download ( auth )
this is the command i used to download the file bat.bat in appdata dir ...
i got my file inside new folder created by wget named ( website.com )
wget -r --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
when i checked appdata directory i found my file here :
C:\Users\ev\AppData\Roaming\website.com\bat.bat
i dont need it to create new dir i need it here :
C:\Users\ev\AppData\Roaming\bat.bat
Try the -nH parameter:
wget -r -nH --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
From wget --help:
Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
--protocol-directories use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.

scp not working saying its a directory error

I am trying to copy a file to remote server in a certain folder.
Its an adrive backup plan. But it comes with scp. I can copy the file if I don't select directory. Even if I put a directory that doesn't exist it says its a directory.
root#host1 [/usr/src]# scp ftpdelete.sh user#host#scp.adrive.com:/mysql-only/
scp: /mysql-only/: Is a directory
Amazingly enough in my case it was that the directory didn't exists!! :|
Is the error message a bug?... or it's me. Tempted for the latter.
SCP doesn't automatically create you new directory if you want to scp file (it creates directory only if you do recursive copy). There is wrong error message. The error should be No such file or directory or similar.
It is known problem and there is upstream bugzilla about this [1].
[1] https://bugzilla.mindrot.org/show_bug.cgi?id=1768
You are copying the sh file to a new directory on the server, and the directory is expected to be there but in fact not(then the machine thinks you want to change the file to be a directory). Most probably the directory you set is wrong.
-r' Recursively copy entire directories. Note that scp follows symbolic links encountered in the tree traversal.
But it doesn't create a directory but you can do below
ssh remote mkdir /diretcory
root#host1 [/usr/src]# scp -r ftpdelete.sh user#host#scp.adrive.com:/complete_path/mysql-only/
or
rsync can do the creating of directory if not exist
its basic command syntax is similar to scp:²
$ rsync -r -e ssh ftpdelete.sh me#my-system:/complete_path/mysql-only/
I saw a similar error, when tried scp to path that relative to home directory. The error fixed after removing unnecessary leading / in path:
# scp ftpdelete.sh user#host#scp.adrive.com:mysql-only/
rather then
# scp ftpdelete.sh user#host#scp.adrive.com:/mysql-only/
^
scp -r source_location user#servername:/target_location
Don't put "/" after the directory's name.
Try to download the file directly to your local root directory and then copy it from there :
root#host1 [/usr/src]# scp user#host:/root/Desktop/file.txt /root/home/

mysqldump. Location of the file?

I did an update on my db, but can't find the file so I can download it. Where is the file located at? Thanks in advance
root#xxxx:~# mysqldump -u root -p cherio > myBackup.sql
Enter password:
root#xxxx:~# find myBackup.sql
myBackup.sql
EDIT:
I tried this:
root#xxxx:~# find / -name "myBackup.sql"
/root/myBackup.sql
Ok, I had to refresh my FTP for the file to show up.
On the same directory you made the dump. It's ./myBackup.sql and not myBackup.sql
Do an ls -all.
It is in the same directory where you ran this command. To check this, run "pwd", it will give you your current path. Then run the command and it will store the database file in the same directory.
To get generated file properly run following command,
mysqldump --database --user=root --password your_db_name > export_into_db.sql
This will give you the "export_into_db.sql" database file.
Play with it.
Enjoy...
It is at either at root or in current directory where you run the dump command.
try this command on terminal..
locate myBackup.sql
I had the same problem.
It comes out that for some reason I could not create any file at root directory.
Try to go into some deeper folder
For example
cd var
cd www
and than execute:
mysqldump -uUSERNAME -pPASSWORD DATABASE > backup.sql