I have a folder that I'd like to delete that is located in httpdocs:
The folder is named: /content/
/var/www/vhosts/webiste/httpdocs/
If i use the CD command to access /httpdocs/ and I'm in that folder, can is use the command:
-rf /content/
Or do I need to use the full directory
e.g -rf /var/www/vhosts/webiste/content/
I just wanted to clarify in case this would delete every folder on the server called content.
Don't use /content.. it's an absolute path.
use just rm -rf content when the content folder is inside /var/www/vhosts/webiste/httpdocs/ and your current working directory is /var/www/vhosts/webiste/httpdocs/
Related
I am using XAMPP on Ubuntu 20.04. It has been installed at the location opt/lampp/ location.
As usual before writing any code I am saving my php file within opt/lampp/htdocs location.
Now, every time I make changes to any file or want to save it I have to use the command line with sudo and obviously type my password again and again and again.
Could there be a way around this?
And not install xampp in root directory or but some other directory which is not within root directory?
Thank you.
I have found a solution to the problem. Now I can create files with in the specific folder of htdocs.
My location of installation of xampp was default one opt/lampp
Go the parent folder of htdocs that is /lampp and type
sudo chown -R $USER:$USER ./htdocs
Now go to the parent folder of ./lampp and type
sudo chmod -R 755 ./lampp/
Here what we basically did was change the permission to create folder and make changes to the file to the location where we have installed xampp.
I'm setting up a new ci/cd pipeline on gitlab. For the deployment I have to run npm run build and then copy the dist/ folder to the webserver via ftp (with lftp). To ensure a clean deployment the script should remove all files except the folder cgi-bin and the file .ftpquota on the webserver first and then copy the files.
I've researched through the web and haven't found a suitable solution. With the flag --delete, lftp deletes all files.
Thats my script so far:
- lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rnev dist/ ./ --ignore-time --delete --parallel=10 --exclude-glob .git* --exclude .git/"
My current script removes all files, but I want it to remove everything except the cgi-bin folder and the .ftpquota file.
As seen in unix.stackexchange.com you should add the -x option:
Please check it
im trying to use wget for FTP download ( auth )
this is the command i used to download the file bat.bat in appdata dir ...
i got my file inside new folder created by wget named ( website.com )
wget -r --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
when i checked appdata directory i found my file here :
C:\Users\ev\AppData\Roaming\website.com\bat.bat
i dont need it to create new dir i need it here :
C:\Users\ev\AppData\Roaming\bat.bat
Try the -nH parameter:
wget -r -nH --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
From wget --help:
Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
--protocol-directories use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.
I had installed CLion(2016.2.3) IDE from CLion-2016.2.3.tar.gz file. I accidentally deleted the CLion-2016.2.3.tar.gz file and CLion-2016.2.3 folder(which I got after extracting CLion-2016.2.3.tar.gz). Now CLion isn't working. When I ran dpkg --list from terminal, CLion wasn't present in the output. I want to remove CLion completely(all its files, folders, dependencies, etc.(even the configuration files)). How do I remove it completely?
Run the following command in terminal to find all the directories and files containing clion in their name :-
$ sudo find . -iname "*clion*"
Then delete the directories and files you have found.
To delete directories/files, go to the location of that directory/file in terminal using cd and run the following command :-
$ sudo rm -rf DIRECTORY_NAME/FILE_NAME
Simple Steps are :
Delete the clion folder you have downloaded and extracted.
Remove cache in ~/. using the command : sudo rm -r ~/.Clion.
Also need remove settings: /home/user/.config/JetBrains
You need also to remove settings that are stored in ~/. directory. That's it for Unix/Linux.
All Clion's binaries are store inside the folder you deleted.
But Clion sets up preferences at first launch, and you may have a menu icon which is pointing nowhere.
I suggest you run something like find ~ -iname "*clion*" and investigate what is found. If you are using Gnome2 or MATE desktop you will certainly find .desktop files which are the icons you are looking for.
If you used snap to install you can uninstall using
sudo snap remove --purge clion
oh so i cd into my folder
ls
cgi-bin wp-comments-post.php wp-mail.php
googlec3erferfer228fc075b.html wp-commentsrss2.php wp-pass.php
index.php wp-config-sample.php wp-rdf.php
license.txt wp-config.php wp-register.php
php.ini wp-content wp-rss.php
readme.html wp-cron.php wp-rss2.php
wp-activate.php wp-feed.php wp-settings.php
wp-admin wp-includes wp-signup.php
wp-app.php wp-links-opml.php wp-trackback.php
wp-atom.php wp-load.php xmlrpc.php
wp-blog-header.php wp-login.php
(uiserver):u45567318:~/wsb454434801 >
What i want to do is zip all the files within this folder then download it to my computer i am really new to ssh and this is a clients website but really want to start to use command line for speed, i have been looking a this reference http://ss64.com/bash/ to find the right commands but would really like some help from somebody please??
Thanks
cd path/to/folder/foldername
zip -r foldername.zip foldername * [use * if it has any sub directory]
Please try this code, it will solve your problem.
If you are in directory itself then
zip -r zipfilename.zip *
Go to folder path using cd command
zip -r foldername.zip foldername
Ex : zip -r test-bkupname.zip test
Here test is the folder name.
tar zcvf ../my_directory.tar.gz .
will create my_directory.tar.gz file.
scp ../my_directory.tar.gz username#your-ip:/path/to/place/file
will transfer file to your computer.
Looks like this is the webroot directory.
Why not zip the directory above (httpdocs / html / whatever) and then move this into the website space, and download from there?
i.e. go into the directory above the web root. For example, if your web root is /var/www/html/ go into /var/www/ and run the following commands:
zip -r allwebfiles.zip html
mv allwebfiles.zip /html/allwebfiles.zip
Then in your web browser go to http://mydomain.com/allwebfiles.zip and just download that file.
When extracting, you'd just need to either extract into /var/www/ OR extract into webroot and move all files up one level.
Use the following Ssh command to download multiple files at one time
mget ./*