im trying to use wget for FTP download ( auth )
this is the command i used to download the file bat.bat in appdata dir ...
i got my file inside new folder created by wget named ( website.com )
wget -r --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
when i checked appdata directory i found my file here :
C:\Users\ev\AppData\Roaming\website.com\bat.bat
i dont need it to create new dir i need it here :
C:\Users\ev\AppData\Roaming\bat.bat
Try the -nH parameter:
wget -r -nH --ftp-user="user" --ftp-password="pass" ftp://website.com/bat.bat -P %appdata%
From wget --help:
Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
--protocol-directories use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.
Related
On windows it is usually stored in the %USERPROFILE%\ssh or
%USERPROFILE%.ssh folders.
However I do not see the ssh folders when going to %USERPROFILE%.
Is it possible to create the ssh folder and the known_hosts file myself?
Yes, this is expected.
You can in a CMD do:
cd "%USERPROFILE%"
mkdir .ssh
From there, assuming you have ssh-keygen in your PATH (which is included in Git For Windows for example), you can type:
ssh-keygen -t rsa -P ""
That will generate a key in the default path ~/.ssh(/id_rsa[.pub]), with ~/.ssh being translated in %USERPROFILE%\.ssh
I recently set up Lamp stack on ubuntu 14.04 for my web server. I'm working through Digital Ocean. These are the steps I went through...
On local machine I logged in to my web server with
sftp user#web_server_ip
Then
sftp> cd /var/www/html
How would I go upon getting onto my local machine to get the file for the site? And how would I transfer them?
I know that I have to use the [get] and [put] commands
I'm just confused what's considered local/remote? if I'm logged into the remote server on my local machine. Overthinking it?
This is the tutorial I'm trying to follow: How To Use SFTP to Securely Transfer Files with a Remote Server
Edit:
So I tried moving a whole directory from my local machine and this is what I ended up doing
scp -r /path/directory_name name#ip_address:/var/www/html
scp: /var/www/html/portfolio.take7: Permission denied
Should I be changing permission by using sudo prior to scp -r?
Edit2:
I have also tried
Where_directory_is$ scp -r /path/directory_name name#ip_address:/var/www/html
/var/www/html: No such file or directory
It might be easier to start with SCP which allows you to copy files with one command. So for example, if you had a local file /path/filename.css and wanted to transfer it to your server, you could use the following command on your local machine:
scp /path/filename.css username#remote_hostname_or_IP:~
This command copies the local file and transfers it to the home directory of the username on the remote server using SSH. You can then SSH in (ssh username#remote_hostname_or_IP) and then do what you need with the file sitting in your home directory, such as move it to the proper Apache directory.
Once you start to get more comfortable, you can switch to sftp if you like.
Update
Here is how to set up your Apache permissions. Let's say you have an account you on the linux computer running Apache, and we'll say the IP is 192.168.1.100.
On your local machine, create this shell script, secure.sh, and remember shell scripts need to have execute privileges (chmod +x secure.sh). Fill it with the following contents:
#!/usr/bin/env bash
# Lockdown the public web files
find /var/www -exec chown you:www-data {} \;
find /var/www -type d -exec chmod -v 750 {} \;
find /var/www -type f -exec chmod -v 640 {} \;
This shell script is setting the permissions for anything in the /var/www/ directory to be 750 for the directories and 640 for the files. This gives you complete read/write permissions for the files and www-data (which is the account for Apache) read permissions. Run this anytime you have uploaded files to ensure the permissions are always set correctly.
Next, SSH into your remote computer and go to the /var/www/html directory. Ensure that the ownership is not set to root. If it is, scp the secure.sh file into your remote computer, become root and run it. This only needs to be done once, so you can remotely set the permissions.
Now you can copy directly to /var/www/ through the scp -r command on your local computer from the top of the directory you wish to copy to /var/www/html:
scp -r ./ you#192.168.1.100:/var/www/html/
Then run this command to remotely run the secure.sh shell script and send the output to out.txt:
ssh you#192.168.1.100 -p 23815 ./secure.sh > out.txt
Then cat out.txt to see that the file permissions changed accordingly.
If this is a public facing computer, then you must add an SSH key to your scp connection. You can use this tutorial to find out more about generating your own keys, it is quite easy. To use the key, you only need to add -i private_key_file to your scp and ssh commands. Lastly, it would actually be safer to keep the /var/www files as root, SSH into the computer, su to become root, then run secure.sh as root (with the owner changed to root in the shell script). It all depends on the level of security you need to worry about. If it is a development computer (which is what I am assuming) no worries then.
For folders use
scp -r root#yourIp:/home/path/ /pathOfDirectory/
For files
scp -r root#yourIp:/home/path/ /pathOfDirectory/file fileNameCopied
I have a problem whith my installation of docker. When I launch my docker-compose up I have this error :
front_1 | /var/lock/apache2 already exists but is not a directory owned by www-data.
front_1 | Please fix manually. Aborting.
I have this error because I add this line in my dockerfile conf :
RUN usermod -u 1000 www-data
But if I delete this line, my symfony project doesn't work with docker.
Do you have any ideas to solve my problem ?
Best regards
As I see it, you are trying to change UID of user www-data inside docker to have the same ID as host machine user UID (you), so you can open project files in your IDE.
This introduces file permissions problems on apache2 service, which can't read it's own files (config, pid,...), simply because it is not the same user anymore.
Quick 'dirty' solution is to change only owner of symfony project files to UID 1000, but keep group (GID) to the www-data. This applies only for dev machine. Else you don't needed it. Run command inside container.
chown -R 1000:www-data /home/project
You can create some bash alias inside docker to have it at hand.
Other option is to use ACL which will set existing files and folder with permissions, which will get inherited to newly created files under given folder. This could be put to bootstrap script inside container. But only for DEV mode. This way you won't need to run chown.
chown -R 1000:www-data /home/project #set for existing files
/usr/bin/setfacl -R -m u:www-data:rwx -m u:0:rwx -m u:1000:rwx /home/project
/usr/bin/setfacl -dR -m u:www-data:rwx -m u:0:rwx -m u:1000:rwx /home/project
Each -m is for a different user. First is www-data (apache2), second is 0 (root) and third is 1000 (you).
Remember UID can change anytime. So this could create security hole if mentioned users are not having proper UID.
I used second method only for folders, where PHP via apache2 sets permissions (uploaded files, cache,...), but host user needs to access these files.
I am downloading folder consisting of either j2k or png file using wget.
Now i want while downloading folder if user is requesting .j2k file and if .j2k file is not existing in that folder then by default download .png file.
i.e. i want download j2k if present || download .png file.
I have used like this
wget -d any.com -i /folder -r -l 1 -nc -A j2k,png
-d: download from this Domain
-i: download from this foldern
-r: recursive
-l 1: follow only 1 link deep
-nc: no clobber = download only if file doesn't exist
-A: accept/download only all *.ogg and *.mp3
but using this it is downloading both j2k and png.
Any help will be appreciated.
Referred Links:
wget if else download condition
wget manual
oh so i cd into my folder
ls
cgi-bin wp-comments-post.php wp-mail.php
googlec3erferfer228fc075b.html wp-commentsrss2.php wp-pass.php
index.php wp-config-sample.php wp-rdf.php
license.txt wp-config.php wp-register.php
php.ini wp-content wp-rss.php
readme.html wp-cron.php wp-rss2.php
wp-activate.php wp-feed.php wp-settings.php
wp-admin wp-includes wp-signup.php
wp-app.php wp-links-opml.php wp-trackback.php
wp-atom.php wp-load.php xmlrpc.php
wp-blog-header.php wp-login.php
(uiserver):u45567318:~/wsb454434801 >
What i want to do is zip all the files within this folder then download it to my computer i am really new to ssh and this is a clients website but really want to start to use command line for speed, i have been looking a this reference http://ss64.com/bash/ to find the right commands but would really like some help from somebody please??
Thanks
cd path/to/folder/foldername
zip -r foldername.zip foldername * [use * if it has any sub directory]
Please try this code, it will solve your problem.
If you are in directory itself then
zip -r zipfilename.zip *
Go to folder path using cd command
zip -r foldername.zip foldername
Ex : zip -r test-bkupname.zip test
Here test is the folder name.
tar zcvf ../my_directory.tar.gz .
will create my_directory.tar.gz file.
scp ../my_directory.tar.gz username#your-ip:/path/to/place/file
will transfer file to your computer.
Looks like this is the webroot directory.
Why not zip the directory above (httpdocs / html / whatever) and then move this into the website space, and download from there?
i.e. go into the directory above the web root. For example, if your web root is /var/www/html/ go into /var/www/ and run the following commands:
zip -r allwebfiles.zip html
mv allwebfiles.zip /html/allwebfiles.zip
Then in your web browser go to http://mydomain.com/allwebfiles.zip and just download that file.
When extracting, you'd just need to either extract into /var/www/ OR extract into webroot and move all files up one level.
Use the following Ssh command to download multiple files at one time
mget ./*