how to execute a sudo chown command manually - ssh

I am trying to install suitecrm on a shared hosting.
My godaddy hosting has limited SSH, which doesn't allow me to use sudo commands.
Is there any manual/other ways for me to achieve the following permission setting?
sudo chown -R www-data:www-data .

Webhosting companies usually work using SuPHP, which means that the user PHP is running with is the same that owns the data.
For example, if you login to the hosting account using userABC account, all the files will be owned by userABC.
In other words, you don't need to change the ownership, it should work fine if you uploaded the file using the account user.
Also, is not possible for a non-root, non-sudo enabled user to execute sudo commands.

Related

Running docker commands with an user without root privileges (possibly with www-data user of Apache)

I am developing a simple Flask application (configured with a Apache webserver) which provides a web interface for docker management. My apache server runs as ‘www-data’ user and it uses the same for all of its API operations.
But i get the ‘Permission denied’ error for the following,
docker images
docker run, etc…
as it doesnt allow ‘www-data’ user to run the above commands.
Can you please provide me a suggestion on using the ‘www-data’ user for docker operations.
I dont want to add ‘www-data’ user to sudoers list.
Is adding the user to docker group alone will be a proper solution ???
Or please suggest me a best practice solution for this.
Thanks
GuruPrasad
It would be easier, clearer, and no less dangerous to tell Apache to run your process as root.
Remember that, if you can run any Docker command at all, you can trivially get unrestricted root-level access to anything on the system. For example, if your tool decides it really does want www-data to be in the host's sudoers list, it can
docker run --rm -v /:/host busybox \
sh -c 'echo www-data ALL = (ALL) NOPASSWD: ALL >> /host/etc/sudoers'
Depending on what your management tool does, it potentially is offering equal unprotected root-level access to the host to anyone who can reach the Web page. Even if it isn't, you need to be extremely careful with how you invoke Docker (another SO answer I was looking at had the potential to root the system if a user could create a directory with an arbitrary name and run the script from there, for instance).

Create an SSH key for other account on Google Cloud Platform

I have installed the Cloud SDK for Google Cloud. I've logged in using auth which redirected me to the gmail-login. Created the SSH key and even logged in by SFTP using Filezilla.
The problem is, when I log in using the gmail auth, SDK shell (or putty?) logs me into an account that is not admin. It has created another SSH user account (named 'Acer', after my pc) and logs me into it. Due to this, FTP starts at the /home/Acer folder. I want access to the /home/admin/web folder, but I don't have it now.
How can I create a SSH key for the admin account so that I can gain access to the folder mentioned above? Otherwise, is it possible to grant 'Acer' the permissions to access all the folders?
I have a few suggestions.
First a bit of background. If you run this command on your home workstation:
sudo find / -iname gcloud
You'll discover a gcloud configuration folder for each user on your home workstation. You'll probably see something like this:
/root/.config/gcloud
/home/Acer/.config/gcloud
If you change directory into /home/Acer/.config/gcloud/configurations you'll see a file named 'config_default'. This file will contain the default account to use for that user ('Acer').
Because you have performed gcloud auth login as that user, and during that process selected your gmail account, it will contain that gmail ID/account within the config file for that user. If you would like a user named 'admin' to log into your project, you could try adding a user named 'admin' to your home workstation, and then before attempting to use gcloud auth login, ensure you switch user on your home workstation to user 'admin'. This will generate a gcloud configuration on your home workstation for user admin, and propagate SSH keys etc.
If you want to create ssh keys manually there's some useful info here.
(For what it's worth, if you decide to use gcloud compute ssh to log into your instance home workstation, you can specify the user in the command you would like to log in as. For example gcloud compute ssh admin#INSTANCE_NAME).
I want access to the /home/admin/web folder, but I don't have it now.
Even if you are logged into the machine as a different user (in this case 'Acer'), the folder /home/admin/web should still exist on the instance if it existed previously. If you land in folder /home/Acer have you tried changing directory to the folder above and then listing the folders to see if /home/admin/ exists?
For example, from /home/Acer run:
$ cd ..
then
$ ls
You should be able to see /home/admin/.
Otherwise, is it possible to grant 'Acer' the permissions to access
all the folders?
Yes this is also possible. If you access the instance as the project owner (the easiest way would be to log into the Console as the owner of the project and use the SSH functionality in the console to access the instance). Now you can run this command:
$ sudo chown Acer.Acer -R /home/admin/web
This will make user 'Acer' owner of directory /home/admin/web and all files/directories below it (thanks to the -R switch).
Now when you next access the instance as user 'Acer' you'll be able to access /home/admin/web by running the following and you'll also have read/write capabilities:
$ cd /home/admin/web

What is the meaning of `jenkins ALL=(ALL) NOPASSWD: ALL` and does it create a security issue?

I saw this is a way to resolve jenkins build script upload to aws instance issue with authentication, bu what exactly this mean? how should I execute it? I can't execute it in terminal as jenkins is a command not found, but I do have Jenkins running in local.
This:
jenkins ALL=(ALL) NOPASSWD: ALL
Means to run ALL commands without a password for a user named jenkins.
And is a syntax/configuration for the program sudo which basically allows users to run programs with the security privileges of another user, by default the superuser.
It definitely creates a security issue but you need to find a way to deal with it since it is giving "root" privileges to the user.
You can read more about sudo (sudoers) here: https://www.sudo.ws/intro.html

Can't add files to the website using Filezilla

I've been working with the server only for 2 days so I am sorry if that is simple question. I looked everywhere, but didn't find an answer.
So I have a Google compute engine account and I have owner privileges. When I run
gcloud compute ssh instance --zone us-central1-a
it works, but it creates a key with username that it takes from my computer account.
So when I am in google shell I can add or remove files using sudo. But when I go to Filezilla I have to use ssh file key and username from that key. And the only folder that accessible with that username is it's own folder. I am not sure what is the problem so I gave all the facts I could.
I'm not entirely sure I'm answering the right question, but I'll take a stab at it. The ssh keys created by/used by gcloud are specific to a particular linux user on your VM. As you note, you can use sudo when ssh'd in to edit files/directories owned by different users---the way this works is that you (roughly speaking) temporarily switch users to root when doing the file edit.
An scp client like Filezilla isn't going to be able to switch users that way. So you'll need a different technique to edit files with Filezilla.
I suggest ssh-ing in to your vm and using chmod or chown to change the ownership of files/directories that you want to use with Filezilla. Alternatively you could you use useradd -G to add you username to a group that can edit the files you care about.
Exactly what you'll do depends on the security policy you want to enforce for your files, but there a lots of decent options. The key test to run---can you get to a state where you can edit the files when logged in with SSH, but not using sudo? If so then you should be able to edit the files with Filezilla.

Phing runs under user with limited permission

I have not used phing before, but would like to use it to automate my deployment process. Currently I log in under myuser, Apache runs under www-data. All my application code is owned by myuser, but other (cache) files are generated by www-data.
I currently solve this by using sudo to remove these files. I would like to keep the application code owned by myuser, because it's easier to access the files via ssh. I wouldn't like phing to run with super powers, because at this moment I don't trust this automated tool yet.
What is the best practice to use phing with limited file permissions?
I had the same issue. Apache is running as www-data.www.data and files created by the webserver are 644 and directories 755.
I solved it by:
adding the user running phing to the www-data group
adding umask 002 to the /etc/apache2/envvar file, see link
Phing is now able to remove the directories and files created by the apache
As I understand it the problem is that phing can't do anything with the cache files since they were created by your Apache user (www-data) and you are running phing as myuser. It sounds to me like you just need to change the permissions on the cache files so that myuser has full permissions. How to do this will depend on how your application is written, but something along the lines of chmod/chowning'ing the files after creation or creating them with a umask allowing myuser permissions.