Local yum repo hosted on SMB Share - yum

I currently have a few RHEL repositories saved on an SMB Share.
i am authenticated to this share via a DOMAIN\user account.
a /etc/yum.repos.d/local.repo file is appropiatley pointing to the dir. in this case the line is:
baseurl=file:/"//run/user/1005/gvfs/smb-share:fileserver0,share=RepoShare"
When I run a sudo yum command (presumably I am root and no longer the domain user?) I get an error.
The error is:
file:///run/user/1005/gvfs/smb-share:server=fileserver0%20share=RepoShare/Repo/repodata/repomd.xml:
[Errno 14] curl#37 - "Couldn't open file /run/user/1005/gvfs/smb-share:server=fileserver0%20share=RepoShare/Repo/repodata/repomd.xml"
Trying other mirror.
Anyone have any experience hosting a repo on an SMB share for linux clients to pull from?

Related

rabbmitmq 3.6.10 allows guest access from remote

Recently I upgraded my ubuntu 12.04 into ubuntu 18.04. RabbitMQ 3.6.10 does not allow guest access from remote anymore.
I have searched online and try this method
Create a config file /etc/rabbitmq/rabbitmq.config with contents
loopback_users = none
or
loopback_users.guest = false
Add environment variable RABBITMQ_CONFIG_FILE as /etc/rabbitmq/rabbitmq.conf.
Give administrator permission to guest.
It still has an error
HTTP access denied: user 'guest' - User can only log in via localhost
It seems rabbitmq.config or rabbitmq.conf is not used as environment variable rabbitmq_config_file or config_file is not used too after I change it into a non-existing file.
I can confirm rabbitmq-env.conf is used.
How should I allow remote access for guest?

Rundeck permission denied issue while executing a job in remote host machine

Earlier I have installed Rundeck in local machine and Everything was working fine for me . Recently I have installed rundeck in remote host machine where ssh and sudo user are different for this machine and they are not in same group .
When I am trying to run the job(python scripts) , it is throwing me below permisision denied message . Do I need to change the user level details somewhere in a file, Please let me know .
/bin/sh: /tmp/4-10-host-machine-dispatch-script.tmp.sh: Permission denied
Result: 126
Failed: NonZeroResultCode: Result code was 126
Thanks,
RK
That means the /tmp directory is restricted in your remote node (some servers setups restrict that by security reasons), you can define a custom copy script path in multiples ways:
1) Node level: defining file-copy-destination-dir attribute at resoruces.xml file, example:
<node name="freebsd11" description="FreeBSD 11 node" tags="unix,freebsd" hostname="192.168.33.41" osArch="amd64" osFamily="unix" osName="FreeBSD" osVersion="11.3-RELEASE" username="youruser" ssh-key-storage-path="keys/rundeck" file-copy-destination-dir="/home/youruser/scripts"/>
2) Project level: Go to Edit Configuration (Rundeck sidebar) > Edit Configuration > Edit Configuration File (up to right button) and add this line:
project.file-copy-destination-dir=/home/youruser/scripts
3) Globally: Stop Rundeck service, add the following line at project.properties (at /etc/rundeck path) file and start Rundeck service again:
framework.file-copy-destination-dir=/home/youruser/script
Just make sure that the custom path is reachable by the remote ssh user. You can check the full documentation here.

Copy files from a remote server which requires superuser preliveges to local machine [duplicate]

This question already has answers here:
WinSCP connect to Amazon AMI EC2 Instance changing user after login to "root"
(6 answers)
Closed last year.
I am trying to use WinSCP to transfer files over to a Linux Instance from Windows.
I'm using private key for my instance to login to Amazon instance using ec2-user. However ec2-user does not have access to write to the Linux instance
How do I sudo su - to access the root directory and write to the Linux box, using WinSCP or any other file transfer method?
Thanks
I know this is old, but it is actually very possible.
Go to your WinSCP profile (Session > Sites > Site Manager)
Click on Edit > Advanced... > Environment > SFTP
Insert sudo su -c /usr/lib/sftp-server in "SFTP Server" (note this path might be different in your system)
Save and connect
Source
AWS Ubuntu 18.04:
There is an option in WinSCP that does exactly what you are looking for:
AFAIK you can't do that.
What I did at my place of work, is transfer the files to your home (~) folder (or really any folder that you have full permissions in, i.e chmod 777 or variants) via WinSCP, and then SSH to to your linux machine and sudo from there to your destination folder.
Another solution would be to change permissions of the directories you are planning on uploading the files to, so your user (which is without sudo privileges) could write to those dirs.
I would also read about WinSCP Remote Commands for further detail.
Usually all users will have write access to /tmp.
Place the file to /tmp and then login to putty , then you can sudo and copy the file.
I just wanted to mention for SUSE Enterprise server V15.2 on an EC2 Instance the command to add to winSCP SFTP server commands is :
sudo su -c /usr/lib/ssh/sftp-server
I didn't have enough Reputation points to add a comment to the original answer but I had to fish this out so I wanted to add it.
ssh to FreePBX and run the commands stated below in your terminal:
sudo nano -f /etc/sudoers.d/my_config_file
YourUserName ALL=(ALL) NOPASSWD:ALL
sudo systemctl restart sshd
Winscp:
under session login ==> Advanced ==> SFTP
Change SFTP Server to:
sudo /usr/libexec/openssh/sftp-server
I do have the same issue, and I am not sure whether it is possible or not,
tried the above solutions are not worked for me.
for a workaround, I am going with moving the files to my HOME directory, editing and replacing the files with SSH.
Tagging this answer which helped me, might not answer the actual question
If you are using password instead of private key, please refer to this answer for tested working solution on Ubuntu 16.04.5 and 20.04.1
https://stackoverflow.com/a/65466397/2457076

Warning: Identity file /home/user/.ssh/id_rsa not accessible: No such file or directory

I'm using Deployer for deploying my code to multiple servers. Today I got this error after starting a deployment:
[Deployer\Exception\RuntimeException (-1)]
The command "if hash command 2>/dev/null; then echo 'true'; fi" failed.
Exit Code: -1 (Unknown error)
Host Name: staging
================
Warning: Identity file /home/user/.ssh/id_rsa not accessible: No such file or directory.
Permission denied (publickey).
First I thought it would probably has something to do with this server configuration since I moved the complete installation to another hosting provider. I tried to trigger a deployment to a server which I deployed to just fine in the past days but then got the same error. This quickly turned my suspicions from server to local.
Since I'm running PHP in docker (Deployer is written in PHP), I thought it might had something to do with my ssh-agent not being forwarded correctly from my host OS to docker. I verified this by using a fresh PHP installation directly from my OS (Ubuntu if that would help). Same warning kept popping up in the logs.
When logging in using the ssh command everything seems to be alright. I still have no clue what going on here. Any ideas?
PS: I also created an issue at Deployer's GIT repo: https://github.com/deployphp/deployer/issues/1507
I have no experience with the library you are talking about, but the issue starts here:
Warning: Identity file /home/user/.ssh/id_rsa not accessible: No such file or directory.
So let's focus on that. Potential things I can think of:
Is the username really user? It says that the file lives at: /home/user. Verifying that that really is the correct path. For instance, just ls the file. If it doesn't exist, you will get an error:
$ ls /home/user/.ssh/id_rsa
That will throw a No such file or directory if it doesn't exist.
If 1. is not the issue, then most likely this is a user issue where the permissions are wrong for the user in the Docker container. If this is the issue, then INSIDE the Docker container, change the permissions on id_rsa before you need to do it:
$ chmod 600 /home/user/.ssh/id_rsa
Now do stuff with the key...
A lot of SSH agents won't work unless the key is only read-write accessible by the user who is trying to run the ssh agent. In this case, that is the user inside of the Docker container.

copying file from local machine to Ubuntu 12.04 returning permission denied

How to I grant myself permission to transfer a .crt file from my local machine to the aws ubuntu 12.04 server?
I am using the following command from my machine and receiving a permission denied response.
scp -i /Users/me/key_pair.pem /Users/me/ssl-bundle.crt ubuntu#ec2-50-150-63-20.eu-west-1.compute.amazonaws.com:/etc/ssl/certs/
I am following comodo's instruction. Refer to the heading Configure your nginx Virtual Host from the link. I have not set anything up with regards to permission as user. This is a little new to me and will appreciate further sources of information.
I changed the permission of the path on the server and transferred the file!
With reference to File Permissions , I gave the /etc/ssl/certs/ path the "Add other write & execute" permission by this chmod command when ssh'd into the Ubuntu server:
sudo chmod o+wx /etc/ssl/certs/
Then, on my local machine, the following command copied a file on my directory and transferred it to destination:
scp -i /Users/me/key_pair.pem /Users/me/ssl-bundle.crt ubuntu#ec2-50-150-63-20.eu-west-1.compute.amazonaws.com:/etc/ssl/certs/
It is the write permission you need, and depending on your use case, use the appropriate chmod command.
Simplest way to transfer files from local to ec2 (or) ec2 to local by FileZila.
You can connect with your instance by using Filezila, then transfer files from local to server and vice-versa.