When I run 'composer update' I get this error:
Writing lock file
Generating autoload files
[ErrorException]
chmod(): Operation not permitted
*It works just fine with sudo, but then I have to reset the owner & permissions, Which is really annoying...
**I also tried to reset the owner of ~/.composer to www-data with 777, no effect.
***I'm using Ubuntu 16.04 LTS + Apache/2.4.18 & php7.0.26
Any idea?
chmod will only work without sudo if the owner of the file is the same as the one running the composer update command.
The problem is that the error message doesn't tell you which file it's trying to chmod.
This depends on the project.
Running the command in verbose mode will give you more details:
composer update -v
In my case, it gave me a stack trace, showing which file called chmod(), and the line number.
However, it didn't give me the path of the file passed to chmod().
I had to add a simple echo right before the call to chmod() (without forgetting to remove it afterwards).
Once you know which file/folder is responsible for the error message, change its owner with chown.
In my case (Magento 2.3), the culprit was the bin/magento file, which needs to be owned by the user running the composer commands.
Related
So I'm running into this error every time I run a command in terminal (using Visual Code Studio) while doing anything within a git repository.
Terminal Git Error Message
I did some digging and found out the owner of the .config file is "root" and not me "username" (see second screenshot)Root is owner of .config file
Do you know how to change the ownership to me so I stop getting this warning message? I ran a command I found on here "sudo chown -R $(username) .config" but it wasn't recognized, then asked me for a password and wouldn't let me type anything so I closed out of terminal.
I'm new to all this and coming from a construction background and going back to school so layman's terms would be appreciated.
Thanks in advance!
sudo chown $USER path/to/.config
is the correct command. sudo asks for your password (the same one you use to login to the $USER account); for security reasons the password isn't echoed, you just have to input it blindly and press Enter.
I have a remote headless server (MacOS BigSur 11.3.1). When I log in via ssh (with either the root user or regular user), I am unable to save to the crontab.
When I use the following command:
% crontab -e
I can see a cronjob that I saved when I was logged in locally (not via ssh). After editing and exiting the crontab, I get the following error:
crontab: installing new crontab
crontab: tmp/tmp.1028: Operation not permitted
crontab: edits left in /tmp/crontab.kKYx3tt4c1
While logged into ssh, I have instead tried to edit the crontab with this command:
% sudo crontab -e
To my surprise, the cronjob that I saved when logged in locally is not listed. It is as if it is a different crontab for a different user. In any case, I can't save to the crontab when using sudo either. It gives the exact same error as above.
I have followed the advice of a few internet posts suggesting allowing the cron and sshd executables "Full Disk Access" through the Mac System Preferences. However, the same error persists.
I'm not sure what to try next.
So the issue was solved by giving sshd-keygen-wrapper full disk access. Don't ask me why that needs it, but it is working now. I hope this helps anyone with the same issue.
I am pretty new at php and ubuntu. I have 2 servers set up, one for development and one for staging. On the dev machine I can use the at command without a problem, but on staging I get a permissions error. The at.deny (and at.allow) files are identical, so it must be another permissions issue.
Any clues?
I see that on the staging server I can only use at command as root. How can I fix this to be able to use the at command as www-data? Again... I checked the at.allow and at.deny files ... they are not the problem here.
1) Check if you have file /etc/at.allow.
If it exists - just add your user in new line.
If not exists - try to find your user in /etc/at.deny and remove/comment it.
2) Restart "at" daemon:
sudo atd restart
3) Check:
at -l
or
sudo -u myuser at -l
The error should not be output.
I own a QNAP-219P and I want to set this up manually using s3cmd.
I did quite a bit of research on this, and here are the references I got:
http://web.archive.org/web/20091120211330/http://codemonkeybrown.com/qnaps3.html
http://wiki.qnap.com/wiki/Running_Your_Own_Application_at_Startup
http://wiki.qnap.com/wiki/Add_items_to_crontab
http://blog.wingateuk.com/2013/03/cloud-backup-on-qnap-nas.html?showComment=1413660445187#c8935766892046800936
I'm trying to get the s3cmd to work on my TS-219P.
I got everything to work (on command line), even running the script file (s3-backup.sh) on command line:
#!/bin/bash <-- I also tried #!/bin/sh
/share/maintenance/s3cmd-1.5.0-rc1/s3cmd --rr sync -rv /share/all-shared-folders/emilie/ s3://kingjim-backup/kingjim-nas/emilie/ >> /share/maintenance/log/s3cmd/backup_`date "+%Y%m%d-%H-%M"`.log <-- I also tried running s3cmd via python by adding /usr/bin/python on the front.
If I run using the SSH command prompt, it seems to work perfectly.
The problem though, is the cronjob. I can confirm the cronjob trigger, and it was run, because my log file (the one above) was generated, but the log is always empty, even though I'm sure there are some new files created/modified.
This is my cronjob task:
14 3 * * * /share/maintenance/s3-backup.sh 2>&1 | logger
I've done a number of different variations on the above, but couldn't find out what was missing.
I feel like some dependency is missing when the crontab is running, as compared to when I run it on command prompt. But I don't know how to debug crontab.
Found out that the problem was that the s3cmd configuration file was not found when running s3cmd.
So the fix was simply to copy this .s3config file to a safe shared folder, and then call the s3cmd with the "--config" parameter followed by the file.
Like this:
/share/maintenance/s3-backup/s3cmd/s3cmd --config
/share/maintenance/s3-backup/s3cmd.config --rr sync -rv /share/MD0_DATA/ s3://xxx-backup/xxx-nas/ >> /share/maintenance/s3-backup/logs/backup_`date "+%Y%m%d-%H-%M"`.log 2>&1
Though I have followed the usual steps for using the dotCloud CLI under Cygwin, dotcloud push fails in all cases: --rsync, --hg, and --git.
I am on Windows 8 and Cygwin.
How can I push successfully?
Sample output:
me#host /cygdrive/d/project
$ dotcloud push --rsync
==> Pushing code with rsync from "./" to application myapp
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at /home/lapo/package/rsync-3.0.9-1/src/rsync-3.0.9/io.c(605) [sender=3.0.9]
me#host /cygdrive/d/project
$ dotcloud push --git
Permission denied (publickey,password).r from "./" to application myapp
fatal: The remote end hung up unexpectedly
me#host /cygdrive/d/project
$ dotcloud push --hg
==> Pushing code with mercurial from "./" to application myapp
abort: no suitable response from remote hg!
Error: Mercurial returned a fatal error
You may be running into a bug in Cygwin's group permissions. Vineet Gupta gives a workaround in his blog. The problem comes from the very strict permissions expected by ssh around the keys, and the solution is to set the permission on the ssh key properly (to 600, rw by owner only). Cygwin seems to need the group to be added manually.
Updating the steps to get the dotCloud CLI installed, including setting the permissions, leads to:
Start the Cygwin Setup.
Select default choices until you reach the package selection dialog.
Enable the following packages:
net/openssh
net/rsync
devel/git
devel/mercurial
python/python (make sure it’s at least 2.6!)
web/wget
After the installation, you should have a Cygwin icon on your desktop. Start it: you will get a command-line shell.
Download easy_install
wget http://peak.telecommunity.com/dist/ez_setup.py
Install easy_install
python ez_setup.py
You now have easy_install; let’s use it to install pip:
easy_install pip
Now install dotcloud (the CLI)
pip install dotcloud
Set up the CLI with your credentials. This will also download the ssh key.
dotcloud setup
New Step Update the permissions on your dotCloud key:
chgrp Users ~/.dotcloud_cli/dotcloud.key
chmod 600 ~/.dotcloud_cli/dotcloud.key
Now you should be able to dotcloud push
If you have multiple dotCloud accounts, then you will need to repeat this process for each account, since each account has its own key. Also note that you shouldn't have to set these permissions manually, but it seems like the group ownership is sometimes the wrong default in Cygwin. Linux and OSX don't seem to show this problem, though the permissions must be 600 for all OSes, so it is worth checking.