Got an odd Serverpilot query not specifically Craft related
My client wants FTP access to a subfolder on the site - loathe to let them have full access so created a user with access to /home/FTPUSER and symlink to this from my /srv/users/serverpilot/apps/APPNAME/public/ folder so that they cannot access the site's core system files, etc from a script.
PHP files from this folder are working fine but the client now wants to be able to run CGI/Perl scripts from this folder. I have tried following instructions at https://serverpilot.io/community/articles/how-to-create-a-cgi-bin-directory.html (updating the document root) but can't get CGI or Perl scripts to run, instead being returned as plain text.
Any thoughts?
Related
I've launched Wordpress on Google Compute Engine (via their automated launcher process). It installs quickly and easily and visiting the external IP displayed in my Compute Engine VM Instances Dashboard, I am able access the fresh installation of Wordpress.
However, when I scp an existing Wordpress installation oldWPsite into var/www/ then replace my html directory
mv html htmlFRESH
mv oldWPsite html
my site returns a 'failed to open' error. Directory permissions user:group are identical.
Moreso, when I return the directories to their original configuration
mv html oldWPsite
mv htmlFRESH html
Still, the error persists.
I am familiar with other hosting paradigms where I can easily switch between the publicly served files by simply modifying directory names. Is there something unique about Google Compute Engine? What is the best way to import existing sites, files, etc into the Google Cloud environment?
Replicate
Install Wordpress via Google Launcher on a micro-VM.
Visit public IP of the VM instance.
SCP a fresh installation of Wordpress tovar/www.
Replace the Google installed html directory with the newly created and copied Wordpress directory using mv commands.
Visit public IP of the VM instance.
===
Referenced Questions:
after replacing /var/www/html directory, apache does not work anymore
permission for var/www/html directory - a2enmod command unrecognized on new G-compute VM
The import .htaccess file had https redirect which caused the server to prompt failure since https is not setup in a fresh launch of Wordpress through GCE. Compounding the issue, the browser cache held that memory when the previous site was moved back to the initial conditions.
Per usual, the solution involved the investigation of user errors.
I have a Rails 3 application which has an attachment model and uses Paperclip gem. Everything works fine on development environment but on production server we cannot access any of the images uploaded. The images are in the right folder where they are supposed to be but when I try to reach them on browser I simply get the 404 page.
The upload folder is located under public folder and called "uploads"
I can access this: "app_url/uploads/test.html" which I manually created to see if it works
But I cannot acces this: "app_url/uploads/test.jpg" which I upload within the application via Paperclip.
I can guess this has something to do with the server configuration but I'm not an expert and may need help about it.
Thanks
UPDATE
I've just realised that uploaded files belong to "nobody" and when I manually change the owner to "root" it seems to be working fine. So I need to find a way to tell Paperclip make the files belong to "root"
It's not a good idea to have a web application being able to write files as root. File permissions are derived from the process writing the files. In case you're using Passenger, there's the concept of user switching:
http://www.modrails.com/documentation/Users%20guide%20Apache.html#PassengerDefaultUser
Upon startup of your app, Passenger tries to figure out which user owns those files, and tries to switch it's application process to that user. In case it fails, "nobody" is the default.
Check your application permissions on the file level. You should have one user account per application on your server. The application (the directory and contents above the public directory) should be owned by this user. Files under public should be readable by others, so the webserver can pick them up, too.
Are you using Capistrano for deployment?
I'm trying to evaluate Symfony 2 (2.1.7). I'm installing it following the download instructions on an EC2 instance that is already running PHP 5.3.20 on Apache.
I'm stuck on the second step of the README.md: "Access the config.php script from a browser". The readme assumes a local installation and provides a sample URL to the localhost: http://localhost/path/to/symfony/app/web/config.php.
Since I'm on a remote server, I try to access the config.php file using the relevant URL: http://mysite.com/Symfony/app/check.php, which returns this message:
Forbidden
You don't have permission to access /Symfony/app/check.php on this server.
I tried to apply the answer from How do I access to symfony config.php remotely? by adding what PHP reports back as my REMOTE_ADDR, but that doesn't change the message.
What do I do now?
In symfony, the web folder is supposed to be your webroot. So, if you want to access \project\web\config.php, you should point your browser to http://www.example.com/config.php.
If that doesnt work, apache is probably configured incorrectly. make sure it it is pointed at your web directory, not your project directory.
edit As you mention in your question, you will also need to edit the config.php file to allow remote access. You can comment those lines out, or add your IP to the whitelist.
edit2 Many webhosts don't allow you to specify your webroot. In that situation, you can put the Symfony files in a different directory and create a symlink between the Symfony web directory and your webroot.
I am working on a Drupal 7 project, the requirement is to upload a HTML file to server and pass the HTML file as a parameter to a Perl script and capture the return code given by the Perl program. the weird problem is, if I FTP the HTML file into the server, it works fine. but if I upload using Drupal file upload its getting saved in /tmp but not able to run it using Perl. the permissions are the same, but still Perl is not able to access the file, did anyone ran into this problem?
Sirish
You need to consider the user you are uploading via FTP, the user Perl is when it executes the file, and the user Drupal runs as.
You mentioned permissions are the same, but is ownership the same too?
Usually drupal will upload files as the apache user (apache, www-data, or whatever user apache is setup to run as). If Perl is running as a different user, then the permissions on the uploaded file will need to be set so that the Perl user can execute the apache owned file.
If that is the case and the Perl script needs to execute that file, then you can use PHP chmod function after the upload to set the file as executable (maybe 755).
im working with cms made simple.my problem is my template folder permission.in this cms when a template uploaded,a folder (by the same name of that temlate,for example : 'TEMP1') creates and it's permission is set to 0755.when i want to change permission of the template folder i will get this error :
FileOp Failure on: /home/visamast/public_html/uploads/arty1: Operation not permitted
and also when i want to upload files via ftp or cpanel to this folder nothing will happen,i mean the upload process will be done,but no files has been uploaded!!!!!!!!
how can i fix this problem?!
It sound like you are having an ownership problem rather than a permission problem. If your server is set up to run PHP as a module, files and directories created by PHP will be owned by the generic Apache user. Generally that means that you will not be able to change permissions on the file/directory. Most likely you will need to have your hosting company do a recursive chown on the entire directory tree your site is in to make you the owner of all of the files and directories.