I am working on a Drupal 7 project, the requirement is to upload a HTML file to server and pass the HTML file as a parameter to a Perl script and capture the return code given by the Perl program. the weird problem is, if I FTP the HTML file into the server, it works fine. but if I upload using Drupal file upload its getting saved in /tmp but not able to run it using Perl. the permissions are the same, but still Perl is not able to access the file, did anyone ran into this problem?
Sirish
You need to consider the user you are uploading via FTP, the user Perl is when it executes the file, and the user Drupal runs as.
You mentioned permissions are the same, but is ownership the same too?
Usually drupal will upload files as the apache user (apache, www-data, or whatever user apache is setup to run as). If Perl is running as a different user, then the permissions on the uploaded file will need to be set so that the Perl user can execute the apache owned file.
If that is the case and the Perl script needs to execute that file, then you can use PHP chmod function after the upload to set the file as executable (maybe 755).
Related
I'm developing an automated backup system for a server using PostgreSQL and Tomcat. The environment is CentOS minimal 7. Long story short, a VM will download the .sql dumps and a .tar.gz folder containing Tomcat via FTP.
No problems in setting up vsftpd, I can access the Server via FTP with a custom user (ftpuser) which currently can access a specific folder (/home/ftpuser/backups/). I can compress tomcat there so my script will fetch the backups/ folder and download it, but I cant figure out how to dump the postgresql db to the /home/ftpuser/backups/ folder without having to do some stupid things with sudo.
Postgres user haven't the permission to write there and i can't give them to him even with chown or chmod. I inserted postgres in sudoers and if I dump the db and then I "sudo cp" it to that folder is okay, but in this way I cant use a script to do that, due to "sudo" asking password.
The question is.. Is there a way to enable "pg_dump" to write .sql dump to /home/ftpuser/backups/ folder?
Thanks for the replies.
pg_dump does not need to be run from the postgres user.
Run it from a user that can write to the desired folder, and pass the --username=database_user parameter to specify the desired database user. You'll probably need a .pgpass file for the password used by this user (unless it has been defined to be trusted on pg_hba.conf).
Got an odd Serverpilot query not specifically Craft related
My client wants FTP access to a subfolder on the site - loathe to let them have full access so created a user with access to /home/FTPUSER and symlink to this from my /srv/users/serverpilot/apps/APPNAME/public/ folder so that they cannot access the site's core system files, etc from a script.
PHP files from this folder are working fine but the client now wants to be able to run CGI/Perl scripts from this folder. I have tried following instructions at https://serverpilot.io/community/articles/how-to-create-a-cgi-bin-directory.html (updating the document root) but can't get CGI or Perl scripts to run, instead being returned as plain text.
Any thoughts?
I have a text file "data.txt", and based on input to an html form I want to display a single line from that file. My result is delivered by a CGI script which needs to access data.txt, but I don't want a user to be able to type in "data.txt" into their web browser and see the whole file. Is there a simple way to make "data.txt" readable by the CGI script but not accessible by loading it with the browser?
I'm using standard apache on ubuntu. I believe the suexec module can do this, but I'm hoping for a simpler solution just using fancy permissions, chowns, etc. Thanks-
Store your datafile outside of the webserver filetree (for apache, check the DocumentRoot).
Whenever I create an .htaccess file in a directory it disappears. I am running a VPS at Digital Ocean (I have full control over the server). So I can't see why my .htaccess files are automatically deleted upon creation. I even tried to make the file on my computer and just transfer it to the directory via FTP but as soon as it transfers, it disappears. I checked the log of the FTP transfer and the file transferred successfully. I can't figure this out.
Its because system files are hidden on apache servers... Either select the option to see hidden files if you're using a GUI or type the command "ls -a" if you're on terminal and you should see the files. Any file that starts with a dot is going to be hidden by default. Your .htaccess files fall within the same category.
If you want to be able to view the .htaccess file on the server, make sure you are logged in as the root user, or a user with root level permissions.
Then, navigate yourself to "/home/username/public_html(in my case)" And if you have a .htaccess file uploaded, it should be displayed there.
im working with cms made simple.my problem is my template folder permission.in this cms when a template uploaded,a folder (by the same name of that temlate,for example : 'TEMP1') creates and it's permission is set to 0755.when i want to change permission of the template folder i will get this error :
FileOp Failure on: /home/visamast/public_html/uploads/arty1: Operation not permitted
and also when i want to upload files via ftp or cpanel to this folder nothing will happen,i mean the upload process will be done,but no files has been uploaded!!!!!!!!
how can i fix this problem?!
It sound like you are having an ownership problem rather than a permission problem. If your server is set up to run PHP as a module, files and directories created by PHP will be owned by the generic Apache user. Generally that means that you will not be able to change permissions on the file/directory. Most likely you will need to have your hosting company do a recursive chown on the entire directory tree your site is in to make you the owner of all of the files and directories.