run perl scripts as different user without mod_perl - apache

I'm using web-console in my project. It's a console in the browser where I can type some commands and it runs them in the server with user www-data.
The installation instructions say
If your web server configured to execute Perl scripts under specific
user account, please make sure that this user has write permissions
for recently created directory.
is it possible to run perl scripts in the server with a user different than www-data? I can't see it in the apache docs. I'm using apache 2 with this configuration (without mod_perl):
AddHandler cgi-script .cgi .pl
<Directory /var/www/myproject/public_html/webconsole>
Options Indexes FollowSymLinks MultiViews +ExecCGI
AllowOverride None
Order allow,deny
allow from all
</Directory>

Would this be of any help: "The suEXEC feature provides Apache users the ability to run CGI and SSI programs under user IDs different from the user ID of the calling web server. Normally, when a CGI or SSI program executes, it runs as the same user who is running the web server."
Source: https://httpd.apache.org/docs/2.2/suexec.html

Related

https works after comment out deny from all, but will there be any security problem?

I'm now working on installing certification of our website to https. I've tried for few days until I found one forum which to take note on deny from all which will block the access . So I comment out deny from all and now it works, but will there be any issue on security side? Below are the configuration used, are there any website that I can refer to for related command?
<Directory "${INSTALL_DIR}/www/abc">
SSLOptions +StdEnvVars
Options Indexes FollowSymLinks MultiViews
AllowOverride All
Order Deny,Allow
Deny from all
Allow from 127.0.0.1 localhost ::1
</Directory>
The Deny from all directive does exactly what it says it does: it blocks all requests, regardless of their origin. Ironically, the next line permits access if and only if the request originated from the same IP address, so this might be the safest configuration you can have, provided you don't mind having the most useless server of all time.
You only want to use the Deny from all to prevent access to the filesystem, otherwise it blocks all incoming requests, as you noticed. Then you specifically allow access only to the directories where you plan on serving files from, like so:
# Make the server filesystem completely off-limits
<Directory "/">
# Do not permit .htaccess files to override this setting
AllowOverride None
# Deny all requests
Require all denied
</Directory>
<Directory "${INSTALL_DIR}/www/abc">
# If you want directories to be allowed to override settings
AllowOverride All
# Let people actually access the server content
Require all granted
</Directory>
<Files ".ht*">
# Make sure .htaccess file (which contain server configurations and
# settings) are completely off-limits to anyone accessing the server,
# even if they are in a directory that is otherwise accessible.
Require all denied
</Files>
As far as the security of the server is concerned, the best advice I would give you is just make sure sensitive files and passwords are not stored in a directory accessible by the server. Even passwords in php files are not safe, because if a malignant actor is able to disable the php engine somehow, the file will be served in plain-text, with all of the sensitive information right there.
The best method of circumventing this is to create a configuration file outside the server root directory and using a SetEnv directive to define the variable.
SetEnv DATABASE_USERNAME "KobeBryantIsBetterThanJordan24"
SetEnv DATABASE_PASSWORD "LebronJamesIsAlsoPrettyGood107"
Then you can use something like this to get the variables into your php scripts without every exposing the information in plaintext.
$username = filter_input(INPUT_SERVER, 'DATABASE_USERNAME', FILTER_SANITIZE_STRING);
$password = filter_input(INPUT_SERVER, 'DATABASE_PASSWORD', FILTER_SANITIZE_STRING);
define('DATABASE_USERNAME', $username);
define('DATABASE_PASSWORD', $password);
Last but not least, make sure you add phpinfo to the disable_functions setting in your php.ini file, as that would immediately expose the password.

Disable Indexes option in apache2 for specific hosts

Is there a way to disable indexes within a directory for a single host in apache2 site configuration while allowing all other hosts to list the directory? I know how to allow Indexes and disable host access to a directory, but I want the host in question the ability to execute items within the directory, just not list them in a web browser.
This is an example of what I do not want:
<Directory /path/to/dir>
Options Indexes
Order allow,deny
allow from all
deny from 10.0.0.10/32
</Directory>
The block above allows indexes in /path/to/dir for everyone except for anyone connecting from 10.0.0.10. So 10.0.0.10 is successfully denied access but that access includes any type of access, not just viewing the directory structure in a web browser.
This is another example of what I do not want:
<Directory /path/to/dir>
Options -Indexes
Order allow,deny
allow from all
</Directory>
The block directly above disables indexes for everyone no matter what host they are connecting from.
TLDR; How do I disable indexes for a single host while allowing indexes for all other hosts?

How to create an Alias in Apache to a network shared directory?

I'm running Apache 2.2 (on OS 10.9 Mavericks) and have a directory on my NAS (My Cloud EX2100) that I would like to set up with as an aliased web site.
To do so, I've created a .conf file (I called it aliases.conf) in /private/etc/apache2/other (Note that the httpd.conf has Include /private/etc/apache2/other/*.conf added to it).
In my aliases.conf I have
Alias /foo /Volumes/bar/
<Directory "/Volumes/bar">
Options FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
I then restart apache and open a browser to go to http://localhost/foo, but I get the error message
Forbidden
You don't have permission to access /foo on this server.
How do I give Apache access to the shared/aliased directory that is on the NAS?
Make sure that the apache user has read permissions to your NAS folder.
Furthermore switch the order of allow and deny to Order deny,allow
I don't know if you have any index files. But if you would like to browse through your directories you have to modify your options entry to: Options FollowSymLinks Idexes
Then restart your apache and try again.

How can I make Flask and Apache (mod_wsgi) update my database queries on each visit to a page?

In my Flask application I have a def that queries a database. When I changed the file, the SQL, the results did not show up on the webpage. When I stopped and started Apache, service apache2 restart (on Debian 7), then the new query results showed up.
I am running my WSGI process in daemon mode using mod_wsgi, v. 3.3, Apache 2.2.
I am not using SQLAlchemy or any other ORM, straight up SQL with a pymssql connect statement.
I am using Blueprints.
If I touch the .wsgi file, Apache will load the results as expected.
I am not sure how Flask-Cache can help me (or any other Flask module).
WSGIDaemonProcess myapp python-path=/var/www/intranet/application/flask:/var/www/intranet/application/flask/lib/python2.7/site-packages
WSGIProcessGroup myapp
WSGIScriptAlias /myapp/var/www/intranet/intranet.wsgi
<Directory /var/www/intranet>
WSGIApplicationGroup %{GLOBAL}
Order allow,deny
Allow from all
</Directory>
<Location />
Options FollowSymLinks
AllowOverride None
order allow,deny
allow from all
AuthType Basic
AuthName "Subversion Repository"
Require valid-user
AuthUserFile /etc/apache2/dav_svn.passwd
<IfModule mod_php4.c>
php_flag magic_quotes_gpc Off
php_flag track_vars On
</IfModule>
I have read much of this, https://code.google.com/p/modwsgi/wiki/ReloadingSourceCode, but I do not know if this is something Flask may already have built in for production.
How can I make a code change take effect without restarting Apache?
Edit: My query is not in the .wsgi file.
What I ended up doing was use a post-receive hook in my --bare directory.
I started from here:
http://krisjordan.com/essays/setting-up-push-to-deploy-with-git
and added a touch to the end of it. Here is what I did:
#!/usr/bin/ruby
#Changed shebang a little from the website version for mine, Debian 7.
# post-receive
#johnny
require 'fileutils'
#
# 1. Read STDIN (Format: "from_commit to_commit branch_name")
from, to, branch = ARGF.read.split " "
# 2. Only deploy if master branch was pushed
if (branch =~ /master$/) == nil
puts "Received branch #{branch}, not deploying."
exit
end
# 3. Copy files to deploy directory
deploy_to_dir = File.expand_path('../deploy')
`GIT_WORK_TREE="#{deploy_to_dir}" git checkout -f master`
puts "DEPLOY: master(#{to}) copied to '#{deploy_to_dir}'"
# 4.TODO: Deployment Tasks
# i.e.: Run Puppet Apply, Restart Daemons, etc
#johnny
FileUtils.touch('/path/to/my/file.wsgi')
I commit:
git commit -a -m'my commit message'
then,
git push production master
After much reading most people do not seem to like the auto update. Where I work, they need to see things immediately. Most things are database reads or static templates, so I don't mind using the "auto" touch for this particular application.

How can I create read-only FTP access for user on Apache server?

I have a web site with lots of pages of photography. In order to allow visitors to download groups of photos without having to save each one individually, I want to create a read-only FTP user that will be publicly available.
Via the control panel for the host, I can create "regular" FTP user accounts, but they have write access, which is unacceptable.
Since there are several domains and subdomains hosted on the same server I don't want to use anonymous FTP -- the read-only FTP account should be restricted to a specific directory/sub-directories.
If possible, I would also like to know how to exclude specific directories from the read-only FTP access I grant to this new user.
I've looked all over on the server to find where user account info is stored to no avail. Specifically I looked in httpd.conf, and found LoadModule proxy_ftp_module modules/mod_proxy_ftp.so, but I don't know how to go about working with it (or if it's even relevant).
It seems like your reason for using FTP is to let people download many photographs at once.
You can just serve links to zip files too, using standard Apache HTTP access control. This way the specific risk of people deleting or overwriting your files, which you mentioned, is eliminated by using plain HTTP.
You can make one directory to provide an index of the zip files to download
<Directory /var/www/photos/>
Order allow,deny
Allow from all
Options Indexes
</Directory>
And apply standard permissions to the rest of your directories
# your file system is off limits
<Directory />
Options None
AllowOverride None
Order deny,allow
Deny from all
</Directory>
DocumentRoot /var/www/
# the rest of your content.
<Directory /var/www/>
<LimitExcept GET POST>
deny from all
</LimitExcept>
Order allow,deny
Allow from all
Options None
</Directory>