Disable Indexes option in apache2 for specific hosts - apache

Is there a way to disable indexes within a directory for a single host in apache2 site configuration while allowing all other hosts to list the directory? I know how to allow Indexes and disable host access to a directory, but I want the host in question the ability to execute items within the directory, just not list them in a web browser.
This is an example of what I do not want:
<Directory /path/to/dir>
Options Indexes
Order allow,deny
allow from all
deny from 10.0.0.10/32
</Directory>
The block above allows indexes in /path/to/dir for everyone except for anyone connecting from 10.0.0.10. So 10.0.0.10 is successfully denied access but that access includes any type of access, not just viewing the directory structure in a web browser.
This is another example of what I do not want:
<Directory /path/to/dir>
Options -Indexes
Order allow,deny
allow from all
</Directory>
The block directly above disables indexes for everyone no matter what host they are connecting from.
TLDR; How do I disable indexes for a single host while allowing indexes for all other hosts?

Related

https works after comment out deny from all, but will there be any security problem?

I'm now working on installing certification of our website to https. I've tried for few days until I found one forum which to take note on deny from all which will block the access . So I comment out deny from all and now it works, but will there be any issue on security side? Below are the configuration used, are there any website that I can refer to for related command?
<Directory "${INSTALL_DIR}/www/abc">
SSLOptions +StdEnvVars
Options Indexes FollowSymLinks MultiViews
AllowOverride All
Order Deny,Allow
Deny from all
Allow from 127.0.0.1 localhost ::1
</Directory>
The Deny from all directive does exactly what it says it does: it blocks all requests, regardless of their origin. Ironically, the next line permits access if and only if the request originated from the same IP address, so this might be the safest configuration you can have, provided you don't mind having the most useless server of all time.
You only want to use the Deny from all to prevent access to the filesystem, otherwise it blocks all incoming requests, as you noticed. Then you specifically allow access only to the directories where you plan on serving files from, like so:
# Make the server filesystem completely off-limits
<Directory "/">
# Do not permit .htaccess files to override this setting
AllowOverride None
# Deny all requests
Require all denied
</Directory>
<Directory "${INSTALL_DIR}/www/abc">
# If you want directories to be allowed to override settings
AllowOverride All
# Let people actually access the server content
Require all granted
</Directory>
<Files ".ht*">
# Make sure .htaccess file (which contain server configurations and
# settings) are completely off-limits to anyone accessing the server,
# even if they are in a directory that is otherwise accessible.
Require all denied
</Files>
As far as the security of the server is concerned, the best advice I would give you is just make sure sensitive files and passwords are not stored in a directory accessible by the server. Even passwords in php files are not safe, because if a malignant actor is able to disable the php engine somehow, the file will be served in plain-text, with all of the sensitive information right there.
The best method of circumventing this is to create a configuration file outside the server root directory and using a SetEnv directive to define the variable.
SetEnv DATABASE_USERNAME "KobeBryantIsBetterThanJordan24"
SetEnv DATABASE_PASSWORD "LebronJamesIsAlsoPrettyGood107"
Then you can use something like this to get the variables into your php scripts without every exposing the information in plaintext.
$username = filter_input(INPUT_SERVER, 'DATABASE_USERNAME', FILTER_SANITIZE_STRING);
$password = filter_input(INPUT_SERVER, 'DATABASE_PASSWORD', FILTER_SANITIZE_STRING);
define('DATABASE_USERNAME', $username);
define('DATABASE_PASSWORD', $password);
Last but not least, make sure you add phpinfo to the disable_functions setting in your php.ini file, as that would immediately expose the password.

How Put xampp server Online for remote access outside

I want to allow remote access for my xampp server .
Actually i am working for web services i want that if we are on same network i can access the web from other system using IP
so for that i need to allow outside access.
xampp/apache/conf/extra/httpd-xampp.conf directory
I tried to make some changes in config file but not working.
Might be need to make some changes here
Alias /phpmyadmin "C:/xampp/phpMyAdmin/"
<Directory "C:/xampp/phpMyAdmin">
AllowOverride AuthConfig
Require local
But not sure what to change
Thanks for Help in advance
Replace your line of code as
Alias /phpmyadmin "C:/xampp/phpMyAdmin/"
<Directory "C:/xampp/phpMyAdmin">
AllowOverride AuthConfig
Require all granted
Its working for me
Refrence:https://www.apachefriends.org/faq_windows.html
Question on above reference
How do I restrict access to phpMyAdmin from the outside?

How to create an Alias in Apache to a network shared directory?

I'm running Apache 2.2 (on OS 10.9 Mavericks) and have a directory on my NAS (My Cloud EX2100) that I would like to set up with as an aliased web site.
To do so, I've created a .conf file (I called it aliases.conf) in /private/etc/apache2/other (Note that the httpd.conf has Include /private/etc/apache2/other/*.conf added to it).
In my aliases.conf I have
Alias /foo /Volumes/bar/
<Directory "/Volumes/bar">
Options FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
I then restart apache and open a browser to go to http://localhost/foo, but I get the error message
Forbidden
You don't have permission to access /foo on this server.
How do I give Apache access to the shared/aliased directory that is on the NAS?
Make sure that the apache user has read permissions to your NAS folder.
Furthermore switch the order of allow and deny to Order deny,allow
I don't know if you have any index files. But if you would like to browse through your directories you have to modify your options entry to: Options FollowSymLinks Idexes
Then restart your apache and try again.

/etc/httpd/conf/httpd.conf versus /etc/httpd/conf.d/owncloud.http /Directory directives

Below is a segment from the owncloud.conf file in /etc/httpd/conf.d. It is the intent to lock out all access except the 10.0 intranet and a limited set of external ip address xx.yy.0.0. However the configuration is not locking out other access. All external address are being allowed. Is there something obvious with this configuration.
<Directory /var/www/http/owncloud/>
Options Indexes FollowSymLinks MultiViews
AllowOverride none
Require all denied
Order Deny,Allow
Deny from all
Allow from 10.0.0.0/16
Allow from xx.yy.0.0/16
</Directory>
It's either being overridden in a different configuration section (like Location or LocationMatch) or your clients are coming through proxies that make them appear to match your rules.
Try this
Deny from none
How ever swap around your ip config and change it to deny
Allow from 10.0.0.0/16
Allow from xx.yy.0.0/16

How can I create read-only FTP access for user on Apache server?

I have a web site with lots of pages of photography. In order to allow visitors to download groups of photos without having to save each one individually, I want to create a read-only FTP user that will be publicly available.
Via the control panel for the host, I can create "regular" FTP user accounts, but they have write access, which is unacceptable.
Since there are several domains and subdomains hosted on the same server I don't want to use anonymous FTP -- the read-only FTP account should be restricted to a specific directory/sub-directories.
If possible, I would also like to know how to exclude specific directories from the read-only FTP access I grant to this new user.
I've looked all over on the server to find where user account info is stored to no avail. Specifically I looked in httpd.conf, and found LoadModule proxy_ftp_module modules/mod_proxy_ftp.so, but I don't know how to go about working with it (or if it's even relevant).
It seems like your reason for using FTP is to let people download many photographs at once.
You can just serve links to zip files too, using standard Apache HTTP access control. This way the specific risk of people deleting or overwriting your files, which you mentioned, is eliminated by using plain HTTP.
You can make one directory to provide an index of the zip files to download
<Directory /var/www/photos/>
Order allow,deny
Allow from all
Options Indexes
</Directory>
And apply standard permissions to the rest of your directories
# your file system is off limits
<Directory />
Options None
AllowOverride None
Order deny,allow
Deny from all
</Directory>
DocumentRoot /var/www/
# the rest of your content.
<Directory /var/www/>
<LimitExcept GET POST>
deny from all
</LimitExcept>
Order allow,deny
Allow from all
Options None
</Directory>