I have a web site that I am using Apache for authentication (using basic auth for testing). It works just fine on all browsers. Now, I am trying to add the API that the web site uses to the authentication realm (using the same AuthName), and noticing the following behaviors:
On Safari it authenticates twice, once when going to the web site, and then again when it makes the API call. (I would prefer only to have to authenticate the first time.)
On both Chrome and Firefox it authenticates when I first go to the web site, but then it returns a 401 error when the web site makes the API call.
These are all on the same domain/port, so I do not see this being a CORS issue (especially since this works when I remove the authentication requirement for the API, which is then not locked down and hence, not desirable). I do have the same AuthName, FWIW, but that seems to have no effect.
My API config in httpd.conf is:
WSGIDaemonProcess rest_api user=gms threads=5
WSGIScriptAlias /api /var/www/extjs/rest_api/rest_api.wsgi
<Location /api>
Options +FollowSymLinks +Multiviews +Indexes
AllowOverride None
Order allow,deny
Deny from all
AuthType basic
Satisfy Any
AuthName "PrivateRepository"
AuthUserFile /var/www/extjs/.htpasswd
Require valid-user
</Location>
While the web site's is:
<VirtualHost *:80>
ServerName cardiocatalogqt
Alias /cardiocatalogqt /var/www/extjs/cardiocatalogqt
<Location /cardiocatalogqt>
Options +FollowSymLinks +Multiviews +Indexes
AllowOverride None
Order allow,deny
Deny from all
AuthType basic
Satisfy Any
AuthName "PrivateRepository"
AuthUserFile /var/www/extjs/.htpasswd
Require valid-user
</Location>
</VirtualHost>
You'll need to rearrange the URL's so they have a common prefix if you want browsers to pre-emptively send basic auth credentials.
Related
I have a virtualhost configured with basic auth, but I want to whitelist only one url, because it will be called from a 3rd party API where I can't configure authentication. I read the other questions here but I couldn't get it work, this is how it looks like now:
<Location "^/this/url">
AuthType None
Order Allow,Deny
Allow from all
Satisfy any
</Location>
<Location />
AuthUserFile "/srv/.htpasswd"
AuthName authorization
AuthType Basic
require valid-user
Order Allow,Deny
Deny from all
Satisfy any
</Location>
So I want http://www.example.com to have authentication but http://www.example.com/this/url don't.
I'm trying to configure a WebDAV environment. However, I keep getting this error:
htaccess: require valid-user not given a valid session, are you using lazy sessions?
Looking at Fiddler, I see HTTP Code 500.
All google searches seem to include references to Shibboleth, which I have installed, but not calling in this path structure.
<Directory "/path/to/webdav">
Options Indexes MultiViews FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
<VirtualHost *:443>
ServerName my.domain.com
DocumentRoot "/path/to/root"
...
Alias /aaa/bbb /path/to/webdav/aaa/bbb
<Location /aaa/bbb>
Options Indexes
DAV On
AuthType Basic
AuthName "webdav"
AuthUserFile /path/to/webdav.pwd
Require valid-user
</Location>
</VirtualHost>
Solution below...
Essentially, there's a blanket requirement for a Shibboleth session in the last lines of my Host configuration.
<PathRegex regex=".*" authType="shibboleth"
requireSession="true" requireSessionWith="Intranet" />
I simply had to add an exception on the webdav folder before those lines.
<Path name="webdav" authType="shibboleth" requireSession="false" />
I am trying to allow Amazon CDN to access the resources on my password-protected staging site (HTTP Basic Authentication).
This is the code I have in the httpd.conf file for it:
NameVirtualHost *:80
<VirtualHost *:80>
ServerName staging.domain.com
DocumentRoot /var/www/html
<Directory "/var/www/html/">
Options Indexes MultiViews FollowSymLinks
AllowOverride all
AuthName "Development Access"
AuthType Basic
AuthUserFile /path/to/password.htpasswd
Require valid-user
SetEnvIf User-Agent "^Amazon.*" cdn
Order allow,deny
Allow from env=cdn
</Directory>
</Virtualhost>
I'm using SetEnvIf to assign a variable if the user agent is Amazon and then just allowing it, but this is not working. Can somebody please help me out with this one?
the problem is that a valid user is required to get to the content, indifferent of the user agent used.
Give this article in the Apache Manual a read, specifically take a look at the RequireAny bit. That allows you to setup the rules with the complexity you require. Your config code would look something like this.
SetEnvIf User-Agent "^Amazon.*" cdn
<RequireAny>
Require valid-user
Require cdn
</RequireAny>
This only works on Apache 2.4 upwards. On 2.2 you can look at this article in the Apache Wiki and specially to the Satisfy Any directive. Hope this helps.
If you have Apache 2 and possibly the requirement to access the resources with HTTP Auth, this has worked for me:
<Directory /var/www/yourwebdirectory>
SetEnvIf User-Agent "^Amazon.*" cdn
AuthUserFile /etc/apache2/.htpasswd.forthissite
AuthType Basic
AuthName "My Files"
Require valid-user
Order allow,deny
Allow from env=cdn
Satisfy Any
</Directory>
I am having trouble understanding how I can get to edit files on a WebDAV setup. I have set up the Auth correctly, as verified by loads of online tutorials, yet there are some files like .htaccess which I can't edit.
The contents of the VirtualHost setup are
<VirtualHost *:80>
ServerAdmin xxx
ServerName xxx
DocumentRoot /data/www/vhosts/xxx
<Directory /data/www/vhosts/xxx>
Options Indexes MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
<Location />
DAV On
AuthType Basic
AuthName "WebDAV Access"
AuthUserFile /data/www/.htpasswd-webdav
Require valid-user
</Location>
</VirtualHost>
I've generated the correct username in the file too, and I can log in successfully and see all the files. Like I say, the problem is that certain files are unreadable and unwritable, the main culprits being .htaccess and .gitignore.
I have set the permissions on all files to 664 and all folders to 775 and a user:group of xxx:www-data. The reason being that this allows PHP to read/write the files ok, and our remote login user xxx to do the same without permissions issues.
Is there something specific I need to do to allow reading writing to these hidden dot files? I'm completely stumped, as most tutorials I've read are telling me that if I don't set the rights on dot files to root:root then they will be writable. I am using a Mac to connect to the WebDAV service, which runs on Ubuntu, if this makes any difference
Just for clarity, all of the xxx in this question is to hide info.
So it seems that I can allow access to specific files using the below
<Files .htaccess>
order allow,deny
deny from all
</Files>
I forgot that .htaccess files are blocked over HTTP by default.
EDIT:
The final working setup, to make all files writeable in the webdav environment, with Digest secure authentication is:
<VirtualHost *:80>
ServerAdmin xxx
ServerName www.domain.name
DocumentRoot xxx
<Directory xxx>
Options Indexes MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
<FilesMatch "\.(htaccess|php)$">
Order allow,deny
allow from all
ForceType text/plain
</FilesMatch>
<Location />
DAV On
AuthType Digest
AuthName "Webdav Access"
AuthDigestDomain / http://www.domain.name/
AuthDigestProvider file
AuthUserFile /data/www/digest.users
Require valid-user
php_value engine off
</Location>
</VirtualHost>
I hope this helps someone else. It took days to find all this info out on the web.
Also check the permissions of /data/www/ itself. It should be writable for the apache user.
I was wondering if it was possible to setup a conditional http basic auth requirement based on the virtual host URL in an .htaccess file.
For example what I want to do is have mysite.com and test.mysite.com run off the same code base in the same directory but password protect test.mysite.com. It would be setup this way so that I wouldn't need to branch my code since my app code can see which vhost/url it's being served from and pick the database to serve content from.
You can sort of kludge this by using mod_setenvif along with the mod_auth modules. Use the SetEnvIfNoCase directive to set which host is password protected. You'll need a couple of extra directives to satisfy access:
# Check for the hostname here
SetEnvIfNoCase HOST ^test\.mysite\.com\.?(:80)?$ PROTECTED_HOST
Then inside the Directory block (or just out in the open) you have your auth stuff setup, something like this:
AuthUserFile /var/www/test.mysite.com/htpasswd
AuthType Basic
AuthName "Password Protected"
Now for the require/satisfy stuff:
Order Deny,Allow
Satisfy any
Deny from all
Require valid-user
Allow from env=!PROTECTED_HOST
This will make it so any host that doesn't match ^test\.mysite\.com\.?(:80)?$ will have access without need for auth (Allow from env=!PROTECTED_HOST) but otherwise, we need a valid user (Require valid-user). The Satisfy any ensures that we just need one of the 2, either the Allow or Require.
I had problems implementing Jon's solution:
Although I am quite familiar with Apache conf and regular expressions, the authentication always fired. From a quick analyzes it looked like the Allow from env=!PROTECTED_HOST line did not kick in.
But I found another solution that actually looks safer to me:
I created two virtual hosts for the two domains pointing to the same document root (which is fully allowed by the way). In one of the vhosts I added the directives for basic auth (directly into the vhost directive block).
Works like a charm. And I have a better feeling that this is really safe - no risk to overlook any details in the regex pattern that would open up the gates for intruders.
<VirtualHost *:80>
ServerName www.mysite.com
DocumentRoot "/path/to/common/doc/root"
<Directory "/path/to/common/doc/root">
Options Indexes FollowSymLinks
AllowOverride All
Order allow,deny
Allow from all
</Directory>
</VirtualHost>
<VirtualHost *:80>
ServerName protected.mysite.com
DocumentRoot "/path/to/common/doc/root"
<Directory "/path/to/common/doc/root">
Options Indexes FollowSymLinks
AllowOverride All
Order allow,deny
Allow from all
AuthUserFile /path/to/htpasswd
AuthName "Password please"
AuthType Basic
Require valid-user
</Directory>
</VirtualHost>
Here's a solution similar to what Jon Lin proposed, but using RewriteCond to check the host name:
RewriteEngine On
RewriteCond %{HTTP_HOST} =protected.hostname.com
RewriteRule ^.*$ - [E=DENY:1]
AuthUserFile /path/to/htpasswd
AuthName "Password please"
AuthType Basic
Order Deny,Allow
Satisfy any
Deny from all
Require valid-user
Allow from env=!DENY