Proper .htpasswd usage - apache

Assuming a small (pages < 5) site, what is the proper usage of .htaccess and .htpassword? I recently watched a tutorial from Nettuts+ where this sample code was given:
.htaccess
AuthName "Login title"
AuthType Basic
AuthUserFile /path/to/.htpasswd
require valid-user
.htpasswd (created using htpasswd -c <file> <username> command)
username:encrypted-version-of-password
I am also curious as to the actual level of security this provides: can it be bypassed easily? If Apache by default does not allow users to access either of the two files directly, do they need to be outside the public directory? Are there any speed implications?

What level of security does this provide?
.htpasswd does not provide much security by itself. That is, it provides a login mechanism, and Apache will not respond without the proper credentials, but unless separately configured, nothing about the exchange is encrypted (or even obfuscated). For example, listening to the GET request with Wireshark gives you a nice view of all the headers being sent by the client, including:
Authorization: Basic d3BhbG1lcjp0ZXN0dGVzdA==
"d3BhbG1lcjp0ZXN0dGVzdA==" being just the base64 encoded form of "wpalmer:testtest". These days, a hacker (or more likely, a virus) can sit on a public WiFi connection and log any requests containing the Authorization: for later perusal. In general, sending any authentication information over an unencrypted HTTP connection is considered a bad idea, even if you're over a wire or secure WiFi end-to-end. It would not, for example, meet the requirements of PCI Compliance if you were storing customer data, payment information, etc behind the .htpasswd lock.
Add https to the mix, and you completely eliminate that particular problem, however...
.htpasswd authentication as implemented by Apache httpd does not provide any form of rate-limiting or brute-force protection. You can make as many simultaneous attempts at a password guess as Apache is willing to serve simultaneous pages, and Apache will respond with success/failure as soon as it possibly can. You can use something like Fail2Ban to limit the number of failed attempts can be made before the client is blocked from talking to the server, but that will not necessarily provide any useful protection against a botnet, which may automatically target your server from thousands of unique addresses. This can lead to the decision of "do I leave myself vulnerable to password attempts from botnets, or do I leave myself vulnerable to denial-of-service attacks, when the entire account is locked-down due to failures from multiple clients?"
These angles of attack can be limited by adding IP-based restrictions to your .htaccess file, allowing connections only from certain addresses. Depending on how you operate, this may be inconvenient, but it also severely limits the types of threats which you would be vulnerable to. You would still be at risk from someone specifically targeting your site, or an infection on part of the network infrastructure itself. Depending on the type of content you are protecting, this may be "good enough". An example of this type of restriction is:
Order deny,allow
Deny from all
Allow from 127.0.0.1
This means, in short, "only allow connections from the local host". Line-by-line, it means:
Order deny,allow defines the order in which rules are processed, with the last match taking precedence.
Deny from all begin by assuming that all clients are denied
Allow from 127.0.0.1 if the client has the IP 127.0.0.1, then it is allowed
To some extent, IP-based restrictions will also protect you to the point where HTTPS may be considered optional. Attackers / viruses can still see your credentials, but it is harder for them to use those credentials on the page itself. Again, this would not be PCI compliant, and it would not be suitable for "important" information, but there are some situations for which this may be considered "good enough". Be aware that many people re-use credentials across multiple sites, so failing to protect login details is generally considered to be very dangerous to the user, even if the site itself is protected.
Finally, the .htaccess file itself is a bit of a liability. See the response to "do they need to be outside the public directory?" for more details on that.
Can it be bypassed easily?
No. There is no reason to expect that the server, when properly configured, would ever fail to require login details to access the protected content. While HTTP Basic authentication has its flaws, Apache httpd is very robust and is one of the most thoroughly tested pieces of software in the world. If you tell Apache that HTTP Basic authentication is required to access certain content, it will be required.
If Apache by default does not allow users to access either of the two files directly, do they need to be outside the public directory?
There are a couple of points to this. First, Apache does not default to preventing access to either of these files. Many distributions of Apache httpd include initial configuration which prevents access (using "Deny from all" rules) to, depending on the distribution, .htaccess/.htpasswd files, .ht* files, or .* files. It is very common, but there are plenty of reasons why this may not be the case. You can add a rule yourself to block these files, if they are not already blocked:
<FilesMatch "^.(htaccess|htpasswd)$">
Order Allow,Deny
Deny from all
</FilesMatch>
Secondly, it should be pointed out that the way .htaccess files work, they are processed when the directory they are in is matched. That is to say: .htpasswd may be elsewhere, but .htaccess needs to be in the same directory. That said, see the "speed implications" section for a bit more detail.
So, as they can be blocked so easily, why keep .htpasswd outside of the public directory? Because mistakes happen, and the .htpasswd file is a big liability. Even if you're using HTTPS, exposure of your .htpasswd file means that your passwords can be easily cracked via brute-force attacks. These days, consumer-grade GPUs can make millions of password guesses per second. This can make even "strong" passwords fall in comparatively little time. Again, this argument generally only applies to a targeted attack, but the fact remains that if an attacker has your .htpasswd file and wants access to your system, these days, they may be able to do so easily. See Speed Hashing on Coding Horror for a relatively-recent (April 2012) overview of the state of things.
With that in mind, the possibility of accidentally (temporarily) exposing your .htaccess file is worth moving it somewhere that should never be even looked at when httpd is looking for content to serve. Yes, there are still configuration changes which could expose it if it's "one level up" instead of "in the public directory", but those changes are much less likely to happen accidentally.
Are there any speed implications?
Some.
First off, the use of .htaccess files does slow down things somewhat. More specifically, the AllowOverride all directive causes a lot of potential slow-down. This causes Apache to look for .htaccess files in every directory, and every parent of a directory, that is accessed (up to and including the DocumentRoot). This means querying the filesystem for a file (or updates to the file), for every request. Compared to the alternative of potentially never hitting the filesystem, this is quite a difference.
So, why does .htaccess exist at all? There are many reasons which might make it "worth it":
depending on your server load, you may never notice the difference. Does your server really need to squeeze every last millisecond out of every request? If not, then don't worry about it. As always, don't worry about estimates and projections. Profile your real-world situations and see if it makes a difference.
.htaccess can be modified without restarting the server. In fact, this is what makes it so slow- Apache checks for changes or the presence of an .htaccess file on every request, so changes are applied immediately.
An error in .htaccess will take down the directory, not the server. This makes it much less of a liability than changing the httpd.conf file.
.htaccess can be modified even if you only have write-access to a single directory. This makes it ideal for shared hosting environments. No need to have access to httpd.conf at all, or access to restart the server.
.htaccess can keep the rules for access next to the files they are meant to effect. This can make them a lot easier to find, and just keeps things more-organised.
Don't want to use .htaccess, even considering all of the above? Any rule that applies to .htaccess can be added directly to httpd.conf or an included file.
What about .htpasswd? That depends on how many users you have. It is file-based, and the bare minimum in terms of implementation. From The docs for httpd 2.2:
Because of the way that Basic authentication is specified, your
username and password must be verified every time you request a
document from the server. This is even if you're reloading the same
page, and for every image on the page (if they come from a protected
directory). As you can imagine, this slows things down a little. The
amount that it slows things down is proportional to the size of the
password file, because it has to open up that file, and go down the
list of users until it gets to your name. And it has to do this every
time a page is loaded.
A consequence of this is that there's a practical limit to how many
users you can put in one password file. This limit will vary depending
on the performance of your particular server machine, but you can
expect to see slowdowns once you get above a few hundred entries, and
may wish to consider a different authentication method at that time.
In short, .htpasswd is slow. If you only have a handful of users who need to authenticate, you'll never notice, but it is yet another consideration.
Summary
Securing an admin section with .htpasswd is not ideal for all situations. Given its simplicity, it may be worth the risks and problems where security and performance are not the highest of priorities. For many situations, with a little bit of tweaking, it can be considered to be "good enough". What constitutes "good enough" is a judgement call for you to make.

Related

It is possible to find out the version of Apache HTTPD when ServerSignature is off?

I have a question. Can I find out the version of Apache when full signature is disabled. Is it even possible? If it is, how? I think that is possible because blackhats hacking big, corporate servers while knowledge of the version of the victim services is essential. What do you think? Thanks.
Well for a start there are two (or even three) things to hide:
ServerHeader - which shows the version in the Server response field. This cannot be turned of in Apache config but can be reduced to just "Apache".
ServerSignature - which displays the server version in the footer of error pages.
X-Powered-By which is not used by Apache but used by back end servers and services it might send requests to (e.g. PHP, J2EE servers... etc.).
Now servers do show some information due to differences in how the operate or how they interpret spec. For example the order of response headers, capitalisation, how they respond to certain requests all give clues to what server software might be being used to answer http requests. However using this to fingerprint a specific version of that software is more tricky unless there was an obvious, observable change from the client side.
Other options include looking at server status-page - though you would hope any administrator clever enough to reduce the default server header would also restrict access to the server-status page. Or through another security hole (e.g. being able to upload load executable scripts or the like).
I would guess most hackers would more be aware of bugs and exploits in some versions of Apache or other web servers and try to see if any of those can be exploited rather than trying to guess the specific version first.
In fact, as an interesting aside, Apache themselves have long been of the opinion that hiding server header information is pointless "security through obscurity" (a point I, and many others, disagree with them on) even putting this in their documention:
Setting ServerTokens to less than minimal is not recommended because
it makes it more difficult to debug interoperational problems. Also
note that disabling the Server: header does nothing at all to make
your server more secure. The idea of "security through obscurity" is a
myth and leads to a false sense of safety.
And even allowing open access to their server-status page.

Why is .htaccess insecure by default to prevent unauthorized access?

I was browsing the web and came across the following:
Source code, including config files, are stored in publicly accessible directories along with files that are meant to be downloaded (such as static assets). [...] You can use .htaccess to limit access. This is not ideal, because it is insecure by default, but there is no other alternative.
Source: owasp.org
Sometimes I use the following code to prevent access from a specific directory:
// contents of .htaccess
order deny,allow
deny from all
allow from none
On servers where there is access outside of the webroot there is obviously less need to prevent access to folders/files with .htaccess.
Can someone explain why they write ".htaccess is insecure by default" and what are alternative ways to prevent access to certain files on a regular LAMP-stack?
.htaccess is not a complete security solution. It doesn't protect you from ddos, sniffing, or man in the middle (when using auth) without SSL.
As far as denying access to specific files, it's generally fine. The scenarios under which it would fail to do so are scenarios where there has already been a successful exploit somewhere else. Since any files in the directory have to be readable by the process owner, the files are only superficially secured by .htaccess.

encrypting file that stores SQL passwords

What is the most secure way to store usernames/password combinations for databases that are used by apache?
there must be something more secure than just storing cleartext passwords in a single file and putting them in a folder that only root and apache have access to.
Well, let me ask this: who are you protecting from? You want to be "more secure", but you haven't identified any attack vector.
External users
If you're trying to protect them from external users, you have two options:
Move the file outside of the webroot. Therefore, the attacker cannot get the file as Apache won't serve it.
Protect the file via permissions or DENY in Apache. That way Apache won't serve it.
The first option is better, as a misconfiguration can't expose the credentials.
Internal users (people who have access to the server)
You can't protect against them. If the attacker can get on the server, they can (in general at least) get access to the credentials.
Yes, you can encrypt the file. But Apache must know how to decrypt it, so the attacker can figure that out.
Your only defense is to set permissions on the file properly. Make it so that it can only be read by Apache (nobody).
In short, there's little to no gain to encrypting credentials, as if your server is properly configured there's no chance of them leaking to someone that they wouldn't leak to already.
Protect your application from other attack vectors (SQLi, Code Injection, XSS, etc) as they are more likely to be the ways that an attacker is going to get in...

Security problems regarding +FollowSymLinks and -SymLinksIfOwnerMatch?

I'm using this for one of my applications:
Options +FollowSymLinks -SymLinksIfOwnerMatch
And I worry about the security problems this may bring. Any idea what measures I can take to make this approach as secure as possible?
There's nothing specific you can do to make using those options as secure as possible. The risk in using them is that a user, or a process running under a user, can disclose information or even hijack content by creating symlinks. For example, if an unpriviliged user (who may have been compromised) wants to read a file that they normally can't, they can sort of escalate it by creating a symlink from their public_html directory to it, and if apache can read it, they can then just access their webpage and read the file. There's nothing specific you can do to prevent something like that from happening except to make sure you're system is properly patched and configured.
Note that this threat isn't just from users on your system. If you are running a webapp in, say php, and it got compromised somehow, an attacker can upload a php file browser and create symlinks to content outside of your document root (like to /etc/passwd or some other file you don't want exposed to the web).
If you're worried about stuff like that, it's better not to use these options.

How do I react when somebody tries to guess admin directiories on my website?

I've been getting these messages in apache error.log for quite a while:
[client 217.197.152.228] File does not exist: /var/www/phpmyadmin
[client 217.197.152.228] File does not exist: /var/www/pma
[client 217.197.152.228] File does not exist: /var/www/admin
[client 217.197.152.228] File does not exist: /var/www/dbadmin
[client 217.197.152.228] File does not exist: /var/www/myadmin
[client 217.197.152.228] File does not exist: /var/www/PHPMYADMIN
[client 217.197.152.228] File does not exist: /var/www/phpMyAdmin
And many more different addresses. Looks like somebody is trying to guess where my admin applications are located. What should I fear in this situation, and what a knowledge of my admin addresses can give to attacker, if everything is password protected?
If everything is locked down well, fear nothing. These are just automated attacks that happen to every URL in existence. Same thing happens to me, and I don't even run PHP on my server.
If you don't have the latest patches (like on say, WordPress), then yes this is a big problem, but one that's relatively easy to fix.
if you have admin or restricted folders you could configure it in htaccess to restrict access only to your ip or ip range like this
<Directory /var/www/AdminFolder/>
Options FollowSymLinks
Order Deny,Allow
Deny from all
Allow from 128.98.2.4 # your ip only
</Directory>
It will only be a good solution if you have static ip, but then you will be completely sure that you ll be the only one to get inside adminfolder
If they find a login page they could try to do a brute force attack or other password cracking approach.
In these cases if there is an IP that is consistently displaying such behaviour we block it with denyhosts and ModSecurity.
Firstly... Never install in a default folder.
Secondly... If you "Must" use a prefab program, rename the admin folders to something less tasty, like, "homework". No-one will ever look there for anything important. (Due to many poor coding techniques of prefab programs, they do not operate willingly when you relocate and rename folders. You would think security would be their primary goal, but, having an admin-folder at www/home level, and no ability to select the location or name, is your first sign of poor programming.)
Thirdly... Move all your INCLUDES above www/home. (Move above = move back one level, to the folder that contains the www/home folder.) Again, expect to get your hands wet with code, as the programmers most-likely did not follow that simple security commonplace with code. You will have to tell your code to look in the new includes path, which is now above the www/home folder.
Fourthly... Where possible, setup a LOCK-OUT on your admin folder. Use your FTP or C_Panel to unlock the folder, only when it is needed. You should not be logged-in daily, or as a common login. For all you know, you have a virus on your computer and it is watching you type-in your password every time, or capturing your cookies, or injecting worms on your server once you have logged-in. The better alternative is to find programs with external admin controls. EG, no software on the server. The software stays on your PC, and it only accesses your server to update changes, briefly.
Fifthly... Get a blacklist plugin for your server, or request one. Your HOST should be blocking those types of obvious scans at the router level, sending repeated requests to a black-hole. Hosts that don't provide the lowest level of security, should not be used. That is what the hackers use, since they don't block them when they attack. (Expect that your NETWORK NEIGHBORS are potentially hackers on a shared-server. They will be fishing inside your shared-temp-files for sessions, cookie data, backup-code, sql-ram data, etc... Looking for anything important. Usually your clients e-mails and passwords, CC info, paypal info, telephone numbers, addresses, and anything else not nailed down, to sell.)
Sixthly... Find a host which does not have prefab programs available. Why? Because they are all poor security, free-ware yester-years versions, unpatchable, poorly configured, and your neighbors are using them too. If you have no neighbors, great... but that does not make you any safer. The installers have decreased your server security, so they can gain access to installing the programs, even if you never install one of them. Hackers exploit that also, installing thing they know they can hack, that exist in your c-panel or server-control, and then they hack in through those exploited installed programs.
LOL, just print books... J/K, that is hackable too!
You know what is not hackable... Pure HTML, on a server without PHP, ASP, MySQL, FTP, e-mailers, and all the other things we all love to play with so much. Oh, but the HTML has to be on a CD, or a hard-drive with the erase-heads unwired. Hehe...
It seems he's looking for PHPMySQLAdmin installations, probably to automatically try and use known exploits on old versions.
If you're not using PHPMyAdmin you should be fine. If you do, make sure it's updated to the latest version, and maybe move it to a non-guessable URL.
If you have protected everything it's no real big deal.
http://217.197.152.228/phpmyadmin/ <- that's where your phpmyadmin is running. Seems it's pass protected etc so don't worry too much!
There are some exploits that will reveal info in fact, your phpmyadmin is vulnarable to some attacks:
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2007-0204
Maybe you should check for exploit docs on your phpmyadmin version.