Host securely password protected static website, without creating security vulnerabilities, alongside other IIS websites - apache

I would like to host a password protected static website on a server, and meet the following 2 requirements:
The static website credentials MUST NOT give any additional access to the hosting server.
The hosting must play nicely with other IIS hosted websites
The hosting server is running Windows 10 Pro.
I've identified 4 options:
Host it in IIS with Basic Authentication enabled
Host it in Apache, separate port, secure with .htpasswd file
Host it in Apache in a VM, use a bridged network, secure with .htpasswd file
Develop a middleware/route request authentication application
Option 1:
Evidently, this option requires a whole new User on the computer.
I do not understand the limitations of a new user's access.
When I hit WindowsKey + R, and run netplwiz, I can configure the user to belong to one of these groups:
Users(default): Users are prevented from making accidental or intentional system-wide changes and can run most applications.
Guest: Guests have the same access as members of the Users group by default, except for the Guest account, which is further restricted as described earlier.
IIS_IUSR: Built-in group used by Internet Information Services.
I can not find the following information in any Microsoft docs:
How IIS_IUSR is "used" by IIS
If any of these groups restrict all access, other than viewing the Basic Auth website
An exhaustive list of permissions granted by the user login credentials, and each group
This method seems confusing and annoying at best, and a complete security failure at worst.
Option 2:
This seems more secure to me, because I can understand the limitations of the user access better.
Option 3:
This seems even more secure, because the hosting server is not directly accessed.
I do not know if this creates other security vulnerabilities though.
Option 4:
This one seems the most secure, because I have full understanding and control over the website's access.
This could take a lot of work though.

An organization can adopt the following policy to protect itself against web server attacks.
Patch management– this involves installing patches to help secure the server. A patch is an update that fixes a bug in the software. The patches can be applied to the operating system and the web server system.
Secure installation and configuration of the operating system
Secure installation and configuration of the web server software
Vulnerability scanning system– these include tools such as Snort, NMap, Scanner Access Now Easy (SANE)
Firewalls can be used to stop simple DoS attacks by blocking all traffic coming the identify source IP addresses of the attacker.
Antivirus software can be used to remove malicious software on the server
Disabling Remote Administration
Default accounts and unused accounts must be removed from the system
Default ports & settings (like FTP at port 21) should be changed to custom port & settings (FTP port at 5069

Related

How to access On-premise TFS from untrusted domain when NTLM is disabled

Our organisation is disabling NTLM due to concerns with its pass the hash weakness. Our team includes 3rd party developers who use PCs that are in an untrusted domain. When they attempt to access our on-premise TFS instance from Visual Studio the authentication fails. My research shows that this is due to Kerberos only working when a trust exists.
Does anyone know of a work-around? The security team are simply suggesting we set up a VDI environment!
Set up a VPN connection. This will allow you to create a tunnel between networks so that those machines in outside domain can hit your server. Another way is exposing TFS to the outside world by allowing the appropriate ports through your firewall. Which may runs counter to the security requirements for your organisation.

Symfony permission recommendation: same user cli and webserver

I read this recommendation in the installation guidelines from Symfony:
1. Use the same user for the CLI and the web server
In development environments, it is a common practice to use the same UNIX user for the CLI and the web server because it avoids any of these permissions issues when setting up new projects. This can be done by editing your web server configuration (e.g. commonly httpd.conf or apache2.conf for Apache) and setting its user to be the same as your CLI user (e.g. for Apache, update the User and Group values).
This is only good practice for local development environments or should I do this on my public test & prod server as well? To me this doesn't seem as a very secure configuration?
Questions Can I safely follow this recommendation on a prod server? What are the risks, if there are any?
This recommendation give an easy alternative to avoid the common permissions problem.
I would prefer setup the web server permissions correctly once and keep the default webserver group/user.
The documentation has a good guide to achieve this.
EDIT
You shouldn't make your CLI user as your webserver user, especially in production because it opens you up to all kinds of potential abuse.
The whole point of the www-data user is that it is an unprivileged user, by default not able to write to any file .
Your CLI user is most often root, also keep the www-data user as the web server owner protect you from bad manipulations that can involves a lot of problems and potential security issues.
Plus, if your webserver is under an attack, other services which depends on the same user can be also compromised.
Server daemons accessible from the outside network (such as the web server) typically run as an unprivileged user so that in the event that they are hacked due to a vulnerability, the possible things the attacker can do is minimal.

Security Risks Associated With Local Web Servers

If I set up a local server using, say, Apache or WAMP are there any associated security risks? I'm not planning on hosting or making any content "publicly accessible," I just want to set up an environment where I can learn PHP and develop using an HTML5 game engine. Sorry if this is a completely naive question; I'm just a bit confused about how server security works.
If you don't open up any ports in your router to allow for public access to your web server, then it won't add any security risks. Just installing the local web server won't do this.
On a side note, WAMP is a collection of tools that includes Apache as the web server, they are not examples of two different web servers.

Authentication issue with IIS

using IIS 6
I have the default web site that works and can authenticate users to the domain when they connect.
I have created a second website, siteb, put a host (a) record into DNS, I can browse to it as long as I have use anonymous access, when I select windows authentication, it fails...
not sure what i'm missing here...
Thanks.
This goes beyond just IIS if you're using Integrated Windows Authentication. You've created "siteb" in DNS which allows your users to connect to it so this is good. However, when their browser requests a Kerberos ticket for "siteb" from Active Directory, AD is probably responding that it cannot find "siteb". You can verify this with Wireshark.
The fix is to add "siteb" (and any other permutations with which you expect users to access the site) as an additional servicePrincipalName for the server's machine account in AD. You can accomplish this with the "setspn.exe" utility. It should be available on your domain controller. If not, you can install it from the Windows 2003 Support Tools.
Some examples of adding a UPN alias with setspn on the DC are:
setspn.exe –A HTTP/siteb <server hostname>
setspn.exe –A HTTP/siteb.acme.com <server hostname>
This should take effect immediately. The final step is ensuring that the browser "trusts" the new website name. In Internet Explorer, for IWA to occur automatically, the server name should be listed in either the Trusted Sites or Intranet zone.
Of course, you could avoid all this hoopla by having the 2nd website just run on a different port under the same name, e.g.: http://sitea:81

Using ldap locally to share login info with webapps - Do I need Kerberos too?

So I'm setting up a dedicated server using Debian 5 Lenny. I will be using some Atlassian Tools (JIRA, Confluence, Bamboo, and Fisheye). I want to use a local LDAP server to store information for the users that will be accessing these software titles, so that they can use one set of credentials to log in.
I also want webmail users to be configured using LDAP.
However, this is a small operation. Three people. That's why all of the software, including the ldap server, will all be on the same machine.
That said, is it safe to use LDAP to store user credentials (including passwords) in LDAP without using Kerberos? I'm confused as to when Kerberos should be used.
Hypothetically, let's say I had two servers on a subnet. Server A received requests from the outside world, for atlassian tools. Server a communicates to ldap server (internally) on server b. In that case, would I use kerberos?
When do I use Kerberos? When do I not?
I am not setting anything like "Active Directory" up. No Samba either. Users do not need to login to a domain (with access to files on the domain), they just need to login to webapps. But if I was doing LDAP on it's own dedicated machine, then I might want Kerberos?
:confuzzled: :(
-Sam
The simplest possible answer is yes, it is possible to store user names, user ids, and passwords without using Kerberos, and in fact directory services accessed via LDAP are an excellent tool for storing this sort of authentication and authorization information.
Update:
In my opinion, if you do choose an open source server, you will find OpenDS to be superior to OpenLDAP or Apache.
Basically, if you have Kerberos, you do not need any directory server. If you aren't in a corporate environment and are looking for an identity management store, you should definitively go for a directory server like OpenLDAP or Apache Directory. Kerberos require running a correctly set up DNS and NTP server. This might be way to much. Even if you do, those lazy morons from Atlassian still did not implement Kerberos support into their products. You can't even go with that.
I just noticed that there are only three of you, maybe a simple database setup with MySQL would suffice instead of running a full-blown directory server?