I have a server and I make a tunnel with cloudflare to be able to access from outside creating an application in Cloudflare Zero Strust and I am trying to disable all the policies so that it allows me to access the application without authentication.
But nothing works, whenever I try to enter my linkstream.domain.org application, it asks me for authentication via email.
The reason is that this application allows me to play streams so I don't want it to have authentication to access.
I've searched and configured everything, but I can't remove this authentication from cloudflare:
Cloudflare Access is a product that can be used to add authentication to an application. If you want your application to be public (i.e. no authentication), I'd recommend not adding it to Access at all. You can set up a Cloudflare Tunnel without adding any Access application, for example to expose a webserver to the public.
I'd recommend looking also at the Allow policies.
Related
Sitecore security hardening guide instructs to restrict access to /sitecore/admin folder by disabling anonymous access. However, after I do that, I get an IIS error when I try to visit pages like /sitecore/admin/cache.aspx.
HTTP Error 401.2 - Unauthorized. You are not authorized to view this page due to invalid authentication headers.
Should anonymous access only be disabled if I don't want to access admin pages?
My sitecore version is 6.6.0 (rev. 130404).
In addition to disabling anonymous access, you should make sure some sort of other authentication method is enabled. By default, IIS7+ doesn't have any other authentication methods available, so all traffic will get an "unauthorized" error. With another means of authentication enabled, IIS will let you access the /sitecore/admin path (at which point, Sitecore's authentication may kick in).
I've done this in the past by creating a local user on the machine and enabling basic auth. Keep in mind, basic auth is not too secure since credentials are passed over the wire as cleartext, but in this case we forced traffic over SSL.
Though not spelled out in the hardening guide, you could also look at limiting access to that directory by IP address. For example, on a production content delivery server, restrict access to only localhost, meaning you cannot browse that directory without being RDP'd to the server directly.
I think you should remove extranet/anonymous access, but make sure that sitecore/everybody (or other role) has access.
That way you can only access it while logged into Sitecore.
Use the Access Viewer to check that users have access to it.
And I think that those pages, have a Sitecore login now. I know /sitecore/admin/dbbrowser.aspx has one.
I would not disable the anonymous access unless it is the production environment. I am not sure how you have the environments setup but ideally cache clearance should be on your stage/uat environment.
I have a web based application which is used to find information about various assets in a facility. This provides only search capability, no CRUD operations allowed from the application (except for READ). This web application is always kept open in a touchscreen device (ie workstation) and this could be used by any of the facility staff. The user does not want to initiate login and logout for each of the search operation.
We are planning on deploying the web application onto the cloud. Although it is not a need to authenticate the user who is accessing the web-application, it is still a need to ensure that information about assets in the facility are not accessible by others. How do I build this authentication layer? The various options I can think of are:
1. Include userid/password in the URL as parameters. I could create a userid/password for each of the facility. Simple, but userid/password area always visible.
2. Certificate based approach. Certificates are created for each of these workstations and deployed on those workstations. Quite secure, but has the challenge of managing the certs life-cycle. As well challenge of configuring the web-servers with certs from different facilities???
Any suggestions?
Thanks,
Prasanna
A simple, but not secure thing. Do an IP check and if the IP is from your facility then grant access.
The second, but secure method is to do a verification at the start of the application with just a password and store a session , so that you will know that people from your facility are accessing the site..
we have a sharepoint 2010 site and are using the same url to access it from inside and outside the network.
our issue is that we don't want users inside our network to get asked for credentials when accessing the site.
if, for example, the url we wanted to use was https://sp.domain.com, how would we set this up?
You can do this via Alternate Access mappings. Have two zones for your sharepoint site: Intranet and Extranet. In AAM settings:
Intranet : http://sp.domain.com
Extranet: http://extranet.domain.com
You did not specify which authentication scheme if you are using for outside network. If its forms authentication, you can set Windows for 1 and Forms authentication for 2.
However, if its AD only for both, you will have to have sp.domain.com configured as Intranet Url and extranet.domain.com as Internet Url in each of client computer. This can be done using group policy.
Your proxy server will have to do the work of transferring the sp.domain.com from external network to extranet.domain.com internally.
Good to read:
http://sharepoint.microsoft.com/blog/Pages/BlogPost.aspx?pID=804
After some theoretical help on the best approach for allowing a SaaS product to authenticate users against a tenant's internal Active Directory (or other LDAP) server.
The application is hosted, but a requirement exists that tenants can delegate authentication to their existing user management provider such as AD or OpenLDAP etc. Tools such as Microsoft Online's hosted exchange support corporate AD sync.
Assuming the client doesn't want to forward port 389 to their domain controller, what is the best approach for this?
After doing some research and talking to a few system admins who would be managing this, we've settled on an two options, which should satisfy most people. I'll describe them here for those who were also interested in the outcome.
Authentication Service installed in the origanisation's DMZ
If users wish to utilise authentication with an on-premises active directory server they will be required to install an agent in their DMZ and open port 443 to it. Our service will be configured to hit this service to perform authentication.
This service will sit in the DMZ and receive authentication requests from the SaaS application. The service will attempt to bind to active directory with these credentials and return a status to indicate success or failure.
In this instance the application's forms based authentication will not change, and the user will not be aware of the authentication behind the scenes.
OpenId
Similar to the first approach, a service will be installed in the client's DMZ, and port 443 will be opened. This will be an OpenId provider.
The SaaS application will be an OpenId consumer (already is for Facebook, Twitter, Google etc login).
When a user wishes to log in, the OpenId provider will be presented, asking them to enter their user name and password. This login screen would be served from the client's DMZ. The user would never enter their username or password into the SaaS application.
In this instance, the existing forms based authentication is replaced with the OpenId authentication from the service in the client's DNZ.
A third option that we're investigating is Active Directory Federated Services, but this is proprietary to Active Directory. The other two solutions support any LDAP based authentication across the internet.
Perhaps this might help…
This vendor, Stormpath, offers a service providing: user authentication, user account management, with hookups to your customers’ on-premise directories.
What about an LDAPS connection to the customer's user directory? They can firewall this off so that only your servers have access if they're concerned about it being public. Since it's SSL it's secure end to end. All you need from them is the certificate from their issuing CA (if it's not a public one). I struggled to get this working for an internal web project in the DMZ and there's a real lack of any guides online. So I wrote one up when I'd got it working:
http://pcloadletter.co.uk/2011/06/27/active-directory-authentication-using-ldaps/
Your best bet is to implement a SAML authentication for your SaaS application, and then sign up with identity providers like Okta or OneLogin. Once that's done then you can also connect it with ADFS to provide Single Sign On for your web application through Active Directory.
I'm just doing this research myself and this is what I've came across of, will have more updates once implementation is done. Hope this gives you enough keywords to do another google search
My understanding is that there are three possible solutions:
Installing something on the domain controller to capture all user changes (additions, deletions, password changes) and send updates to the remote server. Unfortunately there's no way for the website to know the initial user passwords - only new ones once they are changed.
Provide access for the web server to connect to your domain controller via LDAP/WIF/ADFS. This would probably mean opening incoming ports in the company's firewall to allow a specific IP.
Otherwise, bypass usernames/passwords and use email-based authentication instead. Users would just have to authenticate via email once every 3-6 months for each device.
I have to begin implementing this for an upcoming project and I'm seriously leaning towards option #3 for simplicity.
I have a project with the purpose of exposing multiple web applications over the internet. These applications are build using IIS/DotNet and Apache/Php.
The internet user should log-in in only one place, and then be able to access any aplication.
What are the posible solutions to this scenario? One requirement is that changes to existing applications be minimum and another is to use ActiveDirectory for user management.
I have found so far the following solutions:
use a reverse proxy (COTS product) to publish web applications to the internet, and the proxy should take care of authentication/SSO
using forms authentication and a domain wide cookie; this solutions requires changes to existing applications and manual log in in AD
create a new application using forms authentication and after user enters credentials into this application, use these credentials to send a XMLHttpRequest to another applications (this will log in the user)
use client certificates, so that when a user connects to an applications, his certificate will handle the log in process; this approach has a problem when there is more than one certificate installed in the client browser because the browser will ask the user to choose a certificate (and this will happen for every app)