servicestack auth breaks at 4.0.21 - authentication

I am encountering a problem when I upgraded my ServiceStack recently. I separated the different versions to find the problem started at v4.0.21. All earlier versions work and all later versions do not work. It only happens with my call to authenticate (with Basic Auth). Also, it only happens when I make the call to authenticate from our iPad app. Since we have not made any changes to our iPad app I know that something changed with ServiceStack that is causing our problem. I looked through the release notes and found there were a lot of changes to the auth capability due to the addition of a Windows Auth Provider and a couple of other new OAuth providers.
Is there some changes to the settings that I need to make that I am not seeing in the Release Notes? Has anyone else encountered this problem?
Here is our code for registering for ss-auth in global.asax.cs:
public override void Configure(Funq.Container container)
{
SetConfig(new HostConfig {
EnableFeatures = Feature.All.Remove(Feature.Metadata),
AllowJsonpRequests = false,
HandlerFactoryPath = "api"
});
var authFeature = new AuthFeature(
() => new AuthUserSession(),
new IAuthProvider[] {
new MyBasicAuthProvider() // override of BasicAuthProvider
}
);
authFeature.HtmlRedirect = null;
authFeature.IncludeAssignRoleServices = false;
Plugins.Add(authFeature);
container.Register<ICacheClient>(new AzureCacheClient("default"));
var userRepository = new InMemoryAuthRepository();
container.Register<IUserAuthRepository>(userRepository);
}
Here is the request URL:
POST https://vh.azurewebsites.net/api/auth?format=json
And here are the headers that went with the HTTP Request:
Host: vh.azurewebsites.net
Authorization: Basic dXN2dONjb3R0OnBhc3N3b3Jk
Accept-Encoding: gzip, deflate
Accept: application/json
Cookie: ss-opt=perm
Accept-Language: en;q=1, fr;q=0.9, de;q=0.8, ja;q=0.7, nl;q=0.6, it;q=0.5
Content-Length: 0
Connection: keep-alive
Proxy-Connection: keep-alive
User-Agent: mHealth/QA (iPad; iOS 7.1.1; Scale/1.00)

The auth provider was changed so that the implicit default /auth can now be used to tell if a user is authenticated which will return Session Info if a user is already authenticated or a 401 Unauthorized if they're not.
You need to call the explicit /auth/{provider} Auth Provider Route for the auth provider you wish to authenticate against which for Basic Auth is /auth/basic.

Related

xero api fails with unauthorized (401 or 403) after calling auth, refresh and gettenants when calling getinvoices

I'm a rank noob at this, so excuse my ignorance. I've got an MVC web application to login, get the access and refresh tokens, and tenant list OK. I can even get it to refresh the refresh token. No problems.
When I try to run the GetInvoices endpoint either directly or via the sdk, I get 403 (skd) or 401 from the direct api call.
From the latest run with direct call I get this response
{StatusCode: 401, ReasonPhrase: 'Unauthorized', Version: 1.1, Content:
System.Net.Http.HttpConnectionResponseContent, Headers:
{
Server: nginx
Strict-Transport-Security: max-age=31536000
WWW-Authenticate: OAuth Realm="api.xero.com"
Cache-Control: no-store, no-cache, max-age=0
Pragma: no-cache
Date: Wed, 21 Jul 2021 11:19:56 GMT
Connection: close
X-Client-TLS-ver: tls1.2
Content-Type: text/html; charset=utf-8
Content-Length: 95
Expires: Wed, 21 Jul 2021 11:19:56 GMT
}, Trailing Headers:
{
}}
I know that the access token and tenant id used in the GetInvoices step are correct because I checked them against the values pulled in from the auth steps character by character.
The app is being run in Visual Studio 2019, using the self-signed development SSL certificate.
Why is it rejecting the request?
my controllers have the following
private static readonly string Scopes = "openid offline_access profile email accounting.transactions accounting.contacts accounting.attachments";
private static readonly string Scopes = "openid offline_access profile email accounting.transactions accounting.contacts accounting.attachments";
string[] tenant = (string[])TempData.Peek("tenant");
var client = new HttpClient();
var formContent = new FormUrlEncodedContent(new[]
{
new KeyValuePair<string, string>("summaryOnly", "true"),
});
client.DefaultRequestHeaders.Add("Authorization", (string)TempData.Peek("accessToken"));
client.DefaultRequestHeaders.Add("Xero-Tenant-Id", tenant[0]);
client.DefaultRequestHeaders.Add("Accept", "application/json");
var response = await client.PostAsync("https://api.xero.com/api.xro/2.0/Invoices", formContent);
The SDK should handle this for you in the helper methods for the client and OAuth flow but i've included what looks like is missing from just a raw API call below.
Core API call - looks like you need to prefix the token with the Bearer string.
Authorization: "Bearer " + access_token
If you are wanting to use the SDK note that there is a sub Nuget package for OAuth helpers that will help you obtain an access token which you need to pass to core api calls.
https://github.com/XeroAPI/Xero-NetStandard/tree/master/Xero.NetStandard.OAuth2Client
(DOH!) The Tenant returns an Id and a TenantId. I was using the Id.
Thanks to SerKnight and droopsnoot for helping.
I've added code from the OAuth2. The help does not mention to get and cast the return type of RequestAcessTokenAsync.
XeroOAuth2Token authToken = (XeroOAuth2Token)await client.RequestAccessTokenAsync(authorisationCode);
also to check the state on return, you must set in xconfig. mine reads
XeroConfiguration xconfig = new()
{
ClientId = Global.XeroClientID,
ClientSecret = Global.XeroClientSecret,
CallbackUri = new Uri(Global.RedirectURI),
Scope = Global.XeroScopes,
State = Global.XeroState
};
var client = new XeroClient(xconfig);
return Redirect(client.BuildLoginUri());

How to fix SLO with passport-saml that works on first logout but not on subsequent logouts

I have to connect an application to my company's ADFS server. I am using passport-saml for SSO and SLO. SSO works, and SLO works on the first logout only. I am trying to make SLO work every time a user logs out.
I have been searching high and low for a solution to this problem, but it evades me. Here's the details:
I clear the cookies in the browser to start with a clean slate.
I login to my application which redirects to ADFS' login page
Enter user credentials and then ADFS redirects back to my app's homepage
I log out of my app and a request is sent to the ADFS server killing my session locally and on ADFS, I am then redirected back to my app's homepage
I log in again and this works as intended
I logout but this time I am sent to my ADFS server's logout page.
Further inspection shows that ADFS is not clearing its cookies so the ADFS session stays live.
I have used Firefox's SAML viewer plugin to watch what is happening and here are my findings:
On a successful logout:
HTTP:
GET https://myadfs.org/adfs/ls/?wa=wsignout1.0 HTTP/1.1
Host: myadfs.org
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:66.0) Gecko/20100101 Firefox/66.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Referer: https://example.com/dashboard/data
Connection: keep-alive
Cookie: MSISAuth=AAEAAMVBaN7qo03wm/4jDH9e/tZ6ih6HN++2S7c7c0aXHK1RYIZ5++4Y7pf3g4v+OdRUzcJgOROfZkXx0tSEeCOfJFMluodJiSYsESiJnidVcR7Os/iHkNqIp88qGG7UZj+l8NYyvsO/7soTyQGkbMqoI0Z+0z+xXz2CZgOxsqWcjJ3FmTR32bsMR8Lra77XI2KyKycFiNYdYJ2dSKC7yBdxBRKHB7LAs4DOJKAtOt//IWspe9zPbju+x6chgP0dKToyfqX6m4EwlQnbHG4hmCImtXrEDytx1rbuLiBC7N56Y9WmGBTht5vgYvVEoA2cRqBbNYK+HoonL6+oBIJdba6+XZ2lBQsO/yJowvaHxPM8wgwLBknSt39RswaSdGjrI18CcgABAAB/eeLBPuQ9dk6ItCeTem38XttX/PQPLi52Ts+ZQGYHxs4VsO1EMe7EgMGYThPGlMCDcmS9ouXOSh6yW/LiL1jTuhc2/jhq3X0jWY+XPOSXtp81mineHeNv8SWsFjggzh5AymLtPPrPUYT6ihj9fcbJymqatsZMI5B5h0gxS2LaUUWjJyRxpMIyQXEpLSx1mxU5psQrj5/nGpOiq98uy8HE4kJp+Ey9uugSZQXhn9NwY+EqqmWxf6LDrCaeMLFDIX6mlgqu2eTLrUA9gNIJ4kSOC/5Rtw4JQVJpSeQuMom6kCHFEvZo/57BIhGkgWR8vNNCguHzZeB+as0xxfxmmb9SgAMAAMVFqaMXn0uG8+IGJIfxdIIoJ7EsLqV7so7WnFT/4OxfLzsXlO2flq0vcEbasLuLoqhGFaOuy1dkq/ft9se6Pv6rQfH7Esk/aMey/cKObBUPkcZAUFtQxXD7MSLScsiVnq3hHjrpZzEnMTToVkA9Zjv3i72Wv20tdE658+7O1olibavPPIT7Z5syoQNa1rjOAaXcPlM5hbbjXm7BiXx37ZEnvxwpY1Mf4Yocvgd9kMoApciDB2csbTf4GEic7MKeAI2G5KpwArY7g+zt4BJud+F/xnyuwVPpwPVEiNbHQnAogh5NoMDwRx+macTdkHku4AdNvruS/4L/aUHcEhPlhu3j/7r9kP1EnRso12NP1AWipsGlmpdAjoIXfK0+NBqJnDq0KwSEcvJ38OI6Z1FVkRWySi8br8pjtcytFhdh5RTkpD8FVQZ/RnGC1XE4q4IJhxMBlE1Kd8PNh3p85qpoX6r2I36a3knwK2dkm7pb0XNVwhxhC5DGpaB2iNo86CGi+BX4rICBGkNgyrOW/aWKpIhLu0bo1IDVQJw7MORdROJJk/o81E15HuC2g4r3ch+IvZOXKfAenGYM2mYrgnSRHLD0p7KsDN0vuU3IdLXAL5/D5ezr3WQFDFXPpRJyQ+qfx8kyUCe/vtvEVaNezHzOKosQsNGwSvp+lHrEGA9LLYM8RkU/Vwshgkeq2H8MoyuDRaxgOoudNGOmvwNfMp9BoOsz8OCDA5R2BB+JXzsEkSpNYebJK+VWm5wOcYnJ2j9y1OKjRU1ICRtsSPG5kLWmYUt8hHsswzrj4UAxpks+Dn2S09YzeOudC5ss5hmTM/UeVG3r3kJ9+Ad7716V9g7016u+XGhfSWty8EPxVAg0qV9wwAIk+FliWFdF1OLY1RODcsS3swqYfMrBWWdULVNl5d36ycFGucaP893o4Q/im7tx2+588lfvPbZO+DkP40MHP9Hwe++ra6kDiQx5si4M16zYIMmxa4nq6XVcr2hFlqbsLQjhIqkiFOCkt9LNRdKNZlghQkspUH44qLBq4sTHK0iD13FFmBs5rEE1CWa89oCELhea/Z9hPEtjPpC3Q52cAXBgbOJCTr6OYFYfQKbATqHdTU09/nJOafMK5ID1pf7pmBL+ZTH7Kl64lxhyO/9F84t47TctQhhFqxgsIxmv+ZVHajanNl4E0gXqJ0ULsY2h; SamlSession=aHR0cHMlM2ElMmYlMmZmcGNkcmRldi5tb2ZmaXR0Lm9yZyZGYWxzZSZDdWtyYXNTRCYmJiYmXzFkZjY4M2RhLTM4NTktNDVjNS04ODNkLTA3NmRiYTdiMjk3Yg==; MSISAuthenticated=NC8xNi8yMDE5IDExOjI2OjI4IEFN; MSISLoopDetectionCookie=MjAxOS0wNC0xNjoxMToyNjoyOFpcMQ==
Upgrade-Insecure-Requests: 1
HTTP/1.1 302 Found
Content-Length: 0
Content-Type: text/html; charset=utf-8
Location: https://example.com:443/login?SAMLRequest=lZLfa4MwEMf%2fFcl71KjxR7BCqS9C18I69rCXEjXpZJq4XCz982crY6yMwh7vuO9973N3OfChH9lWn%2fRkn8XnJMA6VblCR%2bpzQqWUOKWE4igMUlz7nGKexHUYJdSnKUXOqzDQabVCgesjpwKYRKXAcmXnlE8y7EeYxC%2bEsCBmYeamCX1DTjm7dIrbm%2fLd2hGY58mxaU0rzu6gpeysdbU5eb0%2bdQo5G61AXHtORjHNoQOm%2bCCA2YYd1k9bNtuzZilik4JRNJ3sRIucnbZ7tTdraYW5HykkPyNdhl4Bu23jsctotNWN7lGR33DNIn0s4gDCXHFRccWdac04Auh7XN5K8ObSc9cI8KyZwObeYlPku7ltVf7TbjN9GA6HMvcWeZEvFz8IuB6uUq24FEfSyjgNW47DlGY4og3F6RxjP4nbmid1kCV17v2h%2fE7%2beqDiCw%3d%3d&Signature=pT%2fSUpslARJlvOCah5VzZk4stZLIREyHmUFOO4siHUbkL5eJG4QsfYj9Pq%2bwxnOaPaevYkmiXq0rft3drTzJHspns9UbucyYQvEaSAZVmRTTyfPC3Z0EgVGSvtr0JL3nuDPsq2IfbToseuQQtJFsA%2b94D8KtaLjtUJxiMcQMHyg2yR00Ac3NGt9AsRg1X73X%2frt0XZDN9bSt4R8t%2bt2Yl2UsZsL4GHTGk7RbN3AUrYHsLtKeuN07umXqX3otVtHo%2f9tx2w2h1glYycYbFCk%2bWjox8Mej%2fiLLkpAhw9EXlhiTGrEJ2%2bcYvnQxGokOsz2vXEOoc3%2fhle27LuTPFMN9yw%3d%3d&SigAlg=http%3a%2f%2fwww.w3.org%2f2001%2f04%2fxmldsig-more%23rsa-sha256
Server: Microsoft-HTTPAPI/2.0
P3P: ADFS doesn't have P3P policy, please contact your site's admin for more details
Set-Cookie: SamlSession=; expires=Mon, 15 Apr 2019 11:26:39 GMT; path=/adfs
SamlLogout=aHR0cCUzYSUyZiUyZnJwcHNzb2Rldi5tb2ZmaXR0Lm9yZyUyZmFkZnMlMmZzZXJ2aWNlcyUyZnRydXN0Pz8/aHR0cHMlM2ElMmYlMmZmcGNkcmRldi5tb2ZmaXR0Lm9yZyZGYWxzZSZDdWtyYXNTRCYmJiYmXzFkZjY4M2RhLTM4NTktNDVjNS04ODNkLTA3NmRiYTdiMjk3Yj9fNTBhMTVmZmYtODUxNS00MzI4LWIwYTUtYTc2YjM0NzUwNTg1P3VybiUzYW9hc2lzJTNhbmFtZXMlM2F0YyUzYVNBTUwlM2EyLjAlM2FzdGF0dXMlM2FTdWNjZXNz; path=/adfs; HttpOnly; Secure
MSISAuthenticated=; expires=Mon, 15 Apr 2019 11:26:39 GMT; path=/adfs
MSISAuth=; expires=Mon, 15 Apr 2019 11:26:39 GMT; path=/adfs
ReturnUrl=aHR0cHM6Ly9ycHBzc29kZXYubW9mZml0dC5vcmc6NDQzL2FkZnMvbHMvP3dhPXdzaWdub3V0MS4w; path=/adfs; HttpOnly; Secure
MSISSignoutProtocol=U2FtbA==; expires=Tue, 16 Apr 2019 11:36:39 GMT; path=/adfs; HttpOnly; Secure
Date: Tue, 16 Apr 2019 11:26:39 GMT
SAML:
<samlp:LogoutRequest ID="_50a15fff-8515-4328-b0a5-a76b34750585"
Version="2.0"
IssueInstant="2019-04-16T11:26:39.875Z"
Destination="https://example.com/login"
Consent="urn:oasis:names:tc:SAML:2.0:consent:unspecified"
NotOnOrAfter="2019-04-16T11:31:39.875Z"
xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol"
> <Issuer xmlns="urn:oasis:names:tc:SAML:2.0:assertion">http://myadfs.org/adfs/services/trust</Issuer> <NameID xmlns="urn:oasis:names:tc:SAML:2.0:assertion">USERNAME</NameID> <samlp:SessionIndex>_1df683da-3859-45c5-883d-076dba7b297b</samlp:SessionIndex> </samlp:LogoutRequest>
On subsequent, unsuccessful logouts:
HTTP:
GET https://myadfs.org/adfs/ls/?wa=wsignout1.0 HTTP/1.1
Host: myadfs.org
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:66.0) Gecko/20100101 Firefox/66.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Referer: https://example.com/dashboard/data
Connection: keep-alive
Cookie: MSISLoopDetectionCookie=MjAxOS0wNC0xNjoxMToyODoyNlpcMQ==; SamlLogout=aHR0cCUzYSUyZiUyZnJwcHNzb2Rldi5tb2ZmaXR0Lm9yZyUyZmFkZnMlMmZzZXJ2aWNlcyUyZnRydXN0Pz8/aHR0cHMlM2ElMmYlMmZmcGNkcmRldi5tb2ZmaXR0Lm9yZyZGYWxzZSZDdWtyYXNTRCYmJiYmXzFkZjY4M2RhLTM4NTktNDVjNS04ODNkLTA3NmRiYTdiMjk3Yj9fNTBhMTVmZmYtODUxNS00MzI4LWIwYTUtYTc2YjM0NzUwNTg1P3VybiUzYW9hc2lzJTNhbmFtZXMlM2F0YyUzYVNBTUwlM2EyLjAlM2FzdGF0dXMlM2FTdWNjZXNz; ReturnUrl=aHR0cHM6Ly9ycHBzc29kZXYubW9mZml0dC5vcmc6NDQzL2FkZnMvbHMvP3dhPXdzaWdub3V0MS4w; MSISSignoutProtocol=U2FtbA==; MSISAuth=AAEAAFOnxdlEvO8Le/Gti39Bx6BFj1cEJ39/A6ogocbLbXlBnq07uT1v+MuAzZs0NqyB1Wmqx3O8oTwPancFPCEFrQbngzsvsWI/oAXmuDih8uBG9MVPfstAu/cFPXL95V2IIUjX6r3Tv08FqipxW/1CHa7QM8XvXU5a516zFsZTaxke+ITD3B+nGPsuQY+oVG47NhtoMHmCrbShjOBd9Wn6Q5FzDqbHlxD/5czDUXixYf8gg+MTNq9W+oT5J7TF6NaBb7o1QojY7c8UoJ4fQONwlMNE17TgGVomqN4N9qVPTShGSaTlM8C+er9SOWQiALfZHvH2sv8N0AIn9qpivuCzw9WlBQsO/yJowvaHxPM8wgwLBknSt39RswaSdGjrI18CcgABAAAAz9AfrV1onudL+YY+0zL4vWeCboTECwksETafeI44/o0n0DEBx8kVGELmmPqSKD216OFB+p4k0K//HTW+YnRiuFpk1dAnN+dmwirgwzohFU1A3lWq0pQcHFyui1xs1UHnzDZokvK+7r859oZP0XZ4pGGTZsjWyc2B32FgwfvpiKYKDsWALpajW9FRDnt1VnGyDSzsN3V6vQHmKIEBZn5wb3+b3DtB9hV/ZssxiE7Xf8V8l+144wE71YH4ETNbcX0VXKNlkL9x5R+EThMlzyNl2tAcGWSk+3xM3lhfTm3+8y5GEP3rtJjLQGZSPKUljPcZM/MU3EX3YRrCkYsAyhgpgAMAAKGsYkEEca74go1dVexUCjdky1zUJMng5a/ZmKCRWTYsPT2DCjR579a0Hr69s8nl36p8EgyqnyXPm/uiFp+LPp1CuCCuXe/QYFoySixCOEcJsnRbikBEAP/Bpj5UUifnqgyO7MHH1GQiXeOlw2llsPu7rdNiEqB4X6Hqhnn6xaasl+5iqvNkZSTi8DSQc/24MRT4VsAcJcO7eqxjQBluWr2cyvdr9pn4GigQ05WaXWfogo3BwPJzLUo+NngvLHfxyn1wDmUYghc+oxS+vJwTadiiSDDzrcTVTuVxw2xj6OVi8DXbyRii5+VTKolRK0qCa/4C4BCzOOGUkooktX/GecV6eNuk8xOdLsiybY9Ah5Z2WVgraDntw/w/pP/ij4v0jDLvDQjU+BIfGOpeV1jcG9VDObir5GYGfOm59DtlRpoy/kpjiDLWI8EE75DEFlhomeae0v4xBQ6XqgVd5lEcA2DTm/3Ophg31FA2M5J65yE4t7W7inIC4XjMWFOu3GCMse7ERYyFbq59vf+iSs6eyev7wXidvAekALmq6Gk2Ths2JR1TbV27E2+kgGhmvlgiShx67E9s2wrBfPKvV7+IMS9Xe1YPKpZAlfCwnkbQNonqAMQH5LsHq1K7DWrNTcon10TiOtlMbzin8FtNphcnChHYmBbDxpqrf5xwwYXbyznQnMfeDnjN7aPo909gwhfUGNltLTOZ81m6k9c3Z0C8ugvL61bbw3Ku42OZiOnoVcEYjf50bMWZQl/hUMlRp+uHVNhK41z6U2O9Ph7S4ZI4wg7z33Z+VCp+08HpMRqrX155atJYVX73mnr3+J4rKvyJvjglb9aA333MUOC7iGMDDNImibvofyhbqK3VO+zqyPYj0R4OvhnA9RlvV10MWDhn5qnVevA5Oo1MQNPGnTLtfRZXpB8oa2bZZMh62XO4a5gZ/ioNsigiDAFKbQnx0wvBTb0uqYSZpfxoA4K2o87swOYB81FTkQNBnNZG171szH89jijOuEAI7hAWdAnM2LjagGZwWpuF2yHbJqQqsGzjvnqbQ6yMTvaEbkooSelFEBeRW2Gg5rGAjj5Pvs+T0ljhVlby6FfFKJ71NDBvn/7PGIglARSZqUZcAuthlhr8pta11WnhsfnyumvLfWvOZHZZjWslKMLBpGEBe1WgcYBUBYUrUeHmCqDRy5Zc4KJXwGrY; SamlSession=aHR0cHMlM2ElMmYlMmZmcGNkcmRldi5tb2ZmaXR0Lm9yZyZGYWxzZSZDdWtyYXNTRCYmJiYmX2NlNDAwODQxLTA2ZDItNDI3Ni05MTRlLWU5N2ExYWRlZmQzZQ==; MSISAuthenticated=NC8xNi8yMDE5IDExOjI4OjI2IEFN
Upgrade-Insecure-Requests: 1
HTTP/1.1 200 OK
Cache-Control: no-cache,no-store
Pragma: no-cache
Content-Length: 8957
Content-Type: text/html; charset=utf-8
Expires: -1
Server: Microsoft-HTTPAPI/2.0
Date: Tue, 16 Apr 2019 11:28:45 GMT
SAML:
NO SAML SENT
You will see that on a successful logout ADFS sets the cookies to clear them while an unsuccessful logout does not. Also, the unsuccessful logout does not send the SAML logout request.
Lastly, when I clear the cookies in the browser, the first login/logout session will work as intended again, and all subsequent logouts will not. I can see the cookies are retained on subsequent logouts as ADFS is not getting the SAML logout request. I just don't understand how this works on the first logout but not the following logouts. I have looked in and out of passport-saml's code but can't seem to find the issue.
Any assistance would be great.
Here is my passport.js setup:
const fs = require('fs');
const passport = require('passport');
const SamlStrategy = require('passport-saml').Strategy;
require('dotenv').config();
passport.serializeUser((user, done) => {
done(null, user);
});
passport.deserializeUser((user, done) => {
done(null, user);
});
passport.use(new SamlStrategy({
entryPoint: 'https://myadfs.org/adfs/ls',
issuer: 'https://example.com',
callbackUrl: process.env.NODESERVERURL + ':' + process.env.PORT + '/authenticate/adfs/postResponse',
privateCert: fs.readFileSync(__dirname + '/private/keys/fpcdr.key', 'utf-8'),
logoutUrl: 'https://myadfs.org/adfs/ls/?wa=wsignout1.0',
signatureAlgorithm: 'sha256'
},
function(profile, done) {
const username = profile.nameID.toLowerCase();
const email = profile['http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress'].toLowerCase();
const sessionIndex = profile.sessionIndex;
return done(null, {
username,
email,
sessionIndex
});
})
);
module.exports = passport;
passport callbackUrl:
module.exports.adfsAuthenticate = function(req, res) {
const email = req.user.email;
const username = req.user.username;
if (process.env.UAT === 'true') {
res.status(302).redirect(LANDING_PAGE_REDIRECT_DEV);
} else {
res.status(302).redirect(LANDING_PAGE_REDIRECT_PROD);
}
};
adfs logout:
module.exports.logout = function(req, res) {
req.logout();
req.session.destroy(function (err) {
if (!err) {
res.status(200).clearCookie('connect.sid', {path: '/'}).json({status: "Success"});
} else { alert(err); }
});
};
I have observed that there is one cookie which gets stored in browser called "MSISSignoutProtocol", if this cookie exist, the logout does not work as expected on subsequent requests.
To get it working, you might need to implement proper SAML logout which most IDP support.
Here is my solution, shared here
I found out why SLO doesn't work. I found a solution to make it work properly, im my case with Microsoft Azure (SAML-ADFS), using a database session storage. I log in from my app. Then I log out from microsoft azure ( not from my app), and it logs out nicely !
I think important to mention that there is a huge difference between the IDP logout request with all others. Most of the time, passport routes handle user device's initiated requests or redirects, That means the device's cookies comes along theses requests, which allows passport-saml to successfully retreive device's user information.
When the IDP broadcasts a logout request to relevant ISP(s), it is not always a device request (either initiated nor redirected). It is a simple HTTP GET request that contains a SAMLRequest. This SAMLRequest contains what the IDP uses to identify a saml user.
A saml user is identified using nameID and nameIDFormat profile properties.
Therefore, when a user logs in through an IDP, the app must store both values, nameID and nameIDFormat. You can acheive that first in you saml strategy authentication handler :
export const samlStrategy = new SamlStrategy({
...samlStrategyOptionsHere,
logoutCallbackUrl: 'https://your.isp.com/saml/logout'
},
(profile: Profile | null | undefined, done: VerifiedCallback) => {
// do login process. On successful login, don't forget
// to pass along nameID and nameIDFormat profile values !
done(null, {
userId: user.id, // in my case, my app just need to store the user id
nameID: profile?.nameID,
nameIDFormat: profile?.nameIDFormat
})
});
Then store nameID and nameIDFormat in passport user's session ( within serialization process ) :
interface SessionUserData{
userId: number,
nameID?: string,
nameIDFormat?: string,
}
passport.serializeUser<SessionUserData>((user: any, done) => {
// do serialization stuff
done(null, {
userId: user.id,
nameID: user?.nameID,
nameIDFormat: user?.nameIDFormat
})
});
SLO requests can then be handled properly by first retreiving passport sessions that holds the specified saml identification, and destroy them manually, like this :
interface SAMLLogoutExpectedResponse{
profile?: Profile | null | undefined;
loggedOut?: boolean | undefined;
}
authRouter.get('/saml/logout', async (req, res)=>{
// req.user may not exists here, since this is a single HTTP-GET request that contains a SAMLRequest
// initiated by the IDP, not a user device redirection
// use passport-saml to validate this request
const response:SAMLLogoutExpectedResponse = await (samlStrategy._saml as SAML).validateRedirectAsync(req.query, url.parse(req.url).query)
// check the « SAML USER » profile
if( response?.profile &&
response.profile?.nameID &&
response.profile?.nameIDFormat &&
response.loggedOut===true ){
// get related session(s) if exists
const sessionsToDestroy:string[] = ((await knex.from('sessions').where(true).select('*')) ?? []).reduce((acc:any[], session:any) => {
const userData:any = JSON.parse(session?.data ?? {})?.passport?.user
if( userData &&
userData.nameID &&
userData.nameIDFormat &&
userData.nameID == response?.profile?.nameID &&
userData.nameIDFormat == response?.profile?.nameIDFormat
){
acc.push(session.session_id)
}
return acc;
}, [])
if(sessionsToDestroy.length){
// destroy database session --> which invalidate any corresponding cookie on any device (if still exists)
await knex.from('sessions').whereIn('session_id', sessionsToDestroy).delete();
}
}
// if this is a IDP request, all ends here
// if user initiated the logout request, IDP will redirect device here. Send the device somewhere coherent after successful logout
res.redirect('/')
});
Voilà. I know this is a precise case, and doesn't fit to everyone. But the most important point is that a SLO callback must handle nameID and nameIDFormat values. It should not handle a device session.

Set cookies for cross origin requests

How to share cookies cross origin? More specifically, how to use the Set-Cookie header in combination with the header Access-Control-Allow-Origin?
Here's an explanation of my situation:
I am attempting to set a cookie for an API that is running on localhost:4000 in a web app that is hosted on localhost:3000.
It seems I'm receiving the right response headers in the browser, but unfortunately they have no effect. These are the response headers:
HTTP/1.1 200 OK
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin, Accept-Encoding
Set-Cookie: token=0d522ba17e130d6d19eb9c25b7ac58387b798639f81ffe75bd449afbc3cc715d6b038e426adeac3316f0511dc7fae3f7; Max-Age=86400; Domain=localhost:4000; Path=/; Expires=Tue, 19 Sep 2017 21:11:36 GMT; HttpOnly
Content-Type: application/json; charset=utf-8
Content-Length: 180
ETag: W/"b4-VNrmF4xNeHGeLrGehNZTQNwAaUQ"
Date: Mon, 18 Sep 2017 21:11:36 GMT
Connection: keep-alive
Furthermore, I can see the cookie under Response Cookies when I inspect the traffic using the Network tab of Chrome's developer tools. Yet, I can't see a cookie being set in in the Application tab under Storage/Cookies. I don't see any CORS errors, so I assume I'm missing something else.
Any suggestions?
Update I:
I'm using the request module in a React-Redux app to issue a request to a /signin endpoint on the server. For the server I use express.
Express server:
res.cookie('token', 'xxx-xxx-xxx', { maxAge: 86400000, httpOnly: true, domain: 'localhost:3000' })
Request in browser:
request.post({ uri: '/signin', json: { userName: 'userOne', password: '123456'}}, (err, response, body) => {
// doing stuff
})
Update II:
I am setting request and response headers now like crazy now, making sure that they are present in both the request and the response. Below is a screenshot. Notice the headers Access-Control-Allow-Credentials, Access-Control-Allow-Headers, Access-Control-Allow-Methods and Access-Control-Allow-Origin. Looking at the issue I found at Axios's github, I'm under the impression that all required headers are now set. Yet, there's still no luck...
Cross site approach
To allow receiving & sending cookies by a CORS request successfully, do the following.
Back-end (server) HTTP header settings:
Set the HTTP header Access-Control-Allow-Credentials value to true.
Make sure the HTTP headers Access-Control-Allow-Origin and Access-Control-Allow-Headers are set. Don't use a wildcard *. When you set the allowed origin make sure to use the entire origin including the scheme, i.e. http is not same as https in CORS.
For more info on setting CORS in express js read the docs here.
Cookie settings:
Cookie settings per Chrome and Firefox update in 2021:
SameSite=None
Secure
When doing SameSite=None, setting Secure is a requirement. See docs on SameSite and on requirement of Secure. Also note that Chrome devtools now have improved filtering and highlighting of problems with cookies in the Network tab and Application tab.
Front-end (client): Set the XMLHttpRequest.withCredentials flag to true, this can be achieved in different ways depending on the request-response library used:
ES6 fetch() This is the preferred method for HTTP. Use credentials: 'include'.
jQuery 1.5.1 Mentioned for legacy purposes. Use xhrFields: { withCredentials: true }.
axios As an example of a popular NPM library. Use withCredentials: true.
Proxy approach
Avoid having to do cross site (CORS) stuff altogether. You can achieve this with a proxy. Simply send all traffic to the same top level domain name and route using DNS (subdomain) and/or load balancing. With Nginx this is relatively little effort.
This approach is a perfect marriage with JAMStack. JAMStack dictates API and Webapp code to be completely decoupled by design. More and more users block 3rd party cookies. If API and Webapp can easily be served on the same host, the 3rd party problem (cross site / CORS) dissolves. Read about JAMStack here or here.
Sidenote
It turned out that Chrome won't set the cookie if the domain contains a port. Setting it for localhost (without port) is not a problem. Many thanks to Erwin for this tip!
Note for Chrome Browser released in 2020.
A future release of Chrome will only deliver cookies with cross-site
requests if they are set with SameSite=None and Secure.
So if your backend server does not set SameSite=None, Chrome will use SameSite=Lax by default and will not use this cookie with { withCredentials: true } requests.
More info https://www.chromium.org/updates/same-site.
Firefox and Edge developers also want to release this feature in the future.
Spec found here: https://datatracker.ietf.org/doc/html/draft-west-cookie-incrementalism-01#page-8
In order for the client to be able to read cookies from cross-origin requests, you need to have:
All responses from the server need to have the following in their header:
Access-Control-Allow-Credentials: true
The client needs to send all requests with withCredentials: true option
In my implementation with Angular 7 and Spring Boot, I achieved that with the following:
Server-side:
#CrossOrigin(origins = "http://my-cross-origin-url.com", allowCredentials = "true")
#Controller
#RequestMapping(path = "/something")
public class SomethingController {
...
}
The origins = "http://my-cross-origin-url.com" part will add Access-Control-Allow-Origin: http://my-cross-origin-url.com to every server's response header
The allowCredentials = "true" part will add Access-Control-Allow-Credentials: true to every server's response header, which is what we need in order for the client to read the cookies
Client-side:
import { HttpInterceptor, HttpXsrfTokenExtractor, HttpRequest, HttpHandler, HttpEvent } from "#angular/common/http";
import { Injectable } from "#angular/core";
import { Observable } from 'rxjs';
#Injectable()
export class CustomHttpInterceptor implements HttpInterceptor {
constructor(private tokenExtractor: HttpXsrfTokenExtractor) {
}
intercept(req: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {
// send request with credential options in order to be able to read cross-origin cookies
req = req.clone({ withCredentials: true });
// return XSRF-TOKEN in each request's header (anti-CSRF security)
const headerName = 'X-XSRF-TOKEN';
let token = this.tokenExtractor.getToken() as string;
if (token !== null && !req.headers.has(headerName)) {
req = req.clone({ headers: req.headers.set(headerName, token) });
}
return next.handle(req);
}
}
With this class you actually inject additional stuff to all your request.
The first part req = req.clone({ withCredentials: true });, is what you need in order to send each request with withCredentials: true option. This practically means that an OPTION request will be send first, so that you get your cookies and the authorization token among them, before sending the actual POST/PUT/DELETE requests, which need this token attached to them (in the header), in order for the server to verify and execute the request.
The second part is the one that specifically handles an anti-CSRF token for all requests. Reads it from the cookie when needed and writes it in the header of every request.
The desired result is something like this:
For express, upgrade your express library to 4.17.1 which is the latest stable version. Then;
In CorsOption: Set origin to your localhost url or your frontend production url and credentials to true
e.g
const corsOptions = {
origin: config.get("origin"),
credentials: true,
};
I set my origin dynamically using config npm module.
Then , in res.cookie:
For localhost: you do not need to set sameSite and secure option at all, you can set httpOnly to true for http cookie to prevent XSS attack and other useful options depending on your use case.
For production environment, you need to set sameSite to none for cross-origin request and secure to true. Remember sameSite works with express latest version only as at now and latest chrome version only set cookie over https, thus the need for secure option.
Here is how I made mine dynamic
res
.cookie("access_token", token, {
httpOnly: true,
sameSite: app.get("env") === "development" ? true : "none",
secure: app.get("env") === "development" ? false : true,
})
Pim's answer is very helpful. In my case, I have to use
Expires / Max-Age: "Session"
If it is a dateTime, even it is not expired, it still won't send the cookie to the backend:
Expires / Max-Age: "Thu, 21 May 2020 09:00:34 GMT"
Hope it is helpful for future people who may meet same issue.
In the latest chrome standard, if CORS requests to bring cookies, it must turn on samesite = none and secure, and the back-end domain name must turn on HTTPS,
frontend
`await axios.post(`your api`, data,{
withCredentials:true,
})
await axios.get(`your api`,{
withCredentials:true,
});`
backend
var corsOptions = {
origin: 'http://localhost:3000', //frontend url
credentials: true}
app.use(cors(corsOptions));
const token=jwt.sign({_id:user_id},process.env.JWT_SECRET,{expiresIn:"7d"});
res.cookie("token",token,{httpOnly:true});
hope it will work.
After more then a day of trying all your suggestions and many more, I surrender.
Chrome just does not accept my cross domain cookies on localhost.
No errors, just silently ignored.
I want to have http only cookies to safer store a token.
So for localhost a proxy sounds like the best way around this. I haven't really tried that.
What I ended up doing, maybe it helps someone.
Backend (node/express/typescript)
set cookie as you normally would
res.status(200).cookie("token", token, cookieOptions)
make a work around for localhost
// if origin localhost
response.setHeader("X-Set-Cookie", response.getHeader("set-cookie") ?? "");
Allow x-set-cookie header in cors
app.use(cors({
//...
exposedHeaders: [
"X-Set-Cookie",
//...
]
}));
Frontend (Axios)
On the Axios response
remove the domain= so it's defaulted.
split multiple cookies and store them locally.
// Localhost cookie work around
const xcookies = response.headers?.["x-set-cookie"];
if(xcookies !== undefined){
xcookies
.replace(/\s+Domain=[^=\s;]+;/g, "")
.split(/,\s+(?=[^=\s]+=[^=\s]+)/)
.forEach((cookie:string) => {
document.cookie = cookie.trim();
});
}
Not ideal, but I can move on with my life again.
In general this is just been made to complicated I think :-(
Update my use case maybe we can resolve it?
It's a heroku server with a custom domain.
According to this article that should be okay
https://devcenter.heroku.com/articles/cookies-and-herokuapp-com
I made an isolated test case but still no joy.
I'm pretty sure I've seen it work in FireFox before but currently nothing seems to work, besides my nasty work around.
Server Side
app.set("trust proxy", 1);
app.get("/cors-cookie", (request: Request, response: Response) => {
// http://localhost:3000
console.log("origin", request.headers?.["origin"]);
const headers = response.getHeaders();
Object.keys(headers).forEach(x => {
response.removeHeader(x);
console.log("remove header ", x, headers[x]);
});
console.log("headers", response.getHeaders());
const expiryOffset = 1*24*60*60*1000; // +1 day
const cookieOptions:CookieOptions = {
path: "/",
httpOnly: true,
sameSite: "none",
secure: true,
domain: "api.xxxx.nl",
expires: new Date(Date.now() + expiryOffset)
}
return response
.status(200)
.header("Access-Control-Allow-Credentials", "true")
.header("Access-Control-Allow-Origin", "http://localhost:3000")
.header("Access-Control-Allow-Methods", "GET,HEAD,OPTIONS,POST,PUT")
.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept")
.cookie("test-1", "_1_", cookieOptions)
.cookie("test-2", "_2_", {...cookieOptions, ...{ httpOnly: false }})
.cookie("test-3", "_3_", {...cookieOptions, ...{ domain: undefined }})
.cookie("test-4", "_4_", {...cookieOptions, ...{ domain: undefined, httpOnly: false }})
.cookie("test-5", "_5_", {...cookieOptions, ...{ domain: undefined, sameSite: "lax" }})
.cookie("test-6", "_6_", {...cookieOptions, ...{ domain: undefined, httpOnly: false, sameSite: "lax" }})
.cookie("test-7", "_7_", {...cookieOptions, ...{ domain: "localhost"}}) // Invalid domain
.cookie("test-8", "_8_", {...cookieOptions, ...{ domain: ".localhost"}}) // Invalid domain
.cookie("test-9", "_9_", {...cookieOptions, ...{ domain: "http://localhost:3000"}}) // Invalid domain
.json({
message: "cookie"
});
});
Client side
const response = await axios("https://api.xxxx.nl/cors-cookie", {
method: "get",
withCredentials: true,
headers: {
"Accept": "application/json",
"Content-Type": "application/json",
}
});
Which yields the following reponse
I see the cookies in the Network > request > cookies Tab.
But no cookies under Application > Storage > Cookies nor in document.cookie.
Pim's Answer is very helpful,
But here is an edge case I had gone through,
In my case even though I had set the Access-Control-Allow-Origin to specific origins in BE , In FE I received it as * ; which was not allowed
The problem was, some other person handled the webserver setup,
in that, there was a config to set the Access-Control-* headers which was overriding my headers set from BE application
phew.. took a while to figure it out .
So, if there is mismatches in what you set and what you received, Check your web server configs also.
Hope this would help
for me regarding the sameSite property, after enabling CORS I also add "CookieSameSite = SameSiteMode.None"
to the CookieAuthenticationOptions in the Startup file
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
.....
CookieSameSite = SameSiteMode.None,
.....
}
This is an answer to "Lode Michels" from above regarding CORS cookie with the Heroku server, (and for other cloud providers, like AWS)
The reason your CORS cookie can't be set is because Heroku strip down SSL certificate at Load Balancer, so when you try to set the "secure" cookie at the server, it fails since it's no longer from the secure connection.
You can explicitally specify if the connection is secure, rather than the cookie module examining request.
https://github.com/pillarjs/cookies
with koa, add this:
ctx.cookies.secure = true;
edit: I can't comment on that answer directly due to lower than 50 reputation
This code worked for me
In the backend
Set credentials to true in your corsOptions:
const corsOptions = {
credentials: true,
};
Set cookies before sending requests:
res.cookie('token', 'xxx-xxx-xxx', {
maxAge: 24*60*60*1000, httpOnly: true,
SameSite:"None" })
In the frontend
Request in browser (using axios):
axios.post('uri/signin',
JSON.stringify({ username: 'userOne',
password: '123456'}),.
{withCredentials:true})
.the(result
=>console.log(result?.data))
.catch(err => console.log(err))

ember-simple-auth oauth2 authorizer issue

I am trying to set up authorization on an Ember App running on a Node.js server.
I am using the oauth2 Authenticator, which is requesting a token from the server. This is working fine. I am able to provide the app with a token, which it saves in the local-storage.
However, when I make subsequent requests, the authorizer is not adding the token to the header, I have initialized the authorizer using the method described in the documentation (http://ember-simple-auth.simplabs.com/ember-simple-auth-oauth2-api-docs.html):
Ember.Application.initializer({
name: 'authentication',
initialize: function(container, application) {
Ember.SimpleAuth.setup(container, application, {
authorizerFactory: 'authorizer:oauth2-bearer'
});
}
});
var App = Ember.Application.create();
And I have added an init method to the Authorizer, to log a message to the server when it is initialized, so I know that it is being loaded. The only thing is, the authorize method of the authorizer is never called.
It feels like I am missing a fundamental concept of the library.
I have a users route which I have protected using the AuthenticatedRouteMixin like so:
App.UsersRoute = Ember.Route.extend(Ember.SimpleAuth.AuthenticatedRouteMixin, {
model: function() {
return this.get('store').find('user');
}
});
Which is fetching the data, fine, and redirects to /login if no token is in the session, but the request headers do not include the token:
GET /users HTTP/1.1
Host: *****
Connection: keep-alive
Cache-Control: no-cache
Pragma: no-cache
Accept: application/json, text/javascript, */*; q=0.01
Origin: *****
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.116 Safari/537.36
Referer: *****
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Any help you could give me would be greatly appreciated.
Is your REST API served on a different origin than the app is loaded from maybe? Ember.SimpleAuth does not authorizer cross origin requests by default (see here: https://github.com/simplabs/ember-simple-auth#cross-origin-authorization)

Metro client hangs when calling WCF webserver with wsHttpBinding

I have generated a webservice client with a local wsdl using Metro 1.2 this way:
./wsimport.sh -extension -verbose -wsdllocation service.wsdl -s src -d target service.wsdl -Xendorsed
The wsdl uses SOAP 1.2 and wsHttpBinding. It's supposed to connect to a WCF server that's using NTLM as authentication method.
I have created an Authenticator to handle the NTLM authentication:
public class NtlmAuthenticator extends Authenticator
{
private String username = "";
private String password = "";
public NtlmAuthenticator(String username, String password) {
this.username = username;
this.password = password;
}
#Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication(username, password.toCharArray());
}
}
Which I set before each webservice method is called:
#WebEndpoint(name = "WSHttpBinding_ICustomerService")
public ICustomerService getWSHttpBindingICustomerService() {
ICustomerService service =
super.getPort(new QName("http://xmlns.example.com/services/Customer",
"WSHttpBinding_ICustomerService"), ICustomerService.class);
NtlmAuthenticator auth = new NtlmAuthenticator(username, password);
Authenticator.setDefault(auth);
return service;
}
If I use the wrong username/password, I get a 401 Unauthorized back, which is well and all, but when I use the correct username/password, the call hangs and I never get a response!
The request looks like this (captured it with netcat, so host is different, and no https):
POST / HTTP/1.1
Content-type: application/soap+xml;charset="utf-8";action="http://xmlns.example.com/services/ICustomerService/GetCustomer"
Password: [password]
Authorization: Basic [auth]
Username: [username]
Accept: application/soap+xml, multipart/related, text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2
User-Agent: JAX-WS RI 2.1.7-b01-
Cache-Control: no-cache
Pragma: no-cache
Host: localhost:5500
Connection: keep-alive
Content-Length: 603
[xml follows]
I have also tried with wget 1.12 (heard that 1.11 had problem with NTLM), but it too never yields a response, just waits.
[...]
---request end---
[writing POST file customerRequest.xml ... done]
HTTP request sent, awaiting response...
I've seen that others have gotten this behaviour before, but I have not been able to find out why. Can anyone shed some light on this? JDK 1.6 on linux.
I found that I missed a line in my generated client code that enabled Addressing and pass it to the getPort super method:
WebServiceFeature wsAddressing = new AddressingFeature(true);
ICustomerService service =
super.getPort(new QName("http://xmlns.example.com/services/Customer",
"WSHttpBinding_ICustomerService"), ICustomerService.class,
wsAddressing);
Why metro didn't generate this is beyond me. The method looked like this in the end:
#WebEndpoint(name = "WSHttpBinding_ICustomerService")
public ICustomerService getWSHttpBindingICustomerService() {
WebServiceFeature wsAddressing = new AddressingFeature(true);
ICustomerService service =
super.getPort(new QName("http://xmlns.example.com/services/Customer",
"WSHttpBinding_ICustomerService"), ICustomerService.class,
wsAddressing);
NtlmAuthenticator auth = new NtlmAuthenticator(username, password);
Authenticator.setDefault(auth);
return service;
}
This in turn added a SOAP header to the message:
<S:Header>
<To xmlns="http://www.w3.org/2005/08/addressing">https://services.example.com/CustomerService.svc</To>
<Action xmlns="http://www.w3.org/2005/08/addressing">http://xmlns.example.com/services/ICustomerService/GetCustomer</Action>
<ReplyTo xmlns="http://www.w3.org/2005/08/addressing">
<Address>http://www.w3.org/2005/08/addressing/anonymous</Address>
</ReplyTo>
<MessageID xmlns="http://www.w3.org/2005/08/addressing">uuid:d33c2888-abfa-474d-8729-95d2bcd17a96</MessageID>
</S:Header>