How can I authenticate websocket connection - authentication

What is a common approach in authenticating of user session for websocket connection?
As I understand websocket message contains data only with no headers. Thus authorization cookie is not available to server backend. How should application distinguish messages from different clients?

Which websocket server are you using?
if your webserver and websocketserver are the same, you could send the sessionid via websocket and force-disconnect any client that does not send a valid sessionid in his first message.
if your websocketserver parses the HTTP headers sent in the HTTP upgrade request properly, it may also save any cookie. this is what a request of my firefox (version 35) looks like:
GET /whiteboard HTTP/1.1
Host: *:*
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:35.0) Gecko/20100101 Firefox/35.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
DNT: 1
Sec-WebSocket-Version: 13
Origin: *
Sec-WebSocket-Protocol: whiteboard
Sec-WebSocket-Key: iGPS0jjbNiGAYrIyC/YCzw==
Cookie: PHPSESSID=9fli75enklqmv1a30hbdmg1461
Connection: keep-alive, Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
as you see, the phpsessionid is transmitted fine. check the documentation of your websocketserver if it parses the HTTP headers. remember that cookies will not be sent if the websocket's domain differs from the webserver's domain.

Related

How to store a JWT refreshToken cookie response

I'm trying to authenticate a user with JWT using GraphQL. Once I login the user I receive the token as a JSON response and a httponly cookie storing the refresh token. (Server-side is using Saleor-core)
From the documentation of Saleor and some other blog-posts I assume that this response cookie should now be stored in the browser and whenever I need to refresh a token the cookie-refreshToken is used to authenticate my request. However, when I switch tabs to "Application" in my dev tools it's just empty.
What is the normal behaviour of the browser after receiving a cookie response? Do I need some extra code to somehow "save" that response cookie?
Did not really find anyone else having this problem so I think the mistake must be somewhere else.
UPDATE
I read somewhere the issue might be that there is no "secure" flag, which resulted from the server debug mode. I turned it off, but the cookie is still not being set.
Response Headers:
HTTP/1.1 200 OK
Connection: keep-alive
Date: Thu, 23 Sep 2021 13:32:33 GMT
Server: uvicorn
Content-Type: application/json
Access-Control-Allow-Origin: https://rewhite-86006--beta-duoa0dwg.web.app
Access-Control-Allow-Methods: POST, OPTIONS
Access-Control-Allow-Headers: Origin, Content-Type, Accept, Authorization, Authorization-Bearer
Access-Control-Allow-Credentials: true
Content-Length: 912
X-Content-Type-Options: nosniff
Referrer-Policy: same-origin
Set-Cookie: refreshToken=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpYXQiOjE2MzI0MDM5NTQsIm93bmVyIjoic2FsZW9yIiwiZXhwIjoxNjM0OTk1OTU0LCJ0b2tlbiI6Ijd2b0VmMm1DNlZZSyIsImVtYWlsIjoiSnVsaWFuLkZpbmtlQGdtYWlsLmNvbSIsInR5cGUiOiJyZWZyZXNoIiwidXNlcl9pZCI6IlZYTmxjam8zTmc9PSIsImlzX3N0YWZmIjpmYWxzZSwiY3NyZlRva2VuIjoiWm55ek9xVG9rOU9GYXlDZXY0cjFxMUxnaktnTXRRR0VNUVJEalR1eTJDZ1IyOW1GSVBxQ1B1T1hZcTFQNk92cyJ9.Cl6PmoLkO9Hlh36tDOuyNLQCib4FVBwn32hhnmd7Q4E; expires=Sat, 23 Oct 2021 13:32:34 GMT; HttpOnly; Max-Age=2592000; Path=/; Secure
Via: 1.1 vegur
Request Headers:
POST /graphql/ HTTP/1.1
Host: rewhite-saleor-engine.herokuapp.com
Connection: keep-alive
Content-Length: 318
Pragma: no-cache
Cache-Control: no-cache
sec-ch-ua: "Google Chrome";v="93", " Not;A Brand";v="99", "Chromium";v="93"
sec-ch-ua-mobile: ?0
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/93.0.4577.82 Safari/537.36
sec-ch-ua-platform: "macOS"
content-type: application/json
Accept: */*
Origin: https://rewhite-86006--beta-duoa0dwg.web.app
Sec-Fetch-Site: cross-site
Sec-Fetch-Mode: cors
Sec-Fetch-Dest: empty
Referer: https://rewhite-86006--beta-duoa0dwg.web.app/
Accept-Encoding: gzip, deflate, br
Accept-Language: de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7
Thanks for your help!
The Domain attribute on you cookie seems to be different from the origin of your request. You're making a cross-site request and receiving a Set Cookie response from the server (of a different domain).
Normally we run into this issue when running backend and frontend on different domains (for e.g. localhost:3000 and localhost:8080).
Solution:
Recent Chrome browser versions (from 2020) will only set cookies received from cross-site requests if cookie has SameSite=None and Secure attributes set. With Secure set, a cookie will only be sent to server over HTTPS protocol (you need to implement SSL).
As of now, you don't have set either. SameSite defaults to Lax not None. You need to explicitly set it.
OR
You need implement a proxy such that you will request your webapp on https://rewhite-86006--beta-duoa0dwg.web.app and your webapp will proxy this to your Saleor engine domain rewhite-saleor-engine.herokuapp.com. How you do that depends on what frameworks you're using for serving your webapp. You haven't mentioned your it in your question, but I notice you've tagged it under vue.js, so I'll assume that you're using Vue CLI for serving a Vue app.
Its very simple to set up a proxy with Vue CLI. Just look for vue.config.js file in your root directory. If its not there, create it and paste the code below:
module.exports = {
devServer: {
proxy: {
'^/graphql': {
target: 'https://rewhite-saleor-engine.herokuapp.com',
changeOrigin: true,
logLevel: 'debug',
},
},
},
}
Now instead of fetching the refreshToken from rewhite-saleor-engine.herokuapp.com/graphql, you should send the request to your webapp at https://rewhite-86006--beta-duoa0dwg.web.app/graphql, and your web app local server will forward the request to your Saleor backend on Heroku. To your browser it will appear as though the request's response came form the webapp itself, so it won't be a cross-site request anymore.

Configuring Burp Suite to intercept data between web browser and proxy server

I need to configure Burp Suite to intercept data between web browser and proxy server. The proxy server requires a basic authentication (Username & Password) while connecting for the first time in each session. I have tried the 'Redirect to host' option in Burp Suite(Entered the proxy server address and port in the fields):
Proxy >> Options >> Proxy Listeners >> Request Handling
But I can't see an option to use the authentication that is required while connecting to this proxy server.
While accessing google.com, the request headers are:
GET / HTTP/1.1
Host: google.com
User-Agent: Mozilla/5.0 (X11; Linux i686) KHTML/4.13.3 (like Gecko) Konqueror/4.13
Accept: text/html, text/*;q=0.9, image/jpeg;q=0.9, image/png;q=0.9, image/*;q=0.9, */*;q=0.8
Accept-Encoding: gzip, deflate, x-gzip, x-deflate
Accept-Charset: utf-8,*;q=0.5
Accept-Language: en-US,en;q=0.9
Connection: close
And the response is:
HTTP/1.1 400 Bad Request
Server: squid/3.3.8
Mime-Version: 1.0
Date: Thu, 10 Mar 2016 15:14:12 GMT
Content-Type: text/html
Content-Length: 3163
X-Squid-Error: ERR_INVALID_URL 0
Vary: Accept-Language
Content-Language: en
X-Cache: MISS from proxy.abc.in
X-Cache-Lookup: NONE from proxy.abc.in:3343
Via: 1.1 proxy.abc.in (squid/3.3.8)
Connection: close
you were on the right track, just at the wrong place. You need to setup an upstream proxy at:
Options>>Connections>>Upstream proxy
There you can also setup the authentication
Options>>Connections>>Platform authentication
Here you can create different auth configurations, which will be done if the server requests it.

First request from BizTalk WCF-Custom adapter not pre-authenticated

The first request (or batch of requests) I send from a WCF-Custom adapter using the wsHttpBinding (also tried basicHttp) does not include the Authorization Header.
Request 1 Headers
POST https://axis2service.com/HttpSoap12Endpoint/ HTTP/1.1
Content-Type: application/soap+xml; charset=utf-8
Host: axis2service.com
Content-Length: 556
Expect: 100-continue
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
The service returns a 401 to which I respond with the below.
Request 2 Headers
POST https://axis2service.com/HttpSoap12Endpoint/ HTTP/1.1
Content-Type: application/soap+xml; charset=utf-8
Authorization: Basic XXXXXX
Host: axis2service.com
Content-Length: 556
Expect: 100-continue
Accept-Encoding: gzip, deflate
To which the axis2 on apache service responds with a 504 and Connection: close
This seems to be a known issue, how can I make request 1 include the Authorization header every time?
Note: The request 1 headers only get sent on the first request after a Host Instance restart. All subsequent requests from the adapter use the request 2 headers therefore bypassing the handshake stage.

POST data not sent in IE 11 using the WebBrowser control

I have the following JavaScript code being executed inside of an onclick event handler on my website.
var newRequest = new window.XMLHttpRequest();
newRequest.open('POST', '/index.cfm', true);
newRequest.send('q');
If I use IE 11 and open up my web page that is located on a testing server I can see that the request gets sent as expected using Fiddler with a content length of 1 and 'q' in the post data. However, if I open up my application that hosts the WebBrowser control and navigate to the same website on my test server and have it execute the above code I can always see that the request is being made but that the Content-Length header is 0 and 'q' is not sent along with the request. Here is the failing request as it appears in Fiddler.
POST http://test.mycompany.com/index.cfm HTTP/1.1
Accept: */*
Referer: http://test.mycompany.com/Curtis/BrowserTests/BrowserEventTests.html
Accept-Language: en-US
Content-Type: text/plain;charset=UTF-8
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: test.mycompany.com
Content-Length: 0
Connection: Keep-Alive
Pragma: no-cache
If I then set the HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATION registry value for my executable to 9000 then it works and does send the post data as expected.
Here is the correct request as reported by Fiddler.
POST http://test.mycompany.com/index.cfm HTTP/1.1
Accept: */*
Referer: http://test.mycompany.com/Curtis/BrowserTests/BrowserEventTests.html
Accept-Language: en-us
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: test.mycompany.com
Content-Length: 1
Connection: Keep-Alive
Pragma: no-cache
q
If I change that value back to 10000 or 11000 then it does not work. Does anyone have any ideas on why the post data would not be sent properly using the WebBrowser control? I have reset my IE settings back to factory default with no change in behavior.
Update: If I change the JavaScript to look like this instead then it works with the emulation mode set to 10000.
var newRequest = new ActiveXObject("Msxml2.XMLHTTP");
newRequest.open('POST', '/index.cfm', true);
newRequest.send('q');
Is this just a bug that needs to be reported?

NSURLConnection and Authenticating to webservices behind ssl?

I'm currently trying to connect to a webservice placed on https://xxx.xxx.xx/myapp
It has anonymous access and SSL enabled for testing purposes atm.
While trying to connect from the 3G network, i get Status 403: Access denied. You do not have permission to view this directory or page using the credentials that you supplied.
I get these headers while trying to connect to the webservice locally:
Headers
Request URL:https://xxx.xxx.xx/myapp
Request Method:GET
Status Code:200 OK
Request Headers
GET /myapp/ HTTP/1.1
Host: xxx.xxx.xxx
Connection: keep-alive
Authorization: Basic amViZTAyOlE3ZSVNNHNB
User-Agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Encoding: gzip,deflate,sdch
Accept-Language: sv-SE,sv;q=0.8,en-US;q=0.6,en;q=0.4
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Response Headers
HTTP/1.1 200 OK
Content-Type: text/html; charset=UTF-8
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Thu, 16 Feb 2012 12:26:13 GMT
Content-Length: 622
But when accessing outside the local area, we get the big ol 403. Which in turn wants credentials to grant the user access to the webservice.
However, i've tried using the ASIHTTPRequest library without success, and that project has been abandoned. And they suggest going back to NSURLConnection.
And i have no clue where to start, not even which direction to take.
-connection:(connection *)connection didRecieveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge
The above delegate method of NSURLConnection doesnt even trigger. So i have no idea what so ever how to authenticate myself.
All i get is the parsed results of the xml elements of the 403-page.
I needs dem seriouz helps! plx.
This was all just a major f-up.
The site had ssl required and enabled, and setting ssl required for the virtual directories does some kind of superduper meta-blocking.
So, by disabling ssl required for the virtual directories, it runs over ssl and is not blocking 3G access..