I need to configure Burp Suite to intercept data between web browser and proxy server. The proxy server requires a basic authentication (Username & Password) while connecting for the first time in each session. I have tried the 'Redirect to host' option in Burp Suite(Entered the proxy server address and port in the fields):
Proxy >> Options >> Proxy Listeners >> Request Handling
But I can't see an option to use the authentication that is required while connecting to this proxy server.
While accessing google.com, the request headers are:
GET / HTTP/1.1
Host: google.com
User-Agent: Mozilla/5.0 (X11; Linux i686) KHTML/4.13.3 (like Gecko) Konqueror/4.13
Accept: text/html, text/*;q=0.9, image/jpeg;q=0.9, image/png;q=0.9, image/*;q=0.9, */*;q=0.8
Accept-Encoding: gzip, deflate, x-gzip, x-deflate
Accept-Charset: utf-8,*;q=0.5
Accept-Language: en-US,en;q=0.9
Connection: close
And the response is:
HTTP/1.1 400 Bad Request
Server: squid/3.3.8
Mime-Version: 1.0
Date: Thu, 10 Mar 2016 15:14:12 GMT
Content-Type: text/html
Content-Length: 3163
X-Squid-Error: ERR_INVALID_URL 0
Vary: Accept-Language
Content-Language: en
X-Cache: MISS from proxy.abc.in
X-Cache-Lookup: NONE from proxy.abc.in:3343
Via: 1.1 proxy.abc.in (squid/3.3.8)
Connection: close
you were on the right track, just at the wrong place. You need to setup an upstream proxy at:
Options>>Connections>>Upstream proxy
There you can also setup the authentication
Options>>Connections>>Platform authentication
Here you can create different auth configurations, which will be done if the server requests it.
Related
I'm trying to authenticate a user with JWT using GraphQL. Once I login the user I receive the token as a JSON response and a httponly cookie storing the refresh token. (Server-side is using Saleor-core)
From the documentation of Saleor and some other blog-posts I assume that this response cookie should now be stored in the browser and whenever I need to refresh a token the cookie-refreshToken is used to authenticate my request. However, when I switch tabs to "Application" in my dev tools it's just empty.
What is the normal behaviour of the browser after receiving a cookie response? Do I need some extra code to somehow "save" that response cookie?
Did not really find anyone else having this problem so I think the mistake must be somewhere else.
UPDATE
I read somewhere the issue might be that there is no "secure" flag, which resulted from the server debug mode. I turned it off, but the cookie is still not being set.
Response Headers:
HTTP/1.1 200 OK
Connection: keep-alive
Date: Thu, 23 Sep 2021 13:32:33 GMT
Server: uvicorn
Content-Type: application/json
Access-Control-Allow-Origin: https://rewhite-86006--beta-duoa0dwg.web.app
Access-Control-Allow-Methods: POST, OPTIONS
Access-Control-Allow-Headers: Origin, Content-Type, Accept, Authorization, Authorization-Bearer
Access-Control-Allow-Credentials: true
Content-Length: 912
X-Content-Type-Options: nosniff
Referrer-Policy: same-origin
Set-Cookie: refreshToken=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpYXQiOjE2MzI0MDM5NTQsIm93bmVyIjoic2FsZW9yIiwiZXhwIjoxNjM0OTk1OTU0LCJ0b2tlbiI6Ijd2b0VmMm1DNlZZSyIsImVtYWlsIjoiSnVsaWFuLkZpbmtlQGdtYWlsLmNvbSIsInR5cGUiOiJyZWZyZXNoIiwidXNlcl9pZCI6IlZYTmxjam8zTmc9PSIsImlzX3N0YWZmIjpmYWxzZSwiY3NyZlRva2VuIjoiWm55ek9xVG9rOU9GYXlDZXY0cjFxMUxnaktnTXRRR0VNUVJEalR1eTJDZ1IyOW1GSVBxQ1B1T1hZcTFQNk92cyJ9.Cl6PmoLkO9Hlh36tDOuyNLQCib4FVBwn32hhnmd7Q4E; expires=Sat, 23 Oct 2021 13:32:34 GMT; HttpOnly; Max-Age=2592000; Path=/; Secure
Via: 1.1 vegur
Request Headers:
POST /graphql/ HTTP/1.1
Host: rewhite-saleor-engine.herokuapp.com
Connection: keep-alive
Content-Length: 318
Pragma: no-cache
Cache-Control: no-cache
sec-ch-ua: "Google Chrome";v="93", " Not;A Brand";v="99", "Chromium";v="93"
sec-ch-ua-mobile: ?0
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/93.0.4577.82 Safari/537.36
sec-ch-ua-platform: "macOS"
content-type: application/json
Accept: */*
Origin: https://rewhite-86006--beta-duoa0dwg.web.app
Sec-Fetch-Site: cross-site
Sec-Fetch-Mode: cors
Sec-Fetch-Dest: empty
Referer: https://rewhite-86006--beta-duoa0dwg.web.app/
Accept-Encoding: gzip, deflate, br
Accept-Language: de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7
Thanks for your help!
The Domain attribute on you cookie seems to be different from the origin of your request. You're making a cross-site request and receiving a Set Cookie response from the server (of a different domain).
Normally we run into this issue when running backend and frontend on different domains (for e.g. localhost:3000 and localhost:8080).
Solution:
Recent Chrome browser versions (from 2020) will only set cookies received from cross-site requests if cookie has SameSite=None and Secure attributes set. With Secure set, a cookie will only be sent to server over HTTPS protocol (you need to implement SSL).
As of now, you don't have set either. SameSite defaults to Lax not None. You need to explicitly set it.
OR
You need implement a proxy such that you will request your webapp on https://rewhite-86006--beta-duoa0dwg.web.app and your webapp will proxy this to your Saleor engine domain rewhite-saleor-engine.herokuapp.com. How you do that depends on what frameworks you're using for serving your webapp. You haven't mentioned your it in your question, but I notice you've tagged it under vue.js, so I'll assume that you're using Vue CLI for serving a Vue app.
Its very simple to set up a proxy with Vue CLI. Just look for vue.config.js file in your root directory. If its not there, create it and paste the code below:
module.exports = {
devServer: {
proxy: {
'^/graphql': {
target: 'https://rewhite-saleor-engine.herokuapp.com',
changeOrigin: true,
logLevel: 'debug',
},
},
},
}
Now instead of fetching the refreshToken from rewhite-saleor-engine.herokuapp.com/graphql, you should send the request to your webapp at https://rewhite-86006--beta-duoa0dwg.web.app/graphql, and your web app local server will forward the request to your Saleor backend on Heroku. To your browser it will appear as though the request's response came form the webapp itself, so it won't be a cross-site request anymore.
I'm running Apache 2.4 as Reverse Proxy in front of Tomcat 9 on Ubuntu 18.04.
The Tomcat application is deployed in /apachetest and is using form-based authentification.
When calling "http://10.10.50.20/apachetest" (without proxy)
the login-page is comming up
I put in the credentials
and than "index.html" is delivered
So far ...
On Apache I have configured a virtual host for ssl:
ProxyPass / http://localhost:8087/apachetest/
ProxyPassReverse / http://localhost:8087/apachetest/
ProxyPassReverseCookiePath / /apachetest
when calling https://apachetest.localdomain/
- the login-page is comming up
- I put in the credentials
- and than I receive "HTTP Status 408 – Request Timeout" from Tomcat
By using the developer tools of Chrome I can see the following header for request "j_security_check"
General:
- Request URL: https://apachetest.localdomain/j_security_check
- Request Method: POST
- Status Code: 408
- Remote Address: 10.10.50.20:443
- Referrer Policy: no-referrer-when-downgrade
Response Header
Connection: close
- Content-Language: de
- Content-Length: 1239
- Content-Type: text/html;charset=utf-8
- Date: Mon, 09 Dec 2019 10:36:28 GMT
- Server: Apache/2.4.29 (Ubuntu)
- X-Content-Type-Options: nosniff
- X-Frame-Options: DENY
Request Header:
-Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,/;q=0.8,application/signed-exchange;v=b3
- Accept-Encoding: gzip, deflate, br
- Accept-Language: de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7
- Cache-Control: max-age=0
- Connection: keep-alive
- Content-Length: 43
- Content-Type: application/x-www-form-urlencoded
- Cookie: JSESSIONID=B859EE1F208D4D1C26C7B5714A41B03D
- Host: apachetest.localdomain
- Origin: https://apachetest.localdomain
- Referer: https://apachetest.localdomain/
- Sec-Fetch-Mode: navigate
- Sec-Fetch-Site: same-origin
- Upgrade-Insecure-Requests: 1
- User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.108 Safari/537.36
Thank you for your interest in my question.
ok, I will try again.
I'm looking for a configuration to have a Tomcat Web-Application running behind an Apache Reverse Proxy.
The Tomcat Web-Application have a form-based authentification implemented.
By calling the Tomcat Web-Application, through proxy, I get the expected login-page.
But after putting in the right credentials and submit, I receive the following message:
"HTTP Status 408 – Request Timeout".
The expected site "index.html" is not provided.
I'm trying to create a user using the SaonarQube API (version 6.2 or up).
I have setup a SoapUI project that contains a few test scripts. One of them is login in and creating a user. this one returns a 401 whe the user creation call is done.
The login is used for other calls as well and proves to work. Except for the create user call. The account used to login to SoarQube is member of the System Administror groups.
Below is the raw request.
POST http://localhost:9000/api/users/create HTTP/1.1
Accept-Encoding: gzip,deflate
Content-Type: application/x-www-form-urlencoded
Content-Length: 47
Host: localhost:9000
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)
Cookie: JWT-SESSION=eyJhbGciOiJIUzI1NiJ9.eyJqdGkiOiJBV0ExaGFtX2hnNWdHUWtNNVRHSiIsInN1YiI6ImFkbWluIiwiaWF0IjoxNTEyNzI2NDQwLCJleHAiOjE1MTI5ODU2NDAsImxhc3RSZWZyZXNoVGltZSI6MTUxMjcyNjQ0MDM4MywieHNyZlRva2VuIjoicHRwcXRlYmtzYTR2MTlhaTk3anV0bnVlZW8ifQ.waHqOsMJ9P6FyIOUWuVODl5QcW-IJp10G6oUAvy1DWk; XSRF-TOKEN=ptpqtebksa4v19ai97jutnueeo
Cookie2: $Version=1
login=user01&name=name01&password=%21P%40ssw0rd
Below is the raw resoonse
HTTP/1.1 401 Unauthorized
Server: Apache-Coyote/1.1
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
X-Content-Type-Options: nosniff
Content-Length: 0
Date: Fri, 08 Dec 2017 09:47:20 GMT
Any suggestions are welcome.
BTW: I can create the user using the same values using the UI so there is no issue with he user information, at least it seams so.
Update 1:
Added raw request with querystring parameters
POST http://localhost:9000/api/users/create?login=user01&name=name01&password=%21P%40ssw0rd HTTP/1.1
Accept-Encoding: gzip,deflate
Content-Type: application/x-www-form-urlencoded
Content-Length: 0
Host: localhost:9000
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)
Cookie: JWT-SESSION=eyJhbGciOiJIUzI1NiJ9.eyJqdGkiOiJBV0JHZkVGY0h3bW5UZ0V5QklJNyIsInN1YiI6ImFkbWluIiwiaWF0IjoxNTEzMDExMDM2LCJleHAiOjE1MTMyNzAyMzYsImxhc3RSZWZyZXNoVGltZSI6MTUxMzAxMTAzNjQyNCwieHNyZlRva2VuIjoibmIzdmlpcjAyZmZ1ODJnMzNtdW1hYWdkN3QifQ.ur8eZkW1CwNinx4tInFsbkGLQTHQ6yFjheRfup8Z4fQ; XSRF-TOKEN=nb3viir02ffu82g33mumaagd7t
Cookie2: $Version=1
It's not possible to use the generated cookie by a web request in a console request (it could be considered as an attack).
You need either to :
Specify a user token (recommended way)
Specify a login/password
What is a common approach in authenticating of user session for websocket connection?
As I understand websocket message contains data only with no headers. Thus authorization cookie is not available to server backend. How should application distinguish messages from different clients?
Which websocket server are you using?
if your webserver and websocketserver are the same, you could send the sessionid via websocket and force-disconnect any client that does not send a valid sessionid in his first message.
if your websocketserver parses the HTTP headers sent in the HTTP upgrade request properly, it may also save any cookie. this is what a request of my firefox (version 35) looks like:
GET /whiteboard HTTP/1.1
Host: *:*
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:35.0) Gecko/20100101 Firefox/35.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
DNT: 1
Sec-WebSocket-Version: 13
Origin: *
Sec-WebSocket-Protocol: whiteboard
Sec-WebSocket-Key: iGPS0jjbNiGAYrIyC/YCzw==
Cookie: PHPSESSID=9fli75enklqmv1a30hbdmg1461
Connection: keep-alive, Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
as you see, the phpsessionid is transmitted fine. check the documentation of your websocketserver if it parses the HTTP headers. remember that cookies will not be sent if the websocket's domain differs from the webserver's domain.
I'm currently trying to connect to a webservice placed on https://xxx.xxx.xx/myapp
It has anonymous access and SSL enabled for testing purposes atm.
While trying to connect from the 3G network, i get Status 403: Access denied. You do not have permission to view this directory or page using the credentials that you supplied.
I get these headers while trying to connect to the webservice locally:
Headers
Request URL:https://xxx.xxx.xx/myapp
Request Method:GET
Status Code:200 OK
Request Headers
GET /myapp/ HTTP/1.1
Host: xxx.xxx.xxx
Connection: keep-alive
Authorization: Basic amViZTAyOlE3ZSVNNHNB
User-Agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Encoding: gzip,deflate,sdch
Accept-Language: sv-SE,sv;q=0.8,en-US;q=0.6,en;q=0.4
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Response Headers
HTTP/1.1 200 OK
Content-Type: text/html; charset=UTF-8
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Thu, 16 Feb 2012 12:26:13 GMT
Content-Length: 622
But when accessing outside the local area, we get the big ol 403. Which in turn wants credentials to grant the user access to the webservice.
However, i've tried using the ASIHTTPRequest library without success, and that project has been abandoned. And they suggest going back to NSURLConnection.
And i have no clue where to start, not even which direction to take.
-connection:(connection *)connection didRecieveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge
The above delegate method of NSURLConnection doesnt even trigger. So i have no idea what so ever how to authenticate myself.
All i get is the parsed results of the xml elements of the 403-page.
I needs dem seriouz helps! plx.
This was all just a major f-up.
The site had ssl required and enabled, and setting ssl required for the virtual directories does some kind of superduper meta-blocking.
So, by disabling ssl required for the virtual directories, it runs over ssl and is not blocking 3G access..