I am using BrowserMob proxy 2.0-beta-8 in a test automation project with Selenium. The page I'm testing against is using https and I need to rewrite the user agent header. For plain http requests everything works fine - the request interceptor is called and I can rewrite the header. However, for https requests the interceptor is not called at all.
Does BrowserMob currently not support intercepting https requests or am I missing something here?
You did not mention what browser are you using:
1) If you are using Firefox/Chrome, Selenium2 and BrowserMob >=2.0-beta8 then everything should work out of the box
2) If you are using other browser, check how to install browser CA certificate
Related
I am trying to optimize my site for all HTTPS. I know that Twitter is all HTTPS and I noticed that they don't redirect HTTP to HTTPS, but instead just initiate an HTTPS connection.
Here is a screenshot of Google Chrome's Network Activity, notice there is no redirect (301/302), the HTTP request (first line) just hangs as pending and the second line is the HTTPS page. Note, I have cleared all my browser cache so HTTP Strict Transport Security (HSTS) shouldn't matter.
Here is another screenshot of the request/response for the HTTPS page. Notice, that it seems Twitter inserts some fields into the REQUEST, such as :scheme
How do they do this? I would assume its faster so that if a user types twitter.com into their browser, instead of a redirect (think extra network round trip), Twitter seems to seamlessly move to SSL (HTTPS).
A follow on question would be, does this work in all browsers?
They have been added to a list of preloaded HSTS sites in Chrome and Mozilla Firefox.
I am developing a Rails application that uses SSL connection. I am currently using third party resources that are js and css files for implementing a map (OpenStreetMap) . I have already tried to import these resources (js and css) into my application, but the javascript code tries to access an external WMS via HTTP.
The problem is that Google Chrome is blocking access to third-party resources from HTTP when the application is in HTTPS.
So I disabled SSL on a certain pages of the application and tried to force the HTTP or HTTPS the way I desire.
Following this blog: http://www.simonecarletti.com/blog/2011/05/configuring-rails-3-https-ssl/ and it works.
But when I force the HTTP protocol to the page where these resources will be used using Google Chrome, it forces HTTPS connection causing infinite loop.
If I clear the Chrome cache (that have already accessed the same page with HTTPS) in order access it via HTTP it works. But if I have accessed a HTTPS page and try to access via HTTP, Chrome forces the HTTPS connection resulting in an infinite loop.
The question is: Is there something I can set in the request that causes Chrome to accept the connection?
Regards
I've been doing some research on this, and it turns out that turning on force_ssl = true on Rails 3 causes the app to send an HSTS header. There's a bit of information about it here: How to disable HTTP Strict Transport Security?
Essentially, the HSTS header tells Chrome (and Firefox) to access your site only through HTTPS for a specific amount of time.
So... the answer I have for you now is that you can clear your own HSTS setting by going to about:net-internals within your Chrome browser and removing the HSTS state.
I think the answers here can help you: Rails: activating SSL support gets Chrome confused
Running a Rails application locally, I am able to configure Charles Proxy to show all the request/response details for the app accessible at lvh.me:3000.
However I haven't been able to capture the rails app internal http calls to external urls. For e.g. using the rest-client/httparty when I make an external call to say http://www.google.com from within my Rails app, Charles proxy does not show the server initiated http requests to google.com.
Can some one suggest what configuration I am missing? In Recording Settings > Include > I have added http://www.google.com.
You need to configure the httpclient to use you proxy. If you are using rest-client you only need to set the proxy address to Charles proxy. E.g. RestClient.proxy = "http://127.0.0.1:8881"
Or if it's httpclient:
http_client = HTTPClient.new(proxy: "127.0.0.1:8888")
...
I have an application deployed on tomcat on my localhost. I want to intercept and modify the requests that the application makes and the responses that it receives. Is there a tool to do this? I have tried out Burp but i've only been able to intercept traffic to and from Firefox browser using it.
You could try using the OWASP Zed Attack Proxy.
It will be able to intercept any request from a browser than supports proxies (Firefox, IE, Chrome, Opera...)
I think you are talking about Servlet Filters that intercept the requests and responses to servlets (and are placed in a FilterChain).
As Vikdor said Servlet Filters should do the trick. You need to modify the web.xml of each application running on the tomcat, and write your filter code in java as a Filter.
If you want to do a simple task, like redirect an url or add a header you can use UrlRewriteFilter, for a more complex/custom task you should write your own code.
I have successfully configured my SWT Browser application to use the proxy by setting VM arguments -Dnetwork.proxy_host and -Dnetwork.proxy_port to the according values.
However the proxy needs authentication, but the username / password prompt does not open. Futhermore when registering an authentication listener, the listener is never triggered.
The problems occured with a Linux Debian 64 Bit distribution. When compiling the same application for windows, all works fine, i.e. the password promt opens. The SWT Browser is configured to use MOZILLA, not WEBKIT. Unfortunatelly I cannot test with WEBKIT as I am limited to a given environment.
Temp solution: When starting the Linux Mozilla Browser, the prompt comes up. If entering there correct values and afterwards starting the SWT Browser application, then no authentication is needed at all and internet access is possible. But this is not a good solution.
When I register a location listener with "addLocationListener" to look whats going on with url calls, then I can see that the initial url (for example www.google.de) results to call a certain http site of the proxy server. And this http site is a redirect to a https site of the proxy. Then the https site results in calling the http redirect page again. This is then an endless loop.
I would guess that somewhere in the JAVA code of the SWT Browser class there is a routine that calls setUrl with those pages (what results in an
endless loop) and skip to call any authentication listener for some reason.
Maybe someone has an idea whats going wrong in this authentication process?
I have no solution but a hint: I'm not sure what you mean by "Linux Mozilla Browser" - I know Firefox and Xulrunner. But your workaround suggests that profile information is shared somehow and that shouldn't happen.
I tried to find some information how to define the profile (where the web browser keeps its cache, config, SSL certificates, plugins, ...) but to no avail.
This entry in the FAQ shows how to set the proxy host: How do I set a proxy for the Browser to use?
Try to find a way to add the user/password information into the request sent to the proxy server. If that fails, create a local proxy which connects to the real proxy as upstream and which can authenticate itself.
Looking at the bug database, there is no support for Browser profiles: Flexible Mozilla profile support - new API request