HTTP Basic Auth in Selenium Grid - selenium

I want to implement Basic Http Authentication in Selenium Grid how do I do that? For eg: I want to send request to grid but not without authentication in the URL. I need to create something like this http://username:password#mygrid.com:4444/wd/hub in our internal Selenium grid. How do I do that?

Ok. I was able to achieve what I needed. I installed nginx and added the selenium grid endpoint to it. Then added
auth_basic “Grid’s Area”;
auth_basic_user_file /etc/apache2/.htpasswd;
in the nginx.conf. That's it.
Please remember grid has multiple URI and does not have any root (In nginx terms) URI. So when you proxy let's say /grid to http://localhost:4444/grid/console all the static content cannot be served. In this case we need to proxy / to http://localhost:4001. This is because the static content is served from a different URI. In our case it's being served from /grid/resources/org/openqa/grid/images/ which is different from /grid/console.
As far as getting SSL working I follow this guide and it's super easy.

Related

Acumatica behind reverse proxy causes issues with GetFile.ashx

I am running my dev instances of Acumatica behind an reverse proxy that consists of IIS with Application Request Routing 3.0
For the most part things run and behave as expected, however I have issues with images, e.g. logos, inventory pics, etc. The issue is that upon first load the url delivered to the client is an absolute url. If move between branches then logo url switches to a relative url and the image displays properly.
if you would like an example here is a url to a test instance.
https://2019r2.acumatica.govelocit.com/test20r1
user: admin
pass: P#ssword1
when you login the logo will have a broken link icon
Image with Broken Link
if you switch to a new branch the logo shows.
Working Image
if you switch back to the branch you started with the logo still displays fine. It is just an initial load issue.
Thoughts?
The issue here is that absolute url being built using not current url schema but, the schema the site was called. And since you are calling the site from your reverse proxy via http, the link generated for images is also http, and therefore cannot be loaded. Additionally you are getting the security warning, as the you are calling http content via site on https.
like here
and if you just edit url schema in browser, the image will appear -
here you see the image
There are at least 2 good solutions to suggest:
Point your Reverse proxy on HTTPS site. This is quite a straightforward solution that might however bring a little headache in configuration if your reverse proxy will not like the self signed IIS certificate. It also would not allow to analyze the requests, as all transports will be encrypted.
Another solution is a little more sophisticated and will enable you to call http site and make it thinking you are calling https. For this you would need to set the X-Forwarded-Proto header as https, in your reverse proxy config.
Unfortunately not familiar with Application Request Routing 3.0, for better understanding the NginX proxy location will look like this:
location ~ ^/(MySite){
proxy_pass http://localhost:82; //note, you are calling https here
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Proto https; //here you are tricking the site
}

Call APIs over Http from Webpage served over Https

We have a Java/Jetty server. The servlets on this server are called by some of our internal applications over http.
I have been asked to create a webapp /website which will use many of these servlets / api.
However this is an external customer facing website and needs to be served over https / ssl. The servelet urls look like
http://internalServer:9999?parameters.
Now my webapp is ready and has been deployed on Apache on Debian. Everything works fine but as soon as I enable
https/ssl the backend calls do not go through. On chrome I get "Mixed content. Page was loaded on https but is requestig resource over http...". On Safari I get -could not load resource due to access control checks.
I understand the reasons for these errors but I would like to know ways to solve this.
I have full control over apache server and website code.
I have very limited control over internal jetty server and no control over servelt code.(don't want to mess with existing apps).
Is there something I can do just with apache configuration? can I use it as a reverse proxy for the Jetty(http) server?
Thanks for your help.
"Mixed content. Page was loaded on https but is requestig resource over http..."
That error message means your HTML has resources that are being requested over http://... specifically.
You'll need to fix your HTML (and any references in javascript and css) that request resources (or references resources) to also use https://....
If you try to call an http service from an https site you will have Mixed content error.
You can avoid that error using apache2 proxy settings inside your example.org.conf
You can find it inside the folder /apache2/sites-enabled
Add some code:
<VirtualHost *:443>
...
ProxyPass /service1 http://internalServer:9999
ProxyPassReverse /service1 http://internalServer:9999
</VirtuaHost>
From your https site you have to fetch the url
https://example.org/service1`
to reach the service.
In that way you can call your services http from a https site.

Intercept and modify traffic to and from tomcat

I have an application deployed on tomcat on my localhost. I want to intercept and modify the requests that the application makes and the responses that it receives. Is there a tool to do this? I have tried out Burp but i've only been able to intercept traffic to and from Firefox browser using it.
You could try using the OWASP Zed Attack Proxy.
It will be able to intercept any request from a browser than supports proxies (Firefox, IE, Chrome, Opera...)
I think you are talking about Servlet Filters that intercept the requests and responses to servlets (and are placed in a FilterChain).
As Vikdor said Servlet Filters should do the trick. You need to modify the web.xml of each application running on the tomcat, and write your filter code in java as a Filter.
If you want to do a simple task, like redirect an url or add a header you can use UrlRewriteFilter, for a more complex/custom task you should write your own code.

Set request header and User Agent in Geb

When using Geb, is it possible to set custom request headers and user agent when using the Browser API (and not the Direct Download API)?
While this is possible with the FirefoxDriver (see here), I am looking for a way of doing this with the WebKitDriver.
A possible solution is via a proxy.
BrowserMob has a standalone mode with REST api, or embedded in your test programmatically: https://github.com/webmetrics/browsermob-proxy . Useful when there are a lot of custom headers you want to test.
If you already have Apache, you can create another VirtualHost on a different port having that particular request header, and point your browser to that port before the test. Given that your header doesn't change between tests.
This might not be the direct solution to your question: modify request headers directly in Browser API, but it achieves the end result.

How to modify web content using apache proxy?

I'm using apache server to test my browser. I want to insert some javascript to a web page, such as google.com to analyze the page content.
Other debugging methods are limited by my browser to use. So I want to modify the page contents at the proxy end.
Tank you very much!
Squid is a solid web proxy with easy hooks for modifying pages on the fly. You can start with this and change the script and hooks to modify html instead of images.