Not able to do Jmeter UI recording due to CORS error - jmeter-4.0

We are facing CORS issue in Jmeter UI recording:
CORS request did not succeed
We have website uat.torchmarkets.com which internally calls tradeuat.torchmarkets.com in which has security check for Cross Domain Request. Sharing. We want to do UI recoding for uat.torchmarkets.com
Below is jmeter UI recorder setting
Below is proxy settings in browser. The subdomain loadtesting.torchmarkets.com is whitelisted.
Below is there CORS error

Any help to solve this issue will be much appreciated!
Thanks in advance!!

In my case the CORS error vanished after importing JMeter's certificate.
In JMeter's bin folder, you will find a .crt file. This one needs to be imported into the browser. I have tested it only with Firefox.
In Firefox, go to Privacy & Security settings section where you
find View Certificates...
Select the Authorities tab
Click on Import... and browse for JMeter's certificate
Check This certificate can identify web sites. and confirm
From that point on you should see no further CORS errors, until the certificate expires, which is by default 7 days.
Optionally, in the bin folder of JMeter find the file jmeter.properties and change the validity period of the certificate if you need it for longer than the default 7 days by changing the value proxy.cert.validity and let the certificate regenerate.

Related

Jmeter Recorder - can't access github website

I'm new to Jmeter and I have the following case:
I'm trying to record using Jmeter by going to github.com/login. I set localhost and port to jmeter, also to Mozilla. Certificate created by jmeter is added to mozilla.
If I access for example stackoverflow, it works, but if I want to go to github I don't have the button Accept the risk and continue as it was for stackoverflow, therefore github can't be accessed. I even installed a clean Mozilla in order to not have any old certificate for github.
Does anyone know why is this happening?
jmeter configuration
github certificate message
firefox proxy config
Later edit: This is how it looks for example
stackoverflow
I'm not able to access facebook, google besides github.
It is a matter of certificate installation. Once you install the certificate, there shouldn't be an "Accept the risk and continue". Rather you should be able to directly navigate to the website.
Please try the below steps to install appropriately in Firefox
Launch Firefox -> Navigate to Options -> Search Certificates -> Click View Certificates -> Click on "Authorities" tab -> Click Import -> Choose certificate file types to include ".crt" files in the File Browser Pop-up -> Choose "ApacheJMeterTemporaryRootCA.crt"
Pop-up such as this should appear if you have followed appropriately
Check Trust this CA to identify Websites -> Click OK -> Navigate to Authorities tab once again -> Ensure JMeter certificate is available
If all the above steps are followed, you should be able to record https://github.com/login without any hiccups
Why this error appears in the first place
Basically when a browser tries to access a website via HTTPS, it tries to establish a secure connection post validation of the SSL certificate. In our case, since JMeter is the proxy with which Firefox interacts, Firefox believes it is the actual server and hence it is trying its best to ensure your safety by preventing you from sending the request since JMeter's SSL certificate is not one of the approved SSLs (not part of Trusted CA certificates).
How the certificate helps
When you add the JMeter's Root CA certificate into Trusted CA folder, you let Firefox know that SSL provided by JMeter is trustworthy and hence things are alright
Hope this helps!
You need to import JMeter's certificate into Firefox browser in order to be able to intercept and decrypt secure traffic, see HTTPS recording and certificates chapter of the HTTP(S) Test Script Recorder documentation for more details.
Remember that JMeter's MITM certificate has limited lifetime (7 days as of JMeter 5.3), so it might be the case you need to re-import the certificate if it's expired. The lifetime of the certificate is controlled by proxy.cert.validity property
If you still experience problems try cleaning your browser history and remove everything from the beginning of the time, it will remove cached certificates as well
More information: Recording HTTPS Traffic with JMeter's Proxy Server
Also be aware that you can use an alternative way of recording a JMeter test - JMeter Chrome Extension, in this case you won't have to worry about proxies and SSL certificates

Safari 9 disallowed running of insecure content?

after upgrading to Safari 9 I'm getting this error in the browser:
[Warning] [blocked] The page at https://localhost:8443/login was not allowed to run insecure content from http://localhost:8080/assets/static/script.js.
Anyone knows how to enable the running of insecure content on the new Safari?
According to the Apple support forums Safari does not allow you to disable the block on mixed content.
Though this is frustrating for usability in legitimate cases like yours, it seems to be part of their effort to force secure content serving / content serving best practices.
As a solution for you you can either upgrade the HTTP connection to HTTPS (which it seems you have done) or proxy your content through an HTTPS connection with an HTTPS-enabled service (or, in your case, port).
You can fix the HTTPS problem by using HTTPS locally with a self signed SSL certificate. Heroku has a great how-to article about generating one.
After setting up SSL on all of your development servers, you will still get an error loading the resource in Safari since an untrusted certificate is being used(self signed SSL certificates are not trusted by browsers by default because they cannot be verified with a trusted authority). To fix this, you can load the problematic URL in a new tab in Safari and the browser will prompt you to allow access. If you click "Show Certificate" in the prompt, there will be a checkbox in the certificate details view to "Always allow content from localhost". Checking this before allowing access will store the setting in Safari for the future. After allowing access just reload the page originally exhibiting a problem and you should be good to go.
This is a valid use case as a developer but please make sure you fully understand the security implications and risks you are adding to your system by making this change!
If like me you have
frontend on port1
backend on port2b
want to load script http://localhost:port1/app.js from http://localhost:port2/backendPage
I have found an easy workaround: simply redirect with http response all http://localhost:port2/localFrontend/*path to http://localhost:port1/*path from your backend server configuration.
Then you could load your script directly from http://localhost:port2/localFrontend/app.js instead of direct frontend url. (or you could configure a base url for all your resources)
This way, Safari will be able to load content from another domain/port without needing any https setup.
For me disabling the Website tracking i.e. uncheck the Prevent cross-site tracking worked.

Internet Explorer: SCRIPT7002: XMLHttpRequest: Network Error 0x2f7d, Could not complete the operation due to error 00002f7d

This problem is driving me nuts. Our web app uses HTTP POST to login users and now IE 10 is aborting the connection and saying:
SCRIPT7002: XMLHttpRequest: Network Error 0x2f7d, Could not complete the operation due to error 00002f7d.
Here are all the details I have
IE version 10.0.9.16618, update version 10.0.6. I've also reproduced this on IE version 10.0.9200.16635, update version 10.0.7.
The domain is using HTTPS. The problem doesn't occur on HTTP connections
I've read that for some reason IE needs to get a certificate before it can do an HTTP POST, so I have HTTP GETs running before my POST request, but now the GET is erroring out. See network flow screen shot. The GET is super simple, just a PING page that returns "I'm up."
Asyn is turned off $.ajax({type: 'POST',url: url,async: false...}); I've read in other posts that this matters.
The certificate is good, see screen shot.
The problem goes away if the site is added as a "trusted site" but that's not really the user experience we're shooting for.
This just started about a month ago. Did Microsoft push some new updates recently?
I've already read: http://social.msdn.microsoft.com/Forums/windowsapps/en-US/dd5d2762-7643-420e-880a-9bf75554e383/intermittent-xmlhttprequest-network-error-0x2f7d-could-not-complete-the-operation-due-to-error. It doesn't help.
Screen shots:
Network flow:
Cert is good:
Any help is greatly appreciated. I've spent a lot of hours on this with no luck. As you would expect this works fine in Chrome and Firefox. If you need any more detail about what's happening please let me know.
Thanks,
Certificate revocation checks may block the initial JSON POST, but allow subsequent requests after the GET callback
We recently determined that URLMon's code (Win8, Win7, and probably earlier) to ignore certificate revocation check failures is not applied for content uploads (e.g. HTTP POST). Hence, if a Certificate Revocation check fails, that is fatal to the upload (e.g. IE will show a Page Cannot Be Displayed error message; other clients would show a different error). However, this rarely matters in the real world because in most cases, the user first performs a download (HTTP GET) from the target HTTPS site, and as a result the server's certificate is cached with the "ignore revocation check failures" exemption for the lifetime of the process and thus a subsequent POST inherits that flag and succeeds. The upload fails if the very first request to the HTTPS site in the current process was for an upload (e.g. as in a cross-origin POST request).
Here is how it works:
A little background: When a web browser initiates a HTTPS handshake with a web server, the server immediately sends down a digital certificate. The hostname of the server is listed inside the digital certificate, and the browser compares it to the hostname it was attempting to reach. If these hostnames do not match, the browser raises an error.
The matching-hostnames requirement causes a problem if a single-IP is configured to host multiple sites (sometimes known as “virtual-hosting”). Ordinarily, a virtual-hosting server examines the HTTP Host request header to determine what HTTP content to return. However, in the HTTPS case, the server must provide a digital certificate before it receives the HTTP headers from the browser. SNI resolves this problem by listing the target server’s hostname in the SNI extension field of the initial client handshake with the secure server. A virtual-hosting server may examine the SNI extension to determine which digital certificate to send back to the client.
The GET may be victim of the operation aborted scenario:
The HTML file is being parsed, and encounters a script block. The script block contains inline script which creates a new element and attempts to add it to the BODY element before the closing BODY tag has been encountered by the parser.
<body>
<div>
<script>document.body.appendChild(newElem)</script>
</div>
</body>
Note that if I removed the <div> element, then this problem would not occur because the script block's immediate parent would be BODY, and the script block's immediate parent is immune to this problem.
References
Understanding Certificate Revocation Checks
Client Certificates vs Server Certificates
Understanding and Managing the Certificate Stores
Preventing Operation Aborted Scenarios
HTTPS Improvements in IE
Online Certificate Status Protocol - OCSP
[SOLVED]
I only observed this error today. for me the Error code was different though.
SCRIPT7002: XMLHttpRequest: Network Error 0x2efd, Could not complete
the operation due to error 00002efd.
I was occuring randomly and not all time. but what it noticed is, if it comes it comes for subsequent ajax calls.. so i put some delay of 5 seconds between the ajax calls and it resolved.
Also the CORS must be configured on your web server.
I had the same exact issue and I just finally resolved it. For some reason I got the same error that you were receiving on IE when connecting to the API using OWIN middleware that was used to receive login credentials. It seemed to work fine while connecting to any other sort of API though. For some reason it didnt like cross domain request even though I had CORS enabled server side on the API.
Anyways I was able to resolve the issue using the xdomain library. Make sure you load this script before loading any other javascript.
First create a proxy.html page on the root of your API server and add this code. Replace placeholder URL.
<!DOCTYPE HTML>
<script src="//cdn.rawgit.com/jpillora/xdomain/0.7.3/dist/xdomain.min.js" master="http://insert_client_url_here.com"></script>
Now simply add this to your client replacing the placeholder URL pointing to the proxy.html page on your API server.
<script src="//cdn.rawgit.com/jpillora/xdomain/0.7.3/dist/xdomain.min.js" slave="http://Insert_Api_Url_Here.com/proxy.html"></script>
Adding a delay is not a proper solution.
This can be because the IE will treat it as an network error when the empty body request is made.
Try adding a empty class as the parameter in the server and IE should start working.

How do I get placemark icons to load over ssl?

I'm working on a web application that uses the google earth plugin. Recently, a new requirement to have non-public users logon was added, which meant that some users were now using the site over https. Among the things that broke in testing were the custom placemark icons (They were working using http).
The icons are hosted on the same server which servers the page.
Here are the urls for each of the protocols.
http - http://localhost/Images/yellow.png
https - https://localhost/Images/yellow.png
I can follow that link and the image will appear as you would expect.
The images hrefs are declared as icon styles in dynamically generated kml.
I want to avoid loading the images over http because I think that will cause internet explorer to present the user with a mixed content warning.
How do I get the images to load properly while using https?
I have been wrestling with this myself -- the short answer is that this won't work. If the content is served off of an HTTPS site that generates any kind of error/prompt (authentication, invalid certificate, etc.) the plugin will simply not load the content.
Interestingly, the desktop client works fine and prompts the user for credentials if necessary. However, neither client will allow content to be served off of site with an untrusted certificate.
The only workaround that I have found is:
Use a trusted HTTPS certificate on the server hosting the content (either trust the certificate on the client systems or just use a real certificate.)
Do not use HTTPS basic auth as that will always generate 401/Challenge responses which the web browser client will simply ignore
If authentication is a requirement, use NTLM authentication and common (e.g., domain) logins. If you load the plugin in Interent Explorer (or in a .NET WebBrowserControl) the authentication will be handled properly and the images will show up.
I was at a Google Earth administrator's training last week and the trainer confirmed this "bug". It is supposed to be fixed in the next version of the plugin (it may actually be fixed already -- what version of the plugin are you using?)

SWT Browser Plugin does not promt for proxy authentication

I have successfully configured my SWT Browser application to use the proxy by setting VM arguments -Dnetwork.proxy_host and -Dnetwork.proxy_port to the according values.
However the proxy needs authentication, but the username / password prompt does not open. Futhermore when registering an authentication listener, the listener is never triggered.
The problems occured with a Linux Debian 64 Bit distribution. When compiling the same application for windows, all works fine, i.e. the password promt opens. The SWT Browser is configured to use MOZILLA, not WEBKIT. Unfortunatelly I cannot test with WEBKIT as I am limited to a given environment.
Temp solution: When starting the Linux Mozilla Browser, the prompt comes up. If entering there correct values and afterwards starting the SWT Browser application, then no authentication is needed at all and internet access is possible. But this is not a good solution.
When I register a location listener with "addLocationListener" to look whats going on with url calls, then I can see that the initial url (for example www.google.de) results to call a certain http site of the proxy server. And this http site is a redirect to a https site of the proxy. Then the https site results in calling the http redirect page again. This is then an endless loop.
I would guess that somewhere in the JAVA code of the SWT Browser class there is a routine that calls setUrl with those pages (what results in an
endless loop) and skip to call any authentication listener for some reason.
Maybe someone has an idea whats going wrong in this authentication process?
I have no solution but a hint: I'm not sure what you mean by "Linux Mozilla Browser" - I know Firefox and Xulrunner. But your workaround suggests that profile information is shared somehow and that shouldn't happen.
I tried to find some information how to define the profile (where the web browser keeps its cache, config, SSL certificates, plugins, ...) but to no avail.
This entry in the FAQ shows how to set the proxy host: How do I set a proxy for the Browser to use?
Try to find a way to add the user/password information into the request sent to the proxy server. If that fails, create a local proxy which connects to the real proxy as upstream and which can authenticate itself.
Looking at the bug database, there is no support for Browser profiles: Flexible Mozilla profile support - new API request