This question already has an answer here:
Using PhantomJS to embed all images of a webpage produces warnings but works
(1 answer)
Closed 7 years ago.
On each test run of casperjs test, the output closes with the following:
Unsafe JavaScript attempt to access frame with URL about:blank from
frame with URL file:///usr/lib/node_modules/casperjs/bin/bootstrap.js.
Domains, protocols and ports must match.
My tests pass. Everything else looks okay, but is this block of text important? Does it suggest that I set something up incorrectly.
Even running a command as below shows the same message.
$ casperjs --version
1.1.0-beta3
Unsafe JavaScript attempt to access frame with URL about:blank from frame with URL file:///usr/lib/node_modules/casperjs/bin/bootstrap.js. Domains, protocols and ports must match
Richard,
Apparently, the problem is related to the safety of this request when opening about: blank.
Many of the problems faced in abrri a specific URL and potential security problems, I managed to solve by adding the following code in ALL my scripting calls:
casperjs --web-security=no --ssl-protocol=any --ignore-ssl-errors=yes test C:\path_test\file_test.js
Why in all the executions? In most tests that do, my URLs are https and even on things on the type http, my play with the above code did not work problems and my goal in the project, I had no problem and solved these safety issues.
Take the test and see if it resolves your problem, it still does not work, maybe we can "debug" if not missing something related to CasperJS or PhantomJS, something related to machine configuration, permissions, user, etc.
From the comments to the original question, I downgraded phantomjs
sudo npm -g install phantomjs#1.9.7-15
This remedied that output. However, I still don't have enough knowledge to be able to explain why. If anyone does, I'm happy to mark that as the correct answer.
Related
I'm trying to scrape my own banking information by automating the process using Selenium in Ruby.
I'm running into a bizarre situation where performing the exact same sequence in the browser (whether just the normal browser or private/incognito) works fine, but when I try to log in under a Selenium-controlled browser I get back a strange 500 error from the server.
I've noticed the browser console logs also look different in terms of certain logging messages related to cookies, JS errors, libraries being loaded, etc.
I have found an answer on SO mentioning one possible difference in Chrome being a specific "cdc" string that might be detectable, but is there some kind of corresponding difference in Firefox/Geckodriver that could be used to detect the fact that I'm trying to automate the browser?
I'm not really sure where to look, because my understand was that running via Selenium should basically have identical behaviour to running via the browser itself.
Would love some guidance on what mechanisms may be in play to explain the difference in behaviour!
I wrote a small Bash script which runs a few PhantomJS tasks.
For example:
./node_modules/.bin/phantomjs "phantomjs/snapshot.js" "$url" >file.html
As you see, I use the logged info from PhantomJS to write it to a file.
I ran the command on a URL with JavaScript errors. For some unknown reason each time I now run that command with the same URL I still get the errors... Although everything's fine if I do the same with a different Linux user. Also, if I use a different URL, everything is also fine (but I need to use the one which throws errors).
Is there a way PhantomJS uses cached response? I tried to add clearCookies() and clearMemoryCache() methods before opening the page, also tried to add random parameter on each call, but nothing helped...
PhantomJS is the latest version.
The problem was that the broken call made an entry to localStorage, which then was causing all other errors. Apparently PhantomJS keeps the localStorage on a file, so removing appropriate entry from /home/USER/.local/share/Ofi Labs/PhantomJS cleared the localStorage and the problem is gone now.
I have found a strange issue which I do not completely understand. When I run the LoadRunner with just a single protocol, the browser (when recording starts) is ran but says "page not found" (as if the proxy was not set).
How come? The protocols specify what traffic will be captured but I assumed in just does not record the ones not specified. But why the browser could not find the page in single protocol and could in multiple?
I've found that the single protocol mode (I assume web here) is somewhat erratic and does not work all the time. The workaround is to use the multiple protocol mode, but select only Web (HTTP/HTML). This works much better.
The actual reasons for why this is the case are unknown, but at least give it a try!
As for other issues:
Check that your PROXY settings are correct when you invoke IE for recording. Your issue sounds a little like a proxy issue, but please post more details if none of the above works.
Over 90% of recording issues can be tracked to environment items, specifically do you have the right match up between version of LR and version/manufacturer of your browser plus are you signed in with the proper credentials plus do you have any conflicting software packages loaded, such as antivirus, which could be impacting the recordingf mechansim.
Where to start?
Makes sure you are signed in with Administrative credentials
Disable any antivirus running locally
Validate your browser manufacturer and version with the requirements for your version of LoadRunner
I am using this webkitdotnet in my C# project. It all went well until I had to use access site with https.
I've searched their forum and found few posts about this but none of it solves my problem, so please shed some light on this one. Thx!
edit: Also as mentioned in their threads (also without an answer) I get a "Peer certificate cannot be authenticated with known CA certificates" error when trying to access my server, but https://www.google.com works fine.
They also mention the "apple" build which worked fine with ssl (at least so they say), but I can't find it anywhere...
This is a bit of a hack, but you can make webkitdotnet ingore peer ssl errors. WebKitDotNet uses WebKit, which, in turn uses curl, which is responsible for your wonderful ssl error there. curl exposes an option to ignore ssl errors, but neither webkit nor webkitdotnet seem to expose this functionality in their api. However, if you checkout the webkit source code, webkit sets the curl option (CURLOPT_SSL_VERIFYPEER) to false if the value of the environment variable WEBKIT_IGNORE_SSL_ERRORS is set to true.
What this all boils down to is that if you set the environment variable in code before initializing either webkit or webkitdotnet components, webkit will ignore the bad certificate and allow you to navigate to the site (sort of like clicking Proceed Anyway on IE9's Bad Certificate Warning page).
C++:
setvar("WEBKIT_IGNORE_SSL_ERRORS", "1");
C#:
Environment.SetEnvironmentVariable("WEBKIT_IGNORE_SSL_ERRORS", "1");
If anyone is interested, the webkit source code referenced is in file webkit\Source\WebCore\platform\network\curl\ResourceHandleManager.cpp at lines 65 and 681, currently.
After long googling I finally ended up purchasing a SSL certificate for my domain and now all is fine. Also, a good to note is that Webkit is the easiest to work with and allows for DOM access and manipulation.
I tried the code below and works for me.
webkitBrowser.Preferences.IgnoreSSLErrors = true;
Question is really simple, I am unable to find out what is wrong with this page:
http://www.ezotour.sk/page/poznavacie-zajazdy-taliansko-sicilia.html
I have tried firebug no errors, I have tried chrome with no errors, I have tried IE9 with no errors. Using IETester it will return errors in code but I am not able to find out what it is.
I don't want you to debug my page for me, I need some advice what you use for IE6/7 debugging.
IETester is buggy, don't trust it.
The best way to test across multiple browsers is to create different virtual machine configurations using VMWare.
See: http://civicactions.com/blog/2009/may/24/building_ultimate_cross_browser_testing_system
IE6 is buggy, but not for a long time!
http://ie6countdown.com/