I wrote a small Bash script which runs a few PhantomJS tasks.
For example:
./node_modules/.bin/phantomjs "phantomjs/snapshot.js" "$url" >file.html
As you see, I use the logged info from PhantomJS to write it to a file.
I ran the command on a URL with JavaScript errors. For some unknown reason each time I now run that command with the same URL I still get the errors... Although everything's fine if I do the same with a different Linux user. Also, if I use a different URL, everything is also fine (but I need to use the one which throws errors).
Is there a way PhantomJS uses cached response? I tried to add clearCookies() and clearMemoryCache() methods before opening the page, also tried to add random parameter on each call, but nothing helped...
PhantomJS is the latest version.
The problem was that the broken call made an entry to localStorage, which then was causing all other errors. Apparently PhantomJS keeps the localStorage on a file, so removing appropriate entry from /home/USER/.local/share/Ofi Labs/PhantomJS cleared the localStorage and the problem is gone now.
Related
i'm having trouble developing with Nuxt3, the problem is that sometimes edits in the code are not visible, it seems that Nitro caches something and I need to restart the server in order to see the updates. The most effective example is when I console.log something, then i remove the line, but the log still get printed after I reload the page. Any tips?
I am testing a scenario in which I have to run chrome with cache so how can I do that as by default it launches browser without any cache.
This is not a normal requirement, but we have an option to set the userDataDir, here below is a screenshot from the documentation:
So if you know the path of a directory that exists, you can pass that. That said, maybe the correct testing strategy is to make a call 2 times and ensure that the second call is being cached. But some investigation may be required.
I am running an Apache Server and I have placed a script to generate a report within the CGI-BIN which I can then start running from link in simple webpage I put together.
The script works no problem. What I have noticed though is that if I attempt to run two or more of the same script through the browser they are queued, i.e. the second will not run until the first is done? I was wondering why this happens and if it is possible to turn it off if necessary,
Thanks
It turns out that it was actually Chrome queueing the requests rather apache.
I tried it on different browsers and it worked fine.
This question already has an answer here:
Using PhantomJS to embed all images of a webpage produces warnings but works
(1 answer)
Closed 7 years ago.
On each test run of casperjs test, the output closes with the following:
Unsafe JavaScript attempt to access frame with URL about:blank from
frame with URL file:///usr/lib/node_modules/casperjs/bin/bootstrap.js.
Domains, protocols and ports must match.
My tests pass. Everything else looks okay, but is this block of text important? Does it suggest that I set something up incorrectly.
Even running a command as below shows the same message.
$ casperjs --version
1.1.0-beta3
Unsafe JavaScript attempt to access frame with URL about:blank from frame with URL file:///usr/lib/node_modules/casperjs/bin/bootstrap.js. Domains, protocols and ports must match
Richard,
Apparently, the problem is related to the safety of this request when opening about: blank.
Many of the problems faced in abrri a specific URL and potential security problems, I managed to solve by adding the following code in ALL my scripting calls:
casperjs --web-security=no --ssl-protocol=any --ignore-ssl-errors=yes test C:\path_test\file_test.js
Why in all the executions? In most tests that do, my URLs are https and even on things on the type http, my play with the above code did not work problems and my goal in the project, I had no problem and solved these safety issues.
Take the test and see if it resolves your problem, it still does not work, maybe we can "debug" if not missing something related to CasperJS or PhantomJS, something related to machine configuration, permissions, user, etc.
From the comments to the original question, I downgraded phantomjs
sudo npm -g install phantomjs#1.9.7-15
This remedied that output. However, I still don't have enough knowledge to be able to explain why. If anyone does, I'm happy to mark that as the correct answer.
Is there a reason server-side or client-side why uploading even a tiny file via the iframe method can take such a long time?
I'm just trying to upload a file via an iframe so the user doesn't have to leave the page. It all works but it's incredibly slow. Oddly enough, one time in umpteen, it will actually go through quickly. I'm not sure what's going on.
Browser is Firefox 3.6. Server is CentOS 4 with HTTPd 2.0.
My bad, turns out the code I got from elsewhere was submitting using the same form which in this case was very large. It made the browser compile everything for submission just for the upoad. And the server-side process had to decompile all of that on its end, too. Combined, I guess that's why it was slow.