Using Cypress 6.4 wait for interception but skip first canceled xhr call - xmlhttprequest

I using intercept to wait for the XHR call and be sure the app grid it is loaded to start to do my tests:
cy.visit('/contacts')
cy.intercept('POST', '**/contacts/datatable').as('getContacts')
cy.wait('#getContacts')
cy.log('now I can start...')
The problem is that I for some problem with the api I get first a canceled XHR and then a 200 status with the correct response and cypress get stuck in the cy.wait.
I have try to add some kind of condition like:
cy.wait('#getContacts')
.should('have.property','response.statusCode',200)
.then(cy.log('now I can start...'))
But still the same.
Any ideas to skip the XHR calls until the response it is 200 ?

Try to chain wait with should(). i.e:
cy.wait('#getContacts').its('response.statusCode').should('eq', 200).
according to the cypress documentation, cy.wait will wait until the status code is 200.

Related

(Karate) How to intercept the XHR request response code?

I am testing a login functionality on a 3rd party website. I have this url example.com/login . When I copy and paste this into the browser (chrome), page sometimes load, but sometime does not (empty blank white page).
The problem is that I have to run a script on this page to click one of the elements (all the elements are embedded inside #shadow-root). If the page loads, no problem, script is evaluated successfully. But page sometimes does not load and it returns a 404 in response to an XHR request, and as a result, my * eval(scrip("script") step returns "js eval failed...".
So I found the solution to refresh the page, and to do that, I am considering to capture the xhr request response. If the status code is 404, then refresh the page. If not, continue with the following steps.
Now, I think this may work, but I do not know how to implement karate's Intercepting HTTP Requests. And firstly, is that something doable?
I have looked into documentation here, but could not understand the examples.
https://github.com/karatelabs/karate/tree/master/karate-netty
Meanwhile, if there is another way of refreshing the page conditionally, I will be more than happy to hear about it. Thanks anyone in advance.
First, using JavaScript you should be able to handle shadow roots: https://stackoverflow.com/a/60618233/143475
And the above answer links to advanced examples of executing JS in the context of the current page. I suggest you do some research into that, try to take the help of someone who knows JS, the DOM and HTML well - and you should be find a way to know if the XHR has been made successfully or not - for e.g. based on whether some element on the page has changed etc.
Finally here is how you can do interception: https://stackoverflow.com/a/61372471/143475

Wait for all the ajax requests to finish before returning html in Splash using Lua Script

I am using Splash along with Scrapy to execute some script on the page before scraping it.
Basically, few elements are loaded via AJAX on the the click of the button.
There are multiple AJAX request happening per page. Below is the Lua Script which I am using.
function main(splash)
assert(splash:go(splash.args.url))
splash:wait(1)
local btns = splash:select_all('.buttonShowCo')
for _, btn in ipairs(btns) do
btn:mouse_click()
end
splash:wait(12)
return splash:html()
end
The issue is script misses few dynamic elements. I am assuming that the script return before all the AJAX call finish.
I added a wait time to let all the AJAX calls finish but it is not working.
Is there is a way to wait until all the AJAX call finish?
You can create a variable inside each AJAX request initialy set as "false", then when an AJAX request finish you set it to "true". In another function you create a while loop to check wheter all variables are "true" or not, before doing what your are willing to do

Can scrapy-splash Ignore 504 HTTP Status?

i want to scrap javascript loading web pages, so i use scrapy-splash but some pages so lots of loading time.
like this :
i think [processUser..] things that makes slower.
there are any way to ignore that 504 pages ? because when i set timeout less than 90 , cause 504 gateway error in scrapy shell or spiders.
and can get result html code ( only get 200 ) when time i set is over?
There's a mechanism in splash to abort a request before it starts loading the body which you can leverage using splash:on_response_headers hook. However in your case this hook will only be able to catch and abort the page when the status and the headers are in, and that is after it finishes waiting for the gateway timeout (504). So instead you might want splash:on_request hook to abort the request before it's even sent like so
function main(splash, args)
splash:on_request(function(request)
if request.url:find('processUser') then
request:abort()
end
end)
assert(splash:go(args.url))
assert(splash:wait(.5))
return {
har = splash:har(),
}
end
UPD: Another and perhaps a better way to go about this is to set splash.resource_timeout before any requests take place:
function main(splash, args)
splash.resource_timeout = 3
...
When you are using Splash to render a webpage you are basically using a web browser.
When you ask Splash to render http://example.com:
Splash goes to http://example.com
Splash executes all of the javascript
2.1 javascript makes some requests
2.2 some requests return 50x codes
Splash returns page data
Unforntunately Splash right now does not support any custom rules for blocking javascript requests - it just takes the page and does everything your browser would do without any addons: load everything without question.
All that being said it's highly unlikely that those 50x requests are slowing down your page load, if so it shouldn't be a significant amount.

GET request firing 2 or 3 trying to load 1 page

In my router
Router.get('/login', IndexController.login);
In my controller
exports.login = (req,res,next)=>{
console.log('login get');
res.render('main/login',{pageTitle: 'Login'});
};
The console logs login get twice, meaning that this is called twice. If I remove the render call then it is logged only once. I have been trying to debug for several days now but still cant seem to figure this one out. When using a curl request from another terminal the log is seen only once as well, but using chrome/firefox/IE yields the double or sometimes triple log call. I don't see this behaviour with POST calls however.
-EJS
-EXPRESS
-Node
UPDATE
In chrome dev tools, after inspecting the network tab, I only see 1 GET request being made for the page. It seems that using res.send() only fires the log once however using res.render(....) goes twice or three times.

Kill jquery ajax form submit

I am remotely submitting a form potentially several times in close succession. For this particular case, debouncing is not an option. I am looking for the jQuery equivalent of .abort() for remote forms submitted using .submit() so I can cancel all previous submissions when a new one is made.
Thanks in advance,
Garrett
What I did is attach to the beforeSend event of the form, store the xhr object as part of the form's data, and abort it if a new request is enqueued:
$('form').bind("ajax:beforeSend", function(evt, xhr) {
console.log('Enqueued new xhr request');
var prevXhr = $(this).data('current-xhr');
if (prevXhr) {
prevXhr.abort();
console.log('Aborting previous xhr request');
}
$(this).data('current-xhr', xhr);
});
And you can still use the :remote => true option in the form straightforward.
Maybe you could use jQuery .queue() to queue your submits and .clearQueue() to clear the queue when needed. Just an idea...
I don't understand your problem. If the form is submitted, a request is sent to the server... so, how can you abort that?
If by abort you mean, cancelling the processing of the response, just use a control variable that you increase/decrease to know if there are pending requests.
From the API: http://api.jquery.com/category/ajax/ you can't do that.