I have a number of ajax calls with content being inserted by dojo.place. I have dojo widgets in the new content so I have to execute dojo.parser.parse() after the content is "placed".
The trouble is I cannot find away of executing this. If I put the code on next line I gets executed too soon. I have had to put it in a setInterval command but that is a rubbish solution.
oncomplete event on dojo.place anyone. Help really appreciated.
dojo.place() is synchronous, so calling the parser right after the place call shouldn't be an issue. It sounds like the problem is that you are calling the parser before your XHR completes.
Unfortunately, you need that setTimeout, because the browsers don't render DOM changes synchronously. As explained in Opera blog: Timing and Synchronization in JavaScript:
Here is an example in pseudo-code:
headlineElement.innerHTML = "Please wait...";
performLongRunningCalculation();
headlineElement.innerHTML = "Finished!";
In Internet Explorer and Mozilla, the
text "Please wait..." will never be
shown to the user, as the changes are
rendered only after the whole script
has finished. In Opera, on the other
hand, the "Please wait..." text is
displayed while the calculation is
running.
There are few similar questions in SO:
Ways to increase performance when set big value to innerHTML
Is it normal to have a short delay after .innerHTML = xmlhttp.responseText; ?
If you create a DOM node from your xhrGotten html, say
var d = document.createElement("div");
d.innerHTML = returned_html;
Then you can parse the new div, creating the widgets before using dojo.place to add the new node to the document.
dojo.parser.parse(d);
dojo.place(d, ref_node, "last");
Related
There's a single button element on the page and the following click stream:
let submitClick$ = sources.DOM.select(buttonSel)
.events("click")
.mapTo(true)
.debug(console.log)
Once I click on the button, true is logged, which is correct.
However, when I map the stream, the code inside runs twice:
let submitDeal$ = submitClick$.map(() => {
console.log("Clicked")
// ...
})
No other event handlers should be attached to the button, and the element itself sits inside a div:
button(".btn--add", "Submit")
The usual event.preventDefault() and event.stopPropagation() doesn't make a difference, and inspecting the event does show that it is fired from the same element button.btn--add.
Not really sure what's going on, any ideas are appreciated! Thanks!
Versions:
"#cycle/dom": "^12.2.5"
"#cycle/http": "^11.0.1"
"#cycle/xstream-run": "^3.1.0"
"xstream": "^6.4.0"
Update 1: I triple checked and no JS files are loaded twice. I'm using Webpack that bundles a single app.js file that's loaded on the page (Elixir/Phoenix app). Also when inspecting the button in the Event Listeners tab in Chrome's Developer Tools, it seems that only 1 event handled is attached.
Update 2: Gist with the code
Too little information is given to resolve this problem. However some things come to mind:
You shouldn't use .debug(console.log) but .debug(x => console.log(x)) instead. In fact .debug() is enough, it will use console.log internally.
Then, is the button inside a <form>? That may be affecting the events. In general this question needs more details.
Turns out this was due to a bug in xstream, which was fixed in xstream#7.0.0.
I'm trying to get voiceover to work on safari however, it seems when I tab through elements it doesnt read out the aria-label of the new input box in a certain scenario.
Scenario:
When tabbing onto the next element and the on blur of the current element does something to the dom then it will not read out the aria-label of the next element.
Here is an example
http://plnkr.co/edit/x0c67oIl0wlQEguBIQVZ?p=preview
Notice if you take out the onblur function below then it works fine.
<input id="test" onblur="blurer()" onfocus="focuser()"/>
In this case, the issue isn't the presence of a blurer, but rather the contents of your blurer and corresponding focuser functions. Together these two functions are toggling the hidden state of a nearbye element. This is interupting the announcement. There's a role announcement that also occurs. The full annoucement (when text is populated in the edit text control) should be:
"The edited text" contents selected/unselected, "your aria label", edit text.
The quoted portions are parts you control, the other portions are parts controlled by the OS/VoiceOver's interaction with it, calculated automatically by the state of the control and other aria values.
The announcement we're getting is simply
"The edited text"
So, it's not an issue with the aria-label specifically. But rather, you are causing the entire announcement of the element to be interrupted.
When your blur and focus functions trigger you muck with the VoiceOver's response (or the OS's communication of) these events. Not sure what about your functions is causing this. Regardless, a trick that helps in these circumstances is to add a setTimeout to your code. By separating your function and the actual focus/blur event, you can allow the accessibility APIs to do their thing, before mucking with styles and such on the page. Here is an example that makes your little code snippet work. Just replace the contents of your javascript file with this:
function blurer(){
window.setTimeout(function() {
document.getElementById('myDiv').style.display = 'none';//
}, 0);
}
function focuser(){
window.setTimeout(function() {
document.getElementById('myDiv').style.display = 'block';//
}, 0);
}
In general I like to avoid setTimeouts because they create race conditions. However, setTimeouts of 0 are acceptable, because there is no race condition. You're just decoupling the firing event and the execution of your code by pushing your code to the end of the queue. When hacking around VoiceOver, setTimeout(someFunction, 0) works quite well for a lot of cases.
I am writing selenium test scripts using the industry standard of webdriver waits before interacting with elements, but I still frequently find my tests are failing, and it seems to be due to a race condition.
Here's the example I have been running into lately:
Go to the product catalog page
Apply a filter
Wait for the filter to be applied
Click the save button on the product which loads after the filter is applied
Step number 4 only works if I place a Thread.Sleep() in front of the step - using webdriverwait is not enough. I'm guessing this is because the webdriverwait only waits until the element is attached to the DOM, even though the relevant JavaScript click event has not been added to the element.
How do you get around this issue? Is there an industry standard for dealing with this race condition?
EDIT This was resolved by upgrading to the latest version firefox. Thanks everyone!
As we discovered in comments, updating Firefox to the latest version did the trick.
The code looks really good to me and makes total sense.
What I would try is to move to the element before making a click:
Actions builder = new Actions(WebDriver);
IWebElement saveButton = wait.Until(ExpectedConditions.ElementIsVisible(By.CssSelector(".button-wishlist")));
Actions hoverClick = builder.MoveToElement(saveButton).Click();
hoverClick.Build().Perform();
As we've discovered in comments, the issue is related to the size of the window (the test passed without a Thread.sleep() if the browser window is maximized). This makes me think that if you scroll to the element before making a click it could be enough to make it work:
IWebElement saveButton = wait.Until(ExpectedConditions.ElementIsVisible(By.CssSelector(".button-wishlist")));
((IJavaScriptExecutor)driver).ExecuteScript("arguments[0].scrollIntoView(true);", saveButton);
Actions hoverClick = builder.MoveToElement(saveButton).Click();
hoverClick.Build().Perform();
Take a look at this SO post for custom wait method. Sounds like element presence is not enough of a check in your case because the button may be present at all times in the DOM. What you need is something along the lines of ExpectedConditions.elementToBeClickable().
I am not familiar with the C# API but it looks like there is no built in method to do the same thing as in Java. So you could write a custom wait function that will have checks according to your needs.
The webpage that I'm testing is using knockout. On other pages on our site that are not currently using knockout I'm not having the same problem. The scenario I have is where the page opens, I enter in various required fields and click the save button. At some point between when it enters a value in the last text field and when it clicks the save button the fields that previously had values become cleared out, and thus the script can't continue. Here is an example of the code that I'm running:
driver.findElement(By.id("sku")).clear();
driver.findElement(By.id("sku")).sendKeys(itemNo);
driver.findElement(By.id("desktopThankYouPage")).clear();
driver.findElement(By.id("desktopThankYouPage")).sendKeys(downloadUrl);
driver.findElement(By.id("mobileThankYouPage")).clear();
driver.findElement(By.id("mobileThankYouPage")).sendKeys(mobileDownloadUrl);
driver.findElement(By.id("initialPrice")).clear();
driver.findElement(By.id("initialPrice")).sendKeys(initialPrice);
driver.findElement(By.id("submitSiteChanges")).click();
Like I said, between the time it enters text in the last field and the time it clicks save the fields that previously had text in them get cleared out, and thus my test fails. The problem is it doesn't always happen. Sometimes the test runs fine, other times it doesn't.
I've tried putting Thread.sleep(x); all over the place to see if pausing at certain points would fix the problem. I also have tried using javascript to wait in the background for any ajax that might be running. Also have the implicit wait of driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS). None of it seemingly has made any difference.
I'm running version 2.13 of selenium server and all my tests run on Firefox 8.
Any help on this would be greatly appreciated!
Firefox has a bug which prevents some events from being executed while the browser window is out of focus. This could be an issue when you're running your automation tests - which might be typing even if the window is out of focus.
The point is that knockout model updates are triggered (by default) with the change event. If it's not being executed, it's underlying model won't be up-to-date.
To fix this issue I triggered the change event "manually", injecting javascript into my tests.:
//suppose "element" is an input field
element.sendKeys("value");
JavascriptExecutor jsExecutor = (JavascriptExecutor) driver;
jsExecutor.executeScript("$(arguments[0]).change();", element);
As you might have noticed, I'm using jQuery to trigger the change event. If you're not using jQuery on your app, you can check here how to trigger it using vanilla javascript.
Hope that helps somebody.
I had the exact same problem. I would guess also that your code works fine in Chrome but not firefox, and that it always works when you do it manually?
Anyway, the problem is likely to be that Selenium doesn't really behave the same way as a real user, and doesnt trigger the same events. When you "submit" the form, it sometimes won't have executed the "change" event on a text area, meaning that it won't have changed.
I had the same problem testing Backbone.modelbinding, which uses the "change" event to update the model from the form. Knockout also uses the "change" event, but fortunately it can also use the "keyup" event. See valueUpdate in the docs:
<input data-bind="value: someValue, valueUpdate: 'keyup'" />
Anyway, that reproducibly solved it for me, and didnt need any sleeps once I had that done. The downside is that you'd be running the event more than is necessary in production, in order to make tests work. Another downside is that you if you want to run some code when a value changes, you'll now get one event per keypress instead of one per field change, which sucks sometimes.
There is another solution, which is to make Selenium fire the change event yourself, for example: Selenium IE change event not fired. It's also suboptimal, but what can you do.
You could also try putting the focus on a button before you click it. Don't know if that will work, I haven't tried it.
I was facing the same issue, while using JavaScriptExecutor for sending keys to text fields.
Using below code in IE (same code is working with chrome):
(JavascriptExecutor) driver.executeScript("arguments[0].value = '" + value + "';", element);
After updating the code to simple "sendKeys()" method, it resolved my issue:
element.sendKeys("some text");
The situation is that I have a page that uses some AJAX calls to retrieve content from the server, then puts those results into a chunk of html generated by another script. The problem is, I can't select with watin any of the elements of this new piece of html. It can be viewed in the browser, and comes up when I hit F12 and scan through the code, but still WatiN can't see it.
Is this because WatiN only scans through the html source of the page, and not the current version of the HTML? I think a similar situation would be:
html -
<script type="text/javascript">
$('#foo').html("gak");
</script>
...
<div id="foo">bar</div>
then when I try and assert -
Assert.IsTrue(browser.Div("foo")).ContainsText("gak"));
it will return false.
Any ideas on this? or is my best option to just write a bunch of jQuery, and browser.Eval() it?
I test AJAX pages quite a bit. The key is to wait until the asnyc postback has completed. If you have
Assert.IsFalse(browser.Div("foo")).ContainsText("gak");
browser_action_that_changes_bar_to_gak
>> Here you need to wait <<
Assert.IsTrue(browser.Div("foo")).ContainsText("gak");
In the "wait" section you can do a System.Threading.Thread.Sleep(numberOfMilliseconds) <- this is not the best way, but it is really simple. Once you determine that waiting is what you need to do, a better way to wait is to poll the status rather than way numberOfMilliseconds each time. I believe different AJAX libraries do things differently, but what works for me is really similar to this: http://pushpontech.blogspot.com/2008/04/ajax-issues-with-watin.html
I put the JavaScript into an Eval() in a helper function in the my base Page class rather than having to inject it into every page like the article did.
.
my Base Page class contains:
public bool IsInAsyncPostBack()
{
const string isAsyncPostBackScript = "Sys.WebForms.PageRequestManager.getInstance().get_isInAsyncPostBack()";
return bool.Parse(this.Document.Eval(isAsyncPostBackScript));
}
And then my WaitForAsyncPostback is basically the same as in the linked post, but I added a max wait time. Before going to Page classes (awesome; do it!) I made these static functions somewhere else and it worked too.
This is almost surely a timing issue. The jQuery has not updated when you test. Rather than introducing any artificial pause or wait it's best to wait for something to show that your AJAX has worked as expected.
In this case a WaitUntil should do the job nicely:
Assert.IsTrue(browser.Div("foo")).WaitUntil(c => c.Text.Contains("gak")));
This works for most updates and the like. Another common waiting pattern is on data loading say, where you'd have a spinning wheel displayed. Then you could wait until this wheel is gone with a something like:
WaitUntil(c => c.Style.Display == "none");