Ive got an problem currently on an mobile site that i'm running directly in my pc's firefox browser. Everytime a button is clicked, the page reloads, thus resetting my variables. I've got this script:
// ==UserScript==
// #name trada.net autoclick 55_1min_mobile
// #namespace airtimeauction auto click
// #include http://www.trada.net/Mobile/
// #version 0.1
// #description Automatically click // ==/UserScript==
var interval = 57000;
var bidClickTimer = setInterval (function() {BidClick (); }, interval);
var numBidClicks = 0;
function BidClick ()
{var bidBtn1=document.getElementById("ctl00_mainContentPlaceholder_AirtimeAuctionItem7_btn_bidNow");
numBidClicks++;
if (numBidClicks > 500)
{
clearInterval (bidClickTimer);
bidClickTimer = "";
}
else
{
bidBtn1.click (1);
}
};
BidClick();
It should click the button every 57 seconds, but the moment it clicks the button, the page reloads, thus resetting the variables. How can i get greasemonkey to "remember" or carry over the variables to the next page/script when it reloads? Will it have something to do with GM_setValue? It will only be this few variables, but the second problem or question wil be, will it subtract the few seconds it takes the page to reload from the "57" seconds? How do i compensate for that?
In addition to GM_setValue...
you also can use the new Javascript "localStorage" object, or a SQL Javascript API.
The advantage of the SQL approach is it is very meager in its resource consumption in a script (think about it; rather than concatenating a humongous string of results, you can tuck away each result and recall it if needed with a precise query. The downside is you have to set up a SQL server, but using something like SQLite that's not a big deal these days. Even postgres or mysql can be quickly spun on a laptop...
Yes, I think you have to use GM_setValue/GM_getValue.
And if you have to do something exactly every 57 seconds, then calculate the time when the next action should take place after the reload, and store it using GM_setValue.
When your script starts, read first if the next action is stored, if it is, use that time to schedule the next action, and calculate the time for the action after that, and so on...
GM.setValue will set a value indefinitely and is scoped to the script, but will work if your script runs across multiple domains.
window.localStorage will set a value indefinitely and is scoped to the domain of the page, so will not work across domains, but will work if you need several GreaseMonkey scripts to access the same value.
window.sessionStorage will set a value only while the window or tab is open and is scoped to only that window or tab for that domain.
document.cookie can set a value indefinitely or only while the browser is open, and can be scoped across subdomains, or a single domain, or a path, or a single page.
Those are the main client-side mechanisms for storing values across page loads that are intended for this purpose. However, there is another method which is sometimes possible (if the page itself is not using it), and can also be quite useful; window.name.
window.name is scoped to the window or tab, but will work across domains too. If you need to store several values, then they can be put into an object and you can store the object's JSON string. E.g. window.name = JSON.stringify(obj)
Related
I am working on a website and trying to test it with Selenium and jUnit. I'm getting race conditions between the test and the site, despite my best efforts.
The front end of the site is HTML and jQuery. The back end (via AJAX) is PHP.
The site
I have two required text input fields (year and age), plus some others that I'm not changing in the tests that give problems. As soon as both text inputs are non-empty, an AJAX call is made to the back end. This will return 0+ results. If 0 results are returned, a results div on the screen gets some text saying that there were no results. If >0 results are returned, a table is written to the results div showing the results.
I don't want the site to wait until e.g. 4 digits' worth of year is entered before doing the AJAX call as it could be looking at ancient history (yes, really). So, as soon as both are non-empty the call should be made. If you type slowly, this means that entering e.g. 2015 will trigger calls for year=2, year=20, year=201 and year=2015. (This is OK.)
The test
I'm using page objects - one for the inputs and one for the output. At the start of the test, I wait for a prompt to be present on the screen (please enter some data) as that is generated by JavaScript that checks the state of the input fields - so I know that the page has loaded and JavaScript has run.
The wait for a prompt is made immediately after the page object is created for the output. This is the relevant method in the page object:
// Wait until the prompt / help text is displayed. Assumes that the prompt text always contains the word "Please"
public void waitForText() {
wait.until(ExpectedConditions.textToBePresentInElementLocated(By.id("resultContainer"), "Please"));
}
The method for setting the year is
public void setYear(String year){
WebElement yearField = driver.findElement(By.id(yearInputId));
if (yearField == null) {
// This should never happen
Assert.fail("Can't find year input field using id " + yearInputId);
} else {
yearField.sendKeys(new String [] {year});
driver.findElement(By.id(ageInputId)).click(); // click somewhere else
}
}
and there's a corresponding one for age.
I have a series of methods that wait for things to happen, which don't seem to have prevented the problem (below). These do things like wait for the current result values to be different from a previous snapshot of them, wait for a certain number of results to be returned etc.
I create a driver for Chrome as follows:
import org.openqa.selenium.chrome.ChromeDriver;
// ...
case CHROME: {
System.setProperty("webdriver.chrome.driver", "C:\\path\\chromedriver.exe");
result = new ChromeDriver();
break;
}
The problem
Some of the time, things work OK. Some of the time, both inputs are filled in with sensible values by the test, but the "there are 0 results" message is displayed. Some of the time, the test hangs part-way through filling in the inputs. It seems to be fine when I'm testing with Firefox, but Chrome often fails.
The fact that there is unpredictable behaviour suggests that I'm not controlling all the things I need to (and / or my attempts to control things are wrong). I can't see that I'm doing anything particularly weird, so someone must have hit these kinds of issue before.
Is there a browser issue I'm not addressing?
Is there something I'm doing wrong in setting the values?
Is there something I'm doing wrong in my test choreography?
It could be that when you start typing, the script is still loading or that there's a pending Ajax call when you start handling the next field or validation.
You could try to synchronize the calls with a low level script :
const String JS_WAIT_NO_AJAX =
"var callback = arguments[0]; (function fn(){ " +
" if(window.$ && window.$.active == 0) " +
" return callback(); " +
" setTimeout(fn, 60); " +
"})();";
JavascriptExecutor js = (JavascriptExecutor)driver;
driver.manage().timeouts().setScriptTimeout(20, TimeUnit.SECONDS);
js.executeAsyncScript(JS_WAIT_NO_AJAX);
driver.findElement(By.Id("...")).sendKeys("...");
js.executeAsyncScript(JS_WAIT_NO_AJAX);
driver.findElement(By.Id("...")).click();
I'm using Aurelia with http-server, as it is described in Aurelia getting started docs. I'm unable to see changes, because any browser seems to cache entire page at first load. When I use F5, ctrl+F5 or ctrl+r, page refreshes but nothing changes, none of my modifications are visible. Then I can use another browser and at first visit changes are visible, but any subsequent visit shows always the first one. It occurs in every browser I use (Chrome and Firefox, ever in private mode). I'm certain that it is not bug in Aurelia itself.
I tried to change port and use http-server with -c parameter. Nothing changed. Any ideas?
If you want to force your Aurelia app to be constantly refreshed after you modify it, you can have a look at the following thread:
https://github.com/aurelia/framework/issues/94
Aaike commented on 8 May 2015:
change your index.html to add the extension right before you import
the aurelia-bootstrapper
<script>
var systemLocate = System.locate;
System.locate = function(load) {
var System = this;
return Promise.resolve(systemLocate.call(this, load)).then(function(address) {
if(address.lastIndexOf("html.js") > -1) return address;
if(address.lastIndexOf("css.js") > -1) return address;
return address + System.cacheBust;
});
};
System.cacheBust = '?bust=' + Date.now();
System.import('aurelia-bootstrapper');
</script>
This is a caching issue that pops up sometimes with http-server. I don't know exactly what causes it but I believe the -c modifier changes the length of cache-control so I would set that as 0 or 1 instead of assigning a port that way.
I have a Brother mutlifunction networked printer/scanner/fax (model MFC-9140CDN). I am trying to use the following code with WIA, to retrieve items scanned in with the document feeder:
const int FEEDER = 1;
var manager=new DeviceManager();
var deviceInfo=manager.DeviceInfos.Cast<DeviceInfo>().First();
var device=deviceInfo.Connect();
device.Properties["Pages"].set_Value(1);
device.Properties["Document Handling Select"].set_Value(1);
var morePages=true;
var counter=0;
while (morePages) {
counter++;
var item=device.Items[1];
item.Properties["Bits Per Pixel"].set_Value(1);
item.Properties["Horizontal Resolution"].set_Value(300);
item.Properties["Vertical Resolution"].set_Value(300);
var img=(WIA.ImageFile)item.Transfer();
var path=String.Format(#"C:\Users\user1\Documents\test_{0}.tiff",counter);
img.SaveFile(path);
var status=(int)device.Properties["Document Handling Status"].get_Value();
morePages = (status & FEEDER) > 0;
}
When the Transfer method is reached for the first time, all the pages go through the document feeder. The first page gets saved with img.SaveFile to the passed-in path, but all the subsequent pages are not available - device.Items.Count is 1, and trying device.Items[2] raises an exception.
In the next iteration, calling Transfer raises an exception -- understandably, because there are now no pages in the feeder.
How can I get the subsequent images that have been scanned into the feeder?
(N.B. Iterating through all the device properties, there is an additional unnamed property with the id of 38922. I haven't been able to find any reference to this property.)
Update
I couldn't find a property on the device corresponding to WIA_IPS_SCAN_AHEAD or WIA_DPS_SCAN_AHEAD_PAGES, but that makes sense because this property is optional according to the documentation.
I tried using TWAIN (via the NTwain library, which I highly recommend) with the same problem.
I have recently experienced a similar error with a HP MFC.
It seems that a property was being changed by the driver. The previous developer of the software I'm working on just kept reinitalisating the driver each time in the for loop.
In my case the property was 'Media Type' being set to FLATBED (0x02) even though I was doing a multi-page scan and needed it to be NEXT_PAGE (0x80).
The way I found this was by storing every property before I scanner (both device and item properties) and again after scanning the first page. I then had my application print out any properties that had changed and was able to identify my problem.
This is a networked scanner, and I was using the WSD driver.
Once I installed the manufacturer's driver, the behavior is as expected -- one page goes through the ADF, after which control is returned to the program.
(Even now, when I use WIA's CommonDialog.ShowSelectDevice method, the scanner is available twice, once using the Windows driver and once using the Brother driver; when I choose the WSD driver, I still see the issue.)
This bug did cost me hours...
So thanks a lot Zev.
I also had two scanners shown in the dialog for physically one machine. One driver scans only the first page and then empties the feeder without any chance to intercept. The other one works as expected.
BTW: It is not needed to initialize the scanner for each page. I call my routines for initialization prior to the Transfer() loop. Works just fine.
Another hickup I ran into was to first initialize page sizes, then the feeder. So if you do not get it to work, try switching the sequence how you change the properties for your WIA driver. As mentioned in the MSDN, some properties also influence others, potentially resetting your changes.
So praise to ZEV SPITZ for the answer on Aug. 09, 2015.
You should instantiate and setup device inside the 'while' loop. See:
const int FEEDER = 1;
var morePages=true;
var counter=0;
while (morePages) {
counter++;
var manager=new DeviceManager();
var deviceInfo=manager.DeviceInfos.Cast<DeviceInfo>().First();
var device=deviceInfo.Connect();
//device.Properties["Pages"].set_Value(1);
device.Properties["Document Handling Select"].set_Value(1);
var item=device.Items[1];
item.Properties["Bits Per Pixel"].set_Value(1);
item.Properties["Horizontal Resolution"].set_Value(300);
item.Properties["Vertical Resolution"].set_Value(300);
var img=(WIA.ImageFile)item.Transfer();
var path=String.Format(#"C:\Users\user1\Documents\test_{0}.tiff",counter);
img.SaveFile(path);
var status=(int)device.Properties["Document Handling Status"].get_Value();
morePages = (status & FEEDER) > 0;
}
I got this looking into this free project, which I believe is able to help you too: adfwia.codeplex.com
My application works like this:
Upload Excel file + convert to DataTable
Start new thread
Begin loop through DataTable
Update UI (Label) to show "Processing row [i] of [n]"
Next
End loop
The bold is what I'm not able to do. I've looked around online for updating UI elements from worker threads, but all the results I can seem to find are for Windows Forms, rather than a web project. Is this possible?
yes, you can do it, and actually it is not difficult. you can use ajax toolbox to do it easily. simply use an updatepanel, and update progress.
check http://ajaxcontroltoolkit.codeplex.com/
an example: http://www.asp.net/ajax/documentation/live/overview/updateprogressoverview.aspx
I found a workaround using jQuery AJAX and asp.NET WebMethods and a session variable.
I used a method from one of my previous questions, by having a WebMethod check on a Session variable that was updated by the worker thread.
Worker thread:
Session["progress"] = "{\"current\":" + (i + 1) + ", \"total\":" + dt.Rows.Count + "}"
WebMethod:
[WebMethod]
public static string GetProgress()
if (HttpContext.Current.Session["progress"] == null) {
return "{\"current\":1,\"total\":1}";
} else {
return HttpContext.Current.Session["progress"];
}
}
my jQuery basically looped calling that AJAX WebMethod every second. It would start on page load and if the current = total then it would display "Completed" and clear the loop, otherwise it shows "Processing row [current] of [total]". I even added a jQuery UI Progressbar
This is kind of a manual solution but it solves my problem, with little overhead. An unexpected but nice piece is that since it is utilizing a Session variable, and the WebMethod checks on page load, if the worker thread is active then the progressbar will show even if you navigate away and come back to the page.
So our app is set up like the standard left frame with the tree, right frame has the main content (loaded from clicking the tree).
Our web app inconsistently displays a blank page in the main frame in Firefox. By inconsistent I mean everyday for a few, rarely for others, never for most. Once we get this, going to any other page through our tree results in a blank page. We found that deleting the "aTreeSaveStateCookie" restores normal operation. "aTree" is the name of our Div. I found "SaveStateCookie" strings in dijit/Tree.js.
This also happens in IE, except I would get a browser error page which I can't recall right now. I would then delete the only cookie I could find for our app (not sure how to do the Firefox steps in IE)
Any ideas on why this would happen?
Thanks
Dojo 1.3 through http://ajax.googleapis.com/ajax/libs/dojo/1.3/dojo/dojo.xd.js
Firefox 3.1x
IE 8
Windows XP
In my case, I don't recall ever changing browser settings around Private Data.
Please check to see if the response code is 413 (413 = request entity too large), usually this happens when the cookie(s) used to store the tree(s) expansion state (aTreeSaveStateCookie) exceed(s) the maximum request size for your server
You could try increasing the maximum request size (follow instructions for your specific web app server) or at least display a meaningful error message like "please clear your browser cache" when the 413 error code is encountered
If the persist property is set to a truthy value, dijit.Tree is persisting its state to remember which nodes were expanded, and expand them after a page reload. If you need to persist the tree state in presence of a very large data structure, I recommend overriding Tree to use localStorage instead of dojo.cookie.
This is Dojo v. 1.9, but similar changes can be done to the non-AMD version 1.3
_saveExpandedNodes: function(){
if(this.persist && this.cookieName){
var ary = [];
for(var id in this._openedNodes){
ary.push(id);
}
// Was:
// cookie(this.cookieName, ary.join(","), {expires: 365});
localStorage.setItem(this.cookieName, ary.join(","));
}
},
And:
_initState: function(){
// summary:
// Load in which nodes should be opened automatically
this._openedNodes = {};
if(this.persist && this.cookieName){
// Was:
// var oreo = cookie(this.cookieName);
var oreo = localStorage.getItem(this.cookieName);
if(oreo){
array.forEach(oreo.split(','), function(item){
this._openedNodes[item] = true;
}, this);
}
}
},