I'm calling Ext.create('Rally.data.WsapiDataStore', params), and looking for results with the load event.
I'm requesting a number of objects across programs that the user may or may not have read permission for.
This works fine for queries where the user has permissions. But in the case where the user does not have permission and presumably gets zero results back, the load event does not seem to fire at all. I would expect it to fire with the unsuccessful flag or else to return with empty results.
Since I don't know that the request has failed, my program waits and waits. How can I tell if a this request fails to return because of security?
BTW, looking at the network stats, I believe all my requests get a "200 OK" status back.
Here is the method I use to create the various data stores:
_createDataStore: function(params) {
this.openRequests++;
var createParams = {
model: params.type,
autoLoad: true,
// So I can later determine which query type it is, and which program
requestType: params.requestType == undefined ? params.type : params.requestType,
program: this.program,
listeners: {
load: this._onDataLoaded,
scope: this
},
filters: params.filters,
pageSize: params.pageSize,
fetch: params.fetch,
context: {
project: this.project,
projectScopeUp: false,
projectScopeDown: true
},
pageSize: 1 // We only need the count
};
console.log('_createDataStore', this.program, createParams.requestType);
Ext.create('Rally.data.WsapiDataStore', createParams);
},
And here is the _onDataLoaded method:
_onDataLoaded: function(store, data, successB) {
console.log('_onDataLoaded', this.program, successB);
...
I only see this function called for those queries for which the account has permissions.
Are you getting any request for Defect.js or HierarchicalRequirement.js? When I simulate the issue you are seeing the request for TypeDefinition.js fails when it is building the model because the user doesn't have access to the specified project. This seems like a little bug to me. You should be able to work around it by explicitly fetching the model for a type for a specified workspace and then using that in your store.
Rally.data.ModelFactory.getModels({
types: ['Defect', 'UserStory'], //more types, etc...
context: Rally.environment.getContext().getDataContext(), //use workspace
success: function(models) {
//your code here
}
});
Related
I am trying to open my url using Nightwatch and I wan't able to close the browser afterwards.
I tried using timeouts, as well as browser.end(), or browser.closeWindow(). None of them seem to be working for my url.
module.exports = {
'Demo test mywrkouts' : function (browser) {
browser.url('https://www.mywrkouts.com/workouts/search')
browser.timeouts('script', 10000, function(result) {
browser.end();
console.log("Test result"+result);
});
//browser.closeWindow();
}
};
It opens the page, but doesn't close the browser. I am using Chrome browser with chromedriver. I am expecting to close the window, but it doesn't work.
Any advice is appreciated.
LE: Like I extensibly described below, you don't need to explicitly close the browser at the end of the test (via browser.end()) as the Nightwatch test-runner does that for you at the end of each feature-file.
But, if you need to do some teardown operations and then explicitly close the session, do it in an after (or afterEach) hook. Try the following snippet:
module.exports = {
before(browser) {
browser.maximizeWindow();
},
'My Wrkouts Test': (browser) => {
browser.url('https://www.mywrkouts.com/');
// Check if the website logo is visible:
browser.expect.element('#barbell-homepage-top-image-desktop img.app-bar-desktop-logo').to.be.visible;
// Check the articles heading text:
browser.expect.element('h3.blog-carousel-title.primary-blue-text.center').text.to.contain('Foundational Education Series');
},
after(browser, done) {
browser.end(() => {
console.info('*--*--*--*--*--*--*--*--*--*--*--*--*');
console.info('*-- Clossing session... Good bye! --*');
console.info('*--*--*--*--*--*--*--*--*--*--*--*--*');
done();
});
}
};
Anyways, I feel you are confusing the way NightwatchJS/WebdriverIO/Protractor (or any other Webdriver-based test solution) is handling a browser session.
First off, you need not worry about closing the active session. Nightwatch does it for you at the end of each test feature-file. Thus, running a suit of let's say three test suites (login.js, register.js, forgot_password.js) will sequentially spawn & close three different browser sessions.
Also, browser.closeWindow() is only used for closing a window instance (taking into account that you have multiple windows associated with the same browser session). It won't close your main window, unless you have switched to another window instance (which was previously opened during your test run).
If you use browser.end() in the middle of your test, then you basically kill the active session, nullifying the following logic from your feature-file:
INFO Request: DELETE /wd/hub/session/4a4bb4cb1b38409ee466b0fc8af78101
- data:
- headers: {"Content-Length":0,"Authorization":"Basic Z29wcm86YmM3MDk2MGYtZGE0Yy00OGUyLTk5MGMtMzA5MmNmZGJhZTMz"}
INFO Response 200 DELETE /wd/hub/session/4a4bb4cb1b38409ee466b0fc8af78101 (56ms) { sessionId: '4a4bb4cb1b38409ee466b0fc8af78101',
status: 0,
value: null }
LOG → Completed command end (57 ms)
Everything after will look like this:
INFO Response 404 POST /wd/hub/session/null/elements (11ms) { sessionId: 'null',
value:
{ error: 'invalid session id',
message: 'No active session with ID null',
stacktrace: '' },
status: 6 }
!Note: There is no support for doing what you are trying to do, nor is it a common use-case, thus the lack of support for it across
all of these testing solutions.
They say a picture is worth 1000 words, so let's me simply put it this way... what you are trying to do is synonymous with the following:
Write the server part of the app in Parse server, and the job keeps executing over and over again.
Here is the code:
var cloudRequest = {
"U": "jjj",
"T": "ssss",
"D": "tttt"
};
Parse.Cloud.run('joinUTT', cloudRequest, {
success: function(result) {
console.log("Done with joinUTT");
},
error: function(error) {
console.log("Error after joinUTT");
}
});
Any idea how to make it run just once?
Thanks!
I ran into this problem before - really hard to track down! Here's what has helped me:
In your Cloud Code make sure to explicitly call response.success() and response.error().
If you have no results to return, still define your Cloud Code function with (request, response) and call response.success(""); It is key to include "".
My guess is that in absence of explicit success/error Parse continues to retry until it gets one of these results.
I'm trying to use checkit with bookshelf and after adding the checkit rules, intentionally violating them, my promise#catch block doesn't seem to be properly catching the errors. (I could also be totally misunderstanding the use of catch here)
var validationRules = new Checkit({
email: 'required',
password: 'required'
});
var User = bookshelf.Model.extend({
tableName: 'users',
initialize: function() {
this.on('saving', this.validateSave);
},
validateSave: function() {
validationRules.run(this.attributes);
}
});
User.forge({}).save().then(function(validated) {
console.log('this shouldnt trigger');
}).catch(function(err) { // this doesnt seem to be working the way I expect
console.log(e.message);
});
When I create the empty user object, I'm getting the following unhandled error stacktrace, and also seeing a DB query being built (which may be a separate question for the bookshelf project and what happens when you hook into the 'saving' event)
Possibly unhandled Checkit Errors - email: The email is required; password: The password is required
at checkit/checkit.js:105:23
at tryCatch1 (bluebird/js/main/util.js:45:21)
at Promise._callHandler (bluebird/js/main/promise.js:597:13)
at Promise._settlePromiseFromHandler (bluebird/js/main/promise.js:607:18)
at Promise._settlePromiseAt (checkit/node_modules/bluebird/js/main/promise.js:769:18)
at Promise._settlePromises (checkit/node_modules/bluebird/js/main/promise.js:884:14)
at Async._drainQueue (checkit/node_modules/bluebird/js/main/async.js:98:12)
at Async._drainQueues (checkit/node_modules/bluebird/js/main/async.js:103:10)
at Async.drainQueues (checkit/node_modules/bluebird/js/main/async.js:37:14)
at process._tickCallback (node.js:415:13)
{ __cid: '__cid1',
method: 'insert',
options: undefined,
bindings: [],
sql: 'insert into `users` () values ()' }
ER_NO_DEFAULT_FOR_FIELD: Field 'email' doesn't have a default value
I have 2 questions about this:
Since I have debug: true turned on in my knex config, the block between the stacktrace and the ER_NO_DEFAULT_FOR_FIELD seems to be prepared SQL statements. Given that I introduced Checkit to catch validation errors on the Model level, why is SQL still being executed?
Am I using the #catch block in the correct manner? And if so, why do I still get unhandled error stacktraces. (It looks as though the e.message resulting from the #catch function is actually coming from MySQL rather than from Checkit) If not, what is the correct way to handle errors more gracefully here?
My main sources of information so far have been the bookshelf.js docs(http://bookshelfjs.org/), and the Checkit repo (https://github.com/tgriesser/checkit)
Checkit returns promises, promises work with each-other using return values so by not having return after checkit.run you're not letting bookshelf know when the validation is complete.
Bluebird (the underlying promises) are letting you know you might have a rejection you're not aware of. In order to correct the code you need to change:
validationRules.run(this.attributes);
To:
return validationRules.run(this.attributes);
In your validateSave function so the promise can chain.
Is there a way to output the json-string read by my store in sencha touch 2?
My store is not reading the records so I'm trying to see where went wrong.
My store is defined as follows:
Ext.define("NotesApp.store.Online", {
extend: "Ext.data.Store",
config: {
model: 'NotesApp.model.Note',
storeId: 'Online',
proxy: {
type: 'jsonp',
url: 'http://xxxxxx.com/qa.php',
reader: {
type: 'json',
rootProperty: 'results'
}
},
autoLoad: false,
listeners: {
load: function() {
console.log("updating");
// Clear proxy from offline store
Ext.getStore('Notes').getProxy().clear();
console.log("updating1");
// Loop through records and fill the offline store
this.each(function(record) {
console.log("updating2");
Ext.getStore('Notes').add(record.data);
});
// Sync the offline store
Ext.getStore('Notes').sync();
console.log("updating3");
// Remove data from online store
this.removeAll();
console.log("updated");
}
},
fields: [
{
name: 'id'
},
{
name: 'dateCreated'
},
{
name: 'question'
},
{
name: 'answer'
},
{
name: 'type'
},
{
name: 'author'
}
]
}
});
you may get all the data returned by the server through the proxy, like this:
store.getProxy().getReader().rawData
You can get all the data (javascript objects) returned by the server through the proxy as lasaro suggests:
store.getProxy().getReader().rawData
To get the JSON string of the raw data (the reader should be a JSON reader) you can do:
Ext.encode(store.getProxy().getReader().rawData)
//or if you don't like 'shorthands':
Ext.JSON.encode(store.getProxy().getReader().rawData)
You can also get it by handling the store load event:
// add this in the store config
listeners: {
load: function(store, records, successful, operation, eOpts) {
operation.getResponse().responseText
}
}
As far as I know, there's no way to explicitly observe your response results if you are using a configured proxy (It's obviously easy if you manually send a Ext.Ajax.request or Ext.JsonP.request).
However, you can still watch your results from your browser's developer tools.
For Google Chrome:
When you start your application and assume that your request is completed. Switch to Network tab. The hightlighted link on the left-side panel is the API url from which I fetched data. And on the right panel, choose Response. The response result will appear there. If you have nothing, it's likely that you've triggered a bad request.
Hope this helps.
Your response json should be in following format in Ajax request
{results:[{"id":"1", "name":"note 1"},{"id":"2", "name":"note 2"},{"id":"3", "name":"note 3"}]}
id and name are properties of your model NOte.
For jsonp,
in your server side, get value from 'callback'. that value contains a name of callback method. Then concat that method name to your result string and write the response.
Then the json string should be in following format
callbackmethod({results:[{"id":"1", "name":"note 1"},{"id":"2", "name":"note 2"},{"id":"3", "name":"note 3"}]});
I am trying to use the following to do a cross-domain get:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
preventCache: true,
load: dojo.hitch( this, loadFunction ),
error: dojo.hitch( this, function() {
console.log('Error!!!');
})
});
The load function runs fine, however, when the server returns a 404, the error function does not run. Can anyone tell me why?
EDIT
After some investigation, I found that a timeout and handler could be implemented in the following way:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
timeout: 2000
}).then(function(data){
console.log(data);
}, function(error){
alert(error);
});
This uses functionality provided by the dojo.Deferred object.
When accessing server with script tags (that what dojo.io.script.get does), status code and headers are not available.
You may try some other ways to detect a problem, like using a timeout and analyzing a content of a script. The latter is problematic for JSONP calls (like in your example).
I realize this is old but I thought I'd share a solution in case others, like I had, come across this thread.
dojo.io.script is essentially adding a <script/> to your html page. So you can try this:
var script = document.createElement('script');
script.setAttribute('type', 'text/javascript');
script.setAttribute('src', myUrl);
script.onerror = function() {
debugger
}
script.onload = function() {
debugger
}
document.getElementsByTagName('body')[0].appendChild(script);
That way if the script fails to load the onerror event is called.
*This may not work in every instance but is a good start