How can I use the same value as written in the Json during the same test execution in the testcafe - testing

I have been trying to use the value from the JSON that I have got added successfully using fs.write() function,
There are two test cases in the same fixture, one to create an ID and 2nd to use that id. I can wrote the id successfully in the json file using fs.write() function and trying to use that id using importing json file like var myid=require('../../resources/id.json')
The json file storing correct id of the current execution but I get the id of first test execution in 2nd execution.
For example, id:1234 is stored during first test execution and id:4567 is stored in 2nd test execution. During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
I use it like
t.typeText(ele, myid.orid)
my json file contains only id like {"orid":"4567"}
I am new to Javascript and Testcafe any help would really be appreciated
Write File class
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
I created 2 different classes in order to separate create & update operations and call the functions of create & update order in single fixture in test file
Create Order class
class CreateOrder{
----
----
----
async createNewOrder(){
//get text of created ordder and saved order id in to the json file
-----
-----
-----
const orId= await baseclass.getOrderId();
new WriteIntoFile(orId)
console.log(orId)
-----
-----
-----
}
}export default CreateOrder
Update Order class
var id=require('../../resources/id.json')
class UpdateOrder{
async searchOrderToUpdate(){
await t
***//Here, I get old order id that was saved during previous execution***
.typeText(baseClass.searchBox, id.orderid)
.wait(2500)
.click(baseClass.searchIcon)
.doubleClick(baseClass.orderAGgrid)
console.log(id.ordderid)
----
----
async updateOrder(){
this.searchOrderToUpdate()
.typeText(baseClass.phNo, '1234567890')
.click(baseClass.saveBtn)
}
}export default UpdateOrder
Test file
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
await t
login('id')
await t
.wait(1500)
},{preserveUrl:true})
test('Should be able to create an Order', async t=>{
await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder();
});

I'll reply to this, but you probably won't be happy with my answer, because I wouldn't go down this same path as you proposed in your code.
I can see a couple of problems. Some of them might not be problems right now, but in a month, you could struggle with this.
1/ You are creating separate test cases that are dependent on each other.
This is a problem because of these reasons:
what if Should be able to create an Order doesn't run? or what if it fails? then Should be able to update an order fails as well, and this information is useless, because it wasn't the update operation that failed, but the fact that you didn't meet all preconditions for the test case
how do you make sure Should be able to create an Order always runs before hould be able to update an order? There's no way! You can do it like this when one comes before the other and I think it will work, but in some time you decide to move one test somewhere else and you are in trouble and you'll spend hours debugging it. You have prepared a trap for yourself. I wrote this answer on this very topic, you can read it.
you can't run the tests in parallel
when I read your test file, there's no visible hint that the tests are dependent on each other. Therefore as a stranger to your code, I could easily mess things up because I have no way of knowing about it without going deeper in the code. This is a big trap for anyone who might come to your code after you. Don't do this to your colleagues.
2/ Working with files when all you need to do is pass a value around is too cumbersome.
I really don't see a reason why you need to same the id into a file. A slightly better approach (still violating 1/) could be:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
// use a variable to pass the orderId around
// it's also visible that the tests are dependent on each other
let orderId = undefined;
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl: true})
test('Should be able to create an Order', async t=>{
orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(orderId);
});
Doing it like this also slightly remedies what I wrote in 1/, that is that it's not visible at first sight that the tests are dependent on each other. Now, this is a bit improved.
Some other approaches how you can pass data around are mentioned here and here.
Perhaps even a better approach is to use t.fixtureCtx object:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl:true})
test('Should be able to create an Order', async t=>{
t.fixtureCtx.orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(t.fixtureCtx.orderId);
});
Again, I can at least see the tests are dependent on each other. That's already a big victory.
Now back to your question:
During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
No, it's not weird. You required the file:
var id = require('../../resources/id.json')
and so it's loaded once and if you write into the file later, you won't read the new content unless you read the file again. require() is a function in Node to load modules, and it makes sense to load them once.
This demonstrates the problem:
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// it's been loaded once, you won't get any other value here
console.log(idFile); // { id: 5 }
What you can do to solve the problem?
You can use fs.readFileSync():
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// you need to read the file again and parse its content
const newContent = JSON.parse(fs.readFileSync('id.json'));
console.log(newContent); // { id: 7 }
And this is what I warned you against in the comment section. That this is too cumbersome, inefficient, because you write to a file and then read from the file just to get one value.
What you created is not very readable either:
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
All these operations for writing into a file are in a constructor, but a constructor is not the best place for all this. Ideally you have only variable assignments in it. I also don't see much reason for why you need to create a new class when you are doing only two operations that can easily fit on one line of code:
fs.writeFileSync('orderId.json', JSON.stringify({ orderid: orderId }));
Keep it as simple as possible. it's more readable like so than having to go to a separate file with the class and decypher what it does there.

Related

how to use firestore query result in another query. Kotlin

Tell me please. Here is the base:
"orders" inside this base there are fields ID, number, address, and so on and there is a collection of "carpets" inside the base of carpets there are also different fields, including the field cost (that is, the cost of cleaning this carpet).
now QUESTION:
how to calculate the total cost and write the result in the "orders" database field?
in general it is interesting how to implement it. How to make such queries so that later the result obtained is already written in a different field?
A good approach for a summary field is to define a property on the parent document ("order", in the OP case) and code a write-trigger on the child collection ("order/carpets").
The trigger's job is to determine what sort of write has taken place on the collection and update the parent doc's prop accordingly.
Code something like (very roughly like) the following in your cloud functions folder...
// when carpets are written, update their parent order's "ordersTotal" prop
exports.didUpdateCarpets = functions.firestore.document('orders/{orderId}/carpets').onWrite(async (change, context) => {
const ref = db.collection('orders').doc(orderId);
try {
await runTransaction(db, async (transaction) => {
const doc = await transaction.get(ref);
let ordersTotal = doc.data().ordersTotal;
// modify orderTotal based on the trigger params
const before = change.before.exists ? change.before.data() : null;
const after = change.after.exists ? change.after.data() : null;
if (!before) ordersTotal += after.cost; // created
else if (!after) ordersTotal -= before.cost; // deleted
else ordersTotal += after.cost - before.cost; // modified
transaction.update(ref, { ordersTotal });
});
} catch (e) {
console.log("Transaction failed: ", e);
}
});

Attempting to check if s3 bucket item exists within nested asyncs

I have a Serverless Lambda function that, in response to an S3 s3:ObjectCreated event, tries to check if a separate item exists in an S3 bucket using the following bit of code using the AWS JavaScript SDK:
exports.somethingSomeSomething = async (event) => {
event.Records.forEach(async (record) => {
let tst = await s3.headObject({
Bucket: "mybucket",
Key: "something.gz"
}).promise()
console.log(tst)
})
};
I'm quite rusty with promises in JS, so I'm not sure why this bit of code doesn't work. For reference, it just dies without outputting anything.
However, the following does work:
exports.somethingSomething = async (event) => {
let tst = await s3.headObject({
Bucket: "mybucket",
Key: "something.gz"
}).promise()
console.log(tst)
console.log("RED")
};
How can I get the initial bit of code working, and what am I doing wrong?
It's because your code is async, but the function passed to your forEach loop is also async, so you have an async function invoking another chunk of async code, therefore you lose control of the flow. Whatever is inside forEach will run (although anything after forEach will run before whatever is inside forEach), but it will execute asynchronously and you are unable to keep track of its execution.
But if the code, as I said, will run, why don't you see the results?
Well, that's because Lambda will terminate before that code has the chance to execute. If you run the same piece of code locally, you'll see it will run just fine, but since the original code runs on top of Lambda, you don't have control when it terminates.
You have two options here:
The easiest is to grab the first item in the Records array because s3 events send one and only one event per invocation. The reason it is an array is because the way AWS works (a common interface for all events). Anyways, your forEach is not using anything of the Record object, but still if you wanted to use any properties of it, simply reference the 0th position, like so:
exports.somethingSomeSomething = async (event) => {
const record = event.Records[0]
//do something with record
const tst = await s3.headObject({
Bucket: "mybucket",
Key: "something.gz"
}).promise()
console.log(tst)
};
If you still want to use a for loop to iterate through the records (although, again, unnecessary for s3 events), use a for of loop instead:
exports.somethingSomeSomething = async (event) => {
for (const record of event.Records) {
// do something with record
const tst = await s3.headObject({
Bucket: "mybucket",
Key: "something.gz"
}).promise()
console.log(tst)
}
};
Since for of is just a regular loop, it will use the async from the function it's being executed on, so await is perfectly valid inside it.
More on async/await and for..of

Using promises in Mongoose

I am new to the Promise method used to retrieve multiple database records at the same time and I want to rewrite my existing code to use promises
I have this piece of code in Express:
getController.getData = function(req,res, collection, pagerender) {
var id = req.params.id;
collection.find({}, function(err, docs){
if(err) res.json(err);
else res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
console.log(docs);
});
};
Now I want to use promises here, so I can do more queries to the database. Anyone know how I can get this done?
First, check if collection.find({}) returns a promise. If it does, then you can call your code like:
collection.find({}).
then(function(docs){
res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
})
.catch( function(err) {
res.json(err);
})
If you want more calls here, just create new DB call and add another .then block.
I suggest you read the documentation on promises, just to get a general feeling about them (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/then). You will also see how you can handle both success and rejection in the same function if you want.

Dexie.js table.name isn't working even though the table is under the tables property

I'm wanting to fetch all items from a table into a collection but am getting an error that the table name is undefined. Here is my store:
db.version(1).stores({
users: '++id,',
orgs: '++id,',
applications: '++id'
})
Then later here is my call:
db.orgs.toCollection().count(function (count) {
console.log(count)
})
It gives the following error:
TypeError: Cannot read property 'toCollection' of undefined
But when I stop the debugger at the call and type in db.tables sure enough:
1:Table {name: "orgs", schema: TableSchema, _tx: undefined, …}
_tx:undefined
hook:function rv(eventName, subscriber) { … }
name:"orgs"
Any help is appreciated - thanks.
UPDATE
I noticed that when I seeded the database on initial creation I could fetch the data out. So I copied in that code into my template. It still fails however, so there must be something simple I'm missing, here is that code:
import Dexie from '#/dexie.es.js'
export default {
name: 'ListOrgs',
data: () => {
return {
orgs: []
}
},
methods: {
populateOrgs: async function () {
let db = await new Dexie('myDatabase').open()
db.orgs.toCollection().count(function (count) {
console.log(count)
})
}
},
mounted () {
this.populateOrgs()
}
}
Dexie has two modes
Static - the most common one used in most samples.
Dynamic - Schema is not specified in code.
Static Mode
//
// Static Mode
//
const db = new Dexie('myDatabase');
db.version(1).stores({myTable1: '++'});
db.version(2).stores({myTable1: '++, foo'});
db.myTable1.add({foo: 'bar'}); // OK - dexie knows about myTable1!
Dynamic Mode
//
// Dynamic Mode
//
const db = new Dexie('myDatabase');
// FAIL: db.myTable1.add({foo: 'bar'}); // myTable1 is unknown to the API.
// Here, you must wait for db to open, and then access tables using db.table() method:
db.open().then(db => {
const myTable = db.table('myTable');
if (myTable) {
myTable.add({foo: 'bar'});
}
}).catch(error => {
console.error(error);
});
If omitting any version() specification, Dexie will just try to open any existing database with the same name, no matter version or schema. But it won't create the implicit table properties onto the db instance.
When Dynamic Mode is Useful
Dynamic mode can be useful when building an arbritary database utility that should adapt to any indexedDB database - such as a DB explorer. Dynamic mode can also be useful when the javascript code is by design not aware of the schema (what tables are expected to be queried and what indexes there are).
Benefits with Static Mode
No need to wait for db.open() to complete.
Automatic DB creation when neeeded. No complex app code to deal with database versioning.
Automatic DB population when needed.
Design Patterns in Static Mode
db.js
import Dexie from 'dexie';
//
// Let this module do several things:
//
// * Create the singleton Dexie instance for your application.
// * Declare it's schema (and version history / migrations)
// * (Populate default data http://dexie.org/docs/Dexie/Dexie.on.populate)
//
export const db = new Dexie('myDatabase');
db.version(1).stores({
users: '++id,',
orgs: '++id,',
applications: '++id'
});
db.on('populate', () => {
return db.orgs.bulkAdd([
{'foo': 'bar'},
]);
});
app.js
import {db} from './db';
// Wherever you use the database, include your own db module
// instead of creating a new Dexie(). This way your code will
// always make sure to create or upgrade your database whichever
// of your modules that comes first in accessing the database.
//
// You will not have to take care of creation or upgrading scenarios.
//
// Let Dexie do that for you instead.
//
async function countOrgs() {
return await db.orgs.count();
}

Multiple image attachments to couchdb (nano) with Node.js Express 4.0 & formidable

I am trying to insert multiple images to couchdb via nano using express 4 and formidable. I can access and insert individual files without difficulty using formidable and nano, however, when I try to insert one file after another, I get conflict errors. I am very new to js and node, and I know my understanding of callbacks and asynchronous functions is limited. This is what I have at the moment. Any help would be greatly appreciated.
function uploadImage(req, res) {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
uploadcount = 1;
form.on('field', function(field, value) {
fields.push([field, value]);
})
form.on('file', function(field, file) {
files.push([field, file]);
var docid = fields[0][1];
getrevision();
function getRevision(){
dbn.get(docid, { revs_info: true }, function(err,body, file){
if (!err) {
exrev = body._rev;
insertImage(exrev);
}else{
console.log(err);
}
});
}
function insertImage(exrevision){
var exrev = exrevision;
fs.readFile(file.path, function (err, data) {
if (err){
console.log(err);}else{
var imagename = docid + "_" + uploadcount + ".png";
dbn.attachment.insert(docid,imagename,data,'image/png',
{ rev: exrev }, function(err,body){
if (!err) {
uploadcount++;
}else{
console.log(err);
}
});
};
});
};
});
form.on('end', function() {
console.log('done');
res.redirect('/public/customise.html');
});
form.parse(req);
};
I found a solution by dumping the files first into a temporary directory, then proceeding to insert the files into couchdb via nano all within a single function. I couldn't find a way to pause the filestream to wait for the couchdb response, so this sequential method seems adequate.
This is a problem handling asynchronous calls. Because each attachment insert requires the doc's current rev number, you can't do the inserts in parallel. You must insert a new attachment only after you get a response from the the previous one.
You may use the promise and deferred mechanism to do this. But, I personally have solved a similar problem using a package called "async". In async, you can use async.eachSeries() to make these async calls in series.
Another point is about the revision number, you may just use the lighter weight db.head() function, instead of db.get(). The rev number is presented under the "etag" header. You can get the rev like this:
// get Rev number
db.head(bookId, function(err, _, headers) {
if (!err) {
var rev = eval(headers.etag);
// do whatever you need to do with the rev number
......
}
});
In addition, after each attachment insert, the response from couchdb will look something like this:
{"ok":true,"id":"b2aba1ed809a4395d850e65c3ff2130c","rev":"4-59d043853f084c18530c2a94b9a2caed"}
The rev property will give the new rev number which you may use for inserting to the next attachment.