I have a large table (over 50GB).
And I'll make a new table using a query job.
The query result row is not required. I want to be able to wait until the job is completely finished.
const {BigQuery} = require('#google-cloud/bigquery');
const bigquery = new BigQuery();
async function run() {
const [job] = await bigquery.createQueryJob({
// Actually I'm going to use a complex query rather than a simple copy.
query: 'SELECT * FROM `myproject.mydataset.mytable',
destinationTable: {
...
}
});
// I want to wait until the destination table creation is completed.
// But getQueryResults() brings overflow.
const [result] = await job.getQueryResults();
// I'd like to do other things after the destination table is created.
...
}
run();
Like the code above I tried using getQueryResults but it gave JavaScript heap out of memory.
I also tried to use {maxResult:1} option for getQueryResults().
But it gave return before the destination table creation.
For Node.js BigQuery, is there any way to wait until the destination table creation is completed?
Maybe job.on('complete', ...) event can help you. See: https://cloud.google.com/nodejs/docs/reference/bigquery/latest/bigquery/job
You can do something like:
const [job] = await bigquery.createQueryJob(...);
await new Promise((resolve, reject) =>
job
.on('complete', resolve)
.on('error', reject)
);
job.removeAllListeners();
Related
I'm building a nestjs application that uses typeorm to communicate with postgres.
My tables are created dynamically, and data is inserted also dynamically. That's why I use raw query instead of entities.
The problem is that some data in tables are related and I can't insert new data unless previous insert query has finished.
How do I check if query execution finished?
Here is an example of workflow that I use. It works with small data but fails with big data (10 000 000 entries and more)
export class Test {
constructor(
private readonly connection: Connection;
) {}
public async insertData(table1, table2, arr1, arr2) {
await insertInto(table1, arr1);
//I want second insertInto() to be executed after I get confirmation from database that insertInto() from above is finished
await insertInto(table2, arr2);
}
private async insertInto(table, data) {
const queryRunner = this.connection.createQueryRunner();
await queryRunner.connect();
await queryRunner.startTransaction();
const preparedData = [];
//prepare data to be inserted as raw query
//...
try {
await queryRunner.query(`INSERT INTO "${table}" VALUES ${preparedData}`);
await queryRunner.commitTransaction();
} catch (e) {
await queryRunner.rollbackTransaction();
throw new InternalServerErrorException(e, Error while executing custom query. Rollback transaction.)
} finally {
await queryRunner.release();
}
}
}
Desired result is to have some callback for queryRunner.query like this queryRunner.query('raw_sql', (err, res) => {})
Is it possible with typeorm?
Thanks
The way your code is written, the transaction commit will only happen after the insert finishes. Which means that, at that point you can also execute your new query. You don't necessarily need a callback because you're using the async/await syntax.
However, it seems that with very large inserts, something wrong is happening (some sort of query/connection timeout, or server resource fail). Try debugging/printing the error to see what really happened.
I suggest you try to split the insert into multiple batches (of something like 1k records, for example).
I am using nest.js with knex and try to make error handler for my application.
So I can update my data in database using this code:
const updatedProject = await this.knex('projects')
.update('name', body.name) // body.name id new name which come from frontend
.where('id', id) // id also come from frotend and I compare this id and id in database
.returning('*');
return updatedProject[0];
But if I try to update the project whith id which don't exist I have not any errors.
I try catch them by this way but this is didn't work
const updatedProject = await this.knex('projects')
.update('name', body.name)
.where('id', id)
.returning('*')
.catch((error) => console.error(error));
return updatedProject[0];
How can I create and catch error if reqest is wrong?
I have been trying to use the value from the JSON that I have got added successfully using fs.write() function,
There are two test cases in the same fixture, one to create an ID and 2nd to use that id. I can wrote the id successfully in the json file using fs.write() function and trying to use that id using importing json file like var myid=require('../../resources/id.json')
The json file storing correct id of the current execution but I get the id of first test execution in 2nd execution.
For example, id:1234 is stored during first test execution and id:4567 is stored in 2nd test execution. During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
I use it like
t.typeText(ele, myid.orid)
my json file contains only id like {"orid":"4567"}
I am new to Javascript and Testcafe any help would really be appreciated
Write File class
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
I created 2 different classes in order to separate create & update operations and call the functions of create & update order in single fixture in test file
Create Order class
class CreateOrder{
----
----
----
async createNewOrder(){
//get text of created ordder and saved order id in to the json file
-----
-----
-----
const orId= await baseclass.getOrderId();
new WriteIntoFile(orId)
console.log(orId)
-----
-----
-----
}
}export default CreateOrder
Update Order class
var id=require('../../resources/id.json')
class UpdateOrder{
async searchOrderToUpdate(){
await t
***//Here, I get old order id that was saved during previous execution***
.typeText(baseClass.searchBox, id.orderid)
.wait(2500)
.click(baseClass.searchIcon)
.doubleClick(baseClass.orderAGgrid)
console.log(id.ordderid)
----
----
async updateOrder(){
this.searchOrderToUpdate()
.typeText(baseClass.phNo, '1234567890')
.click(baseClass.saveBtn)
}
}export default UpdateOrder
Test file
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
await t
login('id')
await t
.wait(1500)
},{preserveUrl:true})
test('Should be able to create an Order', async t=>{
await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder();
});
I'll reply to this, but you probably won't be happy with my answer, because I wouldn't go down this same path as you proposed in your code.
I can see a couple of problems. Some of them might not be problems right now, but in a month, you could struggle with this.
1/ You are creating separate test cases that are dependent on each other.
This is a problem because of these reasons:
what if Should be able to create an Order doesn't run? or what if it fails? then Should be able to update an order fails as well, and this information is useless, because it wasn't the update operation that failed, but the fact that you didn't meet all preconditions for the test case
how do you make sure Should be able to create an Order always runs before hould be able to update an order? There's no way! You can do it like this when one comes before the other and I think it will work, but in some time you decide to move one test somewhere else and you are in trouble and you'll spend hours debugging it. You have prepared a trap for yourself. I wrote this answer on this very topic, you can read it.
you can't run the tests in parallel
when I read your test file, there's no visible hint that the tests are dependent on each other. Therefore as a stranger to your code, I could easily mess things up because I have no way of knowing about it without going deeper in the code. This is a big trap for anyone who might come to your code after you. Don't do this to your colleagues.
2/ Working with files when all you need to do is pass a value around is too cumbersome.
I really don't see a reason why you need to same the id into a file. A slightly better approach (still violating 1/) could be:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
// use a variable to pass the orderId around
// it's also visible that the tests are dependent on each other
let orderId = undefined;
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl: true})
test('Should be able to create an Order', async t=>{
orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(orderId);
});
Doing it like this also slightly remedies what I wrote in 1/, that is that it's not visible at first sight that the tests are dependent on each other. Now, this is a bit improved.
Some other approaches how you can pass data around are mentioned here and here.
Perhaps even a better approach is to use t.fixtureCtx object:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl:true})
test('Should be able to create an Order', async t=>{
t.fixtureCtx.orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(t.fixtureCtx.orderId);
});
Again, I can at least see the tests are dependent on each other. That's already a big victory.
Now back to your question:
During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
No, it's not weird. You required the file:
var id = require('../../resources/id.json')
and so it's loaded once and if you write into the file later, you won't read the new content unless you read the file again. require() is a function in Node to load modules, and it makes sense to load them once.
This demonstrates the problem:
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// it's been loaded once, you won't get any other value here
console.log(idFile); // { id: 5 }
What you can do to solve the problem?
You can use fs.readFileSync():
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// you need to read the file again and parse its content
const newContent = JSON.parse(fs.readFileSync('id.json'));
console.log(newContent); // { id: 7 }
And this is what I warned you against in the comment section. That this is too cumbersome, inefficient, because you write to a file and then read from the file just to get one value.
What you created is not very readable either:
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
All these operations for writing into a file are in a constructor, but a constructor is not the best place for all this. Ideally you have only variable assignments in it. I also don't see much reason for why you need to create a new class when you are doing only two operations that can easily fit on one line of code:
fs.writeFileSync('orderId.json', JSON.stringify({ orderid: orderId }));
Keep it as simple as possible. it's more readable like so than having to go to a separate file with the class and decypher what it does there.
I am new to the Promise method used to retrieve multiple database records at the same time and I want to rewrite my existing code to use promises
I have this piece of code in Express:
getController.getData = function(req,res, collection, pagerender) {
var id = req.params.id;
collection.find({}, function(err, docs){
if(err) res.json(err);
else res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
console.log(docs);
});
};
Now I want to use promises here, so I can do more queries to the database. Anyone know how I can get this done?
First, check if collection.find({}) returns a promise. If it does, then you can call your code like:
collection.find({}).
then(function(docs){
res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
})
.catch( function(err) {
res.json(err);
})
If you want more calls here, just create new DB call and add another .then block.
I suggest you read the documentation on promises, just to get a general feeling about them (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/then). You will also see how you can handle both success and rejection in the same function if you want.
I am trying to insert multiple images to couchdb via nano using express 4 and formidable. I can access and insert individual files without difficulty using formidable and nano, however, when I try to insert one file after another, I get conflict errors. I am very new to js and node, and I know my understanding of callbacks and asynchronous functions is limited. This is what I have at the moment. Any help would be greatly appreciated.
function uploadImage(req, res) {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
uploadcount = 1;
form.on('field', function(field, value) {
fields.push([field, value]);
})
form.on('file', function(field, file) {
files.push([field, file]);
var docid = fields[0][1];
getrevision();
function getRevision(){
dbn.get(docid, { revs_info: true }, function(err,body, file){
if (!err) {
exrev = body._rev;
insertImage(exrev);
}else{
console.log(err);
}
});
}
function insertImage(exrevision){
var exrev = exrevision;
fs.readFile(file.path, function (err, data) {
if (err){
console.log(err);}else{
var imagename = docid + "_" + uploadcount + ".png";
dbn.attachment.insert(docid,imagename,data,'image/png',
{ rev: exrev }, function(err,body){
if (!err) {
uploadcount++;
}else{
console.log(err);
}
});
};
});
};
});
form.on('end', function() {
console.log('done');
res.redirect('/public/customise.html');
});
form.parse(req);
};
I found a solution by dumping the files first into a temporary directory, then proceeding to insert the files into couchdb via nano all within a single function. I couldn't find a way to pause the filestream to wait for the couchdb response, so this sequential method seems adequate.
This is a problem handling asynchronous calls. Because each attachment insert requires the doc's current rev number, you can't do the inserts in parallel. You must insert a new attachment only after you get a response from the the previous one.
You may use the promise and deferred mechanism to do this. But, I personally have solved a similar problem using a package called "async". In async, you can use async.eachSeries() to make these async calls in series.
Another point is about the revision number, you may just use the lighter weight db.head() function, instead of db.get(). The rev number is presented under the "etag" header. You can get the rev like this:
// get Rev number
db.head(bookId, function(err, _, headers) {
if (!err) {
var rev = eval(headers.etag);
// do whatever you need to do with the rev number
......
}
});
In addition, after each attachment insert, the response from couchdb will look something like this:
{"ok":true,"id":"b2aba1ed809a4395d850e65c3ff2130c","rev":"4-59d043853f084c18530c2a94b9a2caed"}
The rev property will give the new rev number which you may use for inserting to the next attachment.