I'm using google Pub/sub to receive a message and trigger a cloud func, that func queries the data of the message in BigQuery, the problem is that in my message I receive UNIX timestamp, and I need to convert this time stamp for bigquery format, otherwise the function can not run my query...
In this part of the function:
exports.insertBigQuery = async (message, context) => {
// Decode base64 the PubSub message
let logData = Buffer.from(message.data, "base64").toString();
// Convert it in JSON
let logMessage = JSON.parse(logData);
const query = createQuery(logMessage);
const options = {
query: query,
location: "US",
};
const [job] = await bigquery.createQueryJob(options);
console.log(`Job ${job.id} started.`);
// Only wait the end of the job. Theere is no row as answer, it's only an insert
await job.getQueryResults();
};
I access the data in the message.
On this part of the function I query in my bigquery:
function createQuery() {
const queryString = `INSERT INTO \`mytable\`(myTS, userTS, registerTS)
VALUES ( #myTS, #userTS, #registerTS);`;
My problem is that I receive the message with UNIX time stamp and when the function runs my query gives me an error. I couldn't find any solution, any help is MUCH appreciated! Thanks in advance!
One way you cand handle this is using the TIMESTAMP_SECONDS function to wrap your values on the insert
INSERT INTO \`mytable\`(myTS, userTS, registerTS)
VALUES ( TIMESTAMP_SECONDS(#myTS), TIMESTAMP_SECONDS(#userTS), TIMESTAMP_SECONDS(#registerTS));
Related
I am trying to update hundreds of database records using the TypeORM library. Problem is that sometimes DUPLICATE ERR is returned from SQL when the bulk upload is performed and stops the whole operation. Is possible to set up TypeORM in a way so duplicate entries are ignored and the insert is performed?
The table is using two primary keys:
This is my insert command (TypeORM + Nestjs):
public async saveBulk(historicalPrices: IHistoricalPrice[]) {
if (!historicalPrices.length) {
return;
}
const repoPrices = historicalPrices.map((p) => this.historicalPricesRepository.create(p));
await this.historicalPricesRepository.save(repoPrices, { chunk: 200 });
}
Thanks in advance
You will have to use InsertQueryBuilder to save the entities instead of repository.save method. InsertQueryBuilder will allow you to call an additional method orIgnore() which will add IGNORE literal into your mysql INSERT statement. From mysql official doc:
When INSERT IGNORE is used, the insert operation fails silently for rows containing the unmatched value, but inserts rows that are matched.
One demerit is obviously that you'll have to now chunk the rows on your own. InsertQueryBuilder doesn't provide any options to chunk the entities. Your code should look like this:
for (let i = 0; i < historicalPrices.length; i += 200) {
const chunk = historicalPrices.slice(i, i + 200);
const targetEntity = this.historicalPricesRepository.target;
await this.historicalPricesRepository
.createQueryBuilder()
.insert()
.into(targetEntity)
.values(chunk)
.orIgnore()
.execute();
}
When I excute this code as Google Sheets Script, my first and subsequent attempts rarely retrieves the data from Binance. Occasionally it will work. Can anyone help?
function BINTickFetch(){
var rows=[],obj_array=null;
try {obj_array=JSON.parse(UrlFetchApp.fetch("https://api.binance.com/api/v3/ticker/price").getContentText());} catch (e) {obj_array=null;}
if (obj_array==null) {
Browser.msgBox("data not received from Binance. Try again");
return false;
}
else {
for (r in obj_array) rows.push([obj_array[r].symbol, parseFloat(obj_array[r].price)]);
var ss=SpreadsheetApp.getActiveSpreadsheet(),sheet=ss.getSheetByName('Binance24h');ss.getRange("Binance24h!A1").setValue(new Date());
try {var range=sheet.getRange(2,1,sheet.getLastRow(),2).clearContent();} catch(e) {Logger.log("error");}
if (rows==null) {Browser.msgBox("incomplete symbol data from Binance. Try again"); return false;}
range=sheet.getRange(2,1,rows.length,2); range.setValues(rows);
}
}
(got the code off the internet somewhere)
I am not familiar with that site and I use Yahoo finance. This will grab historical pricing for MSFT
function importCSVFromWeb() {
// Provide the full URL of the CSV file.
var csvUrl = "https://query1.finance.yahoo.com/v7/finance/download/MSFT?period1=1577806579&period2=1609428979&interval=1d&events=history&includeAdjustedClose=true";
var csvContent = UrlFetchApp.fetch(csvUrl).getContentText();
var csvData = Utilities.parseCsv(csvContent);
var sheet = SpreadsheetApp.getActiveSheet();
sheet.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData);
}
Yahoo uses unix date stamps for the beginning and ending periods
This is one way to convert dates to unix date stamps
function getUnixDateStamp(stdDate){
strDate = stdDate.yyyymmdd();
return Date.parse(strDate)/1000
}
I am looking forward to know how can I run an Azure SQL stored procedure with multiple input parameters from Nodejs.
For example, if I have a stored procedure FIND_USERS(activity_status, color, gender) which runs a query
select * from users where isActive = activity_status and bay_color = color and userGender = gender;
I should be able to call this stored procedure from nodejs with the input parameters. The thing to understand here is that I want have a SQL transaction service that can take any CALL PROCEDURE type command along with the set of input parameters and call the procedure using those input parameters irrespective of the number of input parameters.
What I know is that for MySQL, there is a mysql library which lets me run procedures with multiple parameters. I have encapsulated it in MySQLConnector.js service as
var mysql = require('mysql');
exports.query = function(sql, values, next) {
if (arguments.length === 2) {
next = values;
values = null;
}
var connection = mysql.createConnection({
host:host,
user:user,
password:password,
database:database
});
connection.connect(function(err) {
if (err !== null) {
console.log("[MYSQL] Error connecting to mysql:" + err+'\n');
console.log(err == 'Error: ER_CON_COUNT_ERROR: Too many connections')
if(err == 'Error: ER_CON_COUNT_ERROR: Too many connections'){
connection.end();
}
}
});
connection.query(sql, values, function(err) {
connection.end();
if (err) {
throw err;
}
next.apply(this, arguments);
});
}
With this, I can call a stored procedure from nodejs with a function like
MySQLConnector.query('CALL FIND_USERS (?, ?, ?)', [1, 'blue', 'female'], function(err, userData) {
//do something with userData
});
How is it possible to do this for Azure MS SQL?
You can use tedious driver to connect to SQL Server. It supports both input+output parameter for statements and SPs, you can find the example in http://tediousjs.github.io/tedious/parameters.html
Feels free to raise an issue in GitHub if you need more assistance.
Make use of Edje.js and you should create a function and send the parameters when you call the function.
getHorarioFarmacia({pSede:'Sucursal Parrita'}, function (error, result){....
}
For more details, read the comments made by Luis Diego Pizarro here.
Good morning to everyone,
Please I would be so grateful if you could provide a little help for me. I am already stuck for long time with this issue.
I have a function which does not stop loading my localhost after this function is triggered. I did not figure out so far how to fix it.
Any help would be perfect.
This is my code:
// Copy scene
router.post('/copy', function(req,res,call) {
if( req.param('scene') !== undefined ){
db.serialize(function () {
db.run("CREATE TABLE temp_table as SELECT * FROM scene where id=?", req.param('scene'));
db.run("UPDATE temp_table SET id = NULL, user_id = (SELECT id FROM users WHERE email =?)",GLOBAL.email);
db.run("INSERT INTO scene SELECT * FROM temp_table");
db.run("DROP TABLE temp_table");
if(error) {
console.log(error);
}
});
db.close();
}
});
Thank you so much in advance
Whenever browser sends any request to a server it expects a response. if it doesn't get any response it will be stuck with the timeout.
You need to send the response to terminate the request if you are not doing next execution with callback call or if you are expecting next manipulation then replace res.send() to call(error parameter,success parameter).
router.post('/copy', function(req, res, call) {
if (req.param('scene') !== undefined) {
db.serialize(function() {
db.run("CREATE TABLE temp_table as SELECT * FROM scene where id=?", req.param('scene'));
db.run("UPDATE temp_table SET id = NULL, user_id = (SELECT id FROM users WHERE email =?)", GLOBAL.email);
db.run("INSERT INTO scene SELECT * FROM temp_table");
db.run("DROP TABLE temp_table");
if (error) {
console.log(error);
res.send(error);//send response if error
//or call(error);
}
res.send({message:'success'});//send response if success
//or call(null,whatever you want to pass)
});
db.close();
}
});
You must either call the callback call() to pass on control to next middleware, or render and end the response using res.end() once your middleware logic is done.
Please see: https://expressjs.com/en/guide/writing-middleware.html
I'm using node js 0.10.12 to perform querys to postgreSQL 9.1.
I get the error error invalid input synatx for integer: "{39}" (39 is an example number) when I try to perform an update query
I cannot see what is going wrong. Any advise?
Here is my code (snippets) in the front-end
//this is global
var gid=0;
//set websockets to search - works fine
var sd = new WebSocket("ws://localhost:0000");
sd.onmessage = function (evt)
{
//get data, parse it, because there is more than one vars, pass id to gid
var received_msg = evt.data;
var packet = JSON.parse(received_msg);
var tid = packet['tid'];
gid=tid;
}
//when user clicks button, set websockets to send id and other data, to perform update query
var sa = new WebSocket("ws://localhost:0000");
sa.onopen = function(){
sa.send(JSON.stringify({
command:'typesave',
indi:gid,
name:document.getElementById("typename").value,
}));
sa.onmessage = function (evt) {
alert("Saved");
sa.close;
gid=0;//make gid 0 again, for re-use
}
And the back -end (query)
var query=client.query("UPDATE type SET t_name=$1,t_color=$2 WHERE t_id = $3 ",[name, color, indi])
query.on("row", function (row, result) {
result.addRow(row);
});
query.on("end", function (result) {
connection.send("o");
client.end();
});
Why this not work and the number does not get recognized?
Thanks in advance
As one would expect from the initial problem, your database driver is sending in an integer array of one member into a field for an integer. PostgreSQL rightly rejects the data and return an error. '{39}' in PostgreSQL terms is exactly equivalent to ARRAY[39] using an array constructor and [39] in JSON.
Now, obviously you can just change your query call to pull the first item out of the JSON array. and send that instead of the whole array, but I would be worried about what happens if things change and you get multiple values. You may want to look at separating that logic out for this data structure.