need some help.
I'm receiving messages with data in a PubSub topic, I need to insert the data I get from the message and query in BigQuery with a background cloud function(PUB/SUB)...
What I manage to do:
/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* #param {!Object} event Event payload.
* #param {!Object} context Metadata for the event.
*/
exports.insertBigQuery = (message, context) => {
extractQuery(message.data);
};
function extractQuery(pubSubMessage){
// Decide base64 the PubSub message
let logData = Buffer.from(pubSubMessage, 'base64').toString();
// Convert it in JSON
let logMessage= JSON.parse(logData);
console.log(logMessage.customerToken)
console.log(logMessage.fbclid)
console.log(logMessage.fbc)
console.log(logMessage.fbp)
console.log(logMessage.firstHitTS)
console.log(logMessage.consentFB)
main();
return logMessage
}
"use strict";
function main() {
const { BigQuery } = require("#google-cloud/bigquery");
const bigquery = new BigQuery();
async function query() {
const query = `INSERT INTO MYTABLE( customerToken, fbclid, fbc, fbp, firstHitTS, consentFB)
VALUES ("customerTokenSCRIPTCLOUD","fbclidSCRIPT"," fbcSCRIPTCLOUD"," fbpSCRIPTCLOUD","2021-01-05",TRUE )`;
const options = {
query: query,
location: "US",
};
const [job] = await bigquery.createQueryJob(options);
console.log(`Job ${job.id} started.`);
const [rows] = await job.getQueryResults();
console.log("Rows:");
rows.forEach((row) => console.log(row));
}
query();
}
Now every time I receive a message I query in bigQuery, but my VALUES is hard coded, as you can see here:
const query = `INSERT INTO devsensetestprojects.TestDataSet.fbSimpleData( customerToken, fbclid, fbc, fbp, firstHitTS, consentFB)
VALUES ("customerTokenSCRIPTCLOUD","fbclidSCRIPT"," fbcSCRIPTCLOUD"," fbpSCRIPTCLOUD","2021-01-05",TRUE )`;
What I'm not able to do is to get the values from function extractQuery(pubSubMessage) and use them in my query the same way I use in the function (logMessage.SOMEVALUE) to have the correct values I need.
Thanks in advance!
As you said, you are beginner in development. Here a more concise and efficient code. I didn't tested it but it is closer of what you want. Let me know is some part are mysterious for you!
// Make them global to load them only when the Cloud Function instance is created
// They will be reused in the subsequent processing and until the instance deletion
const { BigQuery } = require("#google-cloud/bigquery");
const bigquery = new BigQuery();
exports.insertBigQuery = async (message, context) => {
// Decode base64 the PubSub message
let logData = Buffer.from(message.data, 'base64').toString();
// Convert it in JSON
let logMessage= JSON.parse(logData);
const query = createQuery(logMessage)
const options = {
query: query,
location: "US",
};
const [job] = await bigquery.createQueryJob(options);
console.log(`Job ${job.id} started.`);
// Only wait the end of the job. Theere is no row as answer, it's only an insert
await job.getQueryResults();
}
function createQuery(logMessage) {
// You maybe have to format correctly the logMessage.firstHitTS to be accepted by BigQuery as a date.
return `INSERT INTO MYTABLE(customerToken, fbclid, fbc, fbp, firstHitTS, consentFB)
VALUES (logMessage.customerToken, logMessage.fbclid, logMessage.fbc, logMessage.fbp,
logMessage.firstHitTS, logMessage.consentFB)`;
}
Related
I want to collect all transactions for an NFT.
For example, you can display all transactions here:
https://explorer.solana.com/address/2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V
or here:
https://solscan.io/token/2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V#txs
But is there any way to do this with the API?
I checked
solana-py: https://michaelhly.github.io/solana-py/
and solscan api: https://public-api.solscan.io/docs/
But I could not find a way to do it.
You can use the getSignaturesForAddress RPC method on the mint address and walk backward to get all the transactions.
Here is an example in JS:
import {
Connection,
clusterApiUrl,
ConfirmedSignatureInfo,
PublicKey,
} from "#solana/web3.js";
const connection = new Connection(clusterApiUrl("mainnet-beta"));
export const getTxs = async (connection: Connection, pubkey: PublicKey) => {
const txs: ConfirmedSignatureInfo[] = [];
// Walk backward
let lastTransactions = await connection.getConfirmedSignaturesForAddress2(
pubkey
);
let before = lastTransactions[lastTransactions.length - 1].signature;
txs.push(...lastTransactions);
while (true) {
const newTransactions = await connection.getConfirmedSignaturesForAddress2(
pubkey,
{
before,
}
);
if (newTransactions.length === 0) break;
txs.push(...newTransactions);
before = newTransactions[newTransactions.length - 1].signature;
}
return txs;
};
getTxs(
connection,
new PublicKey("2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V")
);
The equivalent method in Solana.py is this one https://michaelhly.github.io/solana-py/rpc/api/#solana.rpc.api.Client.get_signatures_for_address
It seems quite new, but just hoping someone here has been able to use nodejs to write directly to BigQuery storage using #google-cloud/bigquery-storage.
There is an explanation of how the overall backend API works and how to write a collection of rows atomically using BigQuery Write API but no such documentation for nodejs yet. A recent release 2.7.0 documents the addition of said feature but there is no documentation, and the code is not easily understood.
There is an open issue requesting an example but thought I'd try my luck to see if anyone has been able to use this API yet.
Suppose you have a BigQuery table called student with three columns id,name and age. Following steps will get you to load data into the table with nodejs storage write api.
Define student.proto file as follows
syntax = "proto2";
message Student {
required int64 id = 1;
optional string name = 2;
optional int64 age = 3;
}
Run the following at the command prompt
protoc --js_out=import_style=commonjs,binary:. student.proto
It should generate student_pb.js file in the current directory.
Write the following js code in the current directory and run it
const {BigQueryWriteClient} = require('#google-cloud/bigquery-storage').v1;
const st = require('./student_pb.js')
const type = require('#google-cloud/bigquery-storage').protos.google.protobuf.FieldDescriptorProto.Type
const mode = require('#google-cloud/bigquery-storage').protos.google.cloud.bigquery.storage.v1.WriteStream.Type
const storageClient = new BigQueryWriteClient();
const parent = `projects/${project}/datasets/${dataset}/tables/student`
var writeStream = {type: mode.PENDING}
var student = new st.Student()
var protoDescriptor = {}
protoDescriptor.name = 'student'
protoDescriptor.field = [{'name':'id','number':1,'type':type.TYPE_INT64},{'name':'name','number':2,'type':type.TYPE_STRING},{'name':'age','number':3,'type':type.TYPE_INT64}]
async function run() {
try {
var request = {
parent,
writeStream
}
var response = await storageClient.createWriteStream(request);
writeStream = response[0].name
var serializedRows = []
//Row 1
student.setId(1)
student.setName('st1')
student.setAge(15)
serializedRows.push(student.serializeBinary())
//Row 2
student.setId(2)
student.setName('st2')
student.setAge(15)
serializedRows.push(student.serializeBinary())
var protoRows = {
serializedRows
}
var proto_data = {
writerSchema: {protoDescriptor},
rows: protoRows
}
// Construct request
request = {
writeStream,
protoRows: proto_data
};
// Insert rows
const stream = await storageClient.appendRows();
stream.on('data', response => {
console.log(response);
});
stream.on('error', err => {
throw err;
});
stream.on('end', async () => {
/* API call completed */
try {
var response = await storageClient.finalizeWriteStream({name: writeStream})
response = await storageClient.batchCommitWriteStreams({parent,writeStreams: [writeStream]})
}
catch(err) {
console.log(err)
}
});
stream.write(request);
stream.end();
}
catch(err) {
console.log(err)
}
}
run();
Make sure your environment variables are set correctly to point to the file containing google cloud credentials.
Change project and dataset values accordingly.
This is the query I am using:
app.get("/items/:data", async (req, res) => {
const { data } = req.params;
query = `
SELECT items.discount
FROM items
WHERE items.discount #? '$[*] ? (#.discount[*].shift == $1)'
`
try {
const obj = await pool.query(query, [data]);
res.json(obj.rows[0])
} catch(err) {
console.error(err.message);
}
});
I get this error:
error: bind message supplies 1 parameters, but prepared statement "" requires 0
I am using node-postgres package in node.js.
How can I solve this issue?
Use bracket notation instead of dot notation. So instead of obj.key use obj[key]
Updated
all them driver connectors come with their own method to do what you're looking for. node-postgres also have there own
Pool
import { Pool } from 'pg';
const pool = new Pool({
host: 'localhost',
user: 'database-user',
max: 20,
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
});
/**
* execs the given sql statement.
*
* #param {string} sql - query to run.
* #param {Array} params - an array with the parameter.
* #example
* runQuery("SELECT * FROM users WHERE id = $1", [1]).then(result=> console.log(result))
*/
export async function runQuery (sql, params) {
const connection = await pool.connect()
try {
await connection.query('BEGIN')
const queryText = 'INSERT INTO users(name) VALUES($1) RETURNING id'
const result = await connection.query(sql,params);
// check what result has
console.log(result);
return connection.query('COMMIT').then(result)
} catch (e) {
await connection.query('ROLLBACK')
throw e;
throw e
} finally {
connection.release()
}
}
Pool Config
config = {
// all valid client config options are also valid here
// in addition here are the pool specific configuration parameters:
// number of milliseconds to wait before timing out when connecting a new client
// by default this is 0 which means no timeout
connectionTimeoutMillis?: int,
// number of milliseconds a client must sit idle in the pool and not be checked out
// before it is disconnected from the backend and discarded
// default is 10000 (10 seconds) - set to 0 to disable auto-disconnection of idle clients
idleTimeoutMillis?: int,
// maximum number of clients the pool should contain
// by default this is set to 10.
max?: int,
}
conclution
so basically the structure of a query should be like or less this
const text = 'INSERT INTO users(name, email) VALUES($1, $2) RETURNING *'
const values = ['brianc', 'brian.m.carlson#gmail.com']
connection
.query(text, values)
.then(res => {
console.log(res.rows[0])
// { name: 'brianc', email: 'brian.m.carlson#gmail.com' }
})
.catch(e => console.error(e.stack))
Hello I'm trying to take data from a sql table but the data that I want to check is into an array, so I need compare the data to check if an user is into the group, the array only have the IDs from users and the specific ID that I want is being bringing to me through the login.
This code is in Typescript.
If you need more information let me know please.
class CompanyController {
async consultCompanys(req: Request, res: Response) {
let response: ResponseModel = new ResponseModel(ECodeResponse.Ok, "", []);
const { UserId } = req.body;
try {
const Companies: any = await pool.query(
`SELECT (CompanyId) From Companies Where Members = '${UserId}'`
);
response.Code = ECodeResponse.Ok;
response.Message = EWarningMessage.Error;
return res.json(response);
} catch (error) {
response.Code = ECodeResponse.Warning;
response.Message = EWarningMessage.Error;
return res.json(response);
}
}
}
I'm a litle oxidated in this kind of consults
I would like to know if I have a context variable like t.ctx.data, is there a way to get that to write the value of t.ctx.data to the TestCafe JSON reporter (or any reporter)?
My code:
// Called within Express.js by a request coming from req
const testMySite = (req, res) => {
process.env.PARAMS = JSON.stringify(req.body)
let testcafe = null;
console.log(`Running test on ports 1341 and 1342`)
createTestCafe('localhost', 1341, 1342, void 0, true)
.then(tc => {
testcafe = tc;
const runner = testcafe.createRunner()
return runner
.src(`${path.dirname(__filename)}/tests/gisTest.js`)
.browsers('firefox:headless')
.reporter('json', 'report.json')
.run()
})
.then(failedCount => {
testcafe.close()
})
res.json({message: `Success! Scraper has begun to process ${req.body}`});
}
My test code:
import { ClientFunction, Selector } from 'testcafe';
const doc = process.env.PARAMS
const newDoc = JSON.parse(process.env.PARAMS)
console.log(`newDoc (from test)`, newDoc)
// const _id = newDoc._id
let data = newDoc.mydata
fixture `My Fixture`
.page('https://www.mysite.co')
.afterEach(async t => {
await t
// how do I get t.ctx.myData into the reporter??
console.log(`t.ctx.myData: `, t.ctx.myData)
})
test(`My Test`, async t => {
const photoIcon = Selector('div#sbtc div.LM8x9c > span')
const photoFieldForPaste = Selector('input#Ycyxxc')
const searchByImageButton = Selector('td#aoghAf > input')
const targetElement = Selector('div#jHnbRc span:nth-child(2) > a')
await t
.wait(1000)
.click(photoIcon)
.typeText(photoFieldForPaste, data, {paste: true})
.click(searchByImageButton)
if(await targetElement.exists && await targetElement.visible) {
await t.ctx.finalData = targetElement.innerText;
}
await t.ctx.finalData = null;
})
Please see the part // how do I get t.ctx.myData into the reporter??.
I am assuming this is the only place where I could potentially get the data from the test into the reporter but I'm not sure exactly how.
If you know how to get the t.ctx.myData variable as shown in the above code to be written to the JSON reporter, I would highly appreciate it.
Even better would be to have a way to send the t.ctx.myData value into the response.
At present, you can add only static metadata to tests and fixtures. This metadata is available in reports. Please refer to the following article to get details: https://devexpress.github.io/testcafe/documentation/guides/basic-guides/organize-tests.html#specify-test-metadata
As for sending dynamic data to the reporter, we keep this feature in mind, however we cannot give any estimates on this. Please track the following issue: https://github.com/DevExpress/testcafe/issues/3584