I currently have a node application where I am trying to use the getMetricStatistics function of the AWS Cloudwatch sdk to retrieve data from a metric into my application.
In order to troubleshoot this I have run the listMetrics function as follows:
var params = {
MetricName: 'Open',
Namespace: 'AWS/SES',
};
cloudwatch.listMetrics(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
After running the above code, I get output written via console.log:
After this I try to run the getMetricStatistics function as follows:
var cloudwatch = new aws.CloudWatch({apiVersion: '2010-08-01'});
var params = {
EndTime: new Date(2022,1,31), /* required */
MetricName: 'Open', /* required */
Namespace: 'AWS/SES', /* required */
Period: '3600', /* required */
StartTime: new Date(2022,1,27), /* required */
Dimensions: [
{
Name: 'test-open-key', /* required */
Value: 'test-open-value' /* required */
}
/* more items */
],
Statistics: [
'Average',
/* more items */
]
};
cloudwatch.getMetricStatistics(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
However, the output of the above code is as follows, showing no datapoints:
I have based my input parameters based on what I got from listMetrics, and in my console I can see the following graph, meaning there should be at least one datapoint retrieved on the 29th January.
Would anyone be able to advise on what I'm doing wrong/any further avenues for troubleshooting?
When creating the date object the month is 0-indexed(https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date#several_ways_to_create_a_date_object)
Change your time values like this:
StartTime: new Date(2022,0,27)
EndTime: new Date(2022,0,31),
Related
I have a table called Logs, it contains a column called process in which I want to update the data inside by running a migration file, the structure of values is : "worker + ID " or it may contain "master", what I want to do is to replace only the word worker by "subordinate process" and keep the ID or when it's master replace it by "leading process".
My code is running but it completly replace "worker +id" by "subordinate process", does anyone know how to implement replace function or give some help to refactor it ?
async migrateTenant(tenantID: string): Promise<void> {
const transaction: Transaction = HDBUtils.createTransaction(tenantID);
// #ts-ignore
const { Logs } = cds.entities(Constants.CDS_NAMESPACE);
const substitutions = [{ src: 'master', dst: Constants.LEADING_PROCESS }, { src: 'worker', dst: Constants.SUBORDINATE_PROCESS }];
let updated = 0;
for (const substitution of substitutions) {
try {
await transaction.run(UPDATE.entity(Logs).set({ process: substitution.src }).where({ process: { like:substitution.dst } }));
await transaction.commit();
updated++;
} catch (error) {
await HDBUtils.handleHDBError(MODULE_NAME, 'migrateTenant', tenantID, transaction, 'Unable to migrate data', error);
}
}
}
I have a workflow action script that is supposed to search for a string in an email message body (the string being for the document number--this is stored in a field with the id 'custevent_case_creation') and return the transaction record id.
The script:
/**
*#NApiVersion 2.x
*#NScriptType WorkflowActionScript
* #param {Object} context
*/
define(["N/search", "N/record"], function (search, record) {
function onAction(context) {
var recordObj = context.newRecord;
var oc_number = recordObj.getValue({ fieldId: "custevent_case_creation" });
var s = search
.create({
type: "salesorder",
filters: [
search.createFilter({
name: "tranid",
operator: search.Operator.IS,
values: [oc_number],
}),
],
columns: ["internalid"],
})
.run()
.getRange({
start: 0,
end: 1,
});
log.debug("result set", s[0].id);
return s[0].id;
}
return {
onAction: onAction,
};
});
This works as expected when there is a valid document number used in the email message.
However, there are two scenarios where that won't be the case:
there is no document number referenced in the original email (and therefore, the field "custevent_case_creation" will be blank)
the document number referenced is incorrect and there is no transaction with that document number in the system
I am trying to add some form of error handling to deal with these two scenarios though I can't find anything that works. Where should the error handling be in this script?
Should it be an if/else statement?
So far I have tried:
adding if{s.length>0);
adding a condition in the workflow itself so that the custom action from the workflow action script doesn't occur if the field for custevent_case_creation is blank
-
The error message I am getting is:
org.mozilla.javascript.EcmaError: TypeError: Cannot read property "id" from undefined (/SuiteScripts/sdf_ignore/Workflow Action Lookup SO.js#41)
EDIT:
The working code
/**
*#NApiVersion 2.x
*#NScriptType WorkflowActionScript
* #param {Object} context
*/
define(["N/search", "N/record"], function (search, record) {
function onAction(context) {
try {
var recordObj = context.newRecord;
var oc_number = recordObj.getValue({
fieldId: "custevent_case_creation",
});
var s = search
.create({
type: "salesorder",
filters: [
search.createFilter({
name: "tranid",
operator: search.Operator.IS,
values: [oc_number],
}),
],
columns: ["internalid"],
})
.run()
.getRange({
start: 0,
end: 1,
});
log.debug("result set", s[0].id);
return s[0].id;
} catch (error) {
log.debug(
error.name,
"recordObjId: " +
recordObj.id +
", oc_number:" +
oc_number +
", message: " +
error.message
);
}
}
return {
onAction: onAction,
};
});
Try wrapping the contents of you onAction function with try/catch. More info on try/catch can be found here on W3Schools.
try {
//your working code for onAction function
var recordObj = context.newRecord;
var oc_number = recordObj.getValue({ fieldId: "custevent_case_creation" });
var s = search.create({
type: "salesorder",
filters: [
search.createFilter({
name: "tranid",
operator: search.Operator.IS,
values: [oc_number]
})
],
columns: ["internalid"]
})run().getRange({
start: 0,
end: 1,
});
log.debug("result set", s[0].id);
return s[0].id;
} catch(e){
log.debug(e.name,'recordObjId: '+ recordObj.id +', oc_number:'+ oc_number +', message: ' + e.message); //if e.name is empty try e.title
//you can add additional steps here if desired, i.e. send an email, display an alert, etc.
}
I'm using Google Cloud Function to execute a query on bigQuery and store the result if firestore.
My problem is that as soon as I try to use the firestore batch object, the cloud function stop executing.
Using dichotomy, I think it's when I inclue the batch object code that the function suddenly stop working.
I've tried to increase the memory of the function to 1GB without luck. (currently it's using 128mb)
const {BigQuery} = require('#google-cloud/bigquery');
const {Firestore} = require('#google-cloud/firestore');
const bigquery = new BigQuery ();
const firestore = new Firestore ();
const fsCollectionName = 'ul_queteur_stats_per_year';
const queryStr = "the bigquery query";
function handleError(err){
//skipped
}
/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* #param {!Object} event Event payload.
* #param {!Object} context Metadata for the event.
*/
exports.ULQueteurStatsPerYear = (event, context) => {
const pubsubMessage = event.data;
const parsedObject = JSON.parse(Buffer.from(pubsubMessage, 'base64').toString());
console.log("Recieved Message : "+JSON.stringify(parsedObject));
//{ ul_id:parsedObject.ul_id }
const queryObj = {
query: queryStr,
params: {
ul_id: parsedObject.ul_id
}
};
bigquery
.query(queryObj)
.then((data) => {
console.log("Query Successful, # rows : "+data.length+" data[0].length:"+data[0].length);
//rows : [{"amount":367.63,"weight":2399.3,"time_spent_in_minutes":420}]
const rows = data[0];
console.log("Query Successful");
const batch = firestore.batch();
console.log("Batch Created ");
console.log("Getting Collection");
const collection = firestore.collection(fsCollectionName);
console.log("Getting Collection '"+fsCollectionName+"' retrieved");
//#####################################
for(let i=0;i<rows.length;i++)
{
console.log("getting a new DocId");
const docRef = collection.doc();
console.log("Adding to docRef='"+docRef.id+"' : "+JSON.stringify(rows[i]));
batch.set(docRef, rows[i]);
console.log("Added to batch");
}
console.log("Commiting batch insert");
batch.commit().then(() => {
console.log('Successfully executed batch');
});
//#####################################
})
.catch(err => {
handleError(err);
});
};
Expected:
data inserted in Firestore
Actual result :
If I remove the code between the
//#####################################
Then I get each log in stackdriver.
(The first one saying there's 420 rows)
If I let the code between
//#####################################
(or just the batch.commit() part, or just the for loop part)
I only get the first log, and then nothing.
Query Successful, # rows : 1 data[0].length:420
Even if I put the whole code in a try/catch block with a console.log of the exception, I see no error in stack driver.
Solution
the solution is to return the bigquery promise.
So the above code should be changed to :
return bigquery
.query(queryObj)
.then(...);
Thanks Doug for the help !
You need to return a promise that resolves when all the asynchronous work is complete. Right now, you're returning nothing, which means the function will terminate and shut down almost immediately, before your query is done.
You'll need to pay attention to all the promises that your code is using, including the query, and all the batch commits. You can't ignore any promise returned by any API, else the work will be terminated before it's done.
I'm using google cloud big query service with nodejs client version 1.0x . I created a job successfully by function createQueryJob(). After that, I used an event listen when a callback create job response with getQueryResults() such as:
const options = {
query: sqlQuery,
useLegacySql: true,
dryRun: true
};
// this.bigquery is an constructor.
// this.bigquery = new BigQuery({
// projectId: this.projectId,
// keyFilename: this.keyFile,
// });
this.bigquery.createQueryJob(options, function (err, job) {
if (!err) {
// job id such as 731bf23b-5044-4842-894b-4d9f77485d9b
function manualPaginationCallback(err, rows, nextQuery, apiResponse) {
if (nextQuery) {
job.getQueryResults(nextQuery, manualPaginationCallback);
} else {
return Promise.resolve(rows);
}
}
return job.getQueryResults({
maxResults: 100000,
autoPaginate: false,
// timeoutMs : 60000
}, manualPaginationCallback);
}
});
But It throw an error
{"error":{"code":404,"message":"Not found: Job
[myProjectId]:731bf23b-5044-4842-894b-4d9f77485d9b","errors":[{"message":"Not
found: Job
[myProjectId]:731bf23b-5044-4842-894b-4d9f77485d9b","domain":"global","reason":"notFound"}],"status":"NOT_FOUND"}}
refrence
https://cloud.google.com/nodejs/docs/reference/bigquery/1.0.x/BigQuery#createQueryJob
https://cloud.google.com/nodejs/docs/reference/bigquery/1.0.x/Job#getQueryResults
What wrong's with me? Any help. Thank you!
You're setting the dryrun option in your request, which only validates the job but doesn't actually run the query. Dryrun jobs don't persist, which is why you get not found on the subsequent request.
I'm having am issue with an array that seems to be getting populated with my mongoose code by itself. It's making it impossible to populate the array with modified values.
Here's the code:
router.get('/in-progress', function(req, res) {
console.log('exporting');
var dataset = [];
Intake.find({}, function(err, intakes) {
if(err){
console.log(err);
} else {
/*intakes.forEach(function(intake) {
dataset.push(
{
//requestName: intake.requestName,
requestName: 'Request Name',
status: intake.phase
}
)
});*/
return dataset;
}
}).then((dataset) => {
console.log(dataset);
const report = excel.buildExport(
[
{
heading: inProgressHeading,
specification: inProgressSpec,
data: dataset
}
]
);
res.attachment('requests-in-progress.xlsx');
return res.send(report);
});
});
As you can see, the logic to push data to "dataset" is commented out, but the console log is logging every Intake that I have in the MongoDB database. Does anyone know what I am doing wrong so that I can push my own values into "dataset"?