I have a log stream as below in Cloudwatch and I am trying to extract value from the log
START RequestId: 5ee6cb52-06d3-4552-858e-fd76c46e0e08 Version: $LATEST
[INFO] 2020-07-02T12:50:11.142Z 5ee6cb52-06d3-4552-858e-fd76c46e0e08 { KPI:{ AR:5 } }
END RequestId: 5ee6cb52-06d3-4552-858e-fd76c46e0e08
I am trying write a Metricfilter in AWS CDK to pull out value 5 from the log. The code for the filter as below
const METRIC_AR_JSON_PATTERN = '$.KPI.AR'
new MetricFilter(this, 'DemoMetricFilter', {
metricName: METRIC_NAME,
metricNamespace: METRIC_NAMESPACE,
logGroup: logGroup,
filterPattern: FilterPattern.numberValue(METRIC_AR_JSON_PATTERN, '==', 5),
metricValue: '5'
})
The MetricFilter has been successfully created but the log data has not been pulled. Please advice on what changes I need to do in FilterPattern
The equality operator is = not == in the metric filter : docs and cdk-docs.
const METRIC_AR_JSON_PATTERN = '$.KPI.AR'
new MetricFilter(this, 'DemoMetricFilter', {
metricName: METRIC_NAME,
metricNamespace: METRIC_NAMESPACE,
logGroup: logGroup,
filterPattern: FilterPattern.numberValue(METRIC_AR_JSON_PATTERN, '=', 5),
metricValue: '5'
})
Related
I am unable to get parameters in Lambda function. If I mention parameters value in lambda it works fine. when I remove parameters values from Lambda function and try from API gateway or test lambda it process default parameters values. please help
My Lambda function is :
import boto3
import time
import json
datetime = time.strftime("%Y%m%d%H%M%S")
stackname = 'myec2'
client = boto3.client('cloudformation')
response = client.create_stack(
StackName= (stackname+ '-' + datetime),
TemplateURL='https://testnaeem.s3.amazonaws.com/ec2tags.yaml',
Parameters=[
{
"ParameterKey": "MyInstanceName",
"ParameterValue": " "
},
{
"ParameterKey": "MyInstanceType",
"ParameterValue": " "
}
]
)
def lambda_handler(event, context):
return(response)
My CloudFormation template is:
---
Parameters:
MyInstanceType:
Description: Instance type description
Type: String
MyInstanceName:
Description: Instance type description
Type: String
Resources:
MyInstance:
Type: AWS::EC2::Instance
Properties:
AvailabilityZone: us-east-1a
ImageId: ami-047a51fa27710816e
InstanceType: !Ref MyInstanceType
KeyName: miankeyp
Tags:
- Key : Name
Value : !Ref MyInstanceName
- Key : app
Value : demo
Please help what changes required in the Lambda function.
My test Values are:
{
"MyInstanceName": "demott",
"MyInstanceType": "t2.micro"
}
I modified the code of your lambda function. Please check comments in the code for clarification:
import boto3
import time
import json
datetime = time.strftime("%Y%m%d%H%M%S")
stackname = 'myec2'
client = boto3.client('cloudformation')
def lambda_handler(event, context):
print(event) # to check what your even actually is
# it will be printed out in CloudWatch Logs for your
# function
# you have to check what the event actually looks like
# and adjust event['MyInstanceName'] and event['MyInstanceType']
# in the following code
response = client.create_stack(
StackName= (stackname+ '-' + datetime),
TemplateURL='https://testnaeem.s3.amazonaws.com/ec2tags.yaml',
Parameters=[
{
"ParameterKey": "MyInstanceName",
"ParameterValue": event['MyInstanceName']
},
{
"ParameterKey": "MyInstanceType",
"ParameterValue": event['MyInstanceType']
}
]
)
return(response)
By the way, such function and API gateway can spin up a lot of ec2 instances very quickly. So that you are aware of this.
I have an array of users as below
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
I want to fetch data about the above users(if exists) from Hana database. I am using sap-hdbext-promisfied library in node.js.
My database connection is working fine. So, I am trying to execute a select query as below
async function readUsers(xsaDbConn){
try{
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
const checkuserexiststatement = await xsaDbConn.preparePromisified("SELECT USER_NAME FROM USERS WHERE USER_NAME IN (?)")
let checkuserexistresult = await xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr])
console.log(checkuserexistresult)
return checkuserexistresult
}catch(err){
console.log(err)
return;
}
}
Below is the output I get
PS C:\Users\Documents\XSA\SAC_POC\cap_njs> npm start
> cap_njs#1.0.0 start C:\Users\Documents\XSA\SAC_POC\cap_njs
> node server.js
myapp is using Node.js version: v12.18.3
myapp listening on port 3000
[]
I get an empty array object as output. This is not the expected output, instead it should provide details about the users as they exist in the database.
The above code works when I provide single user value instead of multiple users in an array as shown below
async function readUsers(xsaDbConn, tempxsahdbusers){
try{
let usersarr = 'SAC_XSA_HDB_USER_ABC'
const checkuserexiststatement = await xsaDbConn.preparePromisified("SELECT USER_NAME FROM USERS WHERE USER_NAME IN (?)")
let checkuserexistresult = await xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr])
console.log(checkuserexistresult)
return checkuserexistresult
}catch(err){
console.log(err)
return;
}
}
Output Of Above Code -
PS C:\Users\Documents\XSA\SAC_POC\cap_njs> npm start
> cap_njs#1.0.0 start C:\Users\Documents\XSA\SAC_POC\cap_njs
> node server.js
myapp is using Node.js version: v12.18.3
myapp listening on port 3000
[ 'SAC_XSA_HDB_USER_ABC' ]
So, why is it giving an empty array object when I provide an array as a parameter instead of a variable? Is it possible to provide an array as a parameter to the function statementExecPromisified(statement, []) of sap-hdbext-promisfied library in node.js ?
Your
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
has exactly one value, the String:
"'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"
When passing the userarr in the statementExecPromisified function as a parameter you are actually passing a nested array in an array. You could either try
xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr[0]])
or separate the values in the userarr and add multiple ? in the prepared statement and reference each single value with userarr[x].
I am building a Community Connector between Google Data Studio and SpyFu.com, in order to funnel SEO information for a specific url into the GDS Dashboard.
However, My getData() request only contains the first two fields from my Schema. As you can see, I have four listed in the code. The result is only the first two fields in the schema are printed to GDS.
I've been through tutorials, official documentation, YouTube videos, looked this issue up on google and checked out the community resources on GitHub.
//Step Two: Define getConfig()
function getConfig(request) {
var cc = DataStudioApp.createCommunityConnector();
var config = cc.getConfig();
config.newInfo()
.setId('instructions')
.setText('Give me SpyFu information on the following domain:');
config.newTextInput()
.setId('domain')
.setName('Enter the domain to search')
.setHelpText('e.g. ebay.com')
.setPlaceholder('ebay.com');
config.newTextInput()
.setId('SECRET_KEY')
.setName('Enter your API Secret Key')
.setHelpText('e.g. A1B2C3D4')
.setPlaceholder('A1B2C3D4');
config.setDateRangeRequired(false);
return config.build();
}
//Step Three: Define getSchema()
function getFields(request) {
var cc = DataStudioApp.createCommunityConnector();
var fields = cc.getFields();
var types = cc.FieldType;
var aggregations = cc.AggregationType;
fields.newDimension()
.setId('Keyword')
.setName('Keywords')
.setDescription('The keywords most often attributed to this domain.')
.setType(types.TEXT);
fields.newMetric()
.setId('Rank')
.setName('Rankings')
.setDescription('The ranking of the target site keyword on the Google Search Page.')
.setType(types.NUMBER);
fields.newMetric()
.setId('Local_Monthly_Searches')
.setName('Local Searches per Month')
.setDescription('Number of times, locally, that people have searched for this term within in the last month.')
.setType(types.NUMBER);
fields.newMetric()
.setId('Global_Monthly_Searches')
.setName('Global Searches per Month')
.setDescription('Number of times, globally, that people have searched for this term within in the last month.')
.setType(types.NUMBER);
return fields;
}
function getSchema(request) {
var fields = getFields(request).build();
return { schema: fields };
}
//Step Four: Define getData()
function responseToRows(requestedFields, response, domain) {
// Transform parsed data and filter for requested fields
return response.map(function(Array) {
var row = [];
requestedFields.asArray().forEach(function (field) {
switch (field.getId()) {
case 'Keyword':
return row.push(Array.term);
case 'Rank':
return row.push(Array.position);
case 'Local_Monthly_Searches':
return row.push(Array.exact_local_monthly_search_volume);
case 'Global_Monthly_Searches':
return row.push(Array.exact_global_monthly_search_volume);
case 'domain':
return row.push(domain);
default:
return row.push('');
}
});
return { values: row };
});
}
function getData(request) {
console.log("Request from Data Studio");
console.log(request);
var requestedFieldIds = request.fields.map(function(field) {
return field.name;
});
var requestedFields = getFields().forIds(requestedFieldIds);
// Fetch data from API
var url = [
'https://www.spyfu.com/apis/url_api/organic_kws?q='
+ request.configParams.domain
+ '&r=20'
+ '&p=[1 TO 10]'
+ '&api_key='
+ request.configParams.SECRET_KEY,
];
try {
var response = UrlFetchApp.fetch(url.join(''));
} catch (e) {
DataStudioApp.createCommunityConnector()
.newUserError()
.setDebugText('Failed URL Fetch Attempt. Exception details: ' + e)
.setText('There was an error accessing this domain. Try again later, or file an issue if this error persists.')
.throwException();
}
console.log("Response from API");
console.log(response);
//Parse data from the API
try {
var parsedResponse = JSON.parse(response);
} catch (e) {
DataStudioApp.createCommunityConnector()
.newUserError()
.setDebugText('Error parsing the JSON data. Exception details: ' + e)
.setText('There was an error parsing the JSON data. Try again later, or file an issue if this error persists.')
.throwException();
}
var rows = responseToRows(requestedFields, parsedResponse);
return {
schema: requestedFields.build(),
rows: rows
};
}
I need the GDS to post four columns of data. They are, "Keyword", "Rank", "Local Monthly Searches" and "Global Monthly searches".
I cannot figure out how to create a "fixed schema" so that the system always prints these four columns of data at every request. The tutorials and various documentation say it's possible, but not how to do it. Please help!
The number of metrics initially called up by the Google Community Connector is handled from the front-end, via Google Data Studio.
The back-end system (the Connector) only initially posts the default dimension and default metric. Getting the rest of the schemas to post should be handled when you are building a report on Google Data Studio. Simply click on the data set, select "data" on the right-hand menu, scroll down to either Metrics or Dimensions, and pick the ones you wish to add to the current set.
Note that these are the fields you established earlier in the coding process, when you were setting up your schemas.
Here, you're filtering your defined schema for fields that are present on the request object received by getData().
var requestedFieldIds = request.fields.map(function(field) {
return field.name;
});
var requestedFields = getFields().forIds(requestedFieldIds);
The visualization in Google Data Studio that is the catalyst for the request will determine which fields are requested.
I followed this sample
https://cloud.google.com/bigquery/docs/exporting-data
public function exportDailyRecordsToCloudStorage($date, $tableId)
{
$validTableIds = ['table1', 'table2'];
if (!in_array($tableId, $validTableIds))
{
die("Wrong TableId");
}
$date = date("Ymd", date(strtotime($date)));
$datasetId = $date;
$dataset = $this->bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
// load the storage object
$storage = $this->storage;
$bucketName = 'mybucket';
$objectName = "daily_records/{$tableId}_" . $date;
$destinationObject = $storage->bucket($bucketName)->object($objectName);
// create the import job
$format = 'NEWLINE_DELIMITED_JSON';
$options = ['jobConfig' => ['destinationFormat' => $format]];
$job = $table->export($destinationObject, $options);
// poll the job until it is complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
print('Waiting for job to complete' . PHP_EOL);
$job->reload();
if (!$job->isComplete()) {
//throw new Exception('Job has not yet completed', 500);
}
});
// check if the job has errors
if (isset($job->info()['status']['errorResult'])) {
$error = $job->info()['status']['errorResult']['message'];
printf('Error running job: %s' . PHP_EOL, $error);
} else {
print('Data exported successfully' . PHP_EOL);
}
I have 37670 rows in my table1, and the cloud storage file has 37671 lines.
And I have 388065 my table2, and the cloud storage file has 388066 lines.
The last line in both cloud storage files is empty line.
Is this a Google BigQuery feature improvement request? or I did something wrong in my codes above?
What you described seems like an unexpected outcome. The output file should generally has the same number of lines as the source table.
Your PHP code looks fine and shouldn't be the cause of the issue.
I'm trying reproduce it but unable to. Could you double-check if the last empty line is somehow added by another tool like a text editor or something? How are you counting the lines of the resulting output.
If you have ruled that out and are sure the newline is indeed added by BigQuery export feature, please consider opening a bug using the BigQuery Issue Tracker as suggested by xuejian and include your job ID so that we can investigate further.
the code:
this.sendOperations = function () {
var operation = {
deviceId: '12161',
com_cumulocity_model_WebCamDevice: {
name: 'take picture',
parameters: {
duration: '5s',
quality: 'HD'
}
}
};
c8yDeviceControl.create(operation);
Result:
a new operation will be created in cumulocity server, but in the meantime, the chrome brower on which the app is runing will report some errors, although it looks like the app is still runing after that:
angular.js:9997 TypeError: Cannot read property 'match' of null
at k (deviceControl.js:267)
at wrappedCallback (angular.js:11498)
at wrappedCallback (angular.js:11498)
at angular.js:11584
at Scope.$eval (angular.js:12608)
at Scope.$digest (angular.js:12420)
at Scope.$apply (angular.js:12712)
at done (angular.js:8315)
at completeRequest (angular.js:8527)
at XMLHttpRequest.xhr.onreadystatechange (angular.js:8466)
any suggestion? Thanks
D. Chen