Update BigQuery scheduled query with notificationPubsubTopic fails - google-bigquery

I am using the DataServiceTransferClient API/SDK for Node to create scheduled queries in BigQuery with a notificationPubsubTopic. Creating them works fine, no issues. Updating them results in an error:
INVALID_ARGUMENT: notificationPubsubTopic cannot be updated.
How I'm calling it:
const config = {
transferConfig: {
/* other config options */
notificationPubsubTopic: "projects/engineering/topics/test"
},
updateMask: {
paths: [
"params.query",
"params.write_disposition",
"params.destination_table_name_template",
"schedule",
"notificationPubsubTopic"
],
},
}
dataTransferClient.updateTransferConfig(config)
Some other info:
The topics I've tested with do exist. I can update the scheduled query in the UI to these other topic with no issue.
Fails even when re-using the already associated topic.
Updates without notificationPubsubTopic succeed. By this I specifically mean I am not passing the notificationPubsubTopic property and have removed it from the updateMask.

The updateMask property needed to be turned into snakecase.
updateMask: {
paths: [
"params.query",
"params.write_disposition",
"params.destination_table_name_template",
"schedule",
"notification_pubsub_topic" // <--- here
],
},
The documentation even shows an example of using camelCasing
https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs/patch#body.QUERY_PARAMETERS.update_mask

Related

How to run a SQL query in Cloud Formation template to enable Delayed_Durability in AWS RDS

I have a Cloud Formation template to create a SQL DB in the RDS and want to enable Delayed_Durability feature by default in it by running this query:
ALTER DATABASE dbname SET DELAYED_DURABILITY = FORCED;
Is there a way to run this query right after db instance is created through CF template?
My CF template looks like this:
"Type":"AWS::RDS::DBInstance",
"Properties":{
"AllocatedStorage":"200",
"AutoMinorVersionUpgrade":"false",
"BackupRetentionPeriod":"1",
"DBInstanceClass":"db.m4.large",
"DBInstanceIdentifier":"mydb",
"DBParameterGroupName": {
"Ref": "MyDBParameterGroup"
},
"DBSubnetGroupName":{
"Ref":"dbSubnetGroup"
},
"Engine":"sqlserver-web",
"EngineVersion":"13.00.4422.0.v1",
"LicenseModel":"license-included",
"MasterUsername":"prod_user",
"MasterUserPassword":{ "Ref" : "dbpass" },
"MonitoringInterval":"60",
"MonitoringRoleArn": {
"Fn::GetAtt": [
"RdsMontioringRole",
"Arn"
]
},
"PreferredBackupWindow":"09:39-10:09",
"PreferredMaintenanceWindow":"Sun:08:58-Sun:09:28",
"PubliclyAccessible": false,
"StorageType":"gp2",
"StorageEncrypted": true,
"VPCSecurityGroups":[
{
"Fn::ImportValue":{
"Fn::Sub":"${NetworkStackName}-RDSSecGrp"
}
}
],
"Tags":[
{
"Key":"Name",
"Value":"my-db"
}
]
}
}
Is there a way to run this query right after db instance is created through CF template?
Depends. If you want to do it from within CloudFormation (CFN) then sadly, you can't do this using plain CFN. To do it from CFN, you would have to develop a custom resource. The resource would be in the form of lambda function. You would pass the DB details to the function in your CFN, and it could run and execute your query. It could also return any results you want to your CFN for further use.
In contrast, if you create your CFN stack using AWS CLI or SDK, then once create-stack call is completed, you can run your query from bash or any programming language you use do deploy your stack.

AWS Cognito User Migration - Exception during user migration

I have created userpool and trying to migrate user from RDS which invokes lambda function that returns the updated event object. but its not working for me.
I have followed as provided solution by removing below 2 fields, still not working .. :(
"desiredDeliveryMediums": "EMAIL",
"forceAliasCreation": "false"
Here is the response object that am sending from lambda. still facing same issue - Exception during user migration
Please let me know what am missing here. Thanks in advance
def lambda_handler(event, context):
print event
event["response"] = {
"userAttributes": {
"email": event["userName"],
"email_verified": "true",
},
"finalUserStatus": "CONFIRMED",
"messageAction": "SUPPRESS",
"desiredDeliveryMediums": "EMAIL",
"forceAliasCreation": "false"
}
print event
return event
I was having this problem, and I overcame it by increasing the memory allocated to the lambda from the default 128MB to 1024MB. I am using cdk to deploy, so I did this in the lamdba creation:
const nodeUserMigration = new NodejsFunction(this, 'myLambdaName', {
entry: path.join(
__dirname,
'userMigration.ts'
),
runtime: Runtime.NODEJS_18_X,
timeout: Duration.minutes(5),
memorySize: 1024, // This is what I added to overcome the `UserNotFoundException: Exception migrating user in app client (redactedClientId)`
environment: {
// redacted environment variables
},
});
Instead of
return event
You need
context.succeed(event)
It is probably possible to use return event directly; however, there would be other properties required to get Cognito to recognize it (things such as isBase64Encoded) and I don't know what they might be. Neither does Amazon have any documentation on them.
Oh, and desiredDeliveryMediums should be an array of strings.

Creating a titled Google Sheets results in a "Proto field" error when using the NodeJs client library

I am trying to create a Google Spreadsheet using a NodeJs backend and the Google Sheets v4 API.
I was following the spreadsheets.create tutorial in documentation. However, when I create the file using some specified properties, I always get the following error:
Error: Invalid JSON payload received. Unknown name "title" at 'spreadsheet.properties': Proto field is not repeating, cannot start list.
In the tutorial nothing, is mentioned about a "Proto" field. Is this a bug or am I missing something?
Creating the file does work, if I don't specify properties. However the properties are used to set a name for the file and the sheets, so I do need a way to set this metadata.
Here is the request I am sending with the properties included:
const request = {
auth,
resource: {
properties: {
title: name,
},
sheets: [
{
properties: {
title: 'General',
},
},
],
},
};

sails-rabbitmq adapter integration with mongodb issue

I have setup a sailsjs project and trying to access rabbitmq using sails-rabbitmq adapter. I have followed https://www.npmjs.com/package/sails-rabbitmq .
I want to use mongodb with rabbitmq. problem is when i 'sails lift' i get this error.
error: A hook (orm) failed to load!
error: Error: One of your models (message) refers to multiple datastores.
Please set its configured datastore to a string instead of an array in its model definition (.connection) or the app-wide default (sails.config.models.connection)
(this is conventionally set in your config/models.js file, or as part of your app's environment-specific config).
at constructError (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\lib\construct-error.js:57:13)
at validateModelDef (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\lib\validate-model-def.js:97:11)
at C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\lib\initialize.js:218:36
at arrayEach (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\lodash\index.js:1289:13)
at Function.<anonymous> (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\lodash\index.js:3345:13)
at Array.async.auto._normalizeModelDefs (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\lib\initialize.js:216:11)
at listener (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\node_modules\async\lib\async.js:605:42)
at C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\node_modules\async\lib\async.js:544:17
at _arrayEach (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\node_modules\async\lib\async.js:85:13)
at Immediate.taskComplete (C:\Users\demoapp\AppData\Roaming\npm\node_modules\sails\node_modules\sails-hook-orm\node_modules\async\lib\async.js:543:13)
at processImmediate [as _immediateCallback] (timers.js:383:17)
I have > connection: [ 'rabbitCluster', 'regularMongo' ]
in my Message model. regularMongo is mongodb connection. Please let me know what other configuration i am missing.
With following config I do not see any error. In sails.config.models set
module.exports.models = {
connection: 'someMongodbServer',
migrate: 'safe'
};
in Message.js set
module.exports = {
connection: [ 'rabbitCluster', 'someMongodbServer' ],
routingKey: [ 'parentMessage' ],
attributes: {
title: 'string',
body: 'string',
parentMessage: {
model: 'message'
}
}
};

No result when Rally.data.WsapiDataStore lacks permissions

I'm calling Ext.create('Rally.data.WsapiDataStore', params), and looking for results with the load event.
I'm requesting a number of objects across programs that the user may or may not have read permission for.
This works fine for queries where the user has permissions. But in the case where the user does not have permission and presumably gets zero results back, the load event does not seem to fire at all. I would expect it to fire with the unsuccessful flag or else to return with empty results.
Since I don't know that the request has failed, my program waits and waits. How can I tell if a this request fails to return because of security?
BTW, looking at the network stats, I believe all my requests get a "200 OK" status back.
Here is the method I use to create the various data stores:
_createDataStore: function(params) {
this.openRequests++;
var createParams = {
model: params.type,
autoLoad: true,
// So I can later determine which query type it is, and which program
requestType: params.requestType == undefined ? params.type : params.requestType,
program: this.program,
listeners: {
load: this._onDataLoaded,
scope: this
},
filters: params.filters,
pageSize: params.pageSize,
fetch: params.fetch,
context: {
project: this.project,
projectScopeUp: false,
projectScopeDown: true
},
pageSize: 1 // We only need the count
};
console.log('_createDataStore', this.program, createParams.requestType);
Ext.create('Rally.data.WsapiDataStore', createParams);
},
And here is the _onDataLoaded method:
_onDataLoaded: function(store, data, successB) {
console.log('_onDataLoaded', this.program, successB);
...
I only see this function called for those queries for which the account has permissions.
Are you getting any request for Defect.js or HierarchicalRequirement.js? When I simulate the issue you are seeing the request for TypeDefinition.js fails when it is building the model because the user doesn't have access to the specified project. This seems like a little bug to me. You should be able to work around it by explicitly fetching the model for a type for a specified workspace and then using that in your store.
Rally.data.ModelFactory.getModels({
types: ['Defect', 'UserStory'], //more types, etc...
context: Rally.environment.getContext().getDataContext(), //use workspace
success: function(models) {
//your code here
}
});