NestJS Microservice with rabbitmq without pattern - rabbitmq

I have one rabbitmq service that have a queue with no schema json in message.
Like this:
{"customFieldOne": "foo", "another": "bar", "numbers": [3, 5, 8]}
But nestjs with rabbitmq needs a json schema, like this:
{"pattern": "my-pattern", "data": {"fieldOne": 1 ...}}
It must have the pattern and data field.
Is there any way to consume any json message without a pattern or schema defined?
In my code I consume using pattern but I need to consume any data from rabbitmq with nestjs microservice.
#MessagePattern('my-pattern')
// #EventPattern('my-pattern')
getNotifications(#Payload() data: number[], #Ctx() context: RmqContext) {
console.log(data);
console.log(`Pattern: ${context.getPattern()}`);
}
How can I do that?

I know I'm late to the party but I had similar issue with MQTT in Nest.
To remove pattern and transport only data, I've added this code inside something.module.ts.
{
name: 'HYPERVISOR_CLIENT',
transport: Transport.MQTT,
options: {
url: process.env.MQTT_URL ?? 'mqtt://localhost:1883',
protocolVersion: 5,
serializer: {
serialize(value, options?) {
return value.data
},
},
},
}
This results in a clean output

Related

How do I restructure a json in YAML?

I would like to send data from an API to a BigQuery table with Google Workflows (YAML format).
But the API response that I want to send to BigQuery table does not match the "insertAll" BigQuery connector feature.
main:
params: [input]
steps:
- retrieveMatomoData:
call: http.get
args:
url: https://.....
result: matomoData
- insertAll:
call: googleapis.bigquery.v2.tabledata.insertAll
args:
datasetId: myDatasetId
projectId: myProjectId
tableId: myTableId
body:
"rows": [
{
json: should be the full "matomoData" response
}
]
The response structure of the API I use is:
{
"body": [
{
…
},
{
…
}
]
(which is an array that corresponds to several rows to insert)
It does not match with the structure to insert rows in Bigquery:
"rows": [
{
json: …
},
json: …
]
Do you have any idea of how can I handle this?
While the Workflows syntax and standard library can perform simple data extraction and transformation, larger JSON transformations are likely unwieldy inside Workflows for now. I'd recommend using a Cloud Function with a JSON transformation library.

iotedge: How to requeue message that could not be processed

There are publisher and consumer custom modules that are running on an Edge IoT device. The publisher module keeps producing messages at constant rate no matter consumer module processes it or not. The consumer module POSTs the message to external service and given that there is no Internet connection, the consumer module would like to requeue the messsage so that is not lost and tried again.
I do not prefer to write an infinite loop to keep retrying; also if the module is restarted the message would be lost. So i prefer to requeue the message to edgeHub/RocksDB.
Where do I find documentation on available responses that can be provided for IoTHubMessageDispositionResult? what is the response to be sent if message needs to be requeued?
if message.processed():
return IoTHubMessageDispositionResult.ACCEPTED
else:
return IoTHubMessageDispositionResult.??
You don't have to implement your own requeuing of messages. IotEdge provides offline functionality as described in this blog post and on this documentation page.
The edgeHub will locally store messages on the edgeDevice if there is no connection to the IotHub. It will automatically start sending those messages (in the correct order) once the connection is established again.
You can configure how long edgeHub will buffer messages like this:
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.0",
"routes": {},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
The 7200 seconds (2 hours) is also the default if you don't configure anything.
By default, the messages will be written to a folder within the edgeHub docker container. If you want to store them somewhere else you can do so with this configuration:
"edgeHub": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.0",
"createOptions": {
"HostConfig": {
"Binds": ["<HostStoragePath>:<ModuleStoragePath>"],
"PortBindings": {
"8883/tcp": [{"HostPort":"8883"}],
"443/tcp": [{"HostPort":"443"}],
"5671/tcp": [{"HostPort":"5671"}]
}
}
}
},
"env": {
"storageFolder": {
"value": "<ModuleStoragePath>"
}
},
"status": "running",
"restartPolicy": "always"
}
Replace HostStoragePath and ModuleStoragePath with the wanted values. Example:
"createOptions": {
"HostConfig": {
"Binds": [
"/etc/iotedge/storage/:/iotedge/storage/"
],
...
}
}
},
"env": {
"storageFolder": {
"value": "/iotedge/storage/"
},
...
Please note that you probably have to manually give the iotEdge user (or all users) access to that folder (using chmod).
Update:
If you are just looking for the available values of IoTHubMessageDispositionResult you will find the answer here:
class IoTHubMessageDispositionResult(Enum):
ACCEPTED = 0
REJECTED = 1
ABANDONED = 2
Update 2:
Messages that have been ACCEPTED are removed from the message queue because they have been successfully delivered.
Messages that are ABANDONED are added to the message queue again and the module will try to send it again as defined in the retryPolicy. For more insight on the retryPolicy you can read this thread.
Messages that are REJECTED are not added to the message queue again.

JSON RPC Documentation Tool

Are there any Documentation Tools for JSON RPC API's?
I found a lot which are perfect for RESTful API's (Slate, Spotlight, Swagger) but sadly none suitable for JSON RPC API's.
Ideal would be a Tool that can handle both.
Are there any?
Thanks a lot!
Found a couple:
https://github.com/mzernetsch/jrgen
https://github.com/contributte/anabelle
Looking at using jrgen for my project.
Check this one: https://jsight.io
JSight has HTTP REST and also JSON-RPC 2.0 support, look here for details: https://jsight.io/docs/jsight-api-0-3-quick-tutorial
This is the JSON-RPC definition example from the site:
JSIGHT 0.3
URL /api/rpc
Protocol json-rpc-2.0
Method createCat // Create a cat.
Params
{
"cat": #cat
}
Result
{
"id": 1 // Cat’s id.
}
Method getCat // // Get a cat by its id.
Params
{
"id": 1 // Cat’s id.
}
Result
#cat
TYPE #cat
{
"id": 1,
"name": "Tom"
}
The same example in the cloud: https://editor.jsight.io/r/qjxRR6a/1

Google BigQuery connector (Connect Data Studio to BigQuery tables) - I would like to modify this connector to customize for my special requirements

I need to modify the Google Data Studio - Google BigQuery Connector for the customized requirements.
https://support.google.com/datastudio/answer/6370296
First Question: How could I find the source code for this data connector?
Second question:
According to the guide, https://developers.google.com/datastudio/connector/reference, getData(),
Returns the tabular data for the given request.
And the response is in this format
{
"schema":[
{
"name":"OpportunityName",
"dataType":"STRING"
},
{
"name":"IsVerified",
"dataType":"BOOLEAN"
},
{
"name":"Created",
"dataType":"STRING"
},
{
"name":"Amount",
"dataType":"NUMBER"
}
],
"rows":[
{
"values":[
"Interesting",
true,
"2017-05-23",
"120453.65"
]
},
{
"values":[
"SF",
false,
"2017-03-03",
"362705286.92"
]
},
{
"values":[
"Spring Sale",
true,
"2017-04-21",
"870.12"
]
}
],
"cachedData":true
}
But BigQuery could have 100 millions records in the table. We don't care that it could be 100 millions records, we just give the response in this format anyway?
Thanks!
The existing DS-BQ connector is not open source, hence you won't be able to modify its behavior.
With that said:
The DS-BQ connector has a "smarter" API contract than the open one - queries and filters will be passed down.
Feel free to create your own DS-BQ connector with whatever logic you might require! Community connectors would love your contributions.

Sencha touch 2 - show response (JSON string) on proxy loading

Is there a way to output the json-string read by my store in sencha touch 2?
My store is not reading the records so I'm trying to see where went wrong.
My store is defined as follows:
Ext.define("NotesApp.store.Online", {
extend: "Ext.data.Store",
config: {
model: 'NotesApp.model.Note',
storeId: 'Online',
proxy: {
type: 'jsonp',
url: 'http://xxxxxx.com/qa.php',
reader: {
type: 'json',
rootProperty: 'results'
}
},
autoLoad: false,
listeners: {
load: function() {
console.log("updating");
// Clear proxy from offline store
Ext.getStore('Notes').getProxy().clear();
console.log("updating1");
// Loop through records and fill the offline store
this.each(function(record) {
console.log("updating2");
Ext.getStore('Notes').add(record.data);
});
// Sync the offline store
Ext.getStore('Notes').sync();
console.log("updating3");
// Remove data from online store
this.removeAll();
console.log("updated");
}
},
fields: [
{
name: 'id'
},
{
name: 'dateCreated'
},
{
name: 'question'
},
{
name: 'answer'
},
{
name: 'type'
},
{
name: 'author'
}
]
}
});
you may get all the data returned by the server through the proxy, like this:
store.getProxy().getReader().rawData
You can get all the data (javascript objects) returned by the server through the proxy as lasaro suggests:
store.getProxy().getReader().rawData
To get the JSON string of the raw data (the reader should be a JSON reader) you can do:
Ext.encode(store.getProxy().getReader().rawData)
//or if you don't like 'shorthands':
Ext.JSON.encode(store.getProxy().getReader().rawData)
You can also get it by handling the store load event:
// add this in the store config
listeners: {
load: function(store, records, successful, operation, eOpts) {
operation.getResponse().responseText
}
}
As far as I know, there's no way to explicitly observe your response results if you are using a configured proxy (It's obviously easy if you manually send a Ext.Ajax.request or Ext.JsonP.request).
However, you can still watch your results from your browser's developer tools.
For Google Chrome:
When you start your application and assume that your request is completed. Switch to Network tab. The hightlighted link on the left-side panel is the API url from which I fetched data. And on the right panel, choose Response. The response result will appear there. If you have nothing, it's likely that you've triggered a bad request.
Hope this helps.
Your response json should be in following format in Ajax request
{results:[{"id":"1", "name":"note 1"},{"id":"2", "name":"note 2"},{"id":"3", "name":"note 3"}]}
id and name are properties of your model NOte.
For jsonp,
in your server side, get value from 'callback'. that value contains a name of callback method. Then concat that method name to your result string and write the response.
Then the json string should be in following format
callbackmethod({results:[{"id":"1", "name":"note 1"},{"id":"2", "name":"note 2"},{"id":"3", "name":"note 3"}]});