I have deployed the following example in my tenant:
create context HourlyAvgMeasurementDeviceContext
partition measurement.source.value from MeasurementCreated;
#Name("Creating_hourly_measurement")
context HourlyAvgMeasurementDeviceContext
insert into CreateMeasurement
select
m.measurement.source as source,
current_timestamp().toDate() as time,
"c8y_AverageTemperatureMeasurement" as type,
{
"c8y_AverageTemperatureMeasurement.T.value", avg(cast(getNumber(m, "c8y_TemperatureMeasurement.T.value"), double)),
"c8y_AverageTemperatureMeasurement.T.unit", getString(m, "c8y_TemperatureMeasurement.T.unit")
} as fragments
from MeasurementCreated.win:time(1 hours) m
where getObject(m, "c8y_TemperatureMeasurement") is not null
output last every 1 hours;
Afterwards i want to get the average Temperature values(ideally with unit) over the hour as follows:
Select event.c8y_AverageTemperatureMeasurement.T.value from CreateMeasurement event;
but i get an error while deploying the query as:
Error starting statement: Failed to validate select-clause expression 'event.c8y_AverageTemperatureMeasure...(47 chars)': Failed to resolve property 'event.c8y_AverageTemperatureMeasurement.T.value' to a stream or nested property in a stream [Select event.c8y_AverageTemperatureMeasurement.T.value from CreateMeasurement event
Can you please assist with this?
instead of
select event.c8y_AverageTemperatureMeasurement.T.value from CreateMeasurement event;
you need to use
select getNumber(event, "c8y_AverageTemperatureMeasurement.T.value") from CreateMeasurement event;
If you have custom properties in an object you cannot access them by dot syntax. You need to use functions like getNumber or getString where you pass the object and the JSONPath to grab. If not found these functions will return null.
Related
Some times we need to filter a form grid based on the status of the transaction reference Id. Assume that we want to show purchase orders with confirm document state in arrival overview form. The document state field is located in the purch table. For this aim, I try to outer join WMSArrivalOverviewTmp to purch table and add Range. However the results is not as I expect.
This is the code I have tried in the initialized data source event:
[FormDataSourceEventHandler(formDataSourceStr(WMSArrivalOverview, WMSArrivalOverviewTmp), FormDataSourceEventType::Initialized)]
public static void WMSArrivalOverviewTmp_OnInitialized(FormDataSource sender, FormDataSourceEventArgs e)
{
QueryBuildDataSource qbds = sender.queryBuildDataSource();
QueryBuildDataSource qbdsPO;
qbdsPO = qbds.addDataSource(tableNum(PurchTable));
qbdsPO.clearRange(fieldNum(PurchTable, DocumentState));
qbdsPO.joinMode(JoinMode::OuterJoin);
qbdsPO.fetchMode(QueryFetchMode::One2One);
qbdsPO.addLink(fieldNum(WMSArrivalOverviewTmp, InventTransRefId ),fieldNum(PurchTable, PurchId), qbds.name());
qbdsPO.relations(false);
qbdsPO.addRange(fieldNum(PurchTable, DocumentState)).value(SysQuery::value(VersioningDocumentState::Confirmed));
info(sender.query().toString());
}
This is the query which has been shown :
SELECT FIRSTFAST * FROM WMSArrivalOverviewTmp(WMSArrivalOverviewTmp)
OUTER JOIN FROM PurchTable(PurchTable_1) ON
WMSArrivalOverviewTmp.InventTransRefId = PurchTable.PurchId AND
((DocumentState = 40))
Also, I have changed the event type to query executing but I get errors:
[Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Error converting
data type nvarchar to bigint.
Cannot select a record in Purchase orders (PurchTable). The SQL
database has issued an error.
P.S. I have noticed that the error comes when I click on the update button after I fill one of the field in the filter section(arrival option). For example when I fill account number or warehouse. And also I have noticed that when I click on the update button while the fields are not filled in the filter section the join operation has not applied correctly, i.e., the purchase orders which have draft status are shown.
Your actual query looks correct to me. I would add that you may want to add the link between the WMSArrivalOverviewTmp.InventTransType == InventTransType::Purch (or set qbds.relations(true) so that you use the table's built in relation), but I doubt that is the root of your problem. Of course it wouldn't hurt to make sure your query works in the manner that you expect it to by converting the results of the info() call you have into SQL code and run it in SSMS to double check the result set.
Also, I'm curious why would you put the event on the datasource you are trying to manipulate? I would recommend investigating to see if it can occur on the OnModified event of the "status of the transaction reference Id". Although perhaps I am misunderstanding - if this situation is more of a "we need to happen all the time when the form loads" than a "some times we need" as you start your question off - the place to change the query is to modify it in a handler of the datasource's OnQueryExecuting event.
There are two ways of modifying the query on an event when an event occurs.
FormRun formRun = sender.formRun() as FormRun;
FormDataSource formDataSource = formRun.dataSource(formDataSourceStr(WMSArrivalOverview, WMSArrivalOverviewTmp));
//First way - get the base query and use executeQuery() to reload changes
Query query = formDataSource.query();
//modified query here.
formDataSource.executeQuery();
//Second way - get the current queryRun query and use research() to reload changes
Query query = formDataSource.queryRun.query();
//modified query here.
formDataSource.research();
We're using BigQuery with their new dialect of "standard" SQL.
the new SQL supports inline functions written in SQL instead of JS, so we created a function to handle date conversion.
CREATE TEMPORARY FUNCTION
STR_TO_TIMESTAMP(str STRING)
RETURNS TIMESTAMP AS (PARSE_TIMESTAMP('%Y-%m-%dT%H:%M:%E*SZ', str));
It must be a temporary function as Google returns Error: Only temporary functions are currently supported; use CREATE TEMPORARY FUNCTION
if you try a permanent function.
If you try to save a view with a query that uses the function inline - you get the following error: Failed to save view. No support for CREATE TEMPORARY FUNCTION statements inside views.
If you try to outsmart it, and remove the function (hoping to add it during query time), you'll receive this error Failed to save view. Function not found: STR_TO_TIMESTAMP at [4:7].
Any suggestions on how to address this? We have more complex functions than the example shown.
Since the issue was marked as resolved, BigQuery now supports permanents registration of UDFs.
In order to use your UDF in a view, you'll need to first create it.
CREATE OR REPLACE FUNCTION `ACCOUNT-NAME11111.test.STR_TO_TIMESTAMP`
(str STRING)
RETURNS TIMESTAMP AS (PARSE_TIMESTAMP('%Y-%m-%dT%H:%M:%E*SZ', str));
Note that you must use a backtick for the function's name.
There's no TEMPORARY in the statement, as the function will be globally registered and persisted.
Due to the way BigQuery handles namespaces, you must include both the project name and the dataset name (test) in the function's name.
Once it's created and working successfully, you can use it a view.
create view test.test_view as
select `ACCOUNT-NAME11111.test.STR_TO_TIMESTAMP`('2015-02-10T13:00:00Z') as ts
You can then query you view directly without explicitly specifying the UDF anywhere.
select * from test.test_view
As per the documentation https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_function_statement , the functionality is still in Beta phase but is doable. The functions can be viewed in the same dataset it was created and the view can be created.
Please share if that worked fine for you or if you have any findings which would be helpful for others.
Saving a view created with a temp function is still not supported, but what you can do is plan the SQL-query (already rolled out for the latest UI), and then save it as a table. This worked for me, but I guess it depends on the query parameters you want.
##standardSQL
## JS in SQL to extract multiple h.CDs at the same time.
CREATE TEMPORARY FUNCTION getCustomDimension(cd ARRAY<STRUCT< index INT64,
value STRING>>, index INT64)
RETURNS STRING
LANGUAGE js AS """
for(var i = 0; i < cd.length; i++) {
var item = cd[i];
if(item.index == index) {
return item.value
}
}
return '';
""";
SELECT DISTINCT h.page.pagePath, getcustomDimension(h.customDimensions,20), fullVisitorId,h.page.pagePathLevel1, h.page.pagePathLevel2, h.page.pagePathLevel3, getcustomDimension(h.customDimensions,3)
FROM
`XXX.ga_sessions_*`,
UNNEST(hits) AS h
WHERE
### rolling timeframe
_TABLE_SUFFIX = FORMAT_DATE('%Y%m%d',DATE_SUB(CURRENT_DATE(),INTERVAL YY DAY))
AND h.type='PAGE'
Credit for the solution goes to https://medium.com/#JustinCarmony/strategies-for-easier-google-analytics-bigquery-analysis-custom-dimensions-cad8afe7a153
I'm seeing an issue when creating a spark streaming table using kafka from the snappy shell.
'The exception 'Invalid input 'C', expected dmlOperation, insert, withIdentifier, select or put (line 1, column 1):'
Reference: http://snappydatainc.github.io/snappydata/streamingWithSQL/#spark-streaming-overview
Here is my sql:
CREATE STREAM TABLE if not exists sensor_data_stream
(sensor_id string, metric string)
using kafka_stream
options (
storagelevel 'MEMORY_AND_DISK_SER_2',
rowConverter 'io.snappydata.app.streaming.KafkaStreamToRowsConverter',
zkQuorum 'localhost:2181',
groupId 'streamConsumer',
topics 'test:01');
The shell seems to not like the script at the first character 'C'. I'm attempting to execute the script using the following command:
snappy> run '/scripts/my_test_sensor_script.sql';
any help appreciated!
There is some inconsistency in documentation and actual syntax.The correct syntax is:
CREATE STREAM TABLE sensor_data_stream if not exists (sensor_id string,
metric string) using kafka_stream
options (storagelevel 'MEMORY_AND_DISK_SER_2',
rowConverter 'io.snappydata.app.streaming.KafkaStreamToRowsConverter',
zkQuorum 'localhost:2181',
groupId 'streamConsumer', topics 'test:01');
One more thing you need to do is to write row converter for your data
Mike, You need to create your own rowConverter class by implementing following trait -
trait StreamToRowsConverter extends Serializable {
def toRows(message: Any): Seq[Row]
}
and then specify that rowConverter fully qualified class name in the DDL.
The rowConverter is specific to a schema.
'io.snappydata.app.streaming.KafkaStreamToRowsConverter' is just an placeholder class name, which should be replaced by your own rowConverter class.
I am working on SQL query, I have pre-created SOQL query, I am looking for a way to convert it to SQL query. The query is:
SELECT CronJobDetail.Name, Id, CreatedDate, State
FROM CronTrigger
WHERE CronjobDetail.JobType = 'bla bla'
AND CronJobDetail.Name LIKE '%bla bla2%'
But It does not run on terminal when I try to create monitoring script in Ruby. The error that I get:
(Got exception: INVALID_FIELD: No such relation 'CronJobDetail' on
entity 'CronTrigger'. If you are attempting to use a custom field, be
sure to append the '__c' after the custom field name. Please reference
your WSDL or the describe call for the appropriate names. in
/Users/gakdugan/.rvm/gems/ruby-1.9.3-p547/gems/restforce-2.2.0/lib/restforce/middleware/raise_error.rb:18:in
`on_complete'
Do you have any idea how can I fix it and make it run on SQL?
You are trying to access a relation without adding it to your FROM clause. Alternatively, if that's a custom field name, then do what the error message suggests you to do (add __c after the custom field name).
You probably want to do something like this:
SELECT CronJobDetail.Name, CronTrigger.Id, CronTrigger.CreatedDate, CronTrigger.State
FROM CronTrigger
INNER JOIN CronJobDetail ON CronJobDetail.id = CronTrigger.foreign_id // this you have to do yourself
WHERE CronjobDetail.JobType = 'bla bla'
AND CronJobDetail.Name LIKE '%bla bla2%'
I have Event model with following attributes (I quoted only problem related attributes), this model is filled periodically by API call, calling external service (Google Calendar):
colorid: number # (0-11)
event_start: datetime
event_end: datetime
I need to count duration of grouped events, grouped by colorid. I have Event instance method to calculate single event duration:
def event_duration
((event_end.to_datetime - event_start.to_datetime) * 24 * 60 ).to_i
end
Now, I need to do something like this:
event = Event.group(:colorid).sum(event_duration)
But this doesnot work for me, as long as I get error that event_duration column doesnot exists. My idea is to add one more attribute to Event model "event_duration", and count and update this attribute during Event record creation, in this case I would have column called "event_duration", and I might be ale to use sum on this attribute. But I am not sure this is good and "system solution", as long as I would like to have model data reflecting "raw" received data from API call, and do all math and statistics on the top of model data.
event_duration is instance method (not column name). error was raised because Event.sum only calculates the sum of certain column
on your case, I think it would be easier to use enumerable methods
duration_by_color_id = {}
grouped_events = Event.all.group_by(&:colorid)
grouped_events.each do |colorid, events|
duration_by_color_id[colorid] = events.collect(&:event_duration).sum
end
Source :
Enumerable's group_by
Enumerable's collect