use reference data from a sql database in stream analytics - azure-stream-analytics

In my Stream analytics ,I am getting data from Iot hub for a ID and I want to use reference data to get data from SQL SERVER for that ID.
I use:
Declare #ID int=1
select 1,2,3
from [table]
where ID=#ID
it says no data .
even if I run
select 1,2,3
from [table]
I can't see Sample Data on your SQL Database Reference input to test why data is not coming.

Just summarize here:
Let's say i have a column called Amount, i was using SUM(Amount) so SA gave error : cannot have decimal value. So I cast-ed that to Float.
Regarding my question :
"I can't see Sample Data on your SQL Database Reference input to test
why data is not coming."
When you test query,select the option "Select time Range" and select time ,you think data is present in DB. When you run the SA please check the "Activity Log" mentioned in the Pane as when the job fails it tells that job failed and we can click to that link and can see complete error in the JSON file.

Related

SQL Query in Azure Dataflow does not work when using parameter value in where clause

I use a Azure Datafactory Pipeline.
Within that pipeline i use 2 activities:
Lookup to get a date value
This is the output:
"firstRow": {
"Date": "2022-10-26T00:00:00Z"
A dataflow which is getting the date from the lookup in 1 which is used in the source options SQL query in the where clause:
This is the query:
"SELECT ProductID ,ProductName ,SupplierID,CategoryID ,QuantityPerUnit ,UnitPrice ,UnitsInStock,UnitsOnOrder,ReorderLevel,Discontinued,LastModifiedDate FROM Noordwind.Products where LastModifiedDate >= '{$DS_LastPipeLineRunDate}'"
When i fill the parameter by hand with for example '2022-10-26' then it works great, but when i let the parameter get's its value from the Lookup in step 1 the dataflow fails
Error message:
{"message":"Job failed due to reason: Converting to a date or time failed due to an invalid character. Details:null","failureType":"UserError","target":"Products","errorCode":"DF-Executor-Conversion"}
This is the parameter in the pipeline view, but clicked on the dataflow:
I have tried casting the date al kind of things but not the right thing.
Can you help me.
UPDATE:
After a question from Rakesh:
This is the activity parameter
#activity('LookupLastPipelineRunDate').output.firstRow
I have reproduced the above and got the below results.
My source sample data from SQL database.
For demo, I have used set variable for the date and given a sample date like below.
Created a string parameter and given this variable value to it.
In your case pass the lookup firstrow output date.
I have used below dataflow expression in the query of dataflow source and got the desired result.
concat('select * from dbo.table1 where d1 >=','\'',$date_value,'\'')
Result in a target SQL table.
I have created an activity set variable:
The first pipeline still returns the right date.
I even converted it just to be sure to datetime.
I can create a variable with type string.
Code:
#activity('LookupLastPipelineRunDate').output.firstRow
Regardless of the activity set variable that fails, it looks like the date enters nicely as an input in the Set variable activity
And still a get an error:
When i read this error message, it says that you can't put a date in a string variable. But i can only choose string, boolean and array, so there is no better option for this.
I also reviewd this website.
enter link description here
There for i have altered the table which contains the source data which i use in the dataflow.
I Deleted the column LastModifiedDate because it has datatype datetime.
Now i created the same column with datatype datetime2
I did this because i read that datetime2 has less problems with conversions.

Unable to access the temp tables in azure sql database

Using following code I have created a temp table in azure sql database.
CREATE TABLE ##UpsertTempTable (
eno varchar(25),
ename varchar(25)
);
and I am want to check the data using the below query
select * from ##UpsertTempTable
Ideally it should run without any issue as in all of the azure documentation it works without any issues but unfortunately it is not working and giving below error.
I tried looking solution in all places in the internet but could not find any relevant documentation for this issue.
Error : Failed to execute query. Error: Invalid object name '##UpsertTempTable'.
I tried in Query Editor(Preview) in Portal, and create temporary table code doesn't work. I both used ##UpsertTempTable and #UpsertTempTable.
For example, when we run the code, no error happens .
When you run select * from ##UpsertTempTable, Query editor will gives the error:
I also try with SSMS V17.9 and SSMS V18.1, everything is ok.
What I think is the query editor doesn't support create temporary table well.
I asked Azure Support and wait their replay, please wait my update.
Update:
Azure Support replied me:
"This is by design, the temp tables exists as long as the connection is open.
The current way portal query editor is designed, the connection is killed resulting in temp table being deleted.
"
Hope this helps.

SSIS Variable Date Failing between SQL Server and ORACLE

Good Afternoon All,
I have spent about 6 hours trying to get formatting to work through SSIS using a Max Date Variable to identify into a where clause - Just have no luck!
I have created a variable called my_date which fetched the Max(Date) from a local SQL server table to understand the last load point for that table - using the below code:
SELECT CAST(FORMAT(MAX(Business_Date), 'dd-MMM-yyyy') AS varchar) AS my_date FROM Table
This fetches the date correctly as 17-Sep-2018.
I have then mapped my result set as my_date -> User::max_date
I have set my max_date variable to a string data type under the package scope.
I have tested my variable out by using breakpoints to ensure this runs all the way through in the correct format - and this works 100%.
I then have a data flow task running to fetch data from my ORACLE DB to insert into my SQL Server table which contains the following SQL command:
SELECT *
FROM Table2
WHERE (BUSINESS_DATE > to_date('#[User::max_date]', 'DD-MON-YYYY'))
However I get the ORA-01858 - [TABLE] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E07.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80040E07 Description: "ORA-01858: a non-numeric character was found where a numeric was expected
".
If I go and replace my variable directly with the contents of the variable shown by the breakpoint in locales it works perfectly!
I have attempted multiple format types from the initial export through to the final where clause and this seems to be the closest I have come to pushing it through but it is still complaining about format.
Please help - Images below to help see setup.
Control Flow - displaying the execute SQL and the data flow task
Locale showing Variable is inserting after breakpoint is reached
Managed to get it working!
By adding an intermediary variable where the value and expression contains the following:
"SELECT *
FROM TABLE
WHERE (BUSINESS_DATE > to_date('"+#[User::max_date]+"' , 'DD-MON-YYYY'))"
I then amended my source OLEDB to SQL Command from variable and selected the above variable created and it worked perfectly!
Try mapping your user parameter in the "OLE DB Data Source Editor", under "Parameters".
1) Change SQL Command Text (change #[User::max_date] to ?), like this:
SELECT *
FROM Table2
WHERE (BUSINESS_DATE > to_date('?', 'DD-MON-YYYY'))
2) Then in the parameter editor, map parameter 1 to #[User::max_date].
https://learn.microsoft.com/en-us/sql/integration-services/data-flow/map-query-parameters-to-variables-in-a-data-flow-component?view=sql-server-2017
Also, the "Oracle Provider for OLE DB" behaves differently than the "Microsoft OLE DB Provider for Oracle", so it depends which you are using.

Error in SSIS - Parameters cannot be extracted when parsing query, syntax error

I am building an SSIS package to select some data as XML and export it to a flat file. I've running it via a for each loop, I have a sql execute task to select all records in a table and then loop through each one to create a single XML file.
Here is the SQL statement I am trying to execute as part of the data flow in the OLE DB Source Editor. The select is inside another SELECT (...) AS XML_TEXT to make the output to be text
SELECT
(SELECT SalesCode, ProductName, ManufacturingSite, Country, Language, Status, RevisionDate, ArchiveDate, PFFileName, XMLFileName, Version
FROM ir_cp_connect_data
WHERE (id = ?)
FOR XML RAW ('Data'), ROOT ('Sales'), ELEMENTS XSINIL)
AS XML_TEXT
I've tried to enter it direct as a SQL command and also in a variable. When I enter it directly into as a command, it errors saying it cannot parse the query, giving syntax error, permission violation or other non specific error.
When I switch it out to be a variable in data access mode, it errors with a long error message,
SSIS Error Code DTS_E_OLEDBERROR. An OLD DB Error has occurred, Error Code 0x80004005. An OLE DB record is available.
Again it then goes on to list 'syntax error, permission violation or other non specific error.'
If I run the select statement in SQL Management Studio it works, albeit I substitute the id = ? for id = #id and declare the #id in and assign it a value.
I'm working with VS2008 and SQL Server 2008R2.
I feel I am missing something very obvious but I cannot put my finger on it. If I drop the opening SELECT (..) AS XML_TEXT, the query parses fine, but the output is System.Byte[] which is no good for the flat file.
Any help would be much appreciated.

Pentaho kettle : how to execute "insert into ... select from" with the sql script step?

I am discovering Pentaho DI and i am stuck with this problem :
I want to insert data from a csv file to a custom DB, which does not support the "insert table" step. So i would like to use the sql script step, with one request :
INSERT INTO myTable
SELECT * FROM myInput
And my transformation would like this :
I don't know how to get all my data from the csv to be injected in the "myInput" field.
Could someone help me ?
Thanks a lot :)
When you first edit the SQL Script step, click 'Get fields' button. This is going to load the parameters(fields from your csv) into box on the bottom left corner. Delete the parameters(fields) you don't want to insert.
In your sql script write your query something like this where the question marks are your parameters in order.
insert into my_table (field1,field2,field3...) values ('?','?','?'...);
Mark the checkboxes execute for each row and execute as a single statement. That's really about it. Let me know if you have any more questions and if you provide sample data I'll make you a sample ktr file to look at.
I think you get the wrong way. You should get a cvs file input step and a table output step.
As rwilliams said, In cvs file input step Get fields; the more important, in table output there is a Database Fields tab. Enter field mapping is right choise.The guess function is amazing.
And more, system can generate target table create sql statement when target table not exists in target connection Db server.
Use the following code
with cte as
(
SELECT * FROM myInput
)
select *
into myTable
from cte;