Error while trying to pass a date value to a UDT from mule Database connector - mule

I have a requirement where I need to invoke UDT type defined in stored Procedure on Oracle, from Mule flow. To attain this I am creating a JDBC array type like:
%dw 2.0
output application/java
---
{
inParam: Db::createArray("Database_Config","ADDRESS_TAB",[ Db::createStruct("Database_Config","ADDRESS_TAB_TYPE",
["TESt-Mule",**("2015-07-04T21:01:01") as DateTime**,"WB"])
])
}
the oracle ADDRESS_TAB_TYPE is
ADDRESS_TAB TABLE OF ADDRESS_TAB_TYPE
Name Null? Type
------------------- ----- -------------
IADDRESS_NAME VARCHAR2(240)
IINACTIVE_DATE DATE
ISTATE VARCHAR2(150)
type of date defined on Oracle side is DATE
after invoking the flow I am getting below error:
ERROR 2021-03-18 23:14:06,971 [[MuleRuntime].uber.04: [playground].oracle-db-loc-testFlow.CPU_INTENSIVE #4d9589a8] [processor: ; event: 876ad840-8811-11eb-a282-a483e7749b4e] org.mule.runtime.core.internal.exception.OnErrorPropagateHandler:
********************************************************************************
Message : "org.mule.weave.v2.el.ExpressionFunctionCallException: Exception while executing createStruct("Database_Config","ADDRESS_TAB_TYPE",
["TESt-Mule",**("2015-07-04T21:01:01") as DateTime**,"WB"])
]) cause: An error occurred when trying to create JDBC Structure. Fail to convert to internal representation: 2015-07-04T21:01:01Z
Trace:
at callFunction (Unknown)
at createStruct (line: -1, column: -1)
at createArray (line: 5, column: 73)
at main (line: 5, column: 12)
I am trying to pass this datevalue to oracle, Any help is appreciated. Thanks!

I used this transformation:
fun dateTimeFormat(inputDateTime) = inputDateTime as DateTime as String{format: "yyyy-MM-dd HH:mm:ss.SSS"}
and it worked.
Thanks!

Would the suggestion here help you out maybe ?
https://help.mulesoft.com/s/question/0D52T00004mXXV4/how-to-insert-the-date-value-into-oracle-database-using-mule-4

Related

JPA Query not returning data while sending multiple values in request

I'm having issue finding null or empty list in my JPA Query. I was able to get the values for single and empty value selection via request. But, same this is not working if I pass multiple values.
#Query(
value =
"select sum (ORDER_PALLET_QTY) as PALLETS,
sum (ORDER_QTY) as UNITS,
PO_TYPE as POTYPE
from [ORDER]
where CAL_DT BETWEEN (:fromDate) AND (:endDate) AND
(:vendor IS NULL OR VENDOR_NBR IN (:vendor)) AND
group by PO_TYPE,
nativeQuery = true)
Optional<List<OrderCubeModelHelper>> getOrders(
List<String> vendorNbr,
LocalDate fromDate,
LocalDate endDate);
This is returning data when I send only one value in list. As shown below:
Request:
{
"vendorNbr" :["294"],
"fromDate" : "2021-08-12",
"endDate" : "2021-08-31"
}
Same query throwing exception when I send multiple values in request.
SampleRequest
{
"vendorNbr" :["294","302"],
"fromDate" : "2021-08-12",
"endDate" : "2021-08-31"
}
Exception:
2021-08-19 11:46:18.520 ERROR 12404 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[.[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.dao.InvalidDataAccessResourceUsageException: could not extract ResultSet; SQL [n/a]; nested exception is org.hibernate.exception.SQLGrammarException: could not extract ResultSet] with root cause
com.microsoft.sqlserver.jdbc.SQLServerException: An expression of non-boolean type specified in a context where a condition is expected, near ','.
This is a native query, and AFAIK Hibernate does not interfere in any way to do parameter list expansion as it is generally hard to support all SQL dialects through a single parser. If you want list expansion to happen, you either have to use some kind of table valued function e.g. STRING_SPLIT or you use JPQL/HQL for this query like this:
#Query("select new package.to.OrderCubeModelHelper(sum(e.palletQuantity), sum (e.quantity), e.poType) from OrderEntity e
where e.date BETWEEN (:fromDate) AND (:endDate) AND
(:vendor IS NULL OR e.vendorNumber IN (:vendor)) AND
group by e.poType")
Optional<List<OrderCubeModelHelper>> getOrders(
List<String> vendorNbr,
LocalDate fromDate,
LocalDate endDate);

Failed to transfer data from GCS to Bigquery table

Need Help in DTS.
After creating a table "allorders" with autodetect schema, I created a data transfer service. But when I ran the DTS I'm getting an error. see Job below. quantity field type is for sure set to integer and all the data in the said field are whole numbers.
Job bqts_602c3b1a-0000-24db-ba34-30fd38139ad0 (table allorders) failed
with error INVALID_ARGUMENT: Error while reading data, error message:
Could not parse 'quantity' as INT64 for field quantity (position 14)
starting at location 0 with message 'Unable to parse'; JobID:
956421367065:bqts_602c3b1a-0000-24db-ba34-30fd38139ad0
When I recreated a table and set all fields to type string. It worked fine. see Job below
Job bqts_607cef13-0000-2791-8888-001a114b79a8 (table allorders)
completed successfully. Number of records: 56017, with errors: 0.
Try to find unparseable values in the table with all string fileds:
SELECT *
FROM dataset.table
WHERE SAFE_CAST(value AS INT64) IS NULL;

Unable to execute stored procedure using Mule 4

I am trying to execute store procedure but getting below error, Please someone help me on the same.
Error:
ERROR 2020-01-17 18:13:55,573 [[MuleRuntime].cpuIntensive.01: [imf.org].imf-orgFlow1.CPU_INTENSIVE #d5e7f1] [event: 1ee63c30-3955-11ea-aeb0-000d3a9f109e] org.mule.runtime.core.internal.exception.OnErrorPropagateHandler:
********************************************************************************
Message : "You called the function 'Value Selector' with these arguments:
1: String ("{\r\n\t\"CountryCode\": \"IND\",\r\n\t\"CategoryIds\": \"1\",\r\n\t\"Years\"...)
2: Name ("CountryCode")
But it expects one of these combinations:
(Array, Name)
(Array, String)
(Date, Name)
(DateTime, Name)
(LocalDateTime, Name)
6| CountryCode: payload.CountryCode ,
^^^^^^^^^^^^^^^^^^^

Trying to covert long to ToDate format

My input is long "20190503143744" and wanted to convert to format "2019-09-06 11:46:22"
Trying below code:
A = LOAD 'stp_master_subscriber_profile' using org.apache.hive.hcatalog.pig.HCatLoader() as (mdn:chararray, imei:chararray, imsi:chararray, subscriber_id:long, manufacturer:chararray, model:chararray, update_time:long, scenario:chararray, vz_customer:chararray, commit_time:long);
B = FOREACH A GENERATE ToString(ToDate((chararray)commit_time,'yyyyMMdd'),'yyyy-MM-dd HH:mm:ss') as event_date_gmt:chararray;
Getting error:
ERROR 1066: Unable to open iterator for alias B. Backend error : org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing [POUserFunc (Name: POUserFunc(org.apache.pig.builtin.ToDate2ARGS)[datetime] - scope-15 Operator Key: scope-15) children: null at []]: java.lang.IllegalArgumentException: Invalid format: "20190503143744" is malformed at "143744"
The issue is that you're specifying the format as yyyyMMdd but your original input is in yyyyMMddHHmmss format, so you get an error when Pig reaches 143744 instead of the end of your string. Try this:
B = FOREACH A GENERATE ToString(ToDate((chararray)commit_time,'yyyyMMddHHmmss'),
'yyyy-MM-dd HH:mm:ss') as event_date_gmt;

BigQuery: "Invalid function name"

In BigQuery, I ran this query
SELECT SEC_TO_TIMESTAMP(start_time), resource
FROM [logs.requestlogs_20140305]
and received the error
Error: Invalid function name: SEC_TO_TIMESTAMP
SEC_TO_TIMESTAMP is listed in the date functions reference, so why does this error come up?
Turns out my start_time column was of type float, not an integer which SEC_TO_TIMESTAMP expects. Changing the query to
SELECT SEC_TO_TIMESTAMP(INTEGER(start_time)), resource
FROM [logs.requestlogs_20140305]
fixed the problem.