Can't serialize object type Clob in Mulesoft - clob

I'm trying to receive information from a table with a Clob type attribute a this is what the object appears like in the payload:
TypedValue[value: 'org.mule.runtime.core.internal.streaming.bytes.ManagedCursorStreamProvider#1c11df37', dataType: 'SimpleDataType{type=java.io.InputStream, mimeType='text/plain'}']
It's the BGQUEUE_EXEC_STATUS_VALUE and BGQUEUE_EXEC_VALUE attributes being used in the following flow:
<flow name="SYS_TBGQUEUEMigrationFlow" doc:id="1302cd81-6b21-424b-9bb8-b097d2ab0c3a" >
<db:select doc:name="Select SYS_TBGQUEUE from Oracle DB" doc:id="c7b33299-b772-4134-abdd-c7dcb3ad630a" config-ref="MYCAREER_DEV_DB" >
<db:sql >SELECT * FROM SYS_TBGQUEUE</db:sql>
</db:select>
<batch:job jobName="sysLogicBatch_Job" doc:id="08898cbe-62c5-4028-9f4f-dc8be9e6d0d9" >
<batch:process-records >
<batch:step name="Batch_Step" doc:id="836a7495-415c-4d5f-945a-03f1f2358cd8" >
<batch:aggregator doc:name="Batch Aggregator" doc:id="0425e88e-5e07-4f04-9016-872ccef025bd" streaming="true">
<foreach doc:name="For Each" doc:id="a2af7287-961e-4c1e-bd13-993d00e75370">
<db:stored-procedure doc:name="Insert into SYS_TBGQUEUE" doc:id="692971ea-d05c-430b-9ae4-3fe8b6d439dc" config-ref="Database_Config">
<db:sql>{call InsertIntoBgQueue (:BGQUEUE_CODE, :BGQUEUETYPE_CODE, :IDENTITY_CODE, :BGQUEUE_DATE_INSERT, :BGQUEUE_LOCK_WORKER_ID,
:BGQUEUE_EXEC_START_DATE, :BGQUEUE_EXEC_EXPIRE_DATE, :BGQUEUE_EXEC_VALUE, :BGQUEUE_EXEC_RETRIES,
:BGQUEUE_EXEC_HEARTBEAT_DATE, :BGQUEUE_EXEC_END_DATE, :BGQUEUE_EXEC_STATUS_ENUM, :BGQUEUE_EXEC_STATUS_VALUE, :BGQUEUE_STATUS)}</db:sql>
<db:input-parameters><![CDATA[#[{
BGQUEUE_CODE : payload.bgqueue_code,
BGQUEUETYPE_CODE : payload.bgqueuetype_code,
IDENTITY_CODE : payload.identity_code,
BGQUEUE_DATE_INSERT : payload.bgqueue_date_insert,
BGQUEUE_LOCK_WORKER_ID : payload.bgqueue_lock_worker_id,
BGQUEUE_EXEC_START_DATE : payload.bgqueue_exec_start_date,
BGQUEUE_EXEC_EXPIRE_DATE : payload.bgqueue_exec_expire_date,
BGQUEUE_EXEC_VALUE : payload.bgqueue_exec_value as String,
BGQUEUE_EXEC_RETRIES : payload.bgqueue_exec_retries,
BGQUEUE_EXEC_HEARTBEAT_DATE : payload.bgqueue_exec_heartbeat_date,
BGQUEUE_EXEC_END_DATE : payload.bgqueue_exec_end_date,
BGQUEUE_EXEC_STATUS_ENUM : payload.bgqueue_exec_status_enum,
BGQUEUE_EXEC_STATUS_VALUE : payload.bgqueue_exec_status_value as String,
BGQUEUE_STATUS : payload.bgqueue_status
}]]]></db:input-parameters>
</db:stored-procedure>
</foreach>
</batch:aggregator>
</batch:step>
</batch:process-records>
<batch:on-complete >
<logger level="INFO" doc:name="Logger" doc:id="a79350d8-65e9-44ac-a966-8a27036a5477" message="SYS_TBGQUEUE finished data migration." />
</batch:on-complete>
</batch:job>
</flow>
This is the error message:
Message : Could not dispatch records to batch queue BSQ-sysLogicBatch_Job-ab0fcad0-7b15-11ea-9403-3e6fc91c389b due to Serialization Exception
Error type : MULE:UNKNOWN
Element : SYS_TBGQUEUEMigrationFlow/processors/1 # DatabaseConnectorPOC:sysLogic.xml:114
Element XML : <batch:job jobName="sysLogicBatch_Job" doc:id="08898cbe-62c5-4028-9f4f-dc8be9e6d0d9">
<batch:process-records>
<batch:step name="Batch_Step" doc:id="836a7495-415c-4d5f-945a-03f1f2358cd8">
<batch:aggregator doc:name="Batch Aggregator" doc:id="0425e88e-5e07-4f04-9016-872ccef025bd" streaming="true">
<foreach doc:name="For Each" doc:id="a2af7287-961e-4c1e-bd13-993d00e75370">
<db:stored-procedure doc:name="Insert into SYS_TBGQUEUE" doc:id="692971ea-d05c-430b-9ae4-3fe8b6d439dc" config-ref="Database_Config">
<db:sql>{call InsertIntoBgQueue (:BGQUEUE_CODE, :BGQUEUETYPE_CODE, :IDENTITY_CODE, :BGQUEUE_DATE_INSERT, :BGQUEUE_LOCK_WORKER_ID,:BGQUEUE_EXEC_START_DATE, :BGQUEUE_EXEC_EXPIRE_DATE, :BGQUEUE_EXEC_VALUE, :BGQUEUE_EXEC_RETRIES,
:BGQUEUE_EXEC_HEARTBEAT_DATE, :BGQUEUE_EXEC_END_DATE, :BGQUEUE_EXEC_STATUS_ENUM, :BGQUEUE_EXEC_STATUS_VALUE, :BGQUEUE_STATUS)}</db:sql>
<db:input-parameters>#[output application/java
---
{
BGQUEUE_CODE : payload.bgqueue_code,
BGQUEUETYPE_CODE : payload.bgqueuetype_code,
IDENTITY_CODE : payload.identity_code,
BGQUEUE_DATE_INSERT : payload.bgqueue_date_insert,
BGQUEUE_LOCK_WORKER_ID : payload.bgqueue_lock_worker_id,
BGQUEUE_EXEC_START_DATE : payload.bgqueue_exec_start_date,
BGQUEUE_EXEC_EXPIRE_DATE : payload.bgqueue_exec_expire_date,
BGQUEUE_EXEC_VALUE : payload.bgqueue_exec_value,
BGQUEUE_EXEC_RETRIES : payload.bgqueue_exec_retries,
BGQUEUE_EXEC_HEARTBEAT_DATE : payload.bgqueue_exec_heartbeat_date,
BGQUEUE_EXEC_END_DATE : payload.bgqueue_exec_end_date,
BGQUEUE_EXEC_STATUS_ENUM : payload.bgqueue_exec_status_enum,
BGQUEUE_EXEC_STATUS_VALUE : payload.bgqueue_exec_status_value,
BGQUEUE_STATUS : payload.bgqueue_status
}]</db:input-parameters>
</db:stored-procedure>
</foreach>
</batch:aggregator>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger level="INFO" doc:name="Logger" doc:id="a79350d8-65e9-44ac-a966-8a27036a5477" message="SYS_TBGQUEUE finished data migration."></logger>
</batch:on-complete>
</batch:job>
Does anyone know how I can convert it into a String?
Thank you!

The problem looks to be that it is not possible to serialize the Clob to load the Batch Job queue.
You may need to convert the Clob into String before entering the Batch Job with a transform like this:
( payload map {
BGQUEUE_CODE : payload.bgqueue_code,
BGQUEUETYPE_CODE : payload.bgqueuetype_code,
IDENTITY_CODE : payload.identity_code,
BGQUEUE_DATE_INSERT : payload.bgqueue_date_insert,
BGQUEUE_LOCK_WORKER_ID : payload.bgqueue_lock_worker_id,
BGQUEUE_EXEC_START_DATE : payload.bgqueue_exec_start_date,
BGQUEUE_EXEC_EXPIRE_DATE : payload.bgqueue_exec_expire_date,
BGQUEUE_EXEC_VALUE : (payload.bgqueue_exec_value as String) default "",
BGQUEUE_EXEC_RETRIES : payload.bgqueue_exec_retries,
BGQUEUE_EXEC_HEARTBEAT_DATE : payload.bgqueue_exec_heartbeat_date,
BGQUEUE_EXEC_END_DATE : payload.bgqueue_exec_end_date,
BGQUEUE_EXEC_STATUS_ENUM : payload.bgqueue_exec_status_enum,
BGQUEUE_EXEC_STATUS_VALUE : (payload.bgqueue_exec_status_value as String) default "",
BGQUEUE_STATUS : payload.bgqueue_status
} ) as Iterator
The "as Iterator" is just an optimization that can be removed if you see there is a bug related to that (like a DB session error)
Then in db:input-parameters you will need to put just "payload" as the names will already match with the ones you need.

Related

Dynamic SELECT query which decides whether to use the WHERE clause in Mule 4

I am using Mule 4 and Anypoint 7 and want to setup the database connector to SELECT all customers from my SQL server database table but if the customerName query parameter is populated in the request then I want to add the WHERE clause to only return customers with the same name as the customerName query parameter otherwise it should just return all customers.
My code is below but I am struggling to get the syntax correct.
<db:select doc:name="Select Customers" doc:id="98a4aa2f-b0b6-4fb5-ab27-d70489fd532d" config-ref="db-config">
<ee:repeatable-file-store-iterable />
<db:sql >SELECT TOP 10 * FROM MYDB.dbo.Customer $(if (attributes.queryParams.customerName != null and isEmpty(attributes.queryParams.customerName) == false) "WHERE Name = :customerName" else "")</db:sql>
<db:input-parameters ><![CDATA[#[{'customerName' : attributes.queryParams.customerName}]]]></db:input-parameters>
</db:select>
How can I do this?
Thanks
You were on the right path. I think you were only missing the evaluation tags around the SQL in the db:sql element.
<db:select doc:name="Select Customers" doc:id="98a4aa2f-b0b6-4fb5-ab27-d70489fd532d"
config-ref="db-config">
<ee:repeatable-file-store-iterable />
<db:sql>#["SELECT TOP 10 * FROM MYDB.dbo.Customer
$(if (isEmpty(attributes.queryParams.customerName) == false) "WHERE Name = :customerName" else "") "]</db:sql>
<db:input-parameters ><![CDATA[#[{'customerName' : attributes.queryParams.customerName}]]]></db:input-parameters>
</db:select>
It is easier to debug things like this with variables, so that you can see the individual values. FWIW heres my test code:
<set-variable variableName="additionalWhereClause"
value='#[if ( isEmpty(attributes.queryParams.email) == false)
"WHERE Email = :emailParm"
else "" ]' />
<set-variable variableName="selectSql"
value="#['SELECT FirstName, LastName, Email
FROM User
$( vars.additionalWhereClause )
ORDER BY Email LIMIT 10']" />
<logger level="INFO" message="queryParams: #[attributes.queryParams]" doc:id="96c62f84-2c98-4df6-829c-e00c9fcec9ca" />
<logger level="INFO" message="additionalWhereClause #[vars.additionalWhereClause]" doc:id="0d3611b4-34ae-4ebb-b931-6d31ce3804c1" />
<logger level="INFO" message="selectSql #[vars.selectSql]" doc:id="5c56342d-9674-4891-9d7e-bb32319f4ad0" />
<db:select doc:name="MySQL Query" doc:id="e60be3e6-9b51-4b3b-9dfa-4ee0af65cb03"
config-ref="mysql-config">
<ee:repeatable-file-store-iterable />
<db:sql>#[ vars.selectSql ]</db:sql>
<db:input-parameters><![CDATA[#[{'emailParm' : attributes.queryParams.email}]]]></db:input-parameters>
</db:select>

How to stop executing flow reference unless and until the complete batch is finished?

In my project I am reading a huge tsv file (around 820k) and process the records into db. I have a java file which does validation, after the validation I store the records in a list. Since I can't send the whole list to the vm,I split the list into 2000 and send to the flow. here is how I send 2000 records
while(i < validRecords.size()){
int j = i + 2000;
if( j < validRecords.size())
{
muleClient.dispatch("vm://validRecordsEtoc", validRecords.subList(i, j), msgProperties);
}
else
{
muleClient.dispatch("vm://endRecordsEtoc", validRecords.subList(i,validRecords.size()-1), msgProperties);
}
i = i + 2000;
}
I have a flow which has reference to a batch and process the records into db.
Here is the flow
<flow name="csv-source-inputFlow_Clean_etoc">
<vm:inbound-endpoint exchange-pattern="one-way" path="validRecordsEtoc" doc:name="VM_TFO_ETOC_SUBS_SRC"/>
<enricher doc:name="Enrich inbound Message Properties">
<logger message="Etoc source message enricher : BatchId : #[message.inboundProperties.flow_batch_id] Total Records : #[message.inboundProperties.totalRecordCount] Successful Records : #[message.inboundProperties.successRecordCount]" level="INFO" doc:name="Logger"/>
<enrich source="#[message.inboundProperties.'flow_batch_id']" target="#[flowVars.flow_batch_id]"/>
<enrich source="#[message.inboundProperties.'flow_source_name']" target="#[flowVars.flow_source_name]"/>
<enrich source="#[message.inboundProperties.'successRecordCount']" target="#[flowVars.successRecordCount]"/>
<enrich source="#[message.inboundProperties.'totalRecordCount']" target="#[flowVars.totalRecords]"/>
<enrich source="#[message.inboundProperties.'input_file_name']" target="#[flowVars.input_file_name]"/>
</enricher>
<batch:execute name="csv-source-inputBatch_ETOC" doc:name="csv-source-inputBatch_ETOC"/>
</flow>
<batch:job name="csv-source-inputBatch_ETOC">
<batch:input>
<set-payload value="#[payload]" doc:name="Set Payload_TFO_ETOC"/>
</batch:input>
<batch:process-records>
<batch:step name="TFOETOC_CLEAN_Batch_Step">
<batch:commit size="2000" doc:name="TFO_ETOC Clean Batch Commit">
<db:insert config-ref="LocalhostPostgres" bulkMode="true" doc:name="Enrich_TFO_ETOC_Src_Data">
<db:parameterized-query><![CDATA[INSERT INTO public."csv_tfo_3.4.1 etoc subs"(
seq_id, batch_id, publication, code, first_name,
last_name,subscription_date, email, marketable ,
organization, department, address1 ,
address2, city, state,
zip_code, country, phone,
status, source)
values (DEFAULT,#[flowVars.flow_batch_id],#[payload.Publication] , #[payload.Code] , #[payload.First_Name],#[payload.Last_Name], to_timestamp(#[payload.Subscription_Date],'yyyy-mm-dd hh24:mi:ss'),#[payload.Email],#[payload.Marketable],#[payload.Organization],#[payload.Department],#[payload.Address1],#[payload.Address2],#[payload.City],#[payload.State],#[payload.Zip_Code],#[payload.Country],#[payload.Phone],'pending','TFO')]]></db:parameterized-query>
</db:insert>
</batch:commit>
</batch:step>
</batch:process-records>
</batch:job>
when I process the last list , i am sending to another flow because i need to call the sub-flow in on complete, where as my last vm gets executed before the previous lists get executed.
I followed the below link
How to read huge CSV file in Mule
Can anyone give me any idea how to do that?.

replace xml particular node element value with other value in mule

<healthcare>
<plans>
<plan1>
<planid>100</planid>
<planname>medical</planname>
<desc>medical</desc>
<offerprice>500</offerprice>
<area>texas</area>
</plan1>
<plan2>
<planid>101</planid>
<planname>dental</planname>
<desc>dental</desc>
<offerprice>1000</offerprice>
<area>texas</area>
</plan2>
</plans>
</healthcare>
<splitter evaluator="xpath" expression="/healthcare/plans" doc:name="Splitter"/>
<transformer ref="domToXml" doc:name="Transformer Reference"/>
<logger level="INFO" doc:name="Logger" message=" plans detils...#[message.payload]" />
i want replace offerprice value with other values during runtime.anyhelp appreciated.I tried various various ways . anyone shed light means it saves me lot
You could use XSLT and use identity templates to replace the one element.Or if your really want to do it with MEL, convert to DOM and use Dom4j APIs to set the value and then convert back to XML if needed:
<expression-component><![CDATA[
node = message.payload.getRootElement().selectSingleNode('//plans/plan1/planid');
node.text = 'newvalue';
]]></expression-component>
<mulexml:dom-to-xml-transformer />
<logger level="ERROR" message=" #[payload]" />
EDIT
Here is an example if you want to update multiple nodes. If the transformation gets any more complex, I would really suggest taking a look at XSLT.
<mulexml:xml-to-dom-transformer returnClass="org.dom4j.Document" />
<expression-component><![CDATA[
plans = message.payload.getRootElement().selectNodes('//plans/*');
foreach (plan : plans){
plan.selectSingleNode('offerprice').text = '3000';
} ]]></expression-component>
<mulexml:dom-to-xml-transformer />
<logger level="ERROR" message=" #[payload]" />

Mule JDBC endpoint causing exception while executing SQL query

I got a Strange issue in Mule .. I have a webservice exposed in Mule that perform simple CRUD operation..
Now the issue is there is a SQL query :-
if not exists (select * from sysobjects where name='getData' and xtype='U')create table getData (ID int NOT NULL, NAME varchar(50) NULL,AGE int NULL,DESIGNATION varchar(50) NULL)
What this query does is it check whether the table exists on Database .. if it exists it, leaves and if it doesn't exists it create a new table with the same name and same fields ..
Now I want to use this query before an insert DB operation .. that is if the table exists then it will leave it and will perform insert data into it.. and if it doesn't exists then it will create the table first and then it will insert data into it ..
So my Mule Flow is following :
<jdbc-ee:connector name="Database_Global" dataSource-ref="DB_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc-ee:query key="CheckTableExistsQuery" value="if not exists (select * from sysobjects where name='getData' and xtype='U')create table getData (ID int NOT NULL, NAME varchar(50) NULL,AGE int NULL,DESIGNATION varchar(50) NULL)"/>
<jdbc-ee:query key="InsertQuery" value="INSERT INTO getData(ID,NAME,AGE,DESIGNATION)VALUES(#[flowVars['id']],#[flowVars['name']],#[flowVars['age']],#[flowVars['designation']])"/>
</jdbc-ee:connector>
<flow name="MuleDbInsertFlow1" doc:name="MuleDbInsertFlow1">
<http:inbound-endpoint exchange-pattern="request-response" host="localhost" port="8082" path="mainData" doc:name="HTTP"/>
<cxf:jaxws-service service="MainData" serviceClass="com.test.services.schema.maindata.v1.MainData" doc:name="SOAPWithHeader" />
<component class="com.test.services.schema.maindata.v1.Impl.MainDataImpl" doc:name="JavaMain_ServiceImpl"/>
<mulexml:object-to-xml-transformer doc:name="Object to XML"/>
<choice doc:name="Choice">
<when expression="#[message.inboundProperties['SOAPAction'] contains 'insertDataOperation']">
<processor-chain doc:name="Processor Chain">
<logger message="INSERTDATA" level="INFO" doc:name="Logger"/>
<jdbc-ee:outbound-endpoint exchange-pattern="request-response" queryKey="CheckTableExistsQuery" queryTimeout="-1" connector-ref="Database_Global" doc:name="Database (JDBC)"/>
<jdbc-ee:outbound-endpoint exchange-pattern="request-response" queryKey="InsertQuery" queryTimeout="-1" connector-ref="Database_Global" doc:name="Database (JDBC)"/>
//remaining code ......
As you can see .. I am trying to call CheckTableExistsQuery before InsertQuery so that it checks the table exists or not and then perform insertion of Data .. but I am getting following exception :-
ERROR 2014-09-21 14:03:48,424 [[test].connector.http.mule.default.receiver.02] org.mule.exception.CatchMessagingExceptionStrategy:
********************************************************************************
Message : Failed to route event via endpoint: DefaultOutboundEndpoint{endpointUri=jdbc://CheckTableExistsQuery, connector=EEJdbcConnector
{
name=Database_Global
lifecycle=start
this=79fcce6c
numberOfConcurrentTransactedReceivers=4
createMultipleTransactedReceivers=false
connected=true
supportedProtocols=[jdbc]
serviceOverrides=<none>
}
, name='endpoint.jdbc.CheckTableExistsQuery', mep=REQUEST_RESPONSE, properties={queryTimeout=-1}, transactionConfig=Transaction{factory=null, action=INDIFFERENT, timeout=0}, deleteUnacceptedMessages=false, initialState=started, responseTimeout=10000, endpointEncoding=UTF-8, disableTransportTransformer=false}. Message payload is of type: String
Code : MULE_ERROR--2
--------------------------------------------------------------------------------
Exception stack is:
1. No SQL Strategy found for SQL statement: {if not exists (select * from sysobjects where name='getData' and xtype='U')create table getData (ID int NOT NULL, NAME varchar(50) NULL,AGE int NULL,DESIGNATION varchar(50) NULL)} (java.lang.IllegalArgumentException)
com.mulesoft.mule.transport.jdbc.sqlstrategy.EESqlStatementStrategyFactory:105 (null)
2. Failed to route event via endpoint: DefaultOutboundEndpoint{endpointUri=jdbc://CheckTableExistsQuery, connector=EEJdbcConnector
{
name=Database_Global
lifecycle=start
this=79fcce6c
numberOfConcurrentTransactedReceivers=4
createMultipleTransactedReceivers=false
connected=true
supportedProtocols=[jdbc]
serviceOverrides=<none>
}
, name='endpoint.jdbc.CheckTableExistsQuery', mep=REQUEST_RESPONSE, properties={queryTimeout=-1}, transactionConfig=Transaction{factory=null, action=INDIFFERENT, timeout=0}, deleteUnacceptedMessages=false, initialState=started, responseTimeout=10000, endpointEncoding=UTF-8, disableTransportTransformer=false}. Message payload is of type: String (org.mule.api.transport.DispatchException)
org.mule.transport.AbstractMessageDispatcher:117 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/transport/DispatchException.html)
--------------------------------------------------------------------------------
Root Exception stack trace:
java.lang.IllegalArgumentException: No SQL Strategy found for SQL statement: {if not exists (select * from sysobjects where name='getData' and xtype='U')create table getData (ID int NOT NULL, NAME varchar(50) NULL,AGE int NULL,DESIGNATION varchar(50) NULL)}
at com.mulesoft.mule.transport.jdbc.sqlstrategy.EESqlStatementStrategyFactory.create(EESqlStatementStrategyFactory.java:105)
at org.mule.transport.jdbc.JdbcMessageDispatcher.doSend(JdbcMessageDispatcher.java:65)
at org.mule.transport.AbstractMessageDispatcher.process(AbstractMessageDispatcher.java:84)
+ 3 more (set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
********************************************************************************
But the Strange fact is that .. if I implement the same using Java code it works fine .. for example in Java code I use JDBCTemplate to execute query :-
Check table exists and create it */
String checkTableExists=getQueryByKey("CheckTableExistsQuery"); // Query for check existing table
jdbcTemplate.execute(checkTableExists); //Create Table If not exists
try {
String insertDataIntoDB = getQueryByKey("InsertQuery");
jdbcTemplate.update(insertDataIntoDB, ID, NAME, AGE,
DESIGNATION);
dataResponse.setResponse("Data inserted Successfully");
} catch (DataIntegrityViolationException e) {
SQLException sql = (SQLException) e.getCause();
e.printStackTrace();
throw sql;
} catch (Exception e) {
e.printStackTrace();
throw e;
}
Please help me .. Please let me know how to execute the query
if not exists (select * from sysobjects where name='getData' and xtype='U')create table getData (ID int NOT NULL, NAME varchar(50) NULL,AGE int NULL,DESIGNATION varchar(50) NULL)
successfully ... why it's not getting executed from Mule JDBC endpoint while it's getting executed from JDBCTemplate in Java Code
Mule doesn't recognize the if not exists... query and thus doesn't know what to do with it.
To fix this you need to:
create your own org.mule.transport.jdbc.sqlstrategy.SqlStatementStrategyFactory by sub-classing the default one and adding extra behaviour to support this type of query,
Spring-inject it into the JdbcConnector.
So as per David's suggestion, ended up using if not exists query in a Java component and Groovy component in the Mule flow and is working for me
I came across the exact same error. Indeed, when Mule doesn't know what to do, unsupported SQL query or even a missing queryKey:
java.lang.IllegalArgumentException: No SQL Strategy found for SQL statement
In my case it was the latter, my test-Suite jdbc:connector was missing from the classpath, so I added it.
In your case try rew-riting the query as follows. This one worked for me:
DROP TABLE if exists your_table;
CREATE TABLE your_table(...

Passing date parameter to jdbc query in Mule

I have a flow in Mule where I want to use a date parameter i get from one query as an input for another query.
<jdbc:connector name="myConnector" transactionPerMessage="false" dataSource-ref="myDataSource">
<jdbc:query key="getPollTimes" value="SELECT to_char(last_poll_start, 'YYYY-MM-DD HH24:MI:SS') as last_poll_start, to_char(last_poll_end, 'YYYY-MM-DD HH24:MI:SS') as last_poll_end FROM db_sources WHERE source_system = 'mySystem'" />
<jdbc:query key="getCustomerIds" value="SELECT id FROM customers WHERE updated < TO_DATE(#[variable:last_poll_end],'YYYY-MM-DD HH24:MI:SS')" />
</jdbc:connector>
<flow name="myFlow">
<enricher>
<jdbc:outbound-endpoint queryKey="getPollTimes" exchange-pattern="request-response" />
<enrich target="#[variable:last_poll_end]" source="#[groovy:payload.last_poll_end]"/>
</enricher>
<logger level="INFO" message="last_poll_end = #[variable:last_poll_end]" />
<jdbc:outbound-endpoint queryKey="getCustomerIds" exchange-pattern="request-response" />
</flow>
When running this I cannot get this to work (note that I am using an Oracle DB). I have included the exception below. Have anyone encountered this?
--------------------------------------------------------------------------------
Exception stack is:
1. Invalid column type(SQL Code: 17004, SQL State: + null) (java.sql.SQLException)
oracle.jdbc.driver.DatabaseError:113 (null)
2. Invalid column type Query: SELECT ID FROM CUSTOMERS WHERE UPDATED < TO_DATE(?,'YYYY-MM-DD HH24:MI:SS') Parameters: [[2000-01-01]](SQL Code: 17004, SQL State: + null) (java.sql.SQLException)
org.apache.commons.dbutils.QueryRunner:540 (null)
3. Failed to route event via endpoint: DefaultOutboundEndpoint{endpointUri=jdbc://getCustomerIds, connector=JdbcConnector
{
name=myConnector
lifecycle=start
this=668e94
numberOfConcurrentTransactedReceivers=4
createMultipleTransactedReceivers=false
connected=true
supportedProtocols=[jdbc]
serviceOverrides=<none>
}
, name='endpoint.jdbc.getCustomerIds', mep=REQUEST_RESPONSE, properties={queryTimeout=-1}, transactionConfig=Transaction{factory=null, action=INDIFFERENT, timeout=0}, deleteUnacceptedMessages=false, initialState=started, responseTimeout=10000, endpointEncoding=UTF-8, disableTransportTransformer=false}. Message payload is of type: ArrayList (org.mule.api.transport.DispatchException)
org.mule.transport.AbstractMessageDispatcher:106 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/transport/DispatchException.html)
--------------------------------------------------------------------------------
Problem is solved. The issue was partly that the date variable I got back from the first query was stored as an array. To resolve this i pick out the first element. Besides that I removed the to_date() in the second sql query.
This gets the first element in the array:
<enrich target="#[variable:last_poll_end]" source="#[groovy:payload.last_poll_end[0]]"/>
The updated sql:
<jdbc:query key="getCustomerIds" value="SELECT id FROM customers WHERE updated < #[variable:last_poll_end]" />