Mulesoft: Insert into sql server fails b/c of null constraint however payload has data - sql

I'm still new to Mulesoft's Mule ESB and I just cannot get past this issue.
I'm having trouble understanding why my Mule flow is failing with a sql server insert error.
It appears that the payload has data but for some reason the query parameters are not being populated with that data. I can read from a table or file fine it's just the writing to a db table that goes sideways.
What do I need to do to write my payload to a sql server database?
Below is the error message I get when I run my Mule Application in AnyPoint Studio. I've removed the bulk of the payload from the message due to privacy.
I've also posted my flow xml.
Any help would be appreciated. Thanks
********************************************************************************
Message : Cannot insert the value NULL into column 'AgentId', table 'GARDB1dev.src.AgentList'; column does not allow nulls. INSERT fails. (com.microsoft.sqlserver.jdbc.SQLServerException).
Payload : [{Agent_ID=10032, **REDACTED DATA** [..]]
Payload Type : java.util.ArrayList
Element : /Load-AgentList/processors/2/1/1 # gar-data-load:sharepoint.xml:51
Element XML : <db:insert config-ref="GAR-DB-Connection-SSPI" doc:name="Insert AgentList Data">
<db:parameterized-query>insert into src.AgentList (AgentId,PartyId,PartyType,AgentIdStatus,SalesOrg,AgentIdStatusStart,AgentIdStatusEnd,AgentName,SubOrganization,[Function],NewToOrg,NewToChannel,DaysAgentIdActive,Region,AgentPhone,CertificationSummary)Values (#[payload.Agent_ID],#[payload.Party_ID],#[payload.Party_Type],#[payload.Agent_ID_Status],#[payload.Sales_Organization],#[payload.Agent_ID_Status_Start],#[payload.Agent_ID_Status_End],#[payload.Agent_Name],#[payload.Sub_Organization],#[payload.Function],#[payload.New_To_Org],#[payload.New_To_Channel],#[payload.Days_Agent_ID_Active],#[payload.Region],#[payload.Agent_Phone],#[payload.Certification_Summary])</db:parameterized-query>
</db:insert>
--------------------------------------------------------------------------------
The XML description of my Mule Flow is the following:
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:db="http://www.mulesoft.org/schema/mule/db" xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns:file="http://www.mulesoft.org/schema/mule/file" xmlns:dw="http://www.mulesoft.org/schema/mule/ee/dw" xmlns:metadata="http://www.mulesoft.org/schema/mule/metadata" xmlns:sharepoint2010="http://www.mulesoft.org/schema/mule/sharepoint2010" xmlns:sharepoint="http://www.mulesoft.org/schema/mule/sharepoint" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:spring="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/sharepoint http://www.mulesoft.org/schema/mule/sharepoint/current/mule-sharepoint.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/sharepoint2010 http://www.mulesoft.org/schema/mule/sharepoint2010/current/mule-sharepoint2010.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/dw http://www.mulesoft.org/schema/mule/ee/dw/current/dw.xsd
http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd">
<file:connector name="input" autoDelete="false" streaming="true" validateConnections="true" doc:name="File" readFromDirectory="C:\Users\rpearso7\_gar-data\input" />
<flow name="Load-AgentList">
<file:inbound-endpoint path="C:\Users\rpearso7\_gar-data\input" responseTimeout="10000" doc:name="File" moveToDirectory="C:\Users\rpearso7\_gar-data\archive" connector-ref="input" moveToPattern="#[function:datestamp]-#[message.inboundProperties['originalFilename']]">
<file:filename-regex-filter pattern="agent_list\.csv" caseSensitive="false"/>
</file:inbound-endpoint>
<dw:transform-message doc:name="Transform Message" metadata:id="fdb1aea0-6e94-425f-9493-ee71983b8eb1">
<dw:input-payload mimeType="application/csv"/>
<dw:set-payload><![CDATA[%dw 1.0
%output application/java
---
payload]]></dw:set-payload>
</dw:transform-message>
<logger message="************** Payload Loaded" level="INFO" doc:name="Log Payload"/>
<scatter-gather doc:name="Scatter-Gather">
<threading-profile maxThreadsActive="1" poolExhaustedAction="WAIT"/>
<processor-chain>
<logger message="***************************** Truncate AgentList Table" level="INFO" doc:name="Logger"/>
<db:update config-ref="GAR-DB-Connection-SSPI" doc:name="Truncate table">
<db:parameterized-query><![CDATA[truncate table src.AgentList;]]></db:parameterized-query>
</db:update>
<logger message="************************************ AgentList table truncated" level="INFO" doc:name="Logger"/>
</processor-chain>
<processor-chain>
<logger message="********************************* Start Insert" level="INFO" doc:name="Logger"/>
<db:insert config-ref="GAR-DB-Connection-SSPI" doc:name="Insert AgentList Data">
<db:parameterized-query><![CDATA[insert into src.AgentList (
AgentId,
PartyId,
PartyType,
AgentIdStatus,
SalesOrg,
AgentIdStatusStart,
AgentIdStatusEnd,
AgentName,
SubOrganization,
[Function],
NewToOrg,
NewToChannel,
DaysAgentIdActive,
Region,
AgentPhone,
CertificationSummary)
Values (
#[payload.Agent_ID],
#[payload.Party_ID],
#[payload.Party_Type],
#[payload.Agent_ID_Status],
#[payload.Sales_Organization],
#[payload.Agent_ID_Status_Start],
#[payload.Agent_ID_Status_End],
#[payload.Agent_Name],
#[payload.Sub_Organization],
#[payload.Function],
#[payload.New_To_Org],
#[payload.New_To_Channel],
#[payload.Days_Agent_ID_Active],
#[payload.Region],
#[payload.Agent_Phone],
#[payload.Certification_Summary])]]></db:parameterized-query>
</db:insert>
</processor-chain>
</scatter-gather>
<logger message="**************** Complete" level="INFO" doc:name="Logger"/>
</flow>
</mule>

I have had a look what happens in the debugger, when I use your transformation on a CSV file.
I have set a breakpoint at the first logger, so I could stop the execution and have a look how different Mule Expression Language (MEL) expressions behave in different steps of your flow.
When I evaluate #[payload] after the transformation; I get an ArrayList with the lines of the CSV file.
This seems to be correct.
However... Since the scather-gather only sends a copy to each the same payload is visible in the debugger after the scather-gather as well.
For details see the following blog post: Scatter-Gather in Mule ESB
Scatter-Gather is a routing message processor in Mule ESB runtime that
sends a request message to multiple targets concurrently. It then
collects the responses from all routes and aggregates them back into a
single response.
According to this the DB INSERT processing-step in your flow gets an ArrayList. The elements of the ArrayList can be indexed with a null-based integer index. You can't use the names of the columns of the CSV file to use them.
First you have to select an entry from the ArrayList and then you can use the column headers from the CSV file on the entries of the ArrayList, like in the following example:
#[payload[0].Agent_id]
This MEL expression returns the Agent ID as expected. But if you try to execute #[payload.Agent_id] ; you get null.
In my opinion you do exactly this in your INSERT. Instead of inserting each of the entries of the ArrayList created from the CSV file, you try to access the columns of the CSV file on the ArrayList and not on its elements.
A potential solution could be to add a foreach scope.
Using this you could iterate through the entries of the ArrayList and insert each of them to your database table.
Please also see the results of different MEL expressions in the different states in a similfied version of you flow:
(Please note, that the Mule Expression Language is case-sensitive. If the "ID" substring is replaced with "Id", then I get "null" back.)
I have been testing with a simple CSV file with the following contents:
Agent_ID,Agent_Name
7,james bond
1,Mr. 47
8,Dr. X
You could consider a solution similar to the following one:
I have just added a foreach scope.
This flow has generated the following console-output:
INFO 2018-03-14 16:08:37,736 [[testproject01].connector.file.mule.default.receiver.01] org.mule.transport.file.FileMessageReceiver: Lock obtained on file: X:\SomePathToTheFile\agent_list.csv
INFO 2018-03-14 16:08:38,485 [[testproject01].Load-AgentList.stage1.02] org.mule.api.processor.LoggerMessageProcessor: ************** Payload Loaded
INFO 2018-03-14 16:08:38,488 [[testproject01].ScatterGatherWorkManager.01] org.mule.api.processor.LoggerMessageProcessor: ***************************** Truncate AgentList Table
INFO 2018-03-14 16:08:38,506 [[testproject01].ScatterGatherWorkManager.01] org.mule.api.processor.LoggerMessageProcessor: ********************************* Start Insert
INFO 2018-03-14 16:08:38,506 [[testproject01].ScatterGatherWorkManager.01] org.mule.api.processor.LoggerMessageProcessor: ********************************* Start Insert
INFO 2018-03-14 16:08:38,507 [[testproject01].ScatterGatherWorkManager.01] org.mule.api.processor.LoggerMessageProcessor: ********************************* Start Insert
I hope this is what you are trying to achieve.
If you have furhter questions or I misunderstood your goal, then please add a comment.
Thank you very much.

We have experienced deadlock issues in a similar scenarion, when another process is running an INSERT statement against that same database table.
Scenario:
TRUNCATE is currently being executed on a table, when the same time, concurrently another process INSERTS some data in the same table of the database.
First prioritize the TRUNCATE and acquire lock on the table. Once it is done then go for the INSERT.
You should request a table lock before you execute the TRUNCATE.
If you do this you can't get a deadlock -- the table lock won't be granted before the INSERT finishes and once you have the lock another INSERT can't occur.
Update from the comment:
You can use the LOCK TABLE command.

Related

How to parse FlatFile in mulesoft

I am new to Mulesoft.
I have one flatfile-
RHR001NTT PQR 2018090920180505
STR0010057830DFLT 74253J461000490
STR0020000000000000000000000000000000
I want to iterate each line and then I want to take each row to get substring from one position to another position. E.g. in row one I want substring from 6th column to 12th column.
I am trying new things to do it. I have separated each line using splitter component with
#[StringUtils.split(message.payload, '\n\r')]
and now I want to take substring from each line from one position to another position.
I have no idea what should I do now? Is there any other way? I have heard about For-Each component.I don't have any experience or idea about For-Each and Splitter components.
Please help me out. Thanks in advcance!
This configuration might help. This will iterate over each line, and the transformer splits by " ". This will give you an array. Beware the payload in the For-Each stay's in the foreach and will not exists outside it.
http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/dw http://www.mulesoft.org/schema/mule/ee/dw/current/dw.xsd">
<flow name="xyzFlow">
<set-payload value="#[StringUtils.split(message.payload, '\n\r')]" doc:name="Set Payload"/>
<foreach collection="#[payload]" doc:name="For Each">
<dw:transform-message doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0
%output application/java
---
payload splitBy " "]]></dw:set-payload>
</dw:transform-message>
</foreach>
</flow>

Splitting a comma-separated string and saving to database without DataWeave in Mule

My one concern with many tools is that difficult things become easy, but easy things become difficult. I'm currently stuck with such a problem.
I'm using the community edition of Mule. This edition does not include the DataWeave (used to be DataMapper) function.
Is there a simple way to write a flow that splits a comma-separated string into values and save them to a table in a database?
Try the flow config bellow, basically you use MEL and split string, after splitting the payload will be a collection, then just use collection splitter or in this example a foreach, then just put your database outbound connector and construct your insert sql statement since you don't have data weave or data mapper where you can utilize data sense.
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:json="http://www.mulesoft.org/schema/mule/json" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:spring="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/json http://www.mulesoft.org/schema/mule/json/current/mule-json.xsd">
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8081" doc:name="HTTP Listener Configuration"/>
<flow name="sampleFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/inbound" doc:name="Inbound HTTP"/>
<set-payload value="one,two,three,four" doc:name="Set Sample Payload"/>
<expression-transformer expression="#[message.payload.split(",")]" doc:name="Split String"/>
<foreach collection="#[payload]" doc:name="For Each">
<logger message="INSERT INTO table(field_a) VALUES(#[payload]);" level="INFO" doc:name="SQL INSERT"/>
<logger message="INSERT TO DB" level="INFO" doc:name="YOUR DATABASE CONNECTOR"/>
</foreach>
</flow>
</mule>
LOG OUTPUT
org.mule.api.processor.LoggerMessageProcessor: INSERT INTO
table(field_a) VALUES(one);
org.mule.api.processor.LoggerMessageProcessor: INSERT TO DB
org.mule.api.processor.LoggerMessageProcessor: INSERT INTO
table(field_a) VALUES(two);
org.mule.api.processor.LoggerMessageProcessor: INSERT TO DB
org.mule.api.processor.LoggerMessageProcessor: INSERT INTO
table(field_a) VALUES(three);
org.mule.api.processor.LoggerMessageProcessor: INSERT TO DB
org.mule.api.processor.LoggerMessageProcessor: INSERT INTO
table(field_a) VALUES(four);
org.mule.api.processor.LoggerMessageProcessor: INSERT TO DB
Best way is
After all trials, i found that this dataweave transofrmation works fne without any data splitting or any ambiguity .
Make sure UTF-8 encoding is mentioned in JSON format.
\n
The Above is using Mule 3.8 and anypoint 6.0 versions
%dw 1.0
%output application/csv quoteValues=true,separator="|~" ,header=true ,escape="\""
---
payload
This Works fine without any mapping you could directly map just by specifying payload

Why Batch scope behave strange when trying to load a Huge Records- Mule ESB

I'm facing issues in Process Record Phase of Batch, Kindly suggest- I'm trying to load the some KB file ( which has about 5000 record). For the success scenario it works.
If suppose error happened in input phase for the first hit and the flows stops, when the second time when it try to hit the same record. Mule stops executing in Process Record step.It is not running After loading Phase. Please find the run time logs below
11:55:33 INFO info.org.mule.module.logging.DispatchingLogger - Starting loading phase for instance 'ae67601a-5fbe-11e4-bc4d-f0def1ed6871' of job 'test'
11:55:33 INFO info.org.mule.module.logging.DispatchingLogger - Finished loading phase for instance ae67601a-5fbe-11e4-bc4d-f0def1ed6871 of job order. 5000 records were loaded
11:55:33 INFO info.org.mule.module.logging.DispatchingLogger - **Started execution of instance 'ae67601a-5fbe-11e4-bc4d-f0def1ed6871' of job 'test**
It stopped processing after instance starts- I'm not sure what is happening here.
When i stop the flow and delete the .mule folder from the workspace. It then works.
I hope in loading phase mule using temporary queue it is not being deleted automatically when exception happens in input phase, but not sure this could be the real cause.
I cant go and delete each time the .muleFolder in a real time.
Could you please anyone suggest what makes the strange behavior here. How to i get rid of this issue. Please find config xml
<batch:job name="test">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<component class="com.ReadFile" doc:name="File Reader"/>
<mulexml:jaxb-xml-to-object-transformer returnClass="com.dto" jaxbContext-ref="JAXB_Context" doc:name="XML to JAXB Object"/>
<component class="com.Transformer" doc:name="Java"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step" accept-policy="ALL">
<batch:commit doc:name="Batch Commit" streaming="true">
<logger message="************after Data mapper" level="INFO" doc:name="Logger"/>
<data-mapper:transform config-ref="Orders_Pojo_To_XML" stream="true" doc:name="Transform_CanonicalToHybris"/>
<file:outbound-endpoint responseTimeout="10000" doc:name="File" path="#[sessionVars.uploadFilepath]"">
</file:outbound-endpoint>
</batch:commit>
</batch:step>
</batch:process-records>
<batch:on-complete>
<set-payload value="BatchJobInstanceId:#[payload.batchJobInstanceId+'\n'], Number of TotalRecords: #[payload.totalRecords+'\n'], Number of loadedRecord: #[payload.loadedRecords+'\n'], ProcessedRecords: #[payload.processedRecords+'\n'], Number of sucessfull Records: #[payload.successfulRecords+'\n'], Number of failed Records: #[payload.failedRecords+'\n'], ElapsedTime: #[payload.elapsedTimeInMillis+'\n'], InpuPhaseException #[payload.inputPhaseException+'\n'], LoadingPhaseException: #[payload.loadingPhaseException+'\n'], CompletePhaseException: #[payload.onCompletePhaseException+'\n'] " doc:name="Set Batch Result"/>
<logger message="afterSetPayload: #[payload]" level="INFO" doc:name="Logger"/>
<flow-ref name="log" doc:name="Logger" />
</batch:on-complete>
I'm in struck with this behavior quite a long days. Your help will be much appreciated.
Version:3.5.1
Thanks in advance.
Set max-failed-records to -1 so that batch job will continue even though there an exception
<batch:job name="test" max-failed-records="-1">
in the real time environment you don't have the situation to clean .mule folder
this happens only when you are working with Anypoint Studio

Parsing multiple records after Polling in Mule and pushing every single record in the queue

As of now ,when I poll the database with a query,It will fetch me multiple records during a specified duration.Problem is ,these multiple records will be pushed as a single message into the queue.How do I push every record from the set of records as an individual message?
As you have explained the JDBC endpoint is fetching a collection of records and sending them as one single message to the queue. Solution for this is two options.
Using Mule's For-Each message processor. This helps in iterating through the collection object and processes each item as one message.
Using Mule's collection splitter to iterate through the collection of records.
Solution for option 1 looks like as shown in the image below.
The code for this flow loks like this.
<flow name="JDBC-For-Each-JMS-Flow" >
<jdbc-ee:inbound-endpoint queryKey="SelectAll" mimeType="text/javascript" queryTimeout="500000" pollingFrequency="1000" doc:name="Database">
<jdbc-ee:query key="SelectAll" value="select * from users"/>
</jdbc-ee:inbound-endpoint>
<foreach doc:name="For Each" collection="#[payload]" >
<jms:outbound-endpoint doc:name="JMS"/>
</foreach>
<catch-exception-strategy doc:name="Catch Exception Strategy"/>
</flow>
Note: This is a sample flow.
Hope this helps.

Mule MEL to read database result-set from an second 'database outbound endpoint'

I have a flow something like this
A 'Database inbound endpoint' which polls(for every 5 mins) to mySQL Database-Server and get result-set by a select-query (automatically this becomes the current payload i.e #[message.payload])
'For each' component and a 'Logger' component in it using a expression as #[message.payload]
Now flow has one more 'Database-out-bound-endpoint' component which executes another select-query and obtains result-set.
'For each' component with a 'Logger' component in it using a expression as #[message.payload]
Note: in the loggers result-set of first DB is printing. I mean second logger is also showing result-set of first query itself.Because the result-set is storing as payload
so, my questions are
what is the MEL to read the result-set of second database-query in the above scenario.
is there any another way to read result-set in the flow
Here is the configuration XML
<jdbc-ee:connector name="oracle_database" dataSource-ref="Oracle_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database"/>
<flow name="testFileSaveFlow3" doc:name="testFileSaveFlow3">
<poll frequency="1000" doc:name="Poll">
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="selectTable1" queryTimeout="-1" connector-ref="oracle_database" doc:name="get data from table 1">
<jdbc-ee:query key="selectTable1" value="SELECT * FROM TABLE1"/>
</jdbc-ee:outbound-endpoint>
</poll>
<foreach doc:name="For Each">
<logger message="#[message.payload]" level="INFO" doc:name="prints result-set of table1"/>
</foreach>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="selectTable2" queryTimeout="-1" connector-ref="oracle_database" doc:name="get data from table 2">
<jdbc-ee:query key="selectTable2" value="SELECT * FROM TABLE2"/>
</jdbc-ee:outbound-endpoint>
<foreach doc:name="For Each">
<logger message="#[message.payload]" level="INFO" doc:name="prints result-set of table2"/>
</foreach>
</flow>
thanks in advance.
This is not the issue with the MEL. It is the issue with your flow logic.
The second result set is not available in the message.
The JDBC Outbound Endpoint is one-way. So Mule flow will not wait for the reply (result set) from the second JDBC (outbound ) in the middle of the flow. So the second time also it is printing the first result set.
Type 1:
Try making your JBDC outbound request-response instead of one-way.
Type 2:
Try Mule Enricher to call the JDBC outbound to call the DB and store the result set into a varaible and try looping the varaible.
Hope this helps.