mule 4.4 community edition - how to parse file data which is coming as stream - mule

I am using Community edition Mule 4.4 ( So my understanding is - cannot use Dataweave ( Transform )
Now due to CE cannot use 'Repetable file store stream' ( which is fine ) ( am using repeatable in memory stream )
Now my problem is after I read in the file - how do I parse the contents ?
The data all shows up as
org.mule.runtime.core.internal.streaming.bytes.ManagedCursorStreamProvider
If I could use Dataweave this was simple enough : i.e.
<ee:transform doc:name="Transform Message">
<ee:message>
<ee:set-payload>
<![CDATA[%dw 2.0
output application/json
---
payload map (value,index)->
{
id:value.column_0
}]]>
</ee:set-payload>
</ee:message>
</ee:transform>
But without using Transform component ( since I am using Community Edition Mule runtime 4.4 ) , how do we handle payload which is really a 'Stream' of data ?
Thanks
Please see above code , I need to convert the file content ( which is a stream ) into JSON
edit1:
Thanks to #aled updating with more details : Below is the file read operation where I am trying to read in a tab delimited file. Was not sure what I should set the outputMimeType so have set it as 'application/csv'
<file:read doc:name="Read Products file" config-ref="File_Config" outputMimeType='application/csv; header=false; separator=|' path="/employee_master/employees.unl" outputEncoding="utf-8">
<repeatable-in-memory-stream />
</file:read>

You are thinking about it the wrong way. First, you are looking at the implementation class in Mule, which in Mule 4 you should not be looking at. No file is really a 'stream of data' when you are trying to process it. Is it a binary file? No, clearly if you use map() DataWeave knows how to parse it. So what is the data format of the input file? JSON, CSV, XML, etc? It looks like the correct format is already set so DataWeave is able to parse the input. Really your problem is that you want to transform the data but you can not use a Transform component in the community edition. Also the editing experience will not be good in the UI.
What you can do in community edition is use a set-payload with an expression. Since the expression language is still DataWeave it will work exactly the same. You will lose the graphical UI of Transform.

Related

How to parse ISO8583 message from a text file & write it to a database

I am having few ISO8583 logs in a text file. I want to parse these logs from this text file and write them to any database with some descriptive information such as class of the message, message function, message origin, processing code, response code etc.
I am new to the BASE24/ISO8583 and was trying to find any ready-made parser for this. Is there any such parser available ? Does jPOS provides such functionality ?
EDIT
I have the logs in ISO8583 format in ".log" file as given below:
MTI : 0200
Field-3 : 201234
Field-4 : 000000010000
Field-7 : 0110722180
Field-11 : 123456
Field-44 : A5DFGR
Field-105 : ABCDEFGHIJ 1234567890
This is same as the format given in the link shared by you.
It also consists of hex dump but I dont want to parse that.
The code given in the link is doing packing and unpacking of the message where as what I am trying is to read these logs (in unpacked form) and write them into a database table.
I think i need to write my own code for this and use the jPOS packagers in it.
It really depends on the format of the log file - are the ISO8583 messages - HexStrings, and HexDump an XML representation of ISO8583, some other application trace file ?
Once you know the format and it might require some massaging - you will want to research the ISOMsg.unpack() methods using the appropriate jPOS packager. the packager defines the field structure - of the various ISO8583 fields and field construction (lengths, character set, etc.)
a good example was found at the following blog post: looking at the "Parse (unpack) ISO Message" seciton http://jimmod.com/blog/2011/07/26/jimmys-blog-iso-8583-tutorial-build-and-parse-iso-message-using-jpos-library/
You mention - Base24 - jPOS does have a few packagers that might be close starting point.:
https://github.com/jpos/jPOS/blob/master/jpos/src/dist/cfg/packager/base24.xml
Those human-readable log formats are usually difficult to parse without loosing information. Moreover, the logs are probably PCI compliant so there's a lot of masked information there. You want to ask for ah hex dump of the messages.
what is displayed in log file is parsed ISO.Hence you need not use jpos.jpos is only for packing and unpacking when you transmit the message.
Assign the field to variable and write in DB
for example,Field 39 is response code.
Using jpos is good idea. You should go for your custom packager design class.

CSV TO JSON using dataweave

Please refer this dataweave window image :- dataweave window image
I am trying to transform a CSV TO JSON using dataweave but even for the simplest of transformations it is creating a null tag automatically. I can see that in the preview window. When I run this application I am getting this error
*
Exception stack is:
1. 452 (java.lang.ArrayIndexOutOfBoundsException) com.mulesoft.weave.reader.CharArraySourceReader:21 (null)
2. 452 (java.lang.ArrayIndexOutOfBoundsException). Message payload is of type: WeaveMessageProcessor$WeaveOutputHandler
(org.mule.api.MessagingException)
org.mule.execution.ExceptionToMessagingExceptionExecutionInterceptor:32
******************************************************************************** Root Exception stack trace: java.lang.ArrayIndexOutOfBoundsException:
452 at
com.mulesoft.weave.reader.CharArraySourceReader.lookAheadAscii(CharArraySourceReader.scala:21)
at
com.mulesoft.weave.reader.csv.parser.CSVParser.parse(CSVParser.scala:132)
at
com.mulesoft.weave.reader.csv.parser.CSVParser.elementAt(CSVParser.scala:61)
at
com.mulesoft.weave.reader.csv.parser.CSVParser.contains(CSVParser.scala:38)
at
com.mulesoft.weave.reader.csv.CSVRecordsValue$$anon$1.hasNext(CSVReader.scala:58)
at scala.collection.Iterator$class.toStream(Iterator.scala:1188) at
com.mulesoft.weave.reader.csv.CSVRecordsValue$$anon$1.toStream(CSVReader.scala:56)
This is the sample csv I am using
SpreadsheetKeyEmployee,Position,EffectiveDate,BonusPlan,Amount,Currency,IgnorePlanAssignment
1,18211,2016-05-01,BONUS_PLAN1,150,USD
2,18212,2016-05-01,BONUS_PLAN2,150,USD
3,18213,2016-05-01,BONUS_PLAN3,150,USD
4,18214,2016-05-01,BONUS_PLAN4,150,USD
I think i might be doing some mistake with the reader configuratuion (for csv) there is a similar issue discussed in the forum but this isnt helping either https://forums.mulesoft.com/questions/36378/dataweave-example-of-csv-to-json.html
I have tried recreating your scenario but unable to replicate it, here is the dataweave configuration that I have used and it works.
<dw:transform-message metadata:id="e4e1b720-5d25-4b36-8406-cf7d6bfc7d6a" doc:name="CSV to JSON">
<dw:set-payload><![CDATA[%dw 1.0
%output application/json
---
payload map ((payload01 , indexOfPayload01) -> {
SpreadsheetKeyEmployee: payload01.SpreadsheetKeyEmployee as :number,
Position: payload01.Position,
EffectiveDate: payload01.EffectiveDate,
BonusPlan: payload01.BonusPlan,
Amount: payload01.Amount,
Currency: payload01.Currency,
IgnorePlanAssignment: payload01.IgnorePlanAssignment
})]]></dw:set-payload>
</dw:transform-message>
It might be the case that dataweave cannot handle window line endings correctly. Can you try to replace the windows new line delimiter with \n only as follows:
#[payload.replace("\r\n", "\n")]
You can place the above MEL expression in a set-payload before dataweave.
You need to parse the csv file and map it to JSON object in dataweave component.
Check the link below
https://dzone.com/articles/csv-xml-json-in-mulesoft

How to access the XML which comes out of Dataweave

I try to access the elements under xml which is comes out of DataWeave. It gives returns me null values.
DataWeave Script is
%dw 1.0
%namespace ns0 urn:abc:dbc:Components
%output text/xml
---
ItemFee:{
product_id:flowVars."Dept_id",
TotalFees: sum payload.ns0#ItemResponse.ns0#Fees.*ns0#Fee.ns0#Fee
}
Immediate after to this dataweave i have logger node with below message.
#[message.payload.ItemFee.TotalFees]
I am getting error saying
Execution of the expression "message.payload.ItemFee.TotalFees" failed. (org.mule.api.expression.ExpressionRuntimeException). Message payload is of type: WeaveMessageProcessor$WeaveOutputHandler
I like to say one more point here. When i give below text in the logger immediate after 'Transform Message'. The message printed in the console without issue. But i could not access the elements in the xml message.#[message.payloadAs(java.lang.String)]
That MEL syntax only works with Java objects. As the ouput is XML, you will have to use the xpath3 MEL function:
https://docs.mulesoft.com/mule-user-guide/v/3.7/xpath#the-xpath3-function
Something like:
#[xpath3('//ItemFee/TotalFees').text]
Thanks Ryan. Your answer helped me. I have entered below in my logger. Then i am able to fetch the items.
#[xpath3('//ItemFee/TotalFees')]
Earlier i tried these stuff's i am not sure why it not worked earlier. Might be i have overlooked the issue. :)

Count Number of Rows Processed by Mule DataMapper

I am using Mule's datamapper to write data from a database to a csv file. I am using the streaming option on the database, the datamapper and the file ouput. I want to be able to log the amount of records written by the datamapper. Is there a way to get this data? I am running mule server 3.5.2 and have anypoint studio version 5.2.0.
Not out of the box. You can use an outputArgument and increase a counter if you are NOT using streaming.
If you are using streaming then you can pass an input argument of a counter class. And from the script component of Datamapper you can increment the counter and return the counter as part of the payload to get access to it:
<data-mapper:transform config-ref="Pojo_To_JSON_1" doc:name="Pojo To JSON" stream="true">
<data-mapper:input-arguments>
<data-mapper:input-argument key="counter">#[new Counter()]</data-mapper:input-argument>
</data-mapper:input-arguments>
</data-mapper:transform>
Datamapper script:
//MEL
//START -> DO NOT REMOVE
output.__id = input.__id;
//END -> DO NOT REMOVE
output.text = inputArguments.counter.increment();
I know this is an old thread but still below could help -
<byte-array-to-object-transformer doc:name="Byte Array to Object"/>
<set-variable variableName="Orig_Rows" value="#[payload.length]" doc:name="Variable"/>

Message.InboundProperty always return String - MULE - AnyPoint Studio - APIkit

im developing an API using RAML + MULE AnypointStudio (APIkit) . so in my RAML file i had defined a resourse like this.
/player
post:
queryParameters:
year: type: integer
place: type: string
Then, into AnyPoint Studio, after importing my .raml file i got a flow with the post method asociated. I use the MySQL Conector to insert in a data base. the problem resides in the query:
INSERT INTO Matches (day,place,max_players)
VALUES (#[message.inboundProperties.'day'],#[message.inboundProperties'place'],
#[message.inboundProperties.'maxPlayers'])
when i call #[message.inboundProperties.day] it returns a string, but i want an integer value.
im new at MULE, so it would be great if you can explain me how.
ty
All query parameters are treated as Strings. You can use MEL to convert to an int though. Here's an example suing Java in a Mel expression to parse the parameter to an int.
#[Integer.parseInt(message.inboundProperties.day)]