Munit 2.1 mock a Map response - munit

I am writing a MUnit testcase and I need to mock the response normally achieved from an external system. Therefor I have created a mock-when:
<munit-tools:mock-when doc:name="Mock when" doc:id="310d8979-9451-4767-a344-dfa190fb9c79" processor="dummy">
<munit-tools:then-return >
<munit-tools:payload value="#[['a':'1000013','b':'900154196']]" mediaType="application/java"/>
</munit-tools:then-return>
</munit-tools:mock-when>
I would like to achieve to goal of having a map with these key value pairs:
key a having value 1000013
key b having value 900154196
as payload
Question:
How do I need to write this in the munit-tools:payload
What I have currently allows my dataweave to select the first value, but it's not picking up the second value. This I have tested by changing #[['a':'1000013','b':'900154196']] to #[['b':'900154196','a':'1000013']] in the munit-tools:payload...
I am using Munit 2.1 in AnypointStudio 7.

The issue might be your syntax. Have you tried this instead:
#[[{'a':'1000013'}, {'b':'900154196'}]]

Related

mule 4.4 community edition - how to parse file data which is coming as stream

I am using Community edition Mule 4.4 ( So my understanding is - cannot use Dataweave ( Transform )
Now due to CE cannot use 'Repetable file store stream' ( which is fine ) ( am using repeatable in memory stream )
Now my problem is after I read in the file - how do I parse the contents ?
The data all shows up as
org.mule.runtime.core.internal.streaming.bytes.ManagedCursorStreamProvider
If I could use Dataweave this was simple enough : i.e.
<ee:transform doc:name="Transform Message">
<ee:message>
<ee:set-payload>
<![CDATA[%dw 2.0
output application/json
---
payload map (value,index)->
{
id:value.column_0
}]]>
</ee:set-payload>
</ee:message>
</ee:transform>
But without using Transform component ( since I am using Community Edition Mule runtime 4.4 ) , how do we handle payload which is really a 'Stream' of data ?
Thanks
Please see above code , I need to convert the file content ( which is a stream ) into JSON
edit1:
Thanks to #aled updating with more details : Below is the file read operation where I am trying to read in a tab delimited file. Was not sure what I should set the outputMimeType so have set it as 'application/csv'
<file:read doc:name="Read Products file" config-ref="File_Config" outputMimeType='application/csv; header=false; separator=|' path="/employee_master/employees.unl" outputEncoding="utf-8">
<repeatable-in-memory-stream />
</file:read>
You are thinking about it the wrong way. First, you are looking at the implementation class in Mule, which in Mule 4 you should not be looking at. No file is really a 'stream of data' when you are trying to process it. Is it a binary file? No, clearly if you use map() DataWeave knows how to parse it. So what is the data format of the input file? JSON, CSV, XML, etc? It looks like the correct format is already set so DataWeave is able to parse the input. Really your problem is that you want to transform the data but you can not use a Transform component in the community edition. Also the editing experience will not be good in the UI.
What you can do in community edition is use a set-payload with an expression. Since the expression language is still DataWeave it will work exactly the same. You will lose the graphical UI of Transform.

How to assert varying payload of connector call in for-each loop in Mule 3 (Munit 1.11)?

My Mule 3.9 app (Munit 1.11) consists of a for-each loop in which a http connector is called in each for-each iteration. The payload with which the connector is called varies in each loop. Now I would like to assert the payload in a corresponding Munit test for each for-each iteration. The idea is to use something comparable to the JavaScript Jest function toHaveBeenNthCalledWith so that a assert payload can be defined for each nth connector call. Is there some build in function for this available or is there a workaround for this?

How to find current flow name in mule?

I need to capture current flow name into a variable.
I tried with #[flow.name] but no luck in mule 3.8.0
can anybody please assist me?
Based on the answer in this post: How to get caller flow name in private flow in Mule
There is a simplest way to get the flow name and put it into a variable:
<expression-component doc:name="Expression"><![CDATA[flowVars.flowName = flow.name;]]></expression-component>
Alternately, you can directly use expression #[mule:context.serviceName] in a variable :-
<set-variable variableName="myFlowName" value="#[mule:context.serviceName]" doc:name="Variable"/>
<!-- Print the value of variale in logger -->
<logger message="#[flowVars.myFlowName]" level="INFO" doc:name="Logger"/>
This will set your current flow name directly in variable
In mule 3.8.5 using Groovy script component,
flowVars.currentFlowName = eventContext.getFlowConstruct().getName();
I have been using #[flow.name] in 3.7.3 and just tried in 3.8.0 to make sure it had not been removed and it worked fine for me in logger and setting a flowVars value. I suggest posting up at least a snippet of your flow and maybe we can spot the issue you are having.
PS, not sure why flow.name is not in standard forms or really documented by Mule, and as it is not there continues to be some worries they will remove it. I have seen it stated more than just here that it is not accessible in MEL, but #[flow.name] is a MEL expression and does work. To use if for something like I Parse Template in exception strategies, I use sulthony's form, set a flowVars value in an expression and refer to that flowVars in my template.
You can access flow name in logger by using #[flow.name] but its not accessible in MEL. Use flowconstruct for getting flow name. Refer this answer
Hope this helps.

Count Number of Rows Processed by Mule DataMapper

I am using Mule's datamapper to write data from a database to a csv file. I am using the streaming option on the database, the datamapper and the file ouput. I want to be able to log the amount of records written by the datamapper. Is there a way to get this data? I am running mule server 3.5.2 and have anypoint studio version 5.2.0.
Not out of the box. You can use an outputArgument and increase a counter if you are NOT using streaming.
If you are using streaming then you can pass an input argument of a counter class. And from the script component of Datamapper you can increment the counter and return the counter as part of the payload to get access to it:
<data-mapper:transform config-ref="Pojo_To_JSON_1" doc:name="Pojo To JSON" stream="true">
<data-mapper:input-arguments>
<data-mapper:input-argument key="counter">#[new Counter()]</data-mapper:input-argument>
</data-mapper:input-arguments>
</data-mapper:transform>
Datamapper script:
//MEL
//START -> DO NOT REMOVE
output.__id = input.__id;
//END -> DO NOT REMOVE
output.text = inputArguments.counter.increment();
I know this is an old thread but still below could help -
<byte-array-to-object-transformer doc:name="Byte Array to Object"/>
<set-variable variableName="Orig_Rows" value="#[payload.length]" doc:name="Variable"/>

How to refer session variables in Groovy script in mule studio

I just started working with Mule.
Flow description:
I have an HTTP inbound endpoint receive XML message, and I Hvae to update the database (derby) using the XML payload.
Ex: I will be receiving Emp Id and Emp name, Exp in the request. I have to update the table with these values.
My Implementation:
After receiving XML input I am using the message Property transformer to save the values in Session scope.
<message-properties-transformer scope="session" doc:name="Message Properties">
<add-message-property key="EmpNum"
value="#[xpath:/CreateEmployee/EmpNum]" />
</message-properties-transformer>
like above. And then I have Groovy Script component to update the table.
My Query is:
r.update(conn, "INSERT INTO Employee values(#[header:session:EmpNum],#[header:session:EmpName],#[header:session:Experience],#[header:session:Role])");
But it is throwing error:
Lexical error at line 1, column 29. Encountered: "#" (35), after : "". (org.apache.derby.iapi.error.StandardException)
org.apache.derby.iapi.error.StandardException:-1 (null)
Lexical error at line 1, column 29. Encountered: "#" (35), after : "". Query: INSERT INTO Employee values(#[header:session:EmpNum],#[header:session:EmpName],#[header:session:Experience],#[header:session:Role]) Parameters: [](SQL Code: 30000, SQL State: + 42X02) (java.sql.SQLException)
org.apache.commons.dbutils.QueryRunner:540 (null)
I have used a logger component to display the values.
#[header:session:EmpNum]
is displaying the proper value.
Please help me how to refer this session values in Groovy script?
The following works for me when using Groovy script in Mule to read flow variables or session variables respectively.
For reading flow variables I'm using
message.getInvocationProperty('yourVarsName').toString()
For reading session variables I'm using
sessionVars['yoursVarsName'] or flowVars['yoursVarsName']
They work very well for me in the Groovy script in Mule 3.5.
You cannot use Mule Expression Language (MEL) directly in a Groovy script.
If you're using Mule 3.3, replace #[header:session:EmpName] with sessionVars('EmpName') and similar with the other variables.
For previous versions, replace #[header:session:EmpName] with message.getProperty('EmpName',PropertyScope.SESSION)
You have to allow the Groovy script to set the value before sending it as a SQL command. You are sending the literal "message.getProperty('Experience',PropertyScope.SESSION)" straight into the SQL command.
qr.update(conn, "INSERT INTO Employee values("+message.getProperty('EmpNum',PropertyScope.SESSION)+","+message.getProperty('EmpName',PropertyScope.SESSION)+","+message.getProperty('Experience',PropertyScope.SESSION)+","+message.getProperty('Role',PropertyScope.SESSION)+")")
Also don't forget to import the PropertyScope class in the script:
import org.mule.api.transport.PropertyScope
#[groovy:message.getSessionProperty('sesVarValue')
Not sure if any one if anyone needs the answer but for Mule 3.6+ I was able to access Sessions or flow variable by simply doing sessionVars['sessVarName'] /flowVars['flowVarName']
**DO NOT Forget to use "+" to concatenate Strings if values used as strings.