java.io.NotSerializableException: org.apache.camel.component.file.GenericFile - apache

I am trying to use a download logger that should log any kind of file transfer between two end points as defined below in camel-context.xml
<process ref="downloadLogger"/>
<to uri="file:src/main/resources/META-INF?noop=true"/>
<!-- Prepare the message for calling OFBiz service -->
<setHeader headerName="Ofbiz.ServiceName">
<constant>DownLoadLogger</constant>
</setHeader>
<setHeader headerName="Ofbiz.Param.note">
<simple>${in.body}</simple>
</setHeader>
<!-- Call the OFBiz service -->
<camel:process ref="ofbizDispatcher"/>
</camel:route>
But this gives rise to
java.io.NotSerializableException: org.apache.camel.component.file.GenericFile
at org.apache.camel.util.ObjectHelper.wrapRuntimeCamelException(ObjectHelper.java:1196)[camel-core-2.9.0.jar:2.9.0]
at org.apache.camel.component.bean.BeanInvocation.invoke(BeanInvocation.java:87)[camel-core-2.9.0.jar:2.9.0]
at org.apache.camel.component.bean.BeanProcessor.process(BeanProcessor.java:128)[camel-core-2.9.0.jar:2.9.0]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:99)[camel-core-2.9.0.jar:2.9.0]
at org.apache.camel.component.bean.BeanProcessor.process(BeanProcessor.java:73)[camel-core-2.9.0.jar:2.9.0]
at org.apache.camel.component.rmi.RmiProducer.process(RmiProducer.java:45)[camel-rmi-2.9.0.jar:2.9.0]
I am using jdk 1.6 camel 2.9 jar.
Please suggest if I am missing any configuration any where.
Thanks in advance
Padmalaya

Use following between 'from uri' and 'to uri'
<convertBodyTo type="byte[]"/>

I got this working after converting it to string, conversion to byte does not really work! :(
.convertBodyTo(String.class)

Related

RTI DDS creating own data types

I am working on a .Net example where I define my own data type using RTI Connext DDS.
Instead of creating the application from the beginning, I got help from the source code of the hello_world_xml_dynamic example in rti_workspace directory. I have made several changes to the USER_QOS_PROFILES.xml file to create my own data type and changes its name to MY_PROFILES.xml
But when I compile the application and run it from the command line, I get the following error:
DDS_DomainParticipantFactory_create_participant_from_config_w_paramsI:ERROR: Profile library 'MyParticipantLibrary::PublicationParticipant' not found
! Unable to create DDS domain participant
The line of code that catching the error:
if (this.participant == null)
{
this.participant = DDS.DomainParticipantFactory.get_instance().
create_participant_from_config(
"MyParticipantLibrary::PublicationParticipant");
if (this.participant == null)
{
Console.Error.WriteLine("! Unable to create DDS domain participant");
return;
}
}
this is the configuration file MY_PROFILES.xml :
<!--
RTI Data Distribution Service Deployment
-->
<dds xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://community.rti.com/schema/6.0.1/rti_dds_profiles.xsd">
<!-- Qos Library -->
<qos_library name="qosLibrary">
<qos_profile name="DefaultProfile">
</qos_profile>
</qos_library>
<!-- types -->
<types>
<struct name="FlightData">
<member name="Latitude" type="double"/>
<member name="Longitude" type="double"/>
<member name="Altitude" type="double"/>
</struct>
</types>
<!-- Domain Library -->
<domain_library name="MyDomainLibrary" >
<domain name="FlightDataDomain" domain_id="0">
<register_type name="FlightDataType"
type_ref="FlightData" />
<topic name="FlightDataTopic"
register_type_ref="FlightDataType">
<topic_qos name="FlightData_qos"
base_name="qosLibrary::DefaultProfile"/>
</topic>
</domain>
</domain_library>
<!-- Participant library -->
<domain_participant_library name="MyParticipantLibrary">
<domain_participant name="PublicationParticipant"
domain_ref="MyDomainLibrary::FlightDataDomain">
<publisher name="MyPublisher">
<data_writer name="FlightDataWriter"
topic_ref="FlightDataTopic"/>
</publisher>
</domain_participant>
<domain_participant name="SubscriptionParticipant"
domain_ref="MyDomainLibrary::FlightDataDomain">
<subscriber name="MySubscriber">
<data_reader name="FlightDataReader"
topic_ref="FlightDataTopic">
<datareader_qos name="FlightData_reader_qos"
base_name="qosLibrary::DefaultProfile"/>
</data_reader>
</subscriber>
</domain_participant>
</domain_participant_library>
</dds>
where am i making a mistake?
Your XML file looks correct. From the 'not found' error message, it seems that you may not have taken the right steps to instruct your application to load that profiles-file MY_PROFILES.xml to actually learn about your desired Participant. You can easily verify that this is the case by introducing an error in your XML file (for example by incorrectly renaming one tag) and rerun your application. If it does not complain about the syntax or schema of the XML, then your file did not get loaded and this hypothesis is correct.
If that turns out to be your problem indeed, then you have several options to fix that. They are listed in the User's Manual section 18.5 How to Load XML-Specified QoS Settings.

Apache Camel get empty response from jetty

I face a complex case.
What I'm doing is as following steps:
1) <from uri="jetty:http://0.0.0.0:30100/jetty/test"/>
2) <to uri="hazelcast-client:master-test-series" />
3) <to uri="bean:modelSeriesWrapperTest" />
4)
<split parallelProcessing="true" streaming="true">
<simple>${body}</simple> <to uri="direct:dw.model.test"/>
</split>
5) From another route
<from uri="direct:dw.model.test"/>
<aggregate strategyRef="myAggregatorStrategy"
completionTimeout="1000">
<correlationExpression>
<constant>true</constant>
</correlationExpression>
<marshal ref="modelSeriesVariantColourGson" />
<camel:to uri="file:src/data/catask/output?fileName=output.xml"/>
</aggregate>
The problem is that the jetty response is empty. I use TCP trace to track the request and response, the Content-Length is 0. But the output.xml file has correct JSON format content.
Even I cross the <camel:to uri="file:src/data/catask/output?fileName=output.xml"/>. The jetty response is still empty.
I try the InOut pattern, it doesn't work as well.
It seems jetty return directly, not waiting split done. I try to set In and Out body, it doesn't work either. I Google every case that I can image. There is no helpful case.
Could you please help me? Thank you very much.
If you want the jetty response to include whatever information from your aggregator, then you must use the splitter only approach as documented at:
http://camel.apache.org/composed-message-processor.html
The splitter has built-in aggregation, and that ensures when the splitter is done, it aggregates also, and then you can use that as the jetty response.
When you use <aggregate> then it becomes a separate exchange. To understand this more then read more about the aggregate eip, and other SO, and in various Camel books etc.

Mule ESB: File outputpattern doesn't translate the pattern

I'm using Mule ESB CE 3.4. I have a requirement where I'm reading the configuration information from database and using it as the file name for the file outbound endpoint. Here is an example code (the code may not work as I have only given an outline)
<file:connector name="File-Data" autoDelete="false" streaming="true" validateConnections="true" doc:name="File" />
.....
<!-- Gets the configuration from database using a transformer. The transformer populates the configuration entries in a POJO and puts that in a session. -->
<custom-transformer class="com.test.DbGetConfigsTransformer" doc:name="Get Integration Configs"/>
....<!-- some code to process data -->
<logger message="$$$: #[sessionVars['currentFeed'].getFilePattern()]" doc:name="Set JSON File Name" /> -->
<file:outbound-endpoint path="/temp" outputPattern="#[sessionVars['currentFeed'].getFilePattern()]" responseTimeout="10000" mimeType="text/plain" connector-ref="File-Data" doc:name="Save File"/>
The above code throws the following error:
1. The filename, directory name, or volume label syntax is incorrect (java.io.IOException)
java.io.WinNTFileSystem:-2 (null)
2. Unable to create a canonical file for /temp/Test_User_#[function:datestamp:YYYYMMddhhmmss.sss] (org.mule.api.MuleRuntimeException)
org.mule.util.FileUtils:354 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/MuleRuntimeException.html)
3. Failed to route event via endpoint: DefaultOutboundEndpoint{endpointUri=file:///temp, connector=FileConnector
In the database table, the field name is called FilePattern and it has the value 'Test_User_#[function:datestamp:YYYYMMddhhmmss.sss]. If I hardcode the value or move this value to the mule configuration file
file.name=Test_User_#[function:datestamp:YYYYMMddhhmmss.sss]
and use the configuration property syntax (for e.g. ${file.name} in the 'outputpattern'), it works. But if I read the same from db and use it, it is not working and throwing the error. The logger displays as (which is read from the db)
$$$: Test_#[function:datestamp:YYYYMMddhhmmss.sss]
Any help is much appreciated.
If your datestamp format does not vary, you should just store the environment prefix in your db and use something like:
outputPattern="#[sessionVars['prefix']+server.dateTime.format('YYYYMMddhhmmss.sss')]"
If you need to use your current database values, you can use basic Java string methods to find the correct substrings. For example:
#[sessionVars['currentFeed'].getFilePattern().substring(0,sessionVars['currentFeed'].getFilePattern().indexOf('function')-2)+server.dateTime.format('YYYYMMddhhmmss.sss')]
If you use different datestamp formats, you can find that part as well using similar String methods. However, I still suggest you come up with an implementation that only stores the environment prefix in the db.

How to read CSV file and insert data into PostgreSQL using Mule ESB, Mule Studio

I am very new to Mule Studio.
I am facing a problem. I have a requirement where I need to insert data from a CSV file to PostgreSQL Database using Mule Studio.
I am using Mule Studio CE (version: 1.3.1). I check ed in the Google and find that we can use Data-mapper for doing so. But it works only for EE .So I cannot use it.
Also I am checking in the net and found an article Using Mule Studio to read Data from PostgreSQL(Inbound) and write it to File (Outbound) - Step by Step approach.
That seems feasible but my requirement is just the opposite of the article given. I need File as Inbound data while Databse as Outbound component.
What is the way to do so?
Any step by step help (like what components to use) and guidance will be greatly appreciated.
Here is an example that inserts a two columns CSV file:
<configuration>
<expression-language autoResolveVariables="true">
<import class="org.mule.util.StringUtils" />
<import class="org.mule.util.ArrayUtils" />
</expression-language>
</configuration>
<spring:beans>
<spring:bean id="jdbcDataSource" class=" ... your data source ... " />
</spring:beans>
<jdbc:connector name="jdbcConnector" dataSource-ref="jdbcDataSource">
<jdbc:query key="insertRow"
value="insert into my_table(col1, col2) values(#[message.payload[0]],#[message.payload[1]])" />
</jdbc:connector>
<flow name="csvFileToDatabase">
<file:inbound-endpoint path="/tmp/mule/inbox"
pollingFrequency="5000" moveToDirectory="/tmp/mule/processed">
<file:filename-wildcard-filter pattern="*.csv" />
</file:inbound-endpoint>
<!-- Load all file in RAM - won't work for big files! -->
<file:file-to-string-transformer />
<!-- Split each row, dropping the first one (header) -->
<splitter
expression="#[rows=StringUtils.split(message.payload, '\n\r');ArrayUtils.subarray(rows,1,rows.size())]" />
<!-- Transform CSV row in array -->
<expression-transformer expression="#[StringUtils.split(message.payload, ',')]" />
<jdbc:outbound-endpoint queryKey="insertRow" />
</flow>
In order to read CSV file and insert data into PostgreSQL using Mule all you need to follow following steps:
You need to have following things as pre-requisite
PostgreSQL
PostgreSQL JDBC driver
Anypoint Studio IDE and
A database to be created in PostgreSQL
Then configure Postgre SQL JDBC Driver in Global Element Properties inside Studio
Create Mule Flow in Anypoint Studio as follows:
Step 1: Wrap CSV file source in File component
Step 2: Convert between object arrays and strings
Step 3: Split each row
Step 4: Transform CSV row in array
Step 5: Dump into the destination Database
I would like to suggest Dataweave.
Steps
read the file using FTP connector / endpoint.
Transform using Data weave.
Use database connector , store the data in DB.

MULE expression-transformer not accepted

I'm trying to learn Mule ESB but get problems with example projects. Why are these lines
underlined red and not represented in the Message flow?
<expression-transformer name="returnAttachments">
<return-argument evaluator="attachments-list" expression="*.txt,*.ozb,*.xml" optional="false"/>
</expression-transformer>
I've cut and pasted these lines from mulesoft.org as part of a sample project.
#genjosanzo is right, the MEL equivalent would be:
<expression-transformer
expression="#[($.value in message.inboundAttachments.entrySet() if $.key ~= '(.*\\.txt|.*\\.ozb|.*\\.xml)')]" />
Mule studio has problem rendering nested elements (bug reported here)
Instead you can use the compact version and replace it with the following:
<expression-transformer expression="#[attachments-list:*.txt,*.ozb,*.xml]" doc:name="Expression" />
On a side note ever since mule 3.3.0 the new mule expression languages and it is recommended to rely on it whenever possible.