Jackson 2.0 compatibility with Jackson 1.x - jackson

First I do not speak very good english, i will try to explain of the best way possible.enter code here
I've created a service architecture that using services applications. This architecture uses jackson 2.5.2 for serializacion/desealizacion
JSON object. If one service application uses jackson 1.8 for generate JSON object response, the architecture try to deserialize that objet throws the next error:
org.codehaus.jackson.JsonParseException: Unexpected end-of-String when at [Source: N/A; line: -1, column: -1]
I don't know if the reason for the error is that the json response contains fields of type String and as well one field of JSONNode type
Is possible to resolve this problem?

While Jackson 1.x and 2.x are not directly source or binary compatible, they do produce and consume exactly same JSON. So your error message does not suggest version incompatibility, but rather some other type of a problem.

Related

Jax-rs convert to object Mule

I am using RAML for design api.
Then I convert raml to jaxrs and get java classes (https://github.com/mulesoft-labs/raml-for-jax-rs ).
It gives two classes: interface and *impl classes.
Then I Import them into my project In Anypoint Studio. I want to use them.
But JsonToObject Transfer cannot use convert classes.
org.mule.api.transformer.TransformerMessagingException: Failed to transform from "json" to "classImpl".
I try use without interface classes. It work correct.
How to use interface and *impl Classes for convert json to Object?
I solved the problem. 1. RAML need to convert using jackson 2. In Anypoint need to deserialize from the json to the object by ObjectMapper
You should make sure your classImp support JSON binding using Jackson annotations. See the related Mule documentation on JSON Support for details.

Spark kryo serialization register Datatype[]

To force spark to use kryo serialization the following option can be set spark.kryo.registrationRequired.
How can I register org.apache.spark.sql.types.DataType[]?
Class is not registered: org.apache.spark.sql.types.DataType[]
which is throwing an exception. But trying to register like classOf[org.apache.spark.sql.types.DataType[]] will not compile
I was confused with java and scala array notation.
classOf[Array[org.apache.spark.sql.types.DataType]]
is the correct registration.
Nonetheless, this is a spark internal class which should already be registered by spark.

Mule DataMapper MEL custom functions or calling custom transformer

I have a question relating to the DataMapper component and extending the behaviour. I have a scenario where I'm converting one payload to another using the DataMapper. Some of the elements in my source request as strings (i.e. Male, Female) and these values need to be mapped to ID elements, known as enums in the target system. A DBLookup will suffice but because of the structure of enums (a.k.a lookup tables) in the target system I'd need to define multiple DBLookups for the values which need to be changed. So I'm looking to develop a more generic way of performing the mapping. I've two proposals, which I'm currently exploring
1) Use the invokeTransformer default function in to call a custom transformer. i.e.
output.gender = invokeTransformer("EnumTransformer",input.gender);
However, even though my transformer is defined in my flow
<custom-transformer name="EnumTransformer" class="com.abc.mule.EnumTransformer" />
Running a Preview in the DataMapper fails with the following error (in Studio Error Log)
Caused by: java.lang.IllegalArgumentException: Invalid transformer name 'EnumTransformer'
at com.mulesoft.datamapper.transform.function.InvokeTransformerFunction.call(InvokeTransformerFunction.java:35)
at org.mule.el.mvel.MVELFunctionAdaptor.call(MVELFunctionAdaptor.java:38)
at org.mvel2.optimizers.impl.refl.ReflectiveAccessorOptimizer.getMethod(ReflectiveAccessorOptimizer.java:1011)
at org.mvel2.optimizers.impl.refl.ReflectiveAccessorOptimizer.getMethod(ReflectiveAccessorOptimizer.java:987)
at org.mvel2.optimizers.impl.refl.ReflectiveAccessorOptimizer.compileGetChain(ReflectiveAccessorOptimizer.java:377)
... 18 more
As the transformer is scoped to my flow and the datamapper is outside this scope do I assume it is now possible to invoke a custom transformer in a datamapper? Or do I require additional setup.
2) The alternative approach would be to use "global function". I've found he documentation in this area to be quiet weak. The functionality is referenced in the the cheat sheet and there is a [jira](
https://www.mulesoft.org/jira/browse/MULE-6438) to improve the documentation.
Again perhaps this functionality suffers from a scope issue. Questions on this approach is if anyone can provide a HOWTO on calling some JAVA code via MEL from a data mapper script? This blog suggests data mapper MEL can call JAVA but limits it's example to string functions. Is there any example of calling a custom JAVA class / static method?
In general I'm questioning if I am approaching this wrong? Should I use a Flow Ref and call a JAVA component?
Update
It is perfectly acceptable to use a custom transformer from the data mapper component. The issue I was encountering was a Mule Studio issue. Preview of a data mapping which contains a transformer does not work because the mule registry is not populated on the mule context as mule is not running.
In terms of the general approach now that I have realized the DB Lookup can accept multiple input parameters I can use this to address my mapping scenario.
Thanks
Rich
Try by providing complete class name

Do we have KnowTypeResolver (DataContractResolver) like functionality in JSON.NET?

Is there any easy way to load and pass/set the KnownTypes for the serializer? Meaning that without 1). adding [KnownType] attribute to the types 2). Or passing Type[]. Any help would be greatly appreciated.
After searching and checking for the other available .NET JSON serializer/de-serializer, i liked the way fastJSON & ServiceStack helped. Like DataContractResolver in DataContractSerializer(), fastJSON preserves the TYPE and ServiceStack is nice and clean at serialization time.
So for BOTH the libraries we no need to set [KnownType] attribute to the classes/types Or passing Type[] to the conversion method.
Here is a link about fastJSON in CodeProject: http://www.codeproject.com/Articles/159450/fastJSON
For ServiceStack : http://www.servicestack.net/
And also I have checked some other articles about the .NET JSON serializer's benchmark. Here I have listed only 2 of them:
1). http://theburningmonk.com/2012/11/json-serializers-benchmark-updated-including-mongodb-driver/
2). http://www.servicestack.net/benchmarks/

.NET to Java serialization/deserialization compatibility

Are there any compatibility issues to take care of when serailizing an object in .NET and then deserializing in Java?
I am facing problems in de-serializing an object in java which has been serialized in .NET
Here is the detailed problem statement:
In .NET platform i have a cookie.
1. Cookie is serialized
2. then it is encrypted using Triple DES algo.
3. Send it across to Java application
In Java platform
1. Decrypt the cookie using Triple DES which gives some bytes
2. Deserialize the bytes using something like
new ObjectInputStream( new ByteArrayInputStream(byte[] decryptedCookie)).readObject();
The exception stack trace I get is:
java.io.StreamCorruptedException: invalid stream header: 2F774555
at java.io.ObjectInputStream.readStreamHeader(Unknown Source)
at java.io.ObjectInputStream.(Unknown Source)
The WOX serializer provides interoperable serialization for .Net and Java.
If you serialize in xml then you shouldnt face any problems de-serializing in java since at worse you have to write your own bit of code to reconstruct the objects.
The way java and .Net serialise to binary differs.
How does one know the objects of the other e.g. .Net will have Dictionaries and Java Maps? (plus the bnary representation of a string might differ.
You have to use some data format that both understand and code to do the object mappings. Thus the above answers mentioning XML and WOX. I have worked with internal company produces as well.