Hazelcast : NotSerializableException - serialization

I am using a map with Hazelcast :
//When I do :
map.put(gen.newId(), myObject);
myObject is a very complex object and does not implements Serializable.
I thought that putting the config like below was enough for not having to implement serializable :
<map name="myMap">
<in-memory-format>OBJECT</in-memory-format>
</map>
The Hazelcast doc says :
http://docs.hazelcast.org/docs/3.5/manual/html/entryprocessor.html
"When it is stored as an object (OBJECT format), then the entry processor is applied directly on the object. In that case, no serialization or deserialization is performed"
Thanks for any suggestion.

Unfortunately the object will always be deserialized when the map.put is called, no matter the in memory format. This is because normally there are backups and they need to receive a copy as well. So in this case your only way out is to make your object 'serializable'. You can use Java serialization, but you can also rely on something like Kryo to deal with complex object graphs.

I think you could also use more efficient hazelcast specific solutions. Here is a comparison table of the solutions. Portable has been working for me but is a pain in the ass to implement and maintain for big objects. DataSerializable is a more easy solution and looks a lot like Parcelable from Android.

Related

Jackon JSON: Polymorphic deseralization when subclasses are unknown

I'm trying to do some polymorphic deseralization of JSON using Jackson, however the list of subclasses is unknown at compile time, so I can't use a #JsonSubtype annotation on the base class.
Instead I want to use a TypeIdResolver class to perform the conversion to and from a property value.
The list of possible subclasses I might encounter will be dynamic, but they are all registered at run time with a registry. So I would appear to need my TypeIdResolver object to have a reference to that registry class. It has to operate in what is essentially a dependency injection environment (i.e I can't have a singleton class that the TypeIdResolver consults), so I think I need to inject the registry class into the TypeIdResolver. The kind of code I think I want write is:
ObjectMapper mapper = new ObjectMapper();
mapper.something(new MyTypeIdResolver(subclassRegistry));
mapper.readValue(...)
However, I can't find a way of doing the bit in the middle. The only methods I can find use java annotations to specify what the TypeIdResolver is going to be.
This question Is there a way to specify #JsonTypeIdResolver on mapper config instead of annotation? is the same, though the motivation is different, and the answer is to use an annotation mixin, which won't work here.
SimpleModule has method registerSubtypes(), with which you can register subtypes. If only passing Classes, simple class name is used as type id, but you can also pass NamedType to define type id to use for sub-class.
So, if you do know full set, just build SimpleModule, register that to mapper.
Otherwise if this does not work you may need to resort to just sharing data via static singleton instance (if applicable), or even ThreadLocal.
Note that in the end what I did was abandon Jackson and write my own much simpler framework based on javax.json that just did the kinds of serialisation I wanted in a much more straightforward fashion. I was only dealing with simple DTO (data transfer object) classes, so it was just much simpler to write my own simple framework.

How to store complex objects into hadoop Hbase?

I have complex objects with collection fields which needed to be stored to Hadoop. I don't want to go through whole object tree and explicitly store each field. So I just think about serialization of complex fields and store it as one big piece. And than desirialize it when reading object. So what is the best way to do it? I though about using some kind serilization for that but I hope that Hadoop has means to handle this situation.
Sample object's class to store:
class ComplexClass {
<simple fields>
List<AnotherComplexClassWithCollectionFields> collection;
}
HBase only deals with byte arrays, so you can serialize your object in any way you see fit.
The standard Hadoop way of serializing objects is to implement the org.apache.hadoop.io.Writable interface. Then you can serialize your object into a byte array using org.apache.hadoop.io.WritableUtils.toByteArray(Writable ... writable).
Also, there are other serialization frameworks that people in the Hadoop community use, like Avro, Protocol Buffers, and Thrift. All have their specific use cases, so do your research. If you're doing something simple, implementing Hadoop's Writable should be good enough.

Cannot serialize Object to ViewState only Session

I have a class that is marked as serializable and have no problem storing it in the Session but when I attempt to save it in the ViewState I get:
Sys.WebForms.PageRequestManagerServerErrorException: Error serializing value
The reason is that view state serialization is done by the LosFormatter class while session serialization is done by the BinaryFormatter class. The two are subtly different and one of these subtle differences is probably causing your problem.
Take a look at this article and the documentation for LosFormatter to see if you can find some clues about what is causing your problem.
Well it also depends what kind of session do you use. If it's in-proc, then serialization doesn't take place at all. Your objects get stored into memory.

Serialization of Objects

how does Serialization of objects works? How object got deserialized and a instance is created from serialized date without a call to any constructor?
I've kept this answer language agnostic since a language wasn't given.
When the object is serialized, all the require information to rebuild it is encoded in way which can be retrieved. This typically includes the type of the object, as well as the value of all the instance variables.
When the object is deserialized, an area in memory of the correct size is allocated and is populated using the serialized information such that the new object is identical to the serialized one.
The running program can then refer to this new object in memory without having to actually call the constructor.
There are lots of little details which this doesn't explain, but this is the general idea of serialization/deserialization.
Are you talking about Java? If so, serialization is an extralingual object creation mechanism. It's a backdoor that uses native code to create the object without calling any constructors. Therefore, when designing a class for serializability, you need to make sure that a class created through deserialization maintains the same invariants (key fields being initialized) as you would through the constructor path. A third way to create objects in Java is through cloning, and similar issues apply.
Cloning and serialization don't interact well with the use of final fields if you need to set the value of that field to something different than what is returned by clone or the deserialization process.
Josh Bloch's "Effective Java" has some chapters that explain these issues in more depth.
(this answer may apply to other languages too, but I've only used serialization in Java)
Regarding .NET: this isn't a definitive or textbook answer, and I might be all-out wrong...
.NET Serialization needs to be seperated out into Binary vs. others (XML or an XML derivitave typically). Binary serialization is mostly a black-box to me, but it allows the object to be serialized and restored in their current state. XML serialization typically only serialized the public fields/properties of an object, unless overriden by adding a custom ISerializable implementation.
In the case of XML serialization I believe .NET uses Reflection to determine which fields and properties get converted to their equivalent Elements. Adding an [XMLSerializable] attribute will implement a default behavior which can be adjusted by applying other attributes at the field level (such as [XMLAttribute]).
The metadata (which Reflection depends on) stores all the object members as well as their attributes and addresses, which allows the serializer to determine how it should build the output.

Easiest way to convert CArchive to use SQL database for serialization?

I have an existing application that uses CArchive to serialize a object structure to a file. I am wondering if can replace the usage of CArchive with some custom class to write to a database the same way that things are serialized. Before I go about figuring if this is doable I was wondering if other people have done the same- what is the best approach to this problem? I would like to be able to create a drop in replacement for the usage of CArchive so that the existing object structure would simply read/write to/from a database rather than a serialized file. Is it as simple as overwriting the Serialize method for each class?
Short answer: forget it.
Longer answer:
CArchive doesn't have a single virtual member. Even it's destructor isn't virtual, which means you're not supposed to derive from that class (C# programmers say sealed).
There's only one possibility that I can think of to customize CArchive's work (and without rewriting the whole serialization code in CDocument): Construct your CArchive object by passing it a pointer to CFile derived class that would handle the data connection for you.
From there on, how you control your database by simply overriding CFile's Read() and Write() is beyond my imagination :-(