This is what I want to achieve:
Model an object using google protocol buffer in Proc #1
Serialize that object using proto-buf and send it over posix message
queue.
Read and deserialize the stream into a like model in Proc #2, also using
protocol buffer.
In other words:
Object in Proc #1 --> Serialize --> Send to Posix MQ --> Receive from Posix MQ --> Deserialize --> Object in Proc #2
The catch is Proc #1 and Proc #2 may be completely different language platforms. Proc #1 will usually be C++ complied with g++. But Proc #2 can be anything: Python, Java etc... (Only limited by support for Protobuf).
Now I want to ascertain if Protocol Buffer's serialization strategy is universal enough for the deserialization to work on any language codebase?
YES, that is guaranteed. Protobuf defines how to serialize and deserialize the data, and Protobuf libs of any language should implement the same serialization protocol.
Also that's why GRPC can work with many languages, e.g. you can have a C++ GRPC server, and a Java GRPC client.
Related
I have two different Java 8 projects that will live on different servers and which will both use Akka (specifically Akka Remoting) to talk to each other.
For instance, one app might send a Fizzbuzz message to the other app:
public class Fizzbuzz {
private int foo;
private String bar;
// Getters, setters & ctor omitted for brevity
}
I've never used Akka Remoting before. I assume I need to create a 3rd project, a library/jar for holding the shared messages (such as Fizzbuzz and others) and then pull that library in to both projects as a dependency.
Is it that simple? Are there any serialization (or other Akka and/or networking) considerations that affect the design of these "shared" messages? Thanks in advance!
Shared library is a way to go for sure, except there are indeed serialization concerns:
Akka-remoting docs:
When using remoting for actors you must ensure that the props and messages used for those actors are serializable. Failing to do so will cause the system to behave in an unintended way.
For more information please see Serialization.
Basically, you'll need to provide and configure the serialization for actor props and messages sent (including all the nested classes of course). If I'm not mistaking default settings will get you up and running without any configuration on your side, provided that everything you send over the wire is java-serializable.
However, default config uses default Java serialization, which is known to be quite inefficient - so you might want to switch to protobuf, kryo, or maybe even json. In that case, it would make sense to provide the serialization implementation and bindings as a shared library - either a dedicated one or a part of the "shared models" one that you mentioned in the question - depends if you want to reuse it elsewhere and mind/don't mind having serailization-related transitive dependencies popping all over the place.
Finally, if you allow some personal opinion, I would suggest trying protobuf first - it's binary format (read: efficient) and is widely supported (there are bindings for other languages). Kryo works well too (I have a few closed-source akka-cluster apps with kryo serialization in production), but has a few quirks with regards to collection/map handling.
TL;DR: How do you encode and decode an MTLSharedTextureHandle and MTLSharedEventHandler such that it can be transported across an XPC connection inside an xpc_dictionary?
A macOS application I'm working on makes extensive use of XPC services and was implemented using the C-based API. (i.e.: xpc_main, xpc_connection, xpc_dictionary...) This made sense at the time because certain objects, like IOSurfaces, did not support NSCoding/NSSecureCoding and had to be passed using IOSurfaceCreateXPCObject.
In macOS 10.14, Apple introduced new classes for sharing Metal textures and events between processes: MTLSharedTextureHandle and MTLSharedEventHandle. These classes support NSSecureCoding but they don't appear to have a counter-part in the C-XPC interface for encoding/decoding them.
I thought I could use something like [NSKeyedArchiver archivedDataWithRootObject:requiringSecureCoding:error] to just convert them to NSData objects, which can then be stored in an xpc_dictionary, but when I try and do that, I get the following exception:
Caught exception during archival:
This object may only be encoded by an NSXPCCoder.
(NSXPCCoder is a private class.)
This happens for both MTLSharedTextureHandle and MTLSharedEventHandle. I could switch over to using the new NSXPCConnection API but I've already got an extensive amount of code built on the C-interface, so I'd rather not have to make the switch.
Is there any way to archive either of those two classes into a payload that can be stored in an xpc_dictionary for transfer between the service and the client?
MTLSharedTextureHandle only works with NSXPCConnection. If you're creating the texture from an IOSurface you can share the surface instead which is effectively the same thing. Make sure you are using the same GPU (same id<MTLDevice>) in both processes.
There is no workaround for MTLSharedEventHandle using public API.
I recommend switching to NSXPCConnection if you can. Unfortunately there isn't a good story for partially changing over using public API, you'll have to do it all at once or split your XPC service into two separate services.
I would like to have a C++ client application that maintains a cache of objects that come from a Java server. The objects need to be compatible. I understand that Gemfire maintains them in a serializable format. This means the Java class needs to be equivalent to the C++ class.
Is there a common practice for defining the class structure in common place in a language-independent specifcation and generating the equivalent Java and C++ classes that are serializable to PDX or any other form that Gemfire uses?
Regards,
Yash
Before PDX I used to create a language-neutral representation of my domain and simultaneously generate Java, C++ and .Net classes using DataSerializable. However, PDX makes this unnecessary for the most part. I enclose the sample config below.
If you encounter types that you are using that Java does not support, you still do not have to resort to generating serializers but you can focus in on serializing that one type (see page 564 of http://gemfire.docs.pivotal.io/pdf/pivotal-gemfire-ug.pdf
Consider generating your own serializers when you have an insane need for speed since the auto-serializer can produce a drag. This is usually not needed but if you do, here are the instructions: http://data-docs-samples.cfapps.io/docs-gemfire/latest/javadocs/japi/com/gemstone/gemfire/DataSerializer.html
Here is the configuration for using the pdx auto serializer:
<!-- Cache configuration configuring auto serialization behavior -->
<cache>
<pdx>
<pdx-serializer>
<class-name>com.gemstone.gemfire.pdx.ReflectionBasedAutoSerializer
</class-name>
<parameter name="classes">
<string>com.company.domain.DomainObject</string>
</parameter>
</pdx-serializer>
</pdx>
...
</cache>
If I answered your question, please give check "Answered". Thanks.
I'm new in building corba application. Presently I'm developping a corba application in java. The problem I have is that I should write a method that receive the name of the class, the method and the arguments to pass to the corba server as a string.
Before invoking the remote method, I have to parse the string and obtain all the necessary information (class, method, arguments)
There is no problem here. But now concerning the arguments i do not now in advance the type of the arguments, so I should be able to convert an argument by getting its type and insert it into a Any bject to be sent, is it possible?
If Know in advance the type such as seq.insert_string("bum") it works but I want to do it dynamically.
Use the DynAny interfaces, if your ORB supports them. They can do exactly what you want. From CORBA Explained Simply:
If an application wants to manipulate data embedded inside an any
without being compiled with the relevant stub code then the
application must convert the any into a DynAny. There are sub-types
of DynAny for each IDL construct. For example, there are types called
DynStruct, DynUnion, DynSequence and so on.
The operations on the DynAny interfaces allow a programmer to
recursively drill down into a compound data-structure that is
contained within the DynAny and, in so doing, decompose the compound
type into its individual components that are built-in types.
Operations on the DynAny interface can also be used to recursively
build up a compound data-structure from built-in types.
Are there any compatibility issues to take care of when serailizing an object in .NET and then deserializing in Java?
I am facing problems in de-serializing an object in java which has been serialized in .NET
Here is the detailed problem statement:
In .NET platform i have a cookie.
1. Cookie is serialized
2. then it is encrypted using Triple DES algo.
3. Send it across to Java application
In Java platform
1. Decrypt the cookie using Triple DES which gives some bytes
2. Deserialize the bytes using something like
new ObjectInputStream( new ByteArrayInputStream(byte[] decryptedCookie)).readObject();
The exception stack trace I get is:
java.io.StreamCorruptedException: invalid stream header: 2F774555
at java.io.ObjectInputStream.readStreamHeader(Unknown Source)
at java.io.ObjectInputStream.(Unknown Source)
The WOX serializer provides interoperable serialization for .Net and Java.
If you serialize in xml then you shouldnt face any problems de-serializing in java since at worse you have to write your own bit of code to reconstruct the objects.
The way java and .Net serialise to binary differs.
How does one know the objects of the other e.g. .Net will have Dictionaries and Java Maps? (plus the bnary representation of a string might differ.
You have to use some data format that both understand and code to do the object mappings. Thus the above answers mentioning XML and WOX. I have worked with internal company produces as well.