I would like to have a C++ client application that maintains a cache of objects that come from a Java server. The objects need to be compatible. I understand that Gemfire maintains them in a serializable format. This means the Java class needs to be equivalent to the C++ class.
Is there a common practice for defining the class structure in common place in a language-independent specifcation and generating the equivalent Java and C++ classes that are serializable to PDX or any other form that Gemfire uses?
Regards,
Yash
Before PDX I used to create a language-neutral representation of my domain and simultaneously generate Java, C++ and .Net classes using DataSerializable. However, PDX makes this unnecessary for the most part. I enclose the sample config below.
If you encounter types that you are using that Java does not support, you still do not have to resort to generating serializers but you can focus in on serializing that one type (see page 564 of http://gemfire.docs.pivotal.io/pdf/pivotal-gemfire-ug.pdf
Consider generating your own serializers when you have an insane need for speed since the auto-serializer can produce a drag. This is usually not needed but if you do, here are the instructions: http://data-docs-samples.cfapps.io/docs-gemfire/latest/javadocs/japi/com/gemstone/gemfire/DataSerializer.html
Here is the configuration for using the pdx auto serializer:
<!-- Cache configuration configuring auto serialization behavior -->
<cache>
<pdx>
<pdx-serializer>
<class-name>com.gemstone.gemfire.pdx.ReflectionBasedAutoSerializer
</class-name>
<parameter name="classes">
<string>com.company.domain.DomainObject</string>
</parameter>
</pdx-serializer>
</pdx>
...
</cache>
If I answered your question, please give check "Answered". Thanks.
First I do not speak very good english, i will try to explain of the best way possible.enter code here
I've created a service architecture that using services applications. This architecture uses jackson 2.5.2 for serializacion/desealizacion
JSON object. If one service application uses jackson 1.8 for generate JSON object response, the architecture try to deserialize that objet throws the next error:
org.codehaus.jackson.JsonParseException: Unexpected end-of-String when at [Source: N/A; line: -1, column: -1]
I don't know if the reason for the error is that the json response contains fields of type String and as well one field of JSONNode type
Is possible to resolve this problem?
While Jackson 1.x and 2.x are not directly source or binary compatible, they do produce and consume exactly same JSON. So your error message does not suggest version incompatibility, but rather some other type of a problem.
Is there any easy way to load and pass/set the KnownTypes for the serializer? Meaning that without 1). adding [KnownType] attribute to the types 2). Or passing Type[]. Any help would be greatly appreciated.
After searching and checking for the other available .NET JSON serializer/de-serializer, i liked the way fastJSON & ServiceStack helped. Like DataContractResolver in DataContractSerializer(), fastJSON preserves the TYPE and ServiceStack is nice and clean at serialization time.
So for BOTH the libraries we no need to set [KnownType] attribute to the classes/types Or passing Type[] to the conversion method.
Here is a link about fastJSON in CodeProject: http://www.codeproject.com/Articles/159450/fastJSON
For ServiceStack : http://www.servicestack.net/
And also I have checked some other articles about the .NET JSON serializer's benchmark. Here I have listed only 2 of them:
1). http://theburningmonk.com/2012/11/json-serializers-benchmark-updated-including-mongodb-driver/
2). http://www.servicestack.net/benchmarks/
In my VB6 application I make several calls to a COM server my team created from a Ada project (using GNATCOM). There are basically 2 methods available on the COM server. Their prototypes in VB are:
Sub PutParam(Param As Parameter_Type, Value)
Function GetParam(Param As Parameter_Type)
where Parameter_Type is an enumerated type which distinguishes the many parameters I can put to/get from the COM server and 'Value' is a Variant type variable. PutParam() receives a variant and GetParam() returns a variant. (I don't really know why in the VB6 Object Browser there's no reference to the Variant type on the COM server interface...).
The product of this project has been used continuously this way for years without any problems in this interface on computers with Windows XP with SP2. On computers with WinXP SP3 we get the error 0x800706F7 "The stub received bad data" when trying to put parameters with the 'Long' type.
Does anybody have any clue on what could be causing this? The COM server is still being built in a system with SP2. Should make any difference building it on a system with SP3? (like when we build for X64 in X64 systems).
One of the calls that are causing the problem is the following (changed some var names):
Dim StructData As StructData_Type
StructData.FirstLong = 1234567
StructData.SecondLong = 8901234
StructData.Status = True
ComServer.PutParam(StructDataParamType, StructData)
Where the definition of StructData_Type is:
Type StructData_Type
FirstLong As Long
SecondLong As Long
Status As Boolean
End Type
(the following has been added after the question was first posted)
The definition of the primitive calls on the interface of the COM server in IDL are presented below:
// Service to receive data
HRESULT PutParam([in] Parameter_Type Param, [in] VARIANT *Value);
//Service to send requested data
HRESULT GetParam([in] Parameter_Type Param, [out, retval] VARIANT *Value);
The definition of the structure I'm trying to pass is:
struct StructData_Type
{
int FirstLong;
int SecondLong;
VARIANT_BOOL Status;
} StructData_Type;
I found it strange that this definition here is using 'int' as the type of FirstLong and SeconLong and when I check the VB6 object explorer they are typed 'Long'. Btw, when I do extract the IDL from the COM server (using a specific utility) those parameters are defined as Long.
Update:
I have tested the same code with a version of my COM server compiled for Windows 7 (different version of GNAT, same GNATCOM version) and it works! I don't really know what happened here. I'll keep trying to identify the problem on WinXP SP3 but It is good to know that it works on Win7. If you have a similar problem it may be good to try to migrate to Win7.
I'll focus on explaining what the error means, there are too few hints in the question to provide a simple answer.
A "stub" is used in COM when you make calls across an execution boundary. It wasn't stated explicitly in the question but your Ada program is probably an EXE and implements an out-of-process COM server. Crossing the boundary between processes in Windows is difficult due to their strong isolation. This is done in Windows by RPC, Remote Procedure Call, a protocol for making calls across such boundaries, a network being the typical case.
To make an RPC call, the arguments of a function must be serialized into a network packet. COM doesn't know how to do this because it doesn't know enough about the actual arguments to a function, it needs the help of a proxy. A piece of code that does know what the argument types are. On the receiving end is a very similar piece of code that does the exact opposite of what the proxy does. It deserializes the arguments and makes the internal call. This is the stub.
One way this can fail is when the stub receives a network packet and it contains more or less data than required for the function argument values. Clearly it won't know what to do with that packet, there is no sensible way to turn that into a StructData_Type value, and it will fail with "The stub received bad data" error.
So the very first explanation for this error to consider is a DLL Hell problem. A mismatch between the proxy and the stub. If this app has been stable for a long time then this is not a happy explanation.
There's another aspect about your code snippet that is likely to induce this problem. Structures are very troublesome beasts in software, their members are aligned to their natural storage boundary and the alignment rules are subject to interpretation by the respective compilers. This can certainly be the case for the structure you quoted. It needs 10 bytes to store the fields, 4 + 4 + 2 and they align naturally. But the structure is actually 12 bytes long. Two bytes are padded at the end to ensure that the ints still align when the structure is stored in an array. It also makes COM's job very difficult, since COM hides implementation detail and structure alignment is a massive detail. It needs help to copy a structure, the job of the IRecordInfo interface. The stub will also fail when it cannot find an implementation of that interface.
I'll talk a bit about the proxy, stub and IRecordInfo. There are two basic ways a proxy/stub pair are generated. One way is by describing the interfaces in a language called IDL, Interface Description Language, and compile that with MIDL. That compiler is capable of auto-generating the proxy/stub code, since it knows the function argument types. You'll get a DLL that needs to be registered on both the client and the server. Your server might be using that, I don't know.
The second way is what VB6 uses, it takes advantage of a universal proxy that's built into Windows. Called FactoryBuffer, its CLSID is {00000320-0000-0000-C000-000000000046}. It works by using a type library. A type library is a machine readable description of the functions in a COM server, good enough for FactoryBuffer to figure out how to serialize the function arguments. This type library is also the one that provides the info that IRecordInfo needs to figure out how the members of a structure are aligned. I don't know how it is done on the server side, never heard of GNATCOM before.
So a strong explanation for this problem is that you are having a problem with the type library. Especially tricky in VB6 because you cannot directly control the guids that it uses. It likes to generate new ones when you make trivial changes, the only way to avoid it is by selecting the binary compatibility option. Which uses an old copy of the type library and tries to keep the new one as compatible as possible. If you don't have that option turned on then do expect trouble, especially for the guid of the structure. Kaboom if it changed and the other end is still using the old guid.
Just some hints on where to start looking. Do not assume it is a problem caused by SP3, this COM infrastructure hasn't changed for a very long time. But certainly expect this kind of problem due to a new operating system version being installed and having to re-register everything. SysInternals' ProcMon is a good utility to see the programs use the registry to find the proxy, stub and type library. And you'd certainly get help from a COM Spy kind of utility, albeit that they are very hard to find these days.
If it suddenly stopped working happily on XP, the first culprit I'd look for is type mismatches. It is possible that "long" on such systems is now 64-bits, while your Ada COM code (and/or perhaps your C ints) are exepecting 32-bits. With a traditionally-compiled system this would have been checked for you by your compiler, but the extra indirection you have with COM makes that difficult.
The bit you wrote in there about "when we compile for 64-bit systems" makes me particularly leery. 64-bit compiles may change the size of many C types, you know.
This Related Post suggests you need padding in your struct, as marshalling code may expect more data than you actually send (which is a bug, of course). Your struct contains 9 bytes (assuming 4 bytes for each of the ints/longs and one for the boolean). Try to add padding so that your struct contains a multiple of 4 bytes (or, failing that, multiple of 8, as the post isn't clear on the expected size)
I am also suggesting that the problem is due to a padding issue in your structure. I don't know whether you can control this using a #pragma, but it might be worth looking at your documentation.
I think it would be a good idea to try and patch your struct so that the resulting type library struct is a multiple of four (or eight). Your Status member takes up 2 bytes, so maybe you should insert a dummy value of the same type either before or after Status - which should bring it up to 12 bytes (if packing to eight bytes, this would have to be three dummy variables).
I have been trying to use protobuf-net with MonoTouch but I have no idea how, and despite having heard that it is possible, I haven't been able to find any tutorial or any example that actually work.
It was confirmed by Marc Gravell on his blog that it does work on MonoTouch. I have also looked through the blogs of the two people he states in this article, but I haven't found anything related to protobuf.
Having no lead on the subject, i decided to download protobuf-net and try it out anyway. So I created the following object for testing purposes :
[ProtoContract]
public class ProtoObject
{
public ProtoObject()
{
}
[ProtoMember(1)]
public byte[] Bytes { get; set; }
}
and I tried to send it through WCF from a service running on windows using a [ServiceContract] interface with
[OperationContract]
ProtoObject GetObject();
but the instance of ProtoObject recieved on the device is always null. This is not really unexpected since i have read that to make protobuf-net work with WCF you need to modify the app.config/web.config.
It's a little hard to accomplish since a MonoTouch project has no app.config, but I did not yet give up. To replace the app.config, I tried to add the ProtoEndpointBehavior to the client's endpoint's behaviors programmatically, and there I hit a wall. ProtoBuf.ServiceModel.ProtoEndpointBehavior, available on .NET 3.0 implementation of protobuf-net is not available on the iOS release.
How would I go about using protobuf-net to deserialize objects received from a windows-based WCF endpoint using protobuf-net serialization.
It is actually pretty much the same as described in this blog entry by Friction Point Studios. Since meta-programming on the device is not really an option, the trick is to pre-generate a serialization dll. This can be done by creating a small console exe (this is just a tool - it isn't designed to be pretty) that configures a RuntimeTypeModel (by adding the types you are interested in), and then call .Compile(...):
var model = TypeModel.Create();
model.Add(typeof (ProtoObject), true);
model.Compile("MySerializer", "MySerializer.dll");
This generates a serializer dll; simply reference this dll (along with the iOS version protobuf-net), and use the serializer type in the dll to interact with your model:
var ser = new MySerializer();
ser.Serialize(dest, obj); // etc
Just to bring this up to date there are a few issues with using WCF + Protobuf on MonoTouch. As you have observed the current releases of System.ServiceModel and protobuf light for ios don't include all the necessary bits.
However if you go and get the full System.ServiceModel from the Mono repository on GitHub and build it against the full Protobuf source then you can get it to work; I have done so.
You need to generate a serialisation assembly using the precompile tool then edit the ProtoOperationBehavior attribute to give it some way to reference your serialisation assembly. All the changes are too extensive to document here but it can be done and it is a lot faster than DatacontractSerializer which is pretty awful on iOS.