Windows Phone - Serialization for tombstoning - serialization

I'm looking for a Serializer for Windows Phone 8. The DataContractSerializer doesn't work for me because the data members needs to be public and they need public setters.
I don't need a huge library for tombstoning, I think a smart serializer would fit for me. It would be nice if the serializer returns a serialization-string that can be stored into PhoneApplicationPage.State because i Dont't want to use the IsolatedStorage.

Well you could use Json.net which you can install with nuget.
Not sure if that works for you though, when DataContractSerializer doesn't.
Serialize example:
String result = await JsonConvert.SerializeObjectAsync(yourobject);
or
String result = JsonConvert.SerializeObject(yourobject);
and deserialize example:
T _data = Newtonsoft.Json.JsonConvert.DeserializeObject(yourstringobject);

Related

How can I get one property from a large JSON object in .net core?

I used the JsonSerializer to deserialize JSON before.
And now I have a large JSON to deserialize and I only need one of its properties(Such as "address":"PK road").
I can not order it only replys one property for it is a third-party API.
As we know, I have to code a large model class while using the JsonSerializer and most parts of the model are useless.
I don't want to do these useless work on for only get one property.
Is there any other faster way to achieve this? Thank you.
Here is an alternate solution for System.Text.Json.
You can achieve the same as using JObject in newtonsoft
var jObj = JObject.Parse(myJsonString);
var myObj = jObj.SelectToken("address").Value<string>());

How to convert existing POCO classes in C# to google Protobuf standard POCO

I have POCO classes , I use NewtonSoft json for seralization. Now i want to migrate it to Google protocol buff. Is there any way i can migrate all my classes (not manually) so that i can use google protocol buff for serialization and deseralization.
Do you just want it to work? The absolute simplest way to do this would be to use protobuf-net and add [ProtoContract(ImplicitFields = ImplicitFields.AllPublic)]. What this does is tell protobuf-net to make up the field numbers, which it does by taking all the public members, sorting them alphabetically, and just counting upwards. Then you can use your type with ProtoBuf.Serializer and it should behave in the way you expect.
This is simple, but it isn't very robust. If you add, remove or rename members it can all get out of sync. The problem here is that the protocol buffers format doesn't include names - just field numbers, and it is much harder to guarantee numbers over time. If your type is likely to change, you probably want to define field numbers explicitly. For example:
[ProtoContract]
public class Foo {
[ProtoMember(1)]
public int Id {get;set;}
[ProtoMember(2)]
public List<string> Names {get;} = new List<string>();
}
One other thing to watch out for would be non-zero default values. By default protobuf-net assumes certain things about implicit default values. If you are routinely using non-zero default values without doing it very carefully, protobuf-net may misunderstand you. You can turn that off globally if you desire:
RuntimeTypeModel.Default.UseImplicitZeroDefaults = false;

F#, Json.NET 6.0 and WebApi - serialization of record types

Json.NET 6.0.1 adds F# support for records and discriminated unions. When serializing a F# record type using Json.NET I now get nicely formatted JSON.
The serialization is done as follow:
let converters = [| (new StringEnumConverter() :> JsonConverter) |]
JsonConvert.SerializeObject(questionSet, Formatting.Indented, converters)
However, when I try to expose my F# types through a ASP.NET WebApi 5.0 service, written in C#, the serialized JSON includes an #-sign infront of all properties. The #-sign comes from the internal backing field for the record type (this used to be a known problem with Json.Net and F#).
But - since I'm using the updated version of Json.NET, shouldn't the result be the same as when calling JsonConvert? Or is JsonConvert behaving differently than JsonTextWriterand JsonTextReader?
As far as I can tell from reading the JsonMediaTypeFormatter in the WebApi source JsonTextWriterand JsonTextReader is used by WebApi.
You can adorn your records with the [<CLIMutable>] attribute:
[<CLIMutable>]
type MyDtr = {
Message : string
Time : string }
That's what I do.
For nice XML formatting, you can use:
GlobalConfiguration.Configuration.Formatters.XmlFormatter.UseXmlSerializer <- true
For nice JSON formatting, you can use:
config.Formatters.JsonFormatter.SerializerSettings.ContractResolver <-
Newtonsoft.Json.Serialization.CamelCasePropertyNamesContractResolver()
I believe it's because the backing fields that are emitted by F# records don't follow the same naming convention as C# property backing fields.
The easiest way I've found to get around this is to change the ContractResolver at the startup of your web application from the System.Net.Http.Formatting.JsonContractResolver to use the Newtonsoft.Json.Serialization.DefaultContractResolver instead: -
Formatters.JsonFormatter.SerializerSettings.ContractResolver <- DefaultContractResolver()
You'll then get all JSON formatting done via Newtonsoft's JSON formatter rather than the NET one.

Complex objects at settings. Serializing issue

Default settings serializer supports only simple types. How should I save complex classes. For example:
public class User
{
public string Name {get;set;}
public int Age {get;set;}
}
Now I have to save each field of complex object as separate setting to make it work.
Please advise
The easiest approach is to serialize your settings object and store it as a string. I would recommend JSON.Net for doing this.
string json = Newtonsoft.Json.JsonConvert.SerializeObject(mySettings);
// do something with this string
You can then create a new object from the json
MySettingsObject mySettings = Newtonsoft.Json.JsonConvert.DeserializeObject<MySettingsObject>(json);
You can also take a look to Generic Object Storage Helper for WinRT, available at http://winrtstoragehelper.codeplex.com.
This library serializes your objects using XML format.

An alternative way to use Azure Table Storage?

I'd like to use for table storage an entity like this:
public class MyEntity
{
public String Text { get; private set; }
public Int32 SomeValue { get; private set; }
public MyEntity(String text, Int32 someValue)
{
Text = text;
SomeValue = someValue;
}
}
But it's not possible, because the ATS needs
Parameterless constructor
All properties public and
read/write.
Inherit from TableServiceEntity;
The first two, are two things I don't want to do. Why should I want that anybody could change some data that should be readonly? or create objects of this kind in a inconsistent way (what are .ctor's for then?), or even worst, alter the PartitionKey or the RowKey. Why are we still constrained by these deserialization requirements?
I don't like develop software in that way, how can I use table storage library in a way that I can serialize and deserialize myself the objects? I think that as long the objects inherits from TableServiceEntity it shouldn't be a problem.
So far I got to save an object, but I don't know how retrieve it:
Message m = new Message("message XXXXXXXXXXXXX");
CloudTableClient tableClient = account.CreateCloudTableClient();
tableClient.CreateTableIfNotExist("Messages");
TableServiceContext tcontext = new TableServiceContext(account.TableEndpoint.AbsoluteUri, account.Credentials);
var list = tableClient.ListTables().ToArray();
tcontext.AddObject("Messages", m);
tcontext.SaveChanges();
Is there any way to avoid those deserialization requirements or get the raw object?
Cheers.
If you want to use the Storage Client Library, then yes, there are restrictions on what you can and can't do with your objects that you want to store. Point 1 is correct. I'd expand point 2 to say "All properties that you want to store must be public and read/write" (for integer properties you can get away with having read only properties and it won't try to save them) but you don't actually have to inherit from TableServiceEntity.
TableServiceEntity is just a very light class that has the properties PartitionKey, RowKey, Timestamp and is decorated with the DataServiceKey attribute (take a look with Reflector). All of these things you can do to a class that you create yourself and doesn't inherit from TableServiceEntity (note that the casing of these properties is important).
If this still doesn't give you enough control over how you build your classes, you can always ignore the Storage Client Library and just use the REST API directly. This will give you the ability to searialize and deserialize the XML any which way you like. You will lose the all of the nice things that come with using the library, like ability to create queries in LINQ.
The constraints around that ADO.NET wrapper for the Table Storage are indeed somewhat painful. You can also adopt a Fat Entity approach as implemented in Lokad.Cloud. This will give you much more flexibility concerning the serialization of your entities.
Just don't use inheritance.
If you want to use your own POCO's, create your class as you want it and create a separate tableEntity wrapper/container class that holds the pK and rK and carries your class as a serialized byte array.
You can use composition to achieve what you want.
Create your Table Entities as you need to for storage and create your POCOs as wrappers on those providing the API you want the rest of your application code to see.
You can even mix in some interfaces for better code.
How about generating the POCO wrappers at runtime using System.Reflection.Emit http://blog.kloud.com.au/2012/09/30/a-better-dynamic-tableserviceentity/