Per query JSON serializer in Spring Data Rest - spring-data-rest

Is there any way to define a JSON serializer per query? I would like to be able to define different JSON output for some queries, something similar to this:
#RepositoryRestResource(collectionResourceRel = "people", path = "person")
public interface PersonJpaRepository extends JpaRepository<Person, Long> {
#JsonSerialize(using = SimplePersonSerializer.class)
List<Person> findAll();
#JsonSerialize(using = FullPersonSerializer.class)
List<Person> findByNameOrderByCreationDateDesc(String name);
}
In this scenario, SimplePersonSerializer should be used to serialize a huge list of results and FullPersonSerializer only a few results.

Without any further information it looks like you want projections. Projections define a subset of an entity's properties. The feature is not mentioned in the official documentation, but in the release notes for Spring Data REST 2.1.
You just need to define an interface that contains the subset of properties:
#Projection(name = "simple", types = Person.class)
interface SimplePerson {
String getFirstName();
String getLastName();
}
You don't have to change your repository. The only thing that changes is the URL you are calling: http://myapp/people?projection=simple.

Related

What is the difference between Provider and Resolver

I often need a class/service that will give me some data trough fetching it from a DB, transforming an existing data structure or do both internally but I sometimes have a difficulty naming them properly.
I am currently working with Sylius and they are using classes/services with suffixes such as Checker, Applicator, Processor... I have clear understanding of these names and their implications as to what and how they are doing things. But there are also suffixes Provider and Resolver and I have a difficulty differentiating between them. I don't understand the exact differences of their naming.
What I observed is:
Provider: fetching data that are not yet available (internally fetching data from DB or external API)
Resolver: I already have a bunch of data (and I don't need any additional data) and I need to filter, transform or get some subset of it.
Is there some convention or design pattern to names Resolver and Provider? Am I somewhat right here? Or is there more nuance to this naming?
In my view, patterns are not depend on technology or language, so this article can be applied here:
Content Providers provide an interface, e.g. for publishing and consuming data
and:
Content Resolver resolves a publishing and consuming data to a specific Content provider.
The Content Resolver includes the CRUD (create, read, update, delete) methods corresponding to the abstract methods (insert, query, update, delete) in the Content Provider class.
UPDATE
Provider is an abstraction that can be implemented by concrete providers. E.g., there is DataProvider and DataProvider is an abstraction. So we want concrete implementations of SqlServerProvider, PostgreProvider, OracleProvider.
Let me show an example via C#:
public interface IDataProvider
{
string GetById();
}
public class SqlServerProvider : IDataProvider
{
public string GetById()
{
return "Data retrieved with SqlServerProvider";
}
}
public class PostgreProvider : IDataProvider
{
public string GetById()
{
return "Data retrieved with PostgreProvider";
}
}
public class OracleProvider : IDataProvider
{
public string GetById()
{
return "Data retrieved with OracleProvider";
}
}
Then we need to resolve the above dependenies to use them. But how? We can create DataResolver:
public enum DataProviderType
{
SqlServer, Posgre, Oracle
}
public class DataResolver
{
private Dictionary<DataProviderType, IDataProvider> _dataProviderByType =
new Dictionary<DataProviderType, IDataProvider>()
{
{ DataProviderType.SqlServer, new SqlServerProvider() },
{ DataProviderType.Posgre, new PostgreProvider() },
{ DataProviderType.Oracle, new OracleProvider() },
};
public IDataProvider Resolve(DataProviderType dataProviderType)
{
return _dataProviderByType[dataProviderType];
}
}
and then we can run the above code like this:
DataResolver dataResolver = new DataResolver();
string someValue = dataResolver.Resolve(DataProviderType.SqlServer).GetById();
Console.WriteLine(someValue); // OUTPUT: Data retrieved with SqlServerProvider
See more examples of code here

How to support C# dynamic types in an gRPC proto file

We have a POST action in our asp.net core application that accepts a dynamic object.
[HttpPost]
public Task<ActionResult> SubmitAsync(dynamic unitOfWork)
We'd like to transform this POST action to a gRPC server and we'd like to continue receiving dynamic objects in the gRPC service. What is the equivalent of C# dynamic definition in gRPC protobuf file definition? Or if that cannot be achieved what's the best way to receive a dynamic object?
That isn't really a thing right now. In protobuf terms, Any is the closest thing, but I have not yet implemented that in protobuf-net (it is on my short term additions list). The legacy "dynamic types" feature in protobuf-net (that sends type metadata) is actively being phased out, with Any being the preferred route since it allows cross-platform usage and doesn't have the same metadata dependencies.
Frankly, though, I'd probably say "just don't do this"; instead, prefer oneof; it isn't likely that you actually mean "anything" - you probably just mean "one of these things that I expect, but I don't know which", and oneof expresses that intent. More: protobuf-net implements inheritance via oneof, so a good option is something like:
[ProtoContract]
[ProtoInclude(1, typeof(FooRequest))]
[ProtoInclude(2, typeof(BarRequest))]
public abstract class RequestBase {}
[ProtoContract]
public class FooRequest {}
[ProtoContract]
public class BarRequest {}
You can pass messages with fields whose type was not known in advance. You can also pass messages with fields that are not typed, such as dynamic objects that can take any scalar values, and collections null values are allowed.
To do so, import the proto file "google/protobuf/struct.proto" and declare the
dynamic type as google.protobuf.Value.
So, first add bellow line at the top of your proto file:
import "google/protobuf/struct.proto";
Here my sample message with two dynamic fields:
message BranchResponse {
google.protobuf.Value BranchId = 1;
google.protobuf.Value BranchLevel = 2;
}
Note that: the generated type in C# is Value and belongs to the Google.Protobuf.WellKnownTypes namespace, which belongs itself to the Google.Protobuf assembly. This type inherits from the IMessage, IMessage, IEquatable, IDeepCloneable, and IBufferMessage interfaces that all belong to the Google.Protobuf assembly, except for IEquatable, which comes from the .NET System.Runtime assembly. To write and read dynamic values, we have a set of methods available that shown bellow: (these are write static functions)
We can fill BranchResponse model like this:
var branch = new BranchResponse();
branch.BranchId = Value.ForNumber(1);
branch.BranchLevel = Value.ForStruct(new Struct
{
Fields = {
["LevelId"] = Value.ForNumber(1),
["LevelName"] = Value.ForString("Gold"),
["IsProfessional"] = Value.ForBool(true)}
});
The read Value type is straightforward. The Value type has a set of properties that exposes its value in the wanted type. (these are read static functions)
At the end, you need to read data from your response model like this:
Here my c# classes that my response model is supposed to bind to them.
public class BranchModel
{
public int BranchId { get; set; }
public LevelModel Level { get; set; }
}
public class LevelModel
{
public int LevelId{ get; set; }
public string LevelName{ get; set; }
public bool IsProfessional { get; set; }
}
Finally:
var branch = new BranchResponse(); // Received filled from a gRPC call
// Read
var branchModel = new BranchModel
{
BranchId = Convert.ToInt32(branch.BranchId.NumberValue),
Level= new LevelModel
{
LevelId = Convert.ToInt32(branchModel.Level.StructValue.
Fields["LevelId"].NumberValue),
LevelName = branchModel.Level.StructValue.
Fields["LevelName"].StringValue,
IsProfessional = branchModel.Level.StructValue.
Fields["IsProfessional"].BoolValue,
}
};

Fluent nHibernate SubclassMap and AddFromAssemblyOf

I created a generic user repository base class that provides reusable user management functionality.
public class UserRepository<TUser> where TUser : new, IUser
{
}
I have a concrete implementation of IUser called UserImpl, and corresponding mapping class UserImplMap : ClassMap<UserImpl> (they all are in the same namespace and assembly). I add the mapping using AddFromAssemblyOf . I also use this to create / generate the schema.
So far so good and things work as expected.
Now, in a different project, I needed a few additional properties in my IUser implementation class, so I implemented a new class UserImplEx : UserImpl. This class has the additional properties that I needed. Also, I created a new mapping class UserImplExMap : SubclassMap<UserImplEx>
Now when I create schema using this approach, I get two tables one for UserImpl and one for UserImplEx.
Is is possible to configure / code Fluent mapping in some way so that all the properties (self, plus inherited) of UserImplEx get mapped in a single table UserImplEx instead of getting split into two tables?
Alternatively, if I provide full mapping in UserImplExMap : ClassMap<UserImplEx>, then I do get the schema as desired, but I also get an additional table for UserImpl (because corresponding mapping is present in the UserRepository assembly). If I follow this approach, is there a way to tell AddFromAssemblyOf to exclude specific mapping classes?
Option 1
since you have inhertance here and want the correct type back NH has to store the type somewhere, either through the table the data is in or a discriminator.
If a discriminator column in the table does not matter then add DiscriminatorColumn("userType", "user"); in UserImplMap and DiscriminatorValue("userEx") in UserImplExMap
Option 2
class MyTypeSource : ITypeSource
{
private ITypeSource _inner = new AssemblyTypeSource(typeof(UserImplMap).Assembly);
public IEnumerable<Type> GetTypes()
{
return _inner.Where(t => t != typeof(UserImplMap)).Concat(new [] { typeof(UserImplExMap) });
}
public void LogSource(IDiagnosticLogger logger)
{
_inner.LogSource(logger);
}
public string GetIdentifier()
{
return _inner.GetIdentifier();
}
}
and when configuring
.Mappings(m =>
{
var model = new PersistenceModel();
PersistenceModel.AddMappingsFromSource(new MyTypeSource());
m.UsePersistenceModel(model);
})

WCF, Linq Error:cannot implicitly convert type System.linq.iorderedQueryable<> to System.Collection.Generic.List<>

I am getting an error : i am using entity framework, wcf.
Error:cannot implicitly convert type System.linq.iorderedQueryable<xDataModel.Info> to System.Collection.Generic.List<xServiceLibrary.Info>
Below are my code:
WCF Service:
namespace xServiceLibrary
{
public List<Info> GetScenario()
{
xEntities db = new xEntities();
var query = from qinfo in db.Infoes
select qinfo;
//return query.Cast<Info>().ToList(); (not working)
//return query.toList(); (not working)
return query;
}
}
Interface:
namespace xServiceLibrary
{
[OperationContract]
List<Info> GetScenario();
}
Class:
namespace xServiceLibrary
{
[DataContract]
public class Info
{
[DataMember]
public int Scenario_Id;
[DataMember]
public string Scenario_Name { get; set; }
[DataMember]
public string Company_Name { get; set; }
}
}
update:(2)
I have two class library files.
One is xDataModel namespace in which i have created xmodel.edmx file.
second is xServiceLibrary namespace where i am implementing Wcf Service.
i have attached the xDataModel.dll file in my xServiceLibrary so that i could query my EF Model.
i am not able to understand the concept. any help would be appreciated.
The problem is that you have two different types named Info: DataModel.Info and ServiceLibrary.Info - because these are different types you cannot cast one into the other.
If there is no strong reason for both being there I would eliminate one of them. Otherwise as a workaround you could project DataModel.Info to ServiceLibrary.Info by copying the relevant properties one by one:
var results = (from qinfo in db.Infoes
select new ServiceLibrary.Info()
{
Scenario_Id = qinfo.Scenario_Id,
//and so on
}).ToList();
The problem is that you have two different classes, both called Info, both in scope at the time you run your query. This is a very very bad thing, especially if you thought they were the same class.
If DataModel.Info and ServiceLibrary.Info are the same class, you need to figure out why they are both in scope at the same time and fix that.
If they are different classes, you need to be explicit about which one you are trying to return. Assuming that your EF model includes a set of DataModel.Info objects, your options there are:
Return a List<DataModel.Info> which you can get by calling query.ToList()
Return a List<ServiceLibrary.Info> which you can get by copying the fields from your DataModel.Info objects:
var query = from qinfo in db.Info
select new ServiceLibrary.Info
{
Scenario_Id = q.Scenario_Id,
Scenario_Name = q.Scenario_Name
Company_Name = q.Company_Name
};
Return something else, such as your custom DTO object, similar to #2 but with only the specific fields you need (e.g. if ServiceLibrary.Info is a heavy object you don't want to pass around.
In general, though, your problem is centered around the fact that the compiler is interpreting List<Info> as List<ServiceLibrary.Info> and you probably don't want it to.

Returning datasets from LINQ to SQL in a REST/WCF service

I have a WCF/REST web service that I'm considering using Linq to SQL to return database info from.
It's easy enough to do basic queries against tables and return rows, for example:
[WebGet(UriTemplate = "")]
public List<User> GetUsers()
{
List<User> ret = new List<User>(); ;
using (MyDataContext context = new MyDataContext())
{
var userResults = from u in context.Users select u;
ret = userResults.ToList<User>();
}
return ret;
}
But what if I want to return data from multiple tables or that doesn't exactly match the schema of the table? I can't figure out how to return the results from this query, for example:
var userResults = from u in context.Users
select new { u.userID, u.userName, u.userType,
u.Person.personFirstname, u.Person.personLastname };
Obviously the resulting rowset doesn't adhere to the "User" schema, so I can't just convert to a list of User objects.
I tried making a new entity in my object model that related to the result set, but it doesn't want to do the conversion.
What am I missing?
Edit: related question: what about results returned from stored procedures? Same issue, what's the best way to package them up for returning via the service?
Generally speaking, you shouldn't return domain objects from a service because if you do you'll run into issues like those you're finding. Domain objects are intended to describe a particular entity in the problem domain, and will often not fit nicely with providing a particular set of data to return from a service call.
You're best off decoupling your domain entities from the service by creating data transfer objects to represent them which contain only the information you need to transfer. The DTOs would have constructors which take domain object(s) and copy whatever property values are needed (you'll also need a parameterless constructor so they can be serialized), or you can use an object-object mapper like AutoMapper. They'll also have service-specific features like IExtensibleDataObject and DataMemberAttributes which aren't appropriate for domain objects. This frees your domain objects to vary independently of objects you send from the service.
You can create a Complex Type and instead of returning Anonymous object you return the Complex Type. When you map stored procedures using function import, you have a option to automatically create a complex type.
Create a custom class with the properties that you need:
public class MyTimesheet
{
public int Id { get; set; }
public string Data { get; set; }
}
Then create it from your Linq query:
using (linkDataContext link = new linkDataContext())
{
var data = (from t in link.TimesheetDetails
select new MyTimesheet
{
Id = t.Id,
Data = t.EmployeeId.ToString()
}).ToList();
}