spring-data-rest return multiple projections at once (compose projections) - spring-data-rest

Is there possibility to compose multiple projections? Is it best approach for my use case?
I have these projections for CarInBazar class for example:
SimpleCarInList
WidgetForHotSale
NumberOfItemViews
FullCarData
CarMainImage
CarMainImageIconSize
CarAdditionalImages
And frontend is now making some UI page, which requires some of these projections.
Should I do:
frontend will do multiple requests for same resource with different projections
Implement Projection for each screen of frontend (duplicating things like NumberOfItemViews calculation,...)
Use inheritance and make specific projections utilizing extends keyword. i.e.:
#Projection(name = "screen-dashboardHome", types = {CarInBazar.class})
public interface DashboardHomeProjectionForCarInBazar extends SimpleCarInList,
WidgetForHotSale, CarMainImageIconSize {
}
Is there any possibility to request more projections at once? It could be than rendered in UI using provided profiles perhaps.
EDIT: as requested, providing projection definition example:
import org.springframework.data.rest.core.config.Projection;
import java.awt.*;
import java.util.Date;
#Projection(name = "CarMainImage", types = {CarInBazar.class})
public interface CarMainImage {
Date getLastUpdateDate();
Image getMainImage();
default String getMainImageAdditionalInformation() {
final var updated = this.getLastUpdateDate().getTime();
final var created = this.getCreatedAtDate().getTime();
if (created >= (updated - 10 minutes)) {
return "some business logic on not published fields";
} else {
return "could happen not only in spel";
}
}
}
Many projections does not contain any business logic and are only filtering fields.

Using multiple projections seems to not be supported (or widely supported or easy). Using suggestion from #Aivaras comment I have used this approach:
Repository code with custom JPQL query:
#Repository
#RepositoryRestResource
#Transactional(readOnly = true)
public interface SomeRepository extends PagingAndSortingRepository<Some, Long> {
Page<Some> findByNameContaining(String namePart, Pageable pageable);
#Query("select new sk.qpp.documents.projections.SomeCustomViewByQuery(s.name, s.startDate, s.endDate, s.goLiveDate, 42) from Some s where s.id = :id")
Optional<SomeCustomViewByQuery> getByIdProjectedForSpecialScreen(Long id);
}
And class SomeCustomViewByQuery is just simple DTO like thing. Using lombok it can look like:
#Value
public class SomeCustomViewByQuery {
private String name;
private Date startDate;
private Date endDate;
private Date goLiveDate;
// TODO make SomeHealth to be enum and specific logic behind it.
String getSomeHealth() {
final var start = this.getStartDate().getTime();
final var end = this.getEndDate().getTime();
final var goLive = this.getGoLiveDate().getTime();
final var now = System.currentTimeMillis();
if (now < start) {
return "not started yet";
} else {
if (now < end) {
return "work in progress";
} else {
if (now < goLive) {
return "passed end, but before goLive";
} else {
return "something after goLive time";
}
}
}
}
private int unicornsCount;
}
This way, I can make hand-crafted query (JQPL) and also create custom DTO instance. It is handy, where I need to do some joins to other tables with aggregation (count, avg, max, min, ...) and other things, which are better done on database side.

Related

JPA - Attributeconverter not working with native query

I have created AttributeConverter class which is converting Enum to DB value and overriden the necessary methods.
If I use JPA query like below then converter is getting called and getting correct result.
public List<Driver> findByStatus(DriverStatus status);
BUT If I use with Query annotation that AttributeConverter is not getting called. I have more complex query where I need to use native query with Attribute Converter but it is not working for me.
#Query(value = "select * from driver where status=:status", nativeQuery = true)
public List<Driver> findByStatus1(DriverStatus status);
is there any way to handle this requirement ?
Update 1 - below is the Converter code
#Converter(autoApply = true)
public class DriverStatusConverter implements AttributeConverter<DriverStatus, String> {
#Override
public String convertToDatabaseColumn(DriverStatus driverStatus) {
if (driverStatus == null) {
return null;
}
System.err.println("from converter" +driverStatus.getCode());
return driverStatus.getCode();
}
#Override
public DriverStatus convertToEntityAttribute(String code) {
if (code == null) {
return null;
}
return Stream.of(DriverStatus.values()).filter(c -> c.getCode().equals(code)).findFirst()
.orElseThrow(IllegalArgumentException::new);
}
}
Faced similar issue, wasn't able to find satisfying solution, so had to resort to the ugly hack... Repository method signature was changed like so:
public List<Driver> findByStatus1(Integer status);
...and was called like so:
repository.findByStatus1(DriverStatus.YOUR_DESIRED_STATUS.getCode());
Yes, it's ugly and kinda defeats the purpose of AttributeConverter, but at least it works. In my particular case I had to resort to such measures just with one native query. AttributeConverter still works for the all other JPA queries.
I just come across same problem and use following to go around it
DriverStatusConverter move logic code in convertToEntityAttribute to public static method:
#Override
public DriverStatus convertToEntityAttribute(String code) {
return convertStringToEnum(code);
}
public static DriverStatus convertStringToEnum(String code) {
if (code == null) {
return null;
}
return Stream.of(DriverStatus.values()).filter(c -> c.getCode().equals(code)).findFirst()
.orElseThrow(IllegalArgumentException::new);
}
create new DriverDTO for native query result
public interface DriverDTO {
public Integer getId();
public String getName();
public String getStatusCode();
default DriverStatus getStatus() {
return DriverStatusConverter.convertStringToEnum(getStatusCode());
}
}
DriverRepository findByStatusCode() select columns in query and return DriverDTO class
#Query(value = "select id, name, status as statusCode from driver WHERE status=:statusCode", nativeQuery = true)
public List<DriverDTO> findByStatusCode(String statusCode);
Remove the nativeQuery = true and try.
#Query(value = "select * from driver where status=:status")
public List<Driver> findByStatus1(DriverStatus status);
Alternatively you can try using entityManager like so -
This is just sample code.
TypedQuery<Trip> q = entityManager.createQuery("SELECT t FROM Trip t WHERE t.vehicle = :v", Trip.class);
q.setParameter("v", Vehicle.PLANE);
List<Trip> trips = q.getResultList();
Reference -
https://thorben-janssen.com/jpa-21-type-converter-better-way-to/

Why do I receive an error when I consult data in my Micronaut Gorm application?

I have a simple application in Mironaut with three entities, Customer, Contact and Loans.
Customer has a 1 to many relationship with Contact and Loans. I test with Grails / Gorm and it works fine.
I have a DataLoader class that works well and creates all entities with their relationships.
/****** Contact.groovy *******/
package com.gnc.demo.domain
import grails.gorm.annotation.Entity
#Entity
class Contact {
Long id
Long version
Customer customer
static belongsTo = Customer
String email
String phone
String cellPhone
String address
}
/****** Customer.groovy *******/
package com.gnc.demo.domain
import grails.gorm.annotation.Entity
#Entity
class Customer {
Long id
Long version
String driverId
String name
String lastName
static hasMany = [contacts: Contact, loans: Loan]
static constraints = {
contacts nullable: true
loans nullable: true
}
static mapping = {
contacts lazy: false
loans lazy: false
}
}
/****** Loan.groovy *******/
package com.gnc.demo.domain
import grails.gorm.annotation.Entity
#Entity
class Loan {
Long id
Long version
Customer customer
static belongsTo = Customer
BigDecimal amount
long term
BigDecimal rate
}
/******* CustomerController.groovy *******/
package com.gnc.demo.controllers
import com.gnc.demo.domain.Customer
import com.gnc.demo.services.ContactService
import com.gnc.demo.services.CustomerService
import com.gnc.demo.services.LoanService
import io.micronaut.http.annotation.Controller
import io.micronaut.http.annotation.Get
import org.slf4j.Logger
import org.slf4j.LoggerFactory
#Controller("/customer")
class CustomerController {
private static final Logger LOG = LoggerFactory.getLogger(CustomerController.class);
final CustomerService customerService
final LoanService loanService
final ContactService contactService
CustomerController(CustomerService customerService, LoanService loanService, ContactService contactService) {
this.customerService = customerService
this.loanService = loanService
this.contactService = contactService
}
#Get("/")
String index() {
return "Hola ! " + new Date()
}
#Get("/all/{offset}/{max}")
List<Customer> getCustomers(String offset, String max) {
List<Customer> customers = customerService.findAll([offset: offset, max: max])
try {
customers.each { customer ->
// LOG.info(">>> Loans :" +customer.loans.size())
customer.contacts = []
customer.loans = []
}
} catch (Exception e) {
LOG.info(">>> Error :" + e)
}
return customers
}
#Get("/{id}")
Customer getCustomers(String id) {
Customer customer = customerService.get(id)
customer?.contacts = []
customer?.loans = []
customer?.contacts = contactService.findAllByCustomer(customer)
customer?.loans = loanService.findAllByCustomer(customer)
return customer
}
}
All the code is available in: https://github.com/gnpitty/com-gnc-demo
But when I test in Micronaut with my browser: http://localhost:9020/customer/10
I receive this error:
{"message":"Internal Server Error: Error encoding object
[com.gnc.demo.domain.Customer : 10] to JSON: could not initialize proxy - no
Session (through reference chain: com.gnc.demo.domain.Customer[\"contacts\"]-
>java.util.LinkedHashSet[0]->com.gnc.demo.domain.Contact[\"customer\"]-
>com.gnc.demo.domain.Customer_$$_jvst110_0[\"driverId\"])"}
As one comment said, you should make sure the #Transactional or withTransaction {} is used when reading the record.
Also, if you want to reference the proxy elements (like the Customer reference), you need to force the proxy element to be read. I know of two ways: 1) do an eager fetch on them or 2) resolve the proxy explicitly.
I chose option 2) since I did not want to force eager fetching when it wasn't needed. I only use this in controllers where I am return a JSON encoded domain object. This is usually just in my REST API methods.
Example:
Loan.withTransaction {
def loan = Loan.findByXYZ()
resolveProxies(loan)
}
This converts the proxies into real objects so you can access them outside of the withTransaction{} closure. This usually is Jackson converting them to JSON.
I use this method to resolve any proxies in lists or as simple references to another domain object:
/**
* Resolves all proxies for the given domain class. This allows the domain to be used outside of an hibernate session
* if needed. This will check all fields and sub-objects for proxies.
* <p>
* <b>Note:</b> This will usually force a read of all referenced objects.
* #param object The object.
*/
def resolveProxies(Object object) {
if (object == null) {
return
}
for (property in object.class.gormPersistentEntity.persistentProperties) {
def value = object[property.name]
if (Collection.isAssignableFrom(property.type) && value) {
for (item in value) {
if (item != null) {
// Resolved any sub-objects too.
resolveProxies(item)
}
}
} else if (value instanceof HibernateProxy) {
// A simple reference, so unproxy it the GORM way.
object[property.name] = value.getClass().get(value.id)
}
}
}
Feel free to use this code anywhere you need it.

Spring Data Rest ResourceProcessor not applied on Projections

I am using a ResourceProcessor to add additional links to my resource object when listed in a collection or fetched individually. However, when I apply a projection (or an excerpt project) to my repository, the ResourceProcessor does not get run and thus my links for that resource do not get created. Is there a means to allow my custom resource links to be added to a resource regardless of how the resource content is projected?
I think this issue is describing your case:
https://jira.spring.io/browse/DATAREST-713
Currently, spring-data-rest does not offer functionality to solve your problem.
We are using a little workaround that still needs a separate ResourceProcessor for each projection but we do not need to duplicate the link logic:
We have a base class that is able to get the underlying Entity for a Projection and invokes the Entity's ResourceProcessor and applies the links to the Projection.
Entity is a common interface for all our JPA entities - but I think you could also use org.springframework.data.domain.Persistable or org.springframework.hateoas.Identifiable.
/**
* Projections need their own resource processors in spring-data-rest.
* To avoid code duplication the ProjectionResourceProcessor delegates the link creation to
* the resource processor of the underlying entity.
* #param <E> entity type the projection is associated with
* #param <T> the resource type that this ResourceProcessor is for
*/
public class ProjectionResourceProcessor<E extends Entity, T> implements ResourceProcessor<Resource<T>> {
private final ResourceProcessor<Resource<E>> entityResourceProcessor;
public ProjectionResourceProcessor(ResourceProcessor<Resource<E>> entityResourceProcessor) {
this.entityResourceProcessor = entityResourceProcessor;
}
#SuppressWarnings("unchecked")
#Override
public Resource<T> process(Resource<T> resource) {
if (resource.getContent() instanceof TargetAware) {
TargetAware targetAware = (TargetAware) resource.getContent();
if (targetAware != null
&& targetAware.getTarget() != null
&& targetAware.getTarget() instanceof Entity) {
E target = (E) targetAware.getTarget();
resource.add(entityResourceProcessor.process(new Resource<>(target)).getLinks());
}
}
return resource;
}
}
An implementation of such a resource processor would look like this:
#Component
public class MyProjectionResourceProcessor extends ProjectionResourceProcessor<MyEntity, MyProjection> {
#Autowired
public MyProjectionResourceProcessor(EntityResourceProcessor resourceProcessor) {
super(resourceProcessor);
}
}
The implementation itself just passes the ResourceProcessor that can handle the entity class and passes it to our ProjectionResourceProcessor. It does not contain any link creation logic.
Here is a generic solution:
#Component
public class ProjectionProcessor implements RepresentationModelProcessor<EntityModel<TargetAware>> {
private final RepresentationModelProcessorInvoker processorInvoker;
public ProjectionProcessor(#Lazy RepresentationModelProcessorInvoker processorInvoker) {
this.processorInvoker = processorInvoker;
}
#Override
public EntityModel<TargetAware> process(EntityModel<TargetAware> entityModel) {
TargetAware content = entityModel.getContent();
if (content != null) {
entityModel.add(processorInvoker.invokeProcessorsFor(EntityModel.of(content.getTarget())).getLinks());
}
return entityModel;
}
}
It gets links for original entities and adds them to corrseponding projections.

How can I use MEF to manage interdependent modules?

I found this question difficult to express (particularly in title form), so please bear with me.
I have an application that I am continually modifying to do different things. It seems like MEF might be a good way to manage the different pieces of functionality. Broadly speaking, there are three sections of the application that form a pipeline of sorts:
Acquisition
Transformation
Expression
In it's simplest form, I can express each of these stages as an interface (IAcquisition etc). The problems start when I want to use acquisition components that provides richer data than standard. I want to design modules that use this richer data, but I can't rely on it being there.
I could, of course, add all of the data to the interface specification. I could deal with poorer data sources by throwing an exception or returning a null value. This seems a long way from ideal.
I'd prefer to do the MEF binding in three stages, such that modules are offered to the user only if they are compatible with those selected previously.
So my question: Can I specify metadata which restricts the set of available imports?
An example:
Acquision1 offers BasicData only
Acquision2 offers BasicData and AdvancedData
Transformation1 requires BasicData
Transformation2 requires BasicData and AdvancedData
Acquisition module is selected first.
If Acquisition1 is selected, don't offer Transformation 2, otherwise offer both.
Is this possible? If so, how?
Your question suggests a structure like this:
public class BasicData
{
public string Basic { get; set; } // example data
}
public class AdvancedData : BasicData
{
public string Advanced { get; set; } // example data
}
Now you have your acquisition, transformation and expression components. You want to be able to deal with different kinds of data, so they're generic:
public interface IAcquisition<out TDataKind>
{
TDataKind Acquire();
}
public interface ITransformation<TDataKind>
{
TDataKind Transform(TDataKind data);
}
public interface IExpression<in TDataKind>
{
void Express(TDataKind data);
}
And now you want to build a pipeline out of them that looks like this:
IExpression.Express(ITransformation.Transform(IAcquisition.Acquire));
So let's start building a pipeline builder:
using System;
using System.Collections.Generic;
using System.ComponentModel.Composition;
using System.ComponentModel.Composition.Hosting;
using System.ComponentModel.Composition.Primitives;
using System.Linq;
using System.Linq.Expressions;
// namespace ...
public static class PipelineBuidler
{
private static readonly string AcquisitionIdentity =
AttributedModelServices.GetTypeIdentity(typeof(IAcquisition<>));
private static readonly string TransformationIdentity =
AttributedModelServices.GetTypeIdentity(typeof(ITransformation<>));
private static readonly string ExpressionIdentity =
AttributedModelServices.GetTypeIdentity(typeof(IExpression<>));
public static Action BuildPipeline(ComposablePartCatalog catalog,
Func<IEnumerable<string>, int> acquisitionSelector,
Func<IEnumerable<string>, int> transformationSelector,
Func<IEnumerable<string>, int> expressionSelector)
{
var container = new CompositionContainer(catalog);
The class holds MEF type identities for your three contract interfaces. We'll need those later to identify the correct exports. Our BuildPipeline method returns an Action. That is going to be the pipeline, so we can just do pipeline(). It takes a ComposablePartCatalog and three Funcs (to select an export). That way, we can keep all the dirty work inside this class. Then we start by creating a CompositionContainer.
Now we have to build ImportDefinitions, first for the acquisition component:
var aImportDef = new ImportDefinition(def => (def.ContractName == AcquisitionIdentity), null, ImportCardinality.ZeroOrMore, true, false);
This ImportDefinition simply filters out all exports of the IAcquisition<> interface. Now we can give it to the container:
var aExports = container.GetExports(aImportDef).ToArray();
aExports now holds all IAcquisition<> exports in the catalog. So let's run the selector on this:
var selectedAExport = aExports[acquisitionSelector(aExports.Select(export => export.Metadata["Name"] as string))];
And there we have our acquisition component:
var acquisition = selectedAExport.Value;
var acquisitionDataKind = (Type)selectedAExport.Metadata["DataKind"];
Now we're going to do the same for the transformation and the expression components, but with one slight difference: The ImportDefinition is going to ensure that each component can handle the output of the previous component.
var tImportDef = new ImportDefinition(def => (def.ContractName == TransformationIdentity) && ((Type)def.Metadata["DataKind"]).IsAssignableFrom(acquisitionDataKind),
null, ImportCardinality.ZeroOrMore, true, false);
var tExports = container.GetExports(tImportDef).ToArray();
var selectedTExport = tExports[transformationSelector(tExports.Select(export => export.Metadata["Name"] as string))];
var transformation = selectedTExport.Value;
var transformationDataKind = (Type)selectedTExport.Metadata["DataKind"];
var eImportDef = new ImportDefinition(def => (def.ContractName == ExpressionIdentity) && ((Type)def.Metadata["DataKind"]).IsAssignableFrom(transformationDataKind),
null, ImportCardinality.ZeroOrMore, true, false);
var eExports = container.GetExports(eImportDef).ToArray();
var selectedEExport = eExports[expressionSelector(eExports.Select(export => export.Metadata["Name"] as string))];
var expression = selectedEExport.Value;
var expressionDataKind = (Type)selectedEExport.Metadata["DataKind"];
And now we can wire it all up in an expression tree:
var acquired = Expression.Call(Expression.Constant(acquisition), typeof(IAcquisition<>).MakeGenericType(acquisitionDataKind).GetMethod("Acquire"));
var transformed = Expression.Call(Expression.Constant(transformation), typeof(ITransformation<>).MakeGenericType(transformationDataKind).GetMethod("Transform"), acquired);
var expressed = Expression.Call(Expression.Constant(expression), typeof(IExpression<>).MakeGenericType(expressionDataKind).GetMethod("Express"), transformed);
return Expression.Lambda<Action>(expressed).Compile();
}
}
And that's it! A simple example application would look like this:
[Export(typeof(IAcquisition<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic acquisition")]
public class Acquisition1 : IAcquisition<BasicData>
{
public BasicData Acquire()
{
return new BasicData { Basic = "Acquisition1" };
}
}
[Export(typeof(IAcquisition<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced acquisition")]
public class Acquisition2 : IAcquisition<AdvancedData>
{
public AdvancedData Acquire()
{
return new AdvancedData { Advanced = "Acquisition2A", Basic = "Acquisition2B" };
}
}
[Export(typeof(ITransformation<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic transformation")]
public class Transformation1 : ITransformation<BasicData>
{
public BasicData Transform(BasicData data)
{
data.Basic += " - Transformed1";
return data;
}
}
[Export(typeof(ITransformation<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced transformation")]
public class Transformation2 : ITransformation<AdvancedData>
{
public AdvancedData Transform(AdvancedData data)
{
data.Basic += " - Transformed2";
data.Advanced += " - Transformed2";
return data;
}
}
[Export(typeof(IExpression<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic expression")]
public class Expression1 : IExpression<BasicData>
{
public void Express(BasicData data)
{
Console.WriteLine("Expression1: {0}", data.Basic);
}
}
[Export(typeof(IExpression<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced expression")]
public class Expression2 : IExpression<AdvancedData>
{
public void Express(AdvancedData data)
{
Console.WriteLine("Expression2: ({0}) - ({1})", data.Basic, data.Advanced);
}
}
class Program
{
static void Main(string[] args)
{
var pipeline = PipelineBuidler.BuildPipeline(new AssemblyCatalog(typeof(Program).Assembly), StringSelector, StringSelector, StringSelector);
pipeline();
}
static int StringSelector(IEnumerable<string> strings)
{
int i = 0;
foreach (var item in strings)
Console.WriteLine("[{0}] {1}", i++, item);
return int.Parse(Console.ReadLine());
}
}

Refactoring code using Strategy Pattern

I have a GiftCouponPayment class. It has a business strategy logic which can change frequently - GetCouponValue(). At present the logic is “The coupon value should be considered as zero when the Coupon Number is less than 2000”. In a future business strategy it may change as “The coupon value should be considered as zero when the Coupon Issued Date is less than 1/1/2000”. It can change to any such strategies based on the managing department of the company.
How can we refactor the GiftCouponPayment class using Strategy pattern so that the class need not be changed when the strategy for GetCouponValue method?
UPDATE: After analyzing the responsibilities, I feel, "GiftCoupon" will be a better name for "GiftCouponPayment" class.
C# CODE
public int GetCouponValue()
{
int effectiveValue = -1;
if (CouponNumber < 2000)
{
effectiveValue = 0;
}
else
{
effectiveValue = CouponValue;
}
return effectiveValue;
}
READING
Strategy Pattern - multiple return types/values
GiftCouponPayment class should pass GiftCoupon to different strategy classes. So your strategy interface (CouponValueStrategy) should contain a method:
int getCouponValue(GiftCoupon giftCoupon)
Since each Concrete strategy implementing CouponValueStrategy has access to GiftCoupon, each can implement an algorithm based on Coupon number or Coupon date etc.
You can inject a "coupon value policy" into the coupon object itself and call upon it to compute the coupon value. In such cases, it is acceptable to pass this into the policy so that the policy can ask the coupon for its required attributes (such as coupon number):
public interface ICouponValuePolicy
{
int ComputeCouponValue(GiftCouponPayment couponPayment);
}
public class GiftCouponPayment
{
public ICouponValuePolicy CouponValuePolicy {
get;
set;
}
public int GetCouponValue()
{
return CouponValuePolicy.ComputeCouponValue(this);
}
}
Also, it seems like your GiftCouponPayment is really responsible for two things (the payment and the gift coupon). It might make sense to extract a GiftCoupon class that contains CouponNumber, CouponValue and GetCouponValue(), and refer to this from the GiftCouponPayment.
When your business - logic changes, it's quite natural that your code will have to change as well.
You could perhaps opt to move the expiration-detection logic into a specification class:
public class CouponIsExpiredBasedOnNumber : ICouponIsExpiredSpecification
{
public bool IsExpired( Coupon c )
{
if( c.CouponNumber < 2000 )
return true;
else
return false;
}
}
public class CouponIsExpiredBasedOnDate : ICouponIsExpiredSpecification
{
public readonly DateTime expirationDate = new DateTime (2000, 1, 1);
public bool IsExpired( Coupon c )
{
if( c.Date < expirationDate )
return true;
else
return false;
}
}
public class Coupon
{
public int GetCouponValue()
{
ICouponIsExpiredSpecification expirationRule = GetExpirationRule();
if( expirationRule.IsExpired(this) )
return 0;
else
return this.Value;
}
}
The question you should ask yourself: is it necessary to make it this complex right now ? Can't you make it as simple as possible to satisfy current needs, and refactor it later, when the expiration-rule indeed changes ?
The behavior that you wish to be dynamic is the coupon calculation - which can dependent on any number of things: coupon date, coupon number, etc. I think that a provider pattern would be more appropriate, to inject a service class which calculates the coupon value.
The essence of this is moving the business logic outside of the GiftCouponPayment class, and using a class I'll call "CouponCalculator" to encapsulate the business logic. This class uses an interface.
interface ICouponCalculator
{
int Calculate (GiftCouponPayment payment);
}
public class CouponCalculator : ICouponCalculator
{
public int Calculate (GiftCouponPayment payment)
{
if (payment.CouponNumber < 2000)
{
return 0;
}
else
{
return payment.CouponValue;
}
}
}
Now that you have this interface and class, add a property to the GiftCouponPayment class, then modify your original GetCouponValue() method:
public class GiftCouponPayment
{
public int CouponNumber;
public int CouponValue;
public ICouponCalculator Calculator { get; set; }
public int GetCouponValue()
{
return Calculator.Calculate(this);
}
}
When you construct the GiftCouponPayment class, you will assign the Calculator property:
var payment = new GiftCouponPayment() { Calculator = new CouponCalculator(); }
var val = payment.GetCouponValue(); // uses CouponCalculator class to get value
If this seems like a lot of work just to move the calculation logic outside of the GiftCouponPayment class, well, it is! But if this is your requirement, it does provide several things:
1. You won't need to change the GiftCouponPayment class to adjust the calculation logic.
2. You could create additional classes that implement ICalculator, and a factory pattern to decide which class to inject into GiftCouponPayment when it is constructed. This speaks more to your original desire for a "strategy" pattern - as this would be useful if the logic becomes very complex.