Wicket Deployment mode map resources wrong way - wicketstuff

I have Page
getRootRequestMapperAsCompound().add(new NoVersionMapper("/card/${cardId}", CardPage.class));.
On this page there is TinyMCE4 editor. Which try to load images using relative path "images/1.jpg"
I've added resource mapping to allow images successfuly loaded.
mountResource("/card/image/${imageId}", imageResourceReference);
In DEVELOPMENT mode everything work nice, image are loaded in to editor, but in DEPLOYMENT mode, Page has been called twice, first time for /card/1 and second time for /card/image/1.jpg.
How to correctly mount resources for DEPLOYMENT mode?
UPDATE look like found the reason
public int getCompatibilityScore(Request request)
{
return 0; // pages always have priority over resources
}
, but then the question is: "Why it is working nice in development mode"?
Update 2 I haven't find better solution then add my own Resource Mapper with overrided getCompatibilityScore()
public class ImageResourceMapper extends ResourceMapper {
private String[] mountSegments;
public ImageResourceMapper(String path, ResourceReference resourceReference) {
super(path, resourceReference);
mountSegments = getMountSegments(path);
}
public ImageResourceMapper(String path, ResourceReference resourceReference, IPageParametersEncoder encoder) {
super(path, resourceReference, encoder);
mountSegments = getMountSegments(path);
}
#Override
public int getCompatibilityScore(Request request) {
if (urlStartsWith(request.getUrl(), mountSegments)) {
return 10;
}
return 0;
}
}

Related

symfony 4 Upload

How to upload a file in symfony 4.I have done with the symfony document. I don't know where I have missed something. Its throws error while uploading file give me some clues
REFERED LINK:
https://symfony.com/doc/current/controller/upload_file.html
ERROR:
The file "" does not exist
Entity
public function getBrochure()
{
return $this->brochure;
}
public function setBrochure($brochure)
{
$this->brochure = $brochure;
return $this;
}
File upload Listener
class FileUploader
{
private $targetDirectory;
public function __construct($targetDirectory)
{
$this->targetDirectory = $targetDirectory;
}
public function upload(UploadedFile $file)
{
$fileName = md5(uniqid()).'.'.$file->guessExtension();
$file->move($this->getTargetDirectory(), $fileName);
return $fileName;
}
public function getTargetDirectory()
{
return $this->targetDirectory;
}
}
This Symfony tutorial works fine for me so I'll try to explain how and perhaps it will help you or people still looking for an answer, this post getting a bit old.
So first you have to create the FileUploader service in App\Service for better reusability (chapter: Creating an Uploader Service). You can basically copy/paste what they've done here, it works like a charm. Then you need to open your services.yaml in Config folder and explicit your brochure directory:
parameters:
brochures_directory: '%kernel.project_dir%/public/uploads/brochures'
# ...
services:
# ...
App\Service\FileUploader:
arguments:
$targetDirectory: '%brochures_directory%'
Now everything is normally ready to use your FileUploader service.
So if you're in your controller (for example), I guess you want to use it in a form. Thus, you just have to do this (don't forget to use your Service in your Controller):
public function myController(FileUploader $fileUploader)
{
// Create your form and handle it
if ($form isValid() && &form isSubmitted()) {
$file = $myEntity->getBrochure();
$fileName = $this->fileUploader->upload($file);
$myEntity->setBrochure($fileName);
// Form validation and redirection
}
// Render your template
}
One important point I forgot to say. In your FormType, you need to say that the Brochure will be a FileType:
$builder->add('brochure', FileType::class)
But in your entity you have to specify your brochure is stored as a "string":
/**
* #MongoDB\Field(type="string")
*/
protected $brochure;
The reason is your file is getting uploaded and saved in your public/uploads/brochure. But your database is only remembering a string path to reach it.
I hope this will help!

Using a custom RazorViewEngine AND RazorGenerator precompiled views

I am trying to use a custom (derived) RazorViewEngine AND precompiled views using RazorGenerator.
Some context:
We have a base product that we use for multiple client implementations. With that we have a core set of base views. Most of the views work most of the time. Right now we end up copying existing views for each new solution and modifying as needed. This ends up with 95% of the views being the same between clients and 5% changed.
What I want to do take a base set of views, compile them into a DLL and re-use it across clients. So far I have that working well using RazorGenerator.
Now the next step is to allow for customization (overrides) of views. There is a caveat though. Our application has two "modes" that a user is in. The mode they are in could require a different view.
I have created a derived class from the RazorGeneratorView. This view basically inspects the "OrderingMode" from a UserProfile object that Autofac resolves. Based on the mode - the Path Locator is replaced for the view resolution.
The idea being individual client applications will attempt to resolve the view first in the traditional Views folder. Only I am adding in a sub-directory of Views/{OrderingMode}/{Controller}/{View}.cshtml.
If the view is not found - then it will look in the compiled library (the core views).
This allows me to override individual views / partials as need be for clients.
public PosViewEngine() : base()
{
//{0} = View Name
//{1} = ControllerName
//{2} = Area Name
AreaViewLocationFormats = new[]
{
//First look in the hosting application area folder / Views / ordering type
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"Areas/{2}/Views/%1/{1}/{0}.cshtml",
//Next look in the hosting application area folder / Views / ordering type / Shared
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"Areas/{2}/Views/%1/Shared/(0}.cshtml",
//Finally look in the IMS.POS.Web.Views.Core assembly
"Areas/{2}/Views/{1}/{0}.cshtml"
};
//Same format logic
AreaMasterLocationFormats = AreaViewLocationFormats;
AreaPartialViewLocationFormats = new[]
{
//First look in the hosting application area folder / Views / ordering type
//Areas/{AreaName}/{OrderType}/{ControllerName}/Partials/{PartialViewName}.cshtml
"Areas/{2}/Views/%1/{1}/Paritals/{0}.cshtml",
//Next look in the hosting application area folder / Views / ordering type / Shared
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"Areas/{2}/Views/%1/Shared/(0}.cshtml",
//Finally look in the IMS.POS.Web.Views.Core
"Areas/{2}/Views/{1}/{0}.cshtml"
};
ViewLocationFormats = new[]
{
"Views/%1/{1}/{0}.cshtml",
"Views/%1/Shared/{0}.cshtml",
"Views/{1}/{0}.cshtml",
"Views/Shared/{0}.cshtml"
};
MasterLocationFormats = ViewLocationFormats;
PartialViewLocationFormats = new[]
{
"Views/%1/{1}/Partials/{0}.cshtml",
"Views/%1/Shared/{0}.cshtml",
"Views/{1}/Partials/{0}.cshtml",
"Views/Shared/{0}.cshtml"
};
}
protected override IView CreatePartialView(ControllerContext controllerContext, string partialPath)
{
return base.CreatePartialView(controllerContext, partialPath.ReplaceOrderType(CurrentOrderingMode()));
}
protected override IView CreateView(ControllerContext controllerContext, string viewPath, string masterPath)
{
OrderType orderType = CurrentOrderingMode();
return base.CreateView(controllerContext, viewPath.ReplaceOrderType(orderType), masterPath.ReplaceOrderType(orderType));
}
protected override bool FileExists(ControllerContext controllerContext, string virtualPath)
{
return base.FileExists(controllerContext, virtualPath.Replace("%1/",string.Empty));
}
private OrderType CurrentOrderingMode()
{
OrderType result;
_profileService = DependencyResolver.Current.GetService<IUserProfileService>();
if (_profileService == null || _profileService.OrderingType == 0)
{
IApplicationSettingService settingService =
DependencyResolver.Current.GetService<IApplicationSettingService>();
result =
settingService.GetApplicationSetting(ApplicationSettings.DefaultOrderingMode)
.ToEnumTypeOf<OrderType>();
}
else
{
result = _profileService.OrderingType;
}
return result;
}
}
Here is the StartUp class RazorGenerator uses to Register the ViewEngine.
public static class RazorGeneratorMvcStart
{
public static void Start()
{
var engine = new PrecompiledMvcEngine(typeof(RazorGeneratorMvcStart).Assembly)
{
UsePhysicalViewsIfNewer = HttpContext.Current.Request.IsLocal
};
ViewEngines.Engines.Insert(0, engine);
// StartPage lookups are done by WebPages.
VirtualPathFactoryManager.RegisterVirtualPathFactory(engine);
}
}
The problem is:
This code is executed last (after I register the PosViewEngine) and it inserts the engine at the first position (meaning this is the engine that gets resolved 1st when serving up responses). This ends up finding a view - it is the core view.
If I change the code in the StartUp to Register my custom view engine first first and then the RazorGenerator engine
public static void Start()
{
var engine = new PrecompiledMvcEngine(typeof(RazorGeneratorMvcStart).Assembly)
{
UsePhysicalViewsIfNewer = HttpContext.Current.Request.IsLocal
};
ViewEngines.Engines.Clear();
ViewEngines.Engines.Insert(0, new PosViewEngine());
ViewEngines.Engines.Insert(1, engine);
// StartPage lookups are done by WebPages.
VirtualPathFactoryManager.RegisterVirtualPathFactory(engine);
}
I end up with an exception on the FileExists(ControllerContext controllerContext, string virtualPath) method - "The relative virtual path 'Views/Account/LogOn.cshtml' is not allowed here."
It obviously has something to do with both physical and virtual paths being mixed together.
It looks like someone else was trying to do the same thing here but I didn't see an answer on this.
For anyone else wanting to try this approach I'll post the answer. Basically you need to implement a custom view engine that derives from the PrecompiledMvcEngine found in the RazorGenerator assembly.
public class PosPrecompileEngine : PrecompiledMvcEngine
{
private IUserProfileService _profileService;
public PosPrecompileEngine(Assembly assembly) : base(assembly)
{
LocatorConfig();
}
public PosPrecompileEngine(Assembly assembly, string baseVirtualPath) : base(assembly, baseVirtualPath)
{
LocatorConfig();
}
public PosPrecompileEngine(Assembly assembly, string baseVirtualPath, IViewPageActivator viewPageActivator) : base(assembly, baseVirtualPath, viewPageActivator)
{
LocatorConfig();
}
protected override IView CreatePartialView(ControllerContext controllerContext, string partialPath)
{
return base.CreatePartialView(controllerContext, partialPath.ReplaceOrderType(CurrentOrderingMode()));
}
protected override IView CreateView(ControllerContext controllerContext, string viewPath, string masterPath)
{
OrderType orderType = CurrentOrderingMode();
return base.CreateView(controllerContext, viewPath.ReplaceOrderType(orderType), masterPath.ReplaceOrderType(orderType));
}
protected override bool FileExists(ControllerContext controllerContext, string virtualPath)
{
return base.FileExists(controllerContext, virtualPath.ReplaceOrderType(CurrentOrderingMode()));
}
}
In this class - I override the Locator Paths. Because I have the "base" compiled views in another assembly from the web application - we implemented a convention where the view engine will first look in a PosViews/{ordering mode}/{controller}/{view} path in the web application. If a view is not located -then it will look in the traditional /Views/controller/view. The trick here is the later is a virtual path located in another class library.
This allowed us to "override" an existing view for the application.
private void LocatorConfig()
{
//{0} = View Name
//{1} = ControllerName
//{2} = Area Name
AreaViewLocationFormats = new[]
{
//First look in the hosting application area folder / Views / ordering type
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"PosAreas/{2}/Views/%1/{1}/{0}.cshtml",
//Next look in the hosting application area folder / Views / ordering type / Shared
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"PosAreas/{2}/Views/%1/Shared/(0}.cshtml",
//Next look in the POS Areas Shared
"PosAreas/{2}/Views/Shared/(0}.cshtml",
//Finally look in the IMS.POS.Web.Views.Core assembly
"Areas/{2}/Views/{1}/{0}.cshtml"
};
//Same format logic
AreaMasterLocationFormats = AreaViewLocationFormats;
AreaPartialViewLocationFormats = new[]
{
//First look in the hosting application area folder / Views / ordering type
//Areas/{AreaName}/{OrderType}/{ControllerName}/Partials/{PartialViewName}.cshtml
"PosAreas/{2}/Views/%1/{1}/Partials/{0}.cshtml",
//Next look in the hosting application area folder / Views / ordering type / Shared
//Areas/{AreaName}/{OrderType}/{ControllerName}/{ViewName}.cshtml
"PosAreas/{2}/Views/%1/Shared/(0}.cshtml",
//Next look in the hosting application shared folder
"PosAreas/{2}/Views/Shared/(0}.cshtml",
//Finally look in the IMS.POS.Web.Views.Core
"Areas/{2}/Views/{1}/{0}.cshtml"
};
ViewLocationFormats = new[]
{
"~/PosViews/%1/{1}/{0}.cshtml",
"~/PosViews/%1/Shared/{0}.cshtml",
"~/PosViews/Shared/{0}.cshtml",
"~/Views/{1}/{0}.cshtml",
"~/Views/Shared/{0}.cshtml"
};
MasterLocationFormats = ViewLocationFormats;
PartialViewLocationFormats = new[]
{
"~/PosViews/%1/{1}/{0}.cshtml",
"~/PosViews/%1/Shared/{0}.cshtml",
"~/PosViews/Shared/{0}.cshtml",
"~/Views/{1}/{0}.cshtml",
"~/Views/Shared/{0}.cshtml"
};
}
Register this engine in your application start up events.
public static void Configure()
{
var engine = new PosPrecompileEngine(typeof(ViewEngineConfig).Assembly)
{
UsePhysicalViewsIfNewer = true,
PreemptPhysicalFiles = true
};
ViewEngines.Engines.Add(engine);
// StartPage lookups are done by WebPages.
VirtualPathFactoryManager.RegisterVirtualPathFactory(engine);
}
Here is the final key. When RazorGenerator gets installed view NuGet - you end up with this start-up class that will run on startup
[assembly: WebActivatorEx.PostApplicationStartMethod(typeof(Views.Core.RazorGeneratorMvcStart), "Start")]
public static class RazorGeneratorMvcStart
{
public static void Start()
{
var engine = new PrecompiledMvcEngine(typeof(RazorGeneratorMvcStart).Assembly)
{
UsePhysicalViewsIfNewer = true,
PreemptPhysicalFiles = true
};
ViewEngines.Engines.Add(engine);
// StartPage lookups are done by WebPages.
VirtualPathFactoryManager.RegisterVirtualPathFactory(engine);
}
}
By default - RazorGenerator adds ViewEngine to the first in the collection
ViewEngines.Engines.Insert(0,engine);
You need to change that to an add
ViewEngines.Engines.Add(engine);
So it is added to engines last - this way your custom ViewEngine is used FIRST in locating views.
This approach allows you to reuse views in multiple applications while allowing a means to override that view.
This may be overkill for most applications - bust as I mentioned in the question - this is base product that we use to develop multiple client applications. Trying achieve reuse while maintaining a level of flexibility on a per client basis is something we were trying to achieve.

play 2.5 render view test, access to flash messages

Here the simple play render view test. In view template i trying to accesss session information throught flash.get().
But test failed with message There is no HTTP Context available from here. How add fake session data to tested application in junit test context?
public class ApplicationTest extends WithServer {
private FormFactory formFactory() {
return app.injector().instanceOf(FormFactory.class);
}
#Test
public void renderTemplate() {
Content html;
session().put("session","123");
html = index.render(formFactory().form(Auth.Login.class));
assertTrue(contentAsString(html).contains("Hello"));
}
}
Test ApplicationTest.renderTemplate failed: java.lang.RuntimeException: There is no HTTP Context available from here., took 0.544 sec
at play.mvc.Http$Context.current(Http.java:57)
at play.mvc.Http$Context$Implicit.flash(Http.java:307)
at views.html.index_Scope0$index$$anonfun$apply$1.apply(index.template.scala:39)
at views.html.index_Scope0$index$$anonfun$apply$1.apply(index.template.scala:38)
at views.html.helper.form_Scope0$form.apply(form.template.scala:35)
at views.html.index_Scope0$index.apply(index.template.scala:38)
at views.html.index_Scope0$index.render(index.template.scala:141)
at views.html.index.render(index.template.scala)
at ApplicationTest.renderTemplate(ApplicationTest.java:37)
Using WithServer starts up an application that you can make requests to. For the tests you describe here, you need to use WithApplication.
To manually set a context, you can override the startPlay method.
#Override
public void startPlay()
{
super.startPlay();
Http.Context.current.set(new Http.Context(1L,
Mockito.mock(RequestHeader.class),
Mockito.mock(Http.Request.class),
Collections.<String, String>emptyMap(),
Collections.<String, String>emptyMap(),
Collections.<String, Object>emptyMap()));
}

Can the KnowledgeAgent be used to automatically write the KnowledgeBase to a file so it can be used externally?

i'm working at a little drools project and i have following problem:
- when i read the knowledgepackages from drools via the knowledgeAgent it takes a long time to load((now i know that building the knowledgeBase in general and especially when loading packages from guvnor is very intense ))
so I'm trying to serialize the KnowledgeBase to a file which is located locally on the system - on the one hand because loading the kBase from a local file is much much faster - and for the other so that i can use the KnowledgeBase for other applications The Problem with this is, that while using the KnowledgeAgent to load the KnowledgeBase the first time, the base will be updated by the Agent automatically
BUT: whilst the Base is updated, my local file will not be updated too
So I'm wondering how to handle/get the changeNotification from my KnowledgeAgent so i can call a method to serialize my KnowledgeBase ?
Is this somehow possible? basically i just want to update my local knowledgeBase file, everytime someone edits a rule in governor, so that my local file is always up to date.
If it isn't possible, or a really bad solution to begin with, what is the recommended / best way to go about it?
Please endure my english and the question itself, if you cant really make out what i want to accomplish or if my request is actually not a good solution or the question itself is redundant, im rather new to java and a total noob when it comes to drools.
Down below is the code:
public class DroolsConnection {
private static KnowledgeAgent kAgent;
private static KnowledgeBase kAgentBase;
public DroolsConnection(){
ResourceFactory.getResourceChangeNotifierService().start();
ResourceFactory.getResourceChangeScannerService() .start();
}
public KnowledgeBase readKnowledgeBase( ) throws Exception {
kAgent = KnowledgeAgentFactory.newKnowledgeAgent("guvnorAgent");
kAgent .applyChangeSet( ResourceFactory.newFileResource(CHANGESET_PATH));
kAgent.monitorResourceChangeEvents(true);
kAgentBase = kAgent.getKnowledgeBase();
serializeKnowledgeBase(kAgentBase);
return kAgentBase;
}
public List<EvaluationObject> runAgainstRules( List<EvaluationObject> objectsToEvaluate,
KnowledgeBase kBase ) throws Exception{
StatefulKnowledgeSession knowSession = kBase.newStatefulKnowledgeSession();
KnowledgeRuntimeLogger knowLogger = KnowledgeRuntimeLoggerFactory.newFileLogger(knowSession, "logger");
for ( EvaluationObject o : objectsToEvaluate ){
knowSession.insert( o );
}
knowSession.fireAllRules();
knowLogger .close();
knowSession.dispose();
return objectsToEvaluate;
}
public KnowledgeBase serializeKnowledgeBase(KnowledgeBase kBase) throws IOException{
OutputStream outStream = new FileOutputStream( SERIALIZE_BASE_PATH );
ObjectOutputStream oos = new ObjectOutputStream( outStream );
oos.writeObject ( kBase );
oos.close();
return kBase;
}
public KnowledgeBase loadFromSerializedKnowledgeBase() throws Exception {
KnowledgeBase kBase = KnowledgeBaseFactory.newKnowledgeBase();
InputStream is = new FileInputStream( SERIALIZE_BASE_PATH );
ObjectInputStream ois = new ObjectInputStream( is );
kBase = (KnowledgeBase) ois.readObject();
ois.close();
return kBase;
}
}
thanks for your help in advance!
best regards,
Marenko
In order to keep your local kbase updated you could use a KnowledgeAgentEventListener to know when its internal kbase gets updated:
kagent.addEventListener( new KnowledgeAgentEventListener() {
public void beforeChangeSetApplied(BeforeChangeSetAppliedEvent event) {
}
public synchronized void afterChangeSetApplied(AfterChangeSetAppliedEvent event) {
}
public void beforeChangeSetProcessed(BeforeChangeSetProcessedEvent event) {
}
public void afterChangeSetProcessed(AfterChangeSetProcessedEvent event) {
}
public void beforeResourceProcessed(BeforeResourceProcessedEvent event) {
}
public void afterResourceProcessed(AfterResourceProcessedEvent event) {
}
public void knowledgeBaseUpdated(KnowledgeBaseUpdatedEvent event) {
//THIS IS THE EVENT YOU ARE INTERESTED IN
}
public void resourceCompilationFailed(ResourceCompilationFailedEvent event) {
}
} );
You still need to handle concurrently accesses on your local kbase though.
By the way, since you are not using 'newInstance' configuration option, the agent will create a new instance of a kbase each time a change-set is applied. So, make sure you serialize the kagent's internal kbase (kagent.getKnowledgeBase()) instead of the reference you have in your app.
Hope it helps,

Use MEF to compose parts but postpone the creation of the parts

As explained in these questions I'm trying to build an application that consists of a host and multiple task processing clients. With some help I have figured out how to discover and serialize part definitions so that I could store those definitions without having to have the actual runtime type loaded.
The next step I want to achieve (or next two steps really) is that I want to split the composition of parts from the actual creation and connection of the objects (represented by those parts). So if I have a set of parts then I would like to be able to do the following thing (in pseudo-code):
public sealed class Host
{
public CreationScript Compose()
{
CreationScript result;
var container = new DelayLoadCompositionContainer(
s => result = s);
container.Compose();
return script;
}
public static void Main()
{
var script = Compose();
// Send the script to the client application
SendToClient(script);
}
}
// Lives inside other application
public sealed class Client
{
public void Load(CreationScript script)
{
var container = new ScriptLoader(script);
container.Load();
}
public static void Main(string scriptText)
{
var script = new CreationScript(scriptText);
Load(script);
}
}
So that way I can compose the parts in the host application, but actually load the code and execute it in the client application. The goal is to put all the smarts of deciding what to load in one location (the host) while the actual work can be done anywhere (by the clients).
Essentially what I'm looking for is some way of getting the ComposablePart graph that MEF implicitly creates.
Now my question is if there are any bits in MEF that would allow me to implement this kind of behaviour? I suspect that the provider model may help me with this but that is a rather large and complex part of MEF so any guidelines would be helpful.
From lots of investigation it seems that is not possible to separate the composition process from the instantiation process in MEF so I have had to create my own approach for this problem. The solution assumes that the scanning of plugins results in having the type, import and export data stored somehow.
In order to compose parts you need to keep track of each part instance and how it is connected to other part instances. The simplest way to do this is to make use of a graph data structure that keeps track of which import is connected to which export.
public sealed class CompositionCollection
{
private readonly Dictionary<PartId, PartDefinition> m_Parts;
private readonly Graph<PartId, PartEdge> m_PartConnections;
public PartId Add(PartDefinition definition)
{
var id = new PartId();
m_Parts.Add(id, definition);
m_PartConnections.AddVertex(id);
return id;
}
public void Connect(
PartId importingPart,
MyImportDefinition import,
PartId exportingPart,
MyExportDefinition export)
{
// Assume that edges point from the export to the import
m_PartConnections.AddEdge(
new PartEdge(
exportingPart,
export,
importingPart,
import));
}
}
Note that before connecting two parts it is necessary to check if the import can be connected to the export. In other cases MEF does that but in this case we'll need to do that ourselves. An example of how to approach that is:
public bool Accepts(
MyImportDefinition importDefinition,
MyExportDefinition exportDefinition)
{
if (!string.Equals(
importDefinition.ContractName,
exportDefinition.ContractName,
StringComparison.OrdinalIgnoreCase))
{
return false;
}
// Determine what the actual type is we're importing. MEF provides us with
// that information through the RequiredTypeIdentity property. We'll
// get the type identity first (e.g. System.String)
var importRequiredType = importDefinition.RequiredTypeIdentity;
// Once we have the type identity we need to get the type information
// (still in serialized format of course)
var importRequiredTypeDef =
m_Repository.TypeByIdentity(importRequiredType);
// Now find the type we're exporting
var exportType = ExportedType(exportDefinition);
if (AvailableTypeMatchesRequiredType(importRequiredType, exportType))
{
return true;
}
// The import and export can't directly be mapped so maybe the import is a
// special case. Try those
Func<TypeIdentity, TypeDefinition> toDefinition =
t => m_Repository.TypeByIdentity(t);
if (ImportIsCollection(importRequiredTypeDef, toDefinition)
&& ExportMatchesCollectionImport(
importRequiredType,
exportType,
toDefinition))
{
return true;
}
if (ImportIsLazy(importRequiredTypeDef, toDefinition)
&& ExportMatchesLazyImport(importRequiredType, exportType))
{
return true;
}
if (ImportIsFunc(importRequiredTypeDef, toDefinition)
&& ExportMatchesFuncImport(
importRequiredType,
exportType,
exportDefinition))
{
return true;
}
if (ImportIsAction(importRequiredTypeDef, toDefinition)
&& ExportMatchesActionImport(importRequiredType, exportDefinition))
{
return true;
}
return false;
}
Note that the special cases (like IEnumerable<T>, Lazy<T> etc.) require determining if the importing type is based on a generic type which can be a bit tricky.
Once all the composition information is stored it is possible to do the instantiation of the parts at any point in time because all the required information is available. Instantiation requires a generous helping of reflection combined with the use of the trusty Activator class and will be left as an exercise to the reader.