ASP.NET Core Dependency Injection Evaluation of Tree - asp.net-core

Is there an easy way to evaluate that the dependencies are available within the DI container for either a given service or all services? (Specifically at build time)
I have a couple of Azure Functions running that I manually build the DI container for to keep the content to a minimum and when the dependency tree changes I have to remember to add the new dependency to the function. If I don't I get an error when the function executes the first time not when I build it to republish.
Is there a best practice to follow with this? Can it be achieved with some sort of unit test?

I don't think there is a way to validate this at build time, but as you suggest, you could verify this with a unit test if you follow the usual extension approach to registering your services. Arrange the list of types that should be registered, call your reg extensions, and validate the contents. The only thing I don't think you can do is confirm they're registered with the correct lifetime.
public static IServiceCollection AddFunctionsServices(this IServiceCollection services)
{
return services
.AddTransient<Foo>()
.AddTransient<Bar>()
.AddTransient<Baz>();
}
[Fact]
public void RegisteredServices()
{
var types = new List<Type>{ typeof(Foo), typeof(Bar), typeof(Baz) };
var provider = new ServiceCollection()
.AddFunctionsServices()
.AddSomeOtherServices()
.BuildServiceProvider();
foreach(var t in types)
Assert.NotNull(provider.GetService<t>());
}
If registration happens in a separate library used by different apps (I actually do this myself with a library that supports Functions, web apps, and command-line utilities), you can easily set up different tests for the list of services required by each library consumer.

Related

How generate a workflow in runtime with elsa workflow

With Elsa workflow designer possible to define a workflow and publish it, also can make a workflow programmatically by implementing the IWorkflow interface.
I need to make programmatically workflow at runtime, save it on the database and run it sometimes.
in the asp.net core project controller, I resolve IWorkflowBuilder as a dependency and make a workflow with WorkflowBuilder and return a WorkFlowblueprint object, but I don't know how I can store it and how to run it?
I also have Elsa dashboard on my project and I use EntityFramework Persistence for it.
Is there a way to convert a WorkflowBluePrint to WorkFlowDefination or generate WorkflowDefination from scratch programmatically?
Does everyone have any idea?
Although it might theoretically be possible to store an IWorkflow implementation in the database, there are some caveats that make this tricky to say the least. Here is why:
A workflow definition created by the designer consists purely of a list of activities and connections between them. Because of that, everything is easily serialized into JSON and stored in the database.
However, when you write a C# class, you can do more fancy things, such as configuring activities using C# lambda expressions and implement "inline" activity code. When you try to serialize this to JSON, these C# expressions will be serialized using just their type names.
Although there might be ways to somehow store a programmatic workflow into the database, perhaps even by storing a compiled assembly in the DB, I don't think it's worth the trouble because there are better ways.
You said that you need a programmatic workflow that you only run sometimes.
To achieve that, you do not need to store a workflow in the database.
The way Elsa works is that all workflow sources are converted into a thing called a Workflow Blueprint.
A workflow blueprint is what represents an executable workflow where all the necessary details are dehydrated that the workflow invoker can use.
There are different "sources" to establish these workflow blueprints by means of classes that implement IWorkflowProvider, of which there are three:
Programmatic Workflow Provider
Database Workflow Provider
Blob Storage Workflow Provider
The programmatic provider is what turns IWorkflow implementations into workflow blueprints, while the database provider turns workflow definitions into blueprints. The blob storage provider is similar, except it turns JSON files into blueprints.
The bottom line is that the origin of a workflow blueprint doesn't matter for the workflow engine.
All workflow blueprints are accessed through a service called the workflow registry, which you can use to load & execute a given workflow.
For example, if you have a programmatic workflow called MyWorkflow, you can execute it whenever you want like this:
public class MyWorkflow : IWorkflow
{
public void Build(IWorkflowBuilder builder)
{
builder.WriteLine("Hello World!");
}
}
[ApiController]
[Route("my-workflow")]
public class MyWorkflowController : Controller
{
private readonly IWorkflowRegistry _workflowRegistry;
private readonly IStartsWorkflow _workflowStarter;
public MyWorkflowController(IWorkflowRegistry workflowRegistry, IStartsWorkflow workflowStarter)
{
_workflowRegistry = workflowRegistry;
_workflowStarter = workflowStarter;
}
[HttpGet("run")]
public async Task<IActionResult> RunMyWorkflow(CancellationToken cancellationToken)
{
// 1. Get my workflow blueprint.
var myWorkflowBlueprint = (await _workflowRegistry.GetWorkflowAsync<MyWorkflow>(cancellationToken))!;
// 2. Run the workflow.
await _workflowStarter.StartWorkflowAsync(myWorkflowBlueprint, cancellationToken: cancellationToken);
return Ok();
}
}
Invoking this controller will execute MyWorkflow.
As you can see, there is no need to store the workflow in the database in order to be able to execute it on demand. Even if you did store the workflow in the database, the code would be the same, provided that the name of the workflow remains "MyWorkflow". Under the covers, the GetWorkflowAsync<TWorkflow> is simply an extension method that uses the type name to find the workflow by name. If you wanted to load a workflow by name for which there's no workflow class defined, you would simply use FindByNameAsync, or FindAsync if all you had is a workflow definition ID.

Correct way to clean up a "Pre" instance of ServiceCollection and ServiceProvider?

I am implementing a Custom Configuration Provider in my application.
In that provider, I have to make a REST API call. That call needs a valid OAuth 2 Token to succeed. To get that token I need a semi complicated tree of class dependencies.
For the rest of my application, I just use dependency injection to get the needed instance. But a custom configuration provider is called well before dependency injection is setup.
I have thought about making a "Pre" instance of dependency injection. Like this:
IServiceCollection services = new ServiceCollection();
// Setup the DI here
IServiceProvider serviceProvider = services.BuildServiceProvider();
var myTokenGenerator = serviceProvider.GetService<IMyTokenGenerator>();
But I have read that when you make another ServiceCollection, it can cause problems. I would like to know the way to avoid those problems.
How do I correctly cleanup a "pre-DI" instance of ServiceCollection and ServiceProvider?
(Note: Neither one seems to implement IDisposable.)
Hm, I don't get the point why you want to do it that way.
I'd probably get the Serviceprovider fully build.
To avoid that retrieved services affect each other I'd would use nested containers/scopes which means that if you retrieve retrieve the same service you get different instances per container/scope.
Hopefully I understood what you want to achieve.
See
.NET Core IServiceScopeFactory.CreateScope() vs IServiceProvider.CreateScope() extension

How to configure AutoMapper 9.0 in IIS hosted WCF application

I want to use AutoMapper 9.0 in a WCF project containing several services that will be hosted in IIS. I've only found one other related SO question but its dealing with a 10 year old version of AutoMapper and is not asking the same question. Its answer is similar to the top hits on Google which suggest using a ServiceBehavior but that doesn't seem applicable when I want multiple services to use the same mapper. The defense rests.
In a web project, you might create a static MapperConfiguration in the Global.asax when the application starts, but WCF doesn't have a Global.asax. It looks like there are a few options for executing initialization code in WCF:
Include an AppInitialize() method in the App_Code folder. This will be dynamically compiled at runtime and people have complained that it can have missing reference issues in IIS so I'm not confident AutoMapper or its dependencies will be found once deployed to IIS.
Create a custom ServiceHost. This seems like it would execute once when the application starts, but also looks like it ignores the web.config configuration, which I don't want.
Use the Configure method per service. This has the same drawback as #2 and also I become concerned with thread safety (as in the ServiceBehavior approach) since two services could try to initialize the MapperConfiguration at once.
I considered just creating a class with a static property that would create a static MapperConfiguration or IMapper instance if it was not already created, but as in #3, I'm worried this may not be thread safe. Maybe if I did something like this?
public static class MapperConfig
{
private static IMapper _modelMapper;
private static readonly object _mapperLocker = new object();
public static IMapper ModelMapper
{
get
{
lock(_mapperLocker)
{
if (_modelMapper == null)
{
var config = new MapperConfiguration(cfg => cfg.AddProfile(new MappingProfile1()));
_modelMapper = config.CreateMapper();
}
}
return _modelMapper;
}
}
}
Where two services may call ModelMapper simultaneously. Another downside of this is the first request to any service will have to wait for the mapping to compile, but I'm not sure I can get away from that. I definitely don't want it compiling the mappings per call and would prefer not to even have to do it per service. Can you advise on the thread safety of MapperConfiguration and the best way to use it in IIS-hosted WCF?

Autofac Multitenant Database Configuration

I have a base abstract context which has a couple hundred shared objects, and then 2 "implementation" contexts which both inherit from the base and are designed to be used by different tenants in a .net core application. A tenant object is injected into the constructor for OnConfiguring to pick up which connection string to use.
public abstract class BaseContext : DbContext
{
protected readonly AppTenant Tenant;
protected BaseContext (AppTenant tenant)
{
Tenant = tenant;
}
}
public TenantOneContext : BaseContext
{
public TenantOneContext(AppTenant tenant)
: base(tenant)
{
}
}
In startup.cs, I register the DbContexts like this:
services.AddDbContext<TenantOneContext>();
services.AddDbContext<TenantTwoContext>();
Then using the autofac container and th Multitenant package, I register tenant specific contexts like this:
IContainer container = builder.Build();
MultitenantContainer mtc = new MultitenantContainer(container.Resolve<ITenantIdentificationStrategy>(), container);
mtc.ConfigureTenant("1", config =>
{
config.RegisterType<TenantOneContext>().AsSelf().As<BaseContext>();
});
mtc.ConfigureTenant("2", config =>
{
config.RegisterType<TenantTwoContext>().AsSelf().As<BaseContext>();
});
Startup.ApplicationContainer = mtc;
return new AutofacServiceProvider(mtc);
My service layers are designed around the BaseContext being injected for reuse where possible, and then services which require specific functionality use the TenantContexts.
public BusinessService
{
private readonly BaseContext _baseContext;
public BusinessService(BaseContext context)
{
_baseContext = context;
}
}
In the above service at runtime, I get an exception "No constructors on type 'BaseContext' can be found with the constructor finder 'Autofac.Core.Activators.Reflection.DefaultConstructorFinder'". I'm not sure why this is broken....the AppTenant is definitely created as I can inject it other places successfully. I can make it work if I add an extra registration:
builder.RegisterType<TenantOneContext>().AsSelf().As<BaseContext>();
I don't understand why the above registration is required for the tenant container registrations to work. This seems broken to me; in structuremap (Saaskit) I was able to do this without adding an extra registration, and I assumed using the built in AddDbContext registrations would take care of creating a default registration for the containers to overwrite. Am I missing something here or is this possibly a bug in the multitenat functionality of autofac?
UPDATE:
Here is fully runable repo of the question: https://github.com/danjohnso/testapp
Why is line 66 of Startup.cs needed if I have lines 53/54 and lines 82-90?
As I expected your problem has nothing to do with multitenancy as such. You've implemented it almost entirely correctly, and you're right, you do not need that additional registration, and, btw, these two (below) too because you register them in tenant's scopes a bit later:
services.AddDbContext<TenantOneContext>();
services.AddDbContext<TenantTwoContext>();
So, you've made only one very small but very important mistake in TenantIdentitifcationStrategy implementation. Let's walk through how you create container - this is mainly for other people who may run into this problem as well. I'll mention only relevant parts.
First, TenantIdentitifcationStrategy gets registered in a container along with other stuff. Since there's no explicit specification of lifetime scope it is registered as InstancePerDependency() by default - but that does not really matter as you'll see. Next, "standard" IContainer gets created by autofac's buider.Build(). Next step in this process is to create MultitenantContainer, which takes an instance of ITenantIdentitifcationStrategy. This means that MultitenantContainer and its captive dependency - ITenantIdentitifcationStrategy - will be singletons regardless of how ITenantIdentitifcationStrategy is registered in container. In your case it gets resolved from that standard "root" container in order to manage its dependencies - well, this is what autofac is for anyways. Everything is fine with this approach in general, but this is where your problem actually begins. When autofac resolves this instance it does exactly what it is expected to do - injects all the dependencies into TenantIdentitifcationStrategy's constructor including IHttpContextAccessor. So, right there in the constructor you grab an instance of IHttpContext from that context accessor and store it for using in tenant resolution process - and this is a fatal mistake: there's no http request at this time, and since TenantIdentitifcationStrategy is a singleton it means that there will not ever be one for it! So, it gets null request context for the whole application lifespan. This effectively means that TenantIdentitifcationStrategy will not be able to resolve tenant identifier based on http requests - because it does not actually analyze them. Consequently, MultitenantContainer will not be able to resolve any tenant-specific services.
Now when the problem is clear, its solution is obvious and trivial - just move fetching of request context context = _httpContextAccessor.HttpContext to TryIdentifyTenant() method. It gets called in the proper context and will be able to access request context and analyze it.
PS. This digging has been highly educational for me since I had absolutely no idea about autofac's multi-tenant concept, so thank you very much for such an interesting question! :)
PPS. And one more thing: this question is just a perfect example of how important well prepared example is. You provided very good example. Without it no one would be able to figure out what the problem is since the most important part of it was not presented in the question - and sometimes you just don't know where this part actually is...

Using MEF in Service layer (WCF)

So far I found that MEF is going well with presentation layer with following benefits.
a. DI (Dependency Injection)
b. Third party extensibility (Note that all parties involved should use MEF or need wrappers)
c. Auto discovery of Parts (Extensions)
d. MEF allows tagging extensions with additional metadata which facilitates rich querying and filtering
e. Can be used to resolve Versioning issues together with “DLR and c# dynamic references” or “type embedding”
Pls correct me if I’m wrong.
I'm doing the research on whether to use MEF in Service layer with WCF. Pls share your experience using these two together and how MEF is helping you?
Thanks,
Nils
Update
Here is what my result of research so far. Thanks to Matthew for helping in it.
MEF for the Core Services - cost of changes are not justifying the benefits. Also this is big decision and may affect the service layer in good or bad way so needs lot of study. MEF V2 (Waiting for stable version) might be better in this case but little worried about using MEF V1 here.
MEF for the Function service performs - MEF might add the value but it’s very specific to the service function. We need to go deep into requirement of service to take that decision.
Study is ongoing process, so everyone please share your thoughts and experience.
I think any situation that would benefit from separation-of-concerns, would benefit from IoC. The problem you face here is how you require MEF to be used within your service. Would it be for the core service itself, or some function the service performs.
As an example, if you want to inject services into your WCF services, you could use something similar to the MEF for WCF example on CodePlex. I haven't looked too much into it, but essentially it wraps the service location via an IInstanceProvider, allowing you to customise how your service type is created. Not sure if it supports constructor injection (which would be my preference) though...?
If the WCF service component isn't where you want to use MEF, you can still take advantage of MEF for creating subsets of components used by the service. Recently for the company I work for, we've been rebuilding our Quotation process, and I've built a flexible workflow calculation model, whereby the workflow units are MEF composed parts which can be plugged in where needed. The important part here would be managing how your CompositionContainer is used in relation to the lifetime of your WCF service (e.g. Singleton behaviour, etc.). This is quite important if you decide to create a new container each time (container creation is quite cheap, whereas catalog creation can be expensive).
Hope that helps.
I'm working on a solution where the MEF parts that I want to use across WCF calls are stored in a singleton at the application level. This is all hosted in IIS. The services are decorated to be compatible with asp.net.
[AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]
In Global.asax, I import the parts.
[ImportMany(typeof(IOption))]
public IEnumerable<IOption> AvailableOptions{ get; set; }
After initializing the catalog and container, I copy the imported objects to my singleton class.
container.ComposeParts(this);
foreach (var option in AvailableOptions)
OptionRegistry.AddOption(option);
EDIT:
My registry class:
public static class OptionRegistry
{
private static List<IOption> _availableOptions= new List<IOption>();
public static void AddOption(IOption option)
{
if(!_availableOptions.Contains(option))
_availableOptions.Add(option);
}
public static List<IOption> GetOptions()
{
return _availableOptions;
}
}
This works but I want to make it thread safe so I'll post that version once it's done.
Thread-safe Registry:
public sealed class OptionRegistry
{
private List<IOptionDescription> _availableOptions;
static readonly OptionRegistry _instance = new OptionRegistry();
public static OptionRegistry Instance
{
get { return _instance; }
}
private OptionRegistry()
{
_availableOptions = new List<IOptionDescription>();
}
public void AddOption(IOptionDescription option)
{
lock(_availableOptions)
{
if(!_availableOptions.Contains(option))
_availableOptions.Add(option);
}
}
public List<IOptionDescription> GetOptions()
{
return _availableOptions;
}
}
A little while ago i was wondering how I could create a WCF web service that will get all of its dependencies wired by MEF but that i wouldnt need to write a single line of that wire up code inside my service class.
I also wanted it to be completely configuration based so i could just take my generic solution to the next project without having to make code changes.
Another requirement i had was that i should be able to unit-test the service and mock out its different dependencies in an easy way.
I came up with a solution that ive blogged about here: Unit Testing, WCF and MEF
Hopefully will help people trying to do the same thing.