How to use shared library in ASP.Net Core MVC running on IIS - asp.net-core

I'm looking into using ASP.Net Core MVC for some of my new projects. I work on a team of developers for a very large organization, and we each individually write a lot of small web apps. Due to the size of our organization, we have a lot of rules that we have to follow, and sometimes those rules change, completely out of our control. So this is what we have used in the past projects, all running on IIS:
ASP Classic - Each IIS root folder has a shared folder, containing a lot of commonly used .asp files. These files are mostly the same on each server, but can point to different databases for dev/test/prod environments. These library files are used for common things like authentication, authorization, encryption, sending emails, etc... Each application would be in a sibling folder to the shared folder, and include files like "..\shared\library.asp"
ASP.Net / MVC - The closest thing we could find was the GAC. Everybody says not to use the GAC, but for our purposes it does exactly what we need. We built a DLL library, and store it in the GAC of each web server. We then put local configuration (dev/test/prod environment specific stuff) information on the global web.config of each IIS server. Application specific information would be stored in that application's local web.config file.
The beauty of these two systems, is sometimes things change, and we can simply go update the global libraries, and every application that depends on them will adapt to the new code without needing a recompile. We have many applications, running on many web servers. This may not be ideal, but for our needs it works perfectly, considering the rules can change at a moment's notice, and recompiling every application would be a huge ordeal. We just have to be sure not to ever introduce breaking changes into our libraries, which is simple enough. We have zero problems with how it works.
Now, on to ASP.Net Core. Is there an elegant way to do this? It seems like Core doesn't support the GAC, nor does it support web.config. Everything wants to use appsettings.json. Is there a way to create an appsettings.json at the root level of IIS, and have it set global variables like environment="dev", authdatabase="devsql" etc? And can we store a .Net Core/Standard DLL in a shared folder, and have every app load it with a path like "..\shared\library.dll"? The closest thing I could find to do this with .Net framework was the GAC, but I'm not really finding any answers for this with Core. I appreciate any help, thanks!

sometimes things change, and we can simply go update the global libraries, and every application that depends on them will adapt to the new code without needing a recompile
Note that this is exactly one of the reasons why GAC deployment is usually avoided. If you update a dependency, and that happens to contain a breaking change (in any possibly case), then applications will start to break randomly without you having control over that.
Usually, if you update a dependency, you should have to retest every application that depends on that before you deploy the updated application. That is why dependency updates (e.g. via NuGet) are deliberate choices you need to make.
.NET Core avoids this in general by never sharing assemblies between applications and by allowing different versions side-by-side. That way, you can update applications one by one without affecting others.
This is actually a primary reason why .NET Core was made in the first place: The .NET Framework is shipped with Windows, and is a global thing. All applications will always use the same framework version. So whenever Microsoft ships an update to the .NET Framework, they have to be incredibly careful not to break applications. And that is incredibly difficult because countless applications depend on all kinds of things in the framework. Even fixing a possibly obvious bug can break stuff.
With .NET Core and side-by-side dependencies, this is no longer a problem because updates will not automatically break applications that still depend on older versions. It is a developer’s explicit choice to update an application, shipping newer dependencies.
So you should actually embrace this and start to develop your applications independently. If you have common dependencies, consider creating (private) NuGet packages for those, so that applications can depend on them and so that you have a good way to update them properly.

Related

How to make use of common files across projects using an IIS7 virtual directory

The scenario:
I'm very new to ASP.Net MVC programming and running into a wall constantly trying to make use of common files (.js, .css) across multiple projects.
The idea is to have these generic files in 1 location which provides for easy future updates and avoids the "copy and paste" dilemma across all the projects. I've set this folder up in IIS7 as a virtual directory in the default website with an alias "CommonFiles".
The problem:
With MVC-4 I'm trying to add the js files to a script bundle but upon running the application it's not picking the files up at all. (checked in the page source and also added a js function as a test)
Code snippet in BundleConfig.cs:
bundles.Add(new ScriptBundle("~/bundles/test").Include("~/CommonFiles/test.js"));
Rendering in _Layout.cshtml:
#Scripts.Render("~/bundles/test")
I've read quite a few posts (
Script Bundling in WebForms with Virtual Directories (asp webforms though), How to add reference to System.Web.Optimization for MVC-3-converted-to-4 app, ScriptBundle not rendering scripts that are in a VirtualDirectory) but i'm afraid my lack of knowledge on MVC is limiting my path forward and really hoping to get some insight into how MVC handles IIS virtual directories and if it's even an easy possibility given the last post i've read above.
Can this be done in MVC-4 and if not what is a second best alternative in reusing common code across projects?
After reading a post by kev (Using ServerManager to create Application within Application) it put me on the right path and the issue I had is actually embarrassing.
For the sake of other devs landing on this post with a similar issue in visual studio, this is what fixed my issue:
Problem:
I make use of a separate project which contains files that are used across multiple other projects. I created a virtual folder in IIS7 referencing these files. This means if a change is needed to the common files, it's updated once and all the other projects will automatically "see" the change.
My other individual projects make use of script bundling to include files relevant only to the said project, but also to reference the common files in the virtual folder as defined in IIS.
My MVC-4 web application wasn't picking up the common files given the syntax above, in neither debug or release..
Solution:
When developing in VS2012, under the project's properties, there's a setting under the web tab where you can specify whether you want to use local IIS web server or IIS Express to test your application. IIS Express adds a random port to the site in order to test, and to allow multiple instances of sites to run (on different ports). This seems to throw the virtual directory include off in the bundling.
Choosing to use the local IIS server is closer to what the "live" environment would be in my opinion. Just un-tick the "Use IIS Express" setting.
As a side note and for more info on what the difference between the usage of IIS and IIS express is and whether it's suitable for your environment (as it was for mine) see this link:
http://msdn.microsoft.com/en-us/library/58wxa9w5.aspx
Hope this helps someone in future and saves them the amount of time I wasted on this!

Is it possible to run asp.net mvc 4 from within a folder of a main website?

I have successfully set up an API using ASP.NET MVC 4 on IIS6 (I used Phil's tutorial). When testing, we had it as the "Default website" and so there was no conflict with anything else. I am now being asked to set this up within a FOLDER of an existing website (the existing website is in ASP 1.0...and I cannot modify this...so I would some sort of virtual...something?). So basically, if we have https://www.ourcompany.com, they want the API to be available through https://www.ourcompany.com/api/.
Is this even possible? Phil's tutorial talks about setting up a Virtual Application, but I don't have that option in IIS (and if I had, I'm not knowledgeable enough about IIS to know if that would even allow me to access the API that way). I don't want anything that I set up to mess up the current website either, and there are a couple steps in the tutorial that I'll freely admit I don't fully understand.
If your curious as to WHY, the only advantage (besides being "neat") is so that the same SSL Cert can be used.
Yes that's definately possible at my work we had a similar setup, IIS6, a .NET 3.5 web, with a .NET 4.0 web nested underneath.
You would just set it up as a virtual directory underneath the parent website, point it to your folder, and ensure the value for the "Execute Permissions" dropdown is "Scripts Only" or above, and the correct .NET framework version selected on the ASP.NET tab.
There may be additional values you may need to over-write in your child web.config file, or, alternatively, wrap the entire parent web config with a "Location" attribute.
Forgot to mention, you may need to add manual script mappings for the child web if it doesn't work by default. (This installs the .NET 4.0 script mappings to a specific web) though again not sure if this is required by default. See: http://msdn.microsoft.com/en-us/library/k6h9cz8h.aspx
One more thing - If you're using REST (or an extension less URL mapping which I believe an MVC 4 web will use) - You'll need to add a "wild card" script mapping, which basically tells IIS to serve requests with no extension with the .NET 4.0 framework - See here However where they're referencing .NET 2.0 folders, you'll obviously want to reference the same files but in the .NET 4.0 folders :)
Thanks

Is it possible to make aspnet ModelBinding work in .Net Framework 4.0 Web Forms?

We have a couple of relatively large Web Forms web application projects, but we are limited on using the .net 4.0 because some of our clients are still using Windows Server 2003, and the .net4.5 is not compatible with that OS.
Would it be somehow possible to make the model binding framework created on the .net4.5 work with the .net4.0 WebForms? Maybe something along the lines of extension methods on .net2.0 (although that is obviously almost 100% compile time stuff) or LinqBridge.
If that was possible to some extent, I think I would take the time to do it. Maybe if the code can be extracted from the original sources (I'm downloading them right now to see how it works) and be plugged like an extension or inheritance of sorts in our current page life cycle.
Does that mechanism have some external dependency that would make this prohibitive?
The WebForms-based feature required changes which are only available in 4.5.
That said, if you require model binding in some form, you could always try using the ASP.NET MVC or WebAPI frameworks for the particular part of your site in which you require model binding, leaving the rest as WebForms. They both currently only require .NET 4.0. And you get the benefit that both of those are supported products.

Should I use a single framework codebase for multiple sites or one for each site?

Here's the situation:
Multiple sites are built using a certain framework.
All sites are hosted on the same server.
Each site has its own database
Since all the sites are hosted on the same location and they all have the framework code in common, I could easely install the framework once on the server and have each site use the same files as their framework code.
This way I only have 1 framework installation that is used by all the websites.
The second option is to have each site work with its own installation of the framework.
pro's and con's of option 1:
Only have to maintain 1 codebase for the framework of multiple websites
Framework updates instantly apply for all the websites
Should 1 site have different needs of the framework or have code that's no longer compatible with the latest framework version, custom framework compatibility patches become required. (there is not always time or budget to keep legacy projects compatible with the latest framework version)
pro's and con's of option 2:
Seperate framework for each site to maintain
Framework updates have to be applied seperately for each site
Should 1 site have different needs of the framework or has no budget to be maid compatible with the latest framework update, we simply don't update that site's framework installation.
If it's really necessary, a site could quickly modify its framework to match the needs without interfering with other sites on the server.
So option 1 seems easier to maintain, while option 2 is much more flexible. I don't know what's most important.
Which of the 2 options is the overall the best choice? Or are there more options possible?
I approach this slightly differently.
I'd have a dir structure like this:
/sites/
/site1/
/site2/
[etc]
/framework/
/1.0/
/1.1/
/1.x/
[etc]
Then within each site have a /framework mapping which points to the version of the framework that site is using. That way each site's framework version is independent of the others: if one site needs a different framework version than the others (stuck on an old version, needs a new version before the other sites have been tested with it), then you have control over that sort of granularity. Equally, changing the codebase in /framwork/nightly/ (for example) will "automatically" update all sites with their /framework mapping pointing to that bleeding-edge version of the codebase.
Personally, I will always have each site using its own framework codebase, or at least using a shared codebase frozen at a set version. The problem with sharing the framework is that with each update, you'd have to test each site using that shared codebase to ensure they are still working as expected. What also happens when the framework deprecates a feature you use in one of your sites? It could prevent you from updating, leaving you open to security issues.
Option 2 does give you more of a maintenance overhead but it's the safest approach in my opinion.
It depends on a couple factors.
How many developers working on the sites? Is the framework well supported, documented? Is there a *better framework available (better by documentation, overhead (memory footprint) and community support). Which choice would look better 12 months or longer down the road.
Choice 1 is appealing for its consistency and familiarity.
Choice 2 is appealing for its potential, learning and future growth.

Organising frameworks for client, server, and plugins

I am working on a large project that comprises a server, some plugins for controlling devices that can be loaded into the server process, and also clients that can connect to the server. I am looking for the best practice for structuring the framework(s) that will ultimately need to exist.
There are header files that are shared between client, server, and plugins, and some headers that are specific to each aspect of the system. Sometimes headers are only shared between say client and server, or server and plugin. Similarly, there is common code that can be shared between all three aspects of the project, and also code that would only be needed by one particular aspect.
When the project is finished, we will need to release a client application and plugin developer API for third parties to develop against.
I am not sure how to properly structure the Frameworks that would be needed to support this.
Do I need to have 2 separate frameworks? Or can I have 1 framework that includes all the headers and provides 2 separate dylibs?
If I need to have 2 separate frameworks, what do I do with header files that are shared between all aspects of the system? I do not want to copy them into each framework to avoid possible problems with versioning.
Would a 3rd headers-only Framework be a reasonable option?
Can anyone recommend a best practice for structuring this kind of thing with Frameworks on OS X?
Framework = library + headers for that library
Each framework only needs to include header files for the interfaces you want to expose. Even if common headers were used to build all three frameworks, you are not obliged to bundle them.
A 3 framework approach would be fine, even if one of the frameworks only bundles common headers and no library at all. Example: if you install Qt on your Mac, you will see it split across many frameworks, but headers are never duplicated among them. There are also some frameworks that only contain headers and no code (QtScript.framework, for instance).