.NET Core 2.1 How to provide nuget packages to a heavily modularized deployment with shared runtime - asp.net-core

Setup:
single offline (blocked from inet) server
multiple applications
apps load .net core assemblies (plugins with their own assembly or nuget deps) at runtime through reflection
Problem: What is the most efficient way to deploy the application set?
Currently I publish application per application, so that all required nuget packages and assemblies are available. However, this means the complete .net core and asp.net assembly set is copied over multiple times.
To have a shared deployment with an installed .net core runtime or sdk, there does not seem to be an easy way to make the required nuget packages available on an offline machine?
Any suggestions on the best-practices setup for these kind of deployments?
Cheers.

Sounds like you could use the global packages folder.
If your projects use PackageReference they consume their dependencies directly out of that folder instead of copying them locally, so if you're worried about disk space that would be a way to avoid duplication if that's what you're really worried about.

Related

.Net Core Project Referencing .Net Frameork Projects Problems?

I have a question about .net Core project.
I have a .Net Core Project referencing other projects.
The problem is that a few projects show the warning saying that "Package 'XXXXX' was restored using .NetFramework, Version=v4.6.1.... instead of targer framework .NetCoreApp".
What kind of problems could I have?
Also can I deploy this in Linux for instance and still working fine?
warning showed
Thanks guys
Look at this thread - For a .Net Core 2.1 project, Why does Nuget restores .Net 4.6.1 packages?
What it basically means is the package you have loaded not suitable for .NET CORE, and was restored using a different version of .Net Framework.
Check if the package exists for .NET CORE (search thru NuGet Manager)
Regarding whether it will work on Linux or not - it depends on the package dependencies (e.g. if it is depending on WinForm for example, it probably won't work on Linux).
Even if it will work, I suggest finding a package suitable for .NET CORE.

.NET Core Runtime without installing

Can anyone help as to how to use the run time binaries found in the link below?
https://www.microsoft.com/net/download/windows
Basically, we are moving to .NET Core (we have been using .NET framework 4.0 for many years - so big shift as you can guess). I am kind of nervous to install .NET core on the production server (Windows Server 2012). Is it safe to install .NET Core on a server running .NET framework 4.0? If no, is there any way I can get the .NET Core runtime on to the server without installing them (kind of copying portal libraries) so that I can start with the beta testing of the app. Any help would be much appreciated. Thanks!
EDIT:
It is not duplicate, One of the main questions I had if I can use portable binaries on the server to run my app, without actually installing them (got the answer below, thanks again). Not sure someone down voted this without any reason. It makes the developers nervous to ask a question in StackOverflow. if they can mention the reason that would be great!
There are no issues when installing .NET Core and .NET Framework on the same machine. They are designed to allow them to be installed side-by-side.
However you do need to install the .NET Core runtime onto the server to allow the code to be run because the OS has native dependencies that need to be present. See this link for more information.
With .NET Core you can do framework dependent deployments (FDD) and self-contained deployments (SCD). FDD requires any shared assemblies to be present on the server i.e. System.* assemblies etc. but an SCD only requires the basic runtime/native dependencies. For an SCD your app deployment would include any .NET Core shared assemblies in it's deployment package.
You can read more about FDD and SCD here
Also there is more information about the native dependencies on different OS platforms here

Why does .NET Core not add the reference Dlls from the nuget packages to the bin folder

.Net Core projects do not put the reference DLLs from the nuget packages in the bin folder. Is there a way of any properties that helps in doing that?
It's needed for some third party tools to understand the reference DLLs.
.NET Core, unlike .NET Framework, can resolve assemblies from half a dozen locations. This includes the NuGet cache, servicing cache, runtime store, local app directory, and shared framework folder. During development, these are typically found in the NuGet cache (%USERPROFILE%.NuGet\packages) This makes it unnecessary to copy referenced assemblies to the build output folder (bin) until you publish your application. For more details on how that works, see https://github.com/dotnet/core-setup/blob/master/Documentation/design-docs/corehost.md
You can force the SDK to copy assemblies to your build folder by setting the proper below, but it increase disk use and build time.
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
Or you can use the deps.json and runtimeconfig.json file to locate required assemblies.

Reference third-party class libraries

I am working with .Net Core 1.0 (running under the .Net Framework 4.6.1, non-portable).
I need to include some DLLs that are from a locally-built GitHub project. When I build those projects, and then attempt to "Add Reference" to the resulting DLLs, I get a message saying I can't add them to a Core project directly.
After more research, I found a lot of information regarding "private" NuGet packages. However, those seem overly complex / overly engineered.
Is there any way I can do the following:
Without having to go through the headache of creating a private NuGet repository, can I just "add reference" to the built assemblies that are sitting in the bin folder of the NuGet projects I pulled?
I really don't want to have to build a local-only NuGet package. Mostly because I've already wasted too much time on this issue, and because I read this entire concept is about to be scrapped and turned into something else (sounds familiar by now)... such as the Roslyn-based build system on GitHub.
My current state:
Visual Studio Professional 2015
.Net Core 1.0.1
.Net Core 1.0.1 Tooling Preview 2
No, as for now you have to create a nuget package before and restore it via Nuget. You can use a simple folder as NuGet source, so if you put your compiled NuGet package in C:\packages, you can add this as a source to NuGet (while in the NuGet UI, click the settings Icon and add the folder as new source).
This may change with the next release of ASP.NET Core (1.1), as the .NET/ASP.NET Core team is working to move from *.xproj to *.csproj files.
One of the reasons why you need to use nuget is because it can contain multiple targets and project.json allows you to target multiple platforms (i.e. net452 and netcoreapp1.0).

Interop with unmanaged code in ASP.net vNext

What's the story going to be (if any) around interop with unmanaged code for ASP.net vNext / Core CLR?
The key bits (DllImport and friends) appear to be present to allow for unmanaged code interop, but how would things such as packaging and deployment work in this context? The basic build artifact in vNext / CoreFX no longer appears to be an assembly, but a NuGet package. So in that case, how would we make the new project.json system work so that unmanaged dlls that we're P/Invoking into are also included in the resulting NuGet package?
Or am I talking about scenarios that have not been considered yet (or more disappointingly, not going to happen)?
The story is yet to be fully fleshed out, but there are already examples of how to do this. Ultimately we (the Microsoft teams working on this) are working on some scenarios to enable NuGet packages to better support native content in packages.
To see one example of this, the Kestrel web server has some of its own managed code, plus it includes libuv in its NuGet package for an efficient async IO implementation that is cross-platform.
Because there isn't yet a built-in general solution in NuGet, the build scripts for Kestrel use some custom actions to include the native content in the NuGet package. Then to load libuv there's some code that dynamically figures out which native libuv to load based on the environment in which it is running.
So, yeah, it's a bit messy, but it does work, and this is definitely high on the team's priority list to improve.