Package is installed on local machines with a reference to packages folder added in project. Now If I publish it on server, it is causing problem as glimpse is not installed on server. Please guide what is the best way to install it on server.
Glimpse just needs its DLL's to be in the bin folder on the server.
Some deployment techniques (like on Azure Websites) leverage NuGet package restore to download dependencies and build your site right on the server. With these techniques you don't have to do anything.
For simpler techniques, like xcopy or FTP of files, just make sure to include the DLL's - they are already being copied to the bin when you build your site.
Related
Short Question
Is there a way to programmatically download all of the Nuget packages needed to build/publish a solution and output them into a single directory that can be moved to a local Nuget repository?
Some Background
Where I work, most development is done on an air-gapped network (no internet). On a recent project, I was able to develop on our internet-enabled network. This happens to be the first application developed on the internet-enabled network and the first ASP.NET Core app we've ever developed. The solution builds, runs, and publishes just fine on the internet-facing network. I am now trying to move the solution to the air-gapped network, but I am having issues getting all of the dependencies moved over.
At first, the solution wouldn't build because of missing ASP.NET Core nuget packages; so I copied ALL of the nuget packages from the local cache on the machine that I used to develop the application to the local Nuget repository on the air gapped network. Now the application builds, but I can't seem to publish the web project (ASP.NET Core). I'm getting 25+ errors along the lines of:
Unable to find package runtime.any.System.Diagnositics.Tools. No packages exist with this id in source(s)...
Unable to find Nicrosoft.NETCore.App with version (>= 2.1.6)...
Unable to find...
I'm also getting a runtime error "This page cannot be displayed" when I try to run from visual studio (using IIS Express), but I'm not sure if that's a related issue. The unit/integration tests run fine.
I could try manually downloading each nuget package from Nuget.org and moving them over to the air gapped network, but it takes hours to get things moved from one network to another. Is there anyway I can automate the retrieval of all nuget packages required to build/publish a solution so that I can make a single transfer from one network to the other instead moving what I have and waiting to see what breaks? Preferably I'd like a exe or a PowerShell script that could look at a sln file and drop all the necessary nuget packages into a specified directory.
On your internet enabled dev env, you can browse to the root directory of your project and use:
dotnet restore --packages .\package\
This will use the directory called 'packages' as the local cache for the solution and all nuget files will be copied to it.
You can then include the same switch in your build, to ensure that the local cache is used.
dotnet build --packages .\package\
I answered two very similar questions a few weeks ago. Have a look at these answers to see if they help
Command to download a Nuget package with all dependencies to a folder
Is it possible to create a cache of nuget packages for computers without internet?
I have a .Net Standard project in a solution and I want msbuild to build it on our build server. If I do not run "nuget restore" first, I get the error "project.assets.json' not found. Run a NuGet package restore". I already have all of the necessary NuGet packages in the "packages" folder at the solution level. I would like to indicate somehow that msbuild should just use the files in the local "packages" folder instead of trying to re-download the files. How can I do this? Thank you.
.NET Standard projects do not use the packages folder in the solution. This is the "old" way of referencing NuGet packages through packages.config. The new way - PackageReference items in the project - uses a shared global packages folder in the user's home directory. The project.assets.json needs to be regenerated on the build machine with the resolved paths to this shared cache even if no packages need to be downloaded. In fact, a NuGet restore might not even need to hit the network if all packages are already on the machine.
The best solution I have come up with still involves a NuGet restore, but I added the local packages folder as one of the NuGet repositories in nuget.config and now I don't have to reach out to any remote repositories. This is not quite what I wanted, but it is allowing me to progress.
<packageSources>
<add key="local" value=".\packages" />
</packageSources>
I've just published (via GitHub) a VB.NET Azure Website that works fine on local machines but not on Azure:
Compiler Error Message: BC30451: 'Newtonsoft' is not declared. It may
be inaccessible due to its protection level.
Dim category As Category = Newtonsoft.Json.JsonConvert.DeserializeObject(Of Category)(json)
The Newtonsoft.Json package is installed via NuGet: Newtonsoft.Json.5.0.5.
It's the only 3rd party dll in the project right now.
I used the Azure ftp access to browse to /site/wwwroot/ and noticed that there is no /bin directory.
Now, my .gitignore excludes [Bb]in and [Oo]bj folders, but it's the same .gitignore I've used successfully with c# projects and always assumed that Azure just fetches the missing nuget dlls from /packages.
This is my first VB.NET > GitHub > Azure Websites deployment. What have I missed?
edit: I can confirm that if I upload /Bin/Newtonsoft.Json.dll via Azure ftp the site works. Or at least it will until it's re-imaged...
Sounds like you've not enable "NuGet Package Restore".
By doing so this creates a .nuget folder at the root of your solution and a packages.config file in the application that the build process in Azure will pick up and load the required references from.
Right click on the solution and it should be an option in there.
Background
I have an MVC3 project to which I added the Twitter.Bootstrap.Less Nuget package.
To my knowledge, all this did was copy the appropriate JavaScript / LESS files into their appropriate directories.
However, when I now run this through my build/deploy process onto my dev server, MSBuild doesn't copy the /Content/less folder to my production server as part of deployment package.
My build server doesn't have an internet connection, so unfortunately using NuGet without committing packages to source control isn't an option.
Question
How do I get MSBuild to deploy these files? Or do I need to copy the files, uninstall the nuget package, and manually copy them back in?
You could setup your own internal NuGet package source. That would be internal to your network and mean the build server could pickup the NuGet packages without an internet connection.
E.g. On the server, copy the packages you need to a folder and setup a package feed to that folder for the build server to use.
See:
http://docs.nuget.org/docs/creating-packages/hosting-your-own-nuget-feeds
We have recently added AppFabric as a caching option in our project (Windows server version, not Azure cloud). The project thus now has a dependancy on Microsoft.ApplicationServer.Caching.Client.dll and .Core.dll.
Our build server is Windows 2003 and thus AppFabric cache cannot be installed onto it, thus these assemblies are not available.
Short of including the afore mentioned assemblies as explicit binaries in our SVN repository, and referencing them directly (yuck), are there any suggestions as to how to build the project?
Secondly, if we checked in the binaries, performed the build then deleted them from the output folder, is there a way to force .NET to search the %windir%\system32\AppFabric\ folder for the assemblies?
MS do not register them in the GAC...
If your application is using an assembly that can't be installed on your build server, then the solution is to upgrade your build server to Windows 2008.