At the company I work at we do the following when we need references to third party dlls in our projects:
Use nuget to get package
Pull dll's out and create a "lib" folder and add the references here
this lib folder is added to git so other team members have all references when they do a pull from git
Reference dll's stored in lib folder in our project
We do this to have full control and know exactly what references we are using.
My question is how is this achieved when using vnext and can we continue to do it this way?
Have watched "INTRODUCING: The Future of .NET on the Server" and it seems you list all dependencies in project.json file and when you do k restore it will go and download all based on feeds in nuget config file
You'll make use of the project.json file. As you mentioned, you list all your dependencies in there and the K Package Manager will deal with resolving the missing packages for you.
You'll notice that in the json file you specify the package in somewhat of a key-value pair of package:version. Most examples show a version of * which means get me the latest. But there's nothing stopping you from specifying a specific version, or a specific part of a version. For instance, the project.json file in the Autofac container of the DI project specifies a specific version of Autofac:
"dependencies": {
"Autofac": "3.3.0",
"Microsoft.Framework.DependencyInjection": ""
},
The main DI project specifies a sort-of-specific version of Microsoft.Framework.ConfigurationModel:
"dependencies": {
"Microsoft.Framework.ConfigurationModel": "1.0.0-*"
},
That says get me the most recent build of 1.0.0
This system allows you to automatically get the latest and greatest if you want, but also specify a specific version for safety. There's no reason to copy DLL's into a custom lib folder.
EDIT: You inspired me to blog about it: http://davidzych.com/2014/08/13/specifying-package-dependency-versions-in-asp-net-vnext/
Just noting that "-*" does not necessarily return the latest version. It my simple testing, it always returns the lowest available version. Per this documentation the calculation is more complex and returns the lowest version that "works".
EDIT: added link to documentation
Related
I have a repository that contains two solutions. One solution (in this case solution A) for a web project that has a reference to a project in the second solution (in this case solution B) (in the same repository).
When I build the web project in VSTS I pull the repository, build solution B, and then build solution B.
Build solution B work, but, the build of solution A is failed cause the reference dll of the project in solution B didn't found
You have a few options:
1) Use project references. You don't need to depend on an assembly.
2) Use NuGet packages -- the shared piece is built via a CI process, turned into a NuGet package, and then published to a Packages feed. The dependent projects can reference the NuGet package and restore an appropriate version on build.
Which approach you should take depends on a lot of factors. If you're not worried about versioning, just use project references.
As Daniel said that it’s better to use NuGet packages.
Regarding reference the assembly file directly, refer to these steps:
Open your web project file through Notepad
Find the related reference and check Hintpath value, should be relative path.
Add Copy files task to your build definition (Before build solution A task) to copy corresponding assembly files to corresponding folder (per to that relative path)
I have a CMake project to build multiple shared libraries and tools, most of these under subdirectories:
add_directory(libFirst)
add_directory(libSecond)
add_directory(myTool)
# etc...
The install(TARGET "someTarget" COMPONENT "someTarget" ...) rules are in the respective subdirectory/CMakeLists.txt files.
I would like to generate Debian packages for all of these using a make package command from the build directory. I have CPACK_DEB_COMPONENT_INSTALL set to ON.
The problem I'm facing, is that not all of the targets have the same VERSION and/or SOVERSION. For example, libFirst is at version 1.0.0.0 and libSecond is version 4.3.0.0. This means that the generated packages should also have different version, but the only way I've found to specify the version is to specify the CPACK_PACKAGE_VERSION_MAJOR, CPACK_PACKAGE_VERSION_MINOR and CPACK_PACKAGE_VERSION_PATCH variables (and perhaps the internal CPACK_PACKAGE_VERSION variable), which set the version for all generated packages.
Is there a way to set package versions per-component, for example by setting some variables similarly to the other CPACK_COMPONENT_<COMPONENT>_* or CPACK_DEBIAN_<COMPONENT>_* variables?
I don't think this is possible at the moment but I have created a merge-request (https://gitlab.kitware.com/cmake/cmake/merge_requests/2305) which provides this functionality.
I hope it will get approved but in the meantime you can locally change your CPackDeb.cmake as shown in this diff: https://gitlab.kitware.com/cmake/cmake/merge_requests/2305/diffs. The default location of that file is in /usr/share/cmake/Modules/CPackDeb.cmake.
I have a solution with the following projects:
MySolution.sln
- MySolution.Client.csproj
- MySolution.Service.csproj
- MySolution.Models.csproj
- MySolution.Server.xproj
MySolution.Models is a simple class library which contains shared code that is referenced by MySolution.Client and MySolution.Service - and I would like to reference it in MySolution.Server.
The GUI in VS 2015 RC1 lets me add the reference by right clicking References -> Add Reference. I then see all my projects under Projects -> Solution.
I select MySolution.Models and click Ok, after which I receive the following error in the output log:
Errors in ...PathToSolution\MySolution.Server\project.json
Unable to locate MySolution.Models >= 1.0.0-*
It really feels like this should work, since the GUI allows me to add the reference without any hiccups.
So the first thing to understand is DNX projects have no understanding of traditional .net projects. They don't read or parse csproj files. This is done to keep them cross platform and cross IDE compatible (csproj is a distinctly windows and VS specific thing).
When you add a reference to a "legacy" (I use legacy to mean a .net 4.x csproj based project) behind the scenes the IDE will run dnu wrap but it looks like in your case something broke.
The following should be done automatically.
In solution root global.json a folder "wrap" should be added to the
projects property.
A folder off the root named "wrap" will be created if it doesn't exist.
A /wrap/project.json will be created/updated with a path to the assembly (dll).
Add a reference to the assembly and version to the referencing project's project.json file.
So first thing to check is make sure you have a "wrap" folder and wrap reference in projects property of solution.json. If you don't then likely something "broke". Try removing the reference rebuilding and adding the reference back. Check the build output window for any errors (VS is still RC so there are something error which probably should be halting that are not).
Look for a project.json in the wrap folder. It should look something like this:
{
"version": "1.0.0-*",
"frameworks": {
"net452": {
"wrappedProject": "../../LegacyClassLibrary/LegacyClassLibrary.csproj",
"bin": {
"assembly": "../../LegacyClassLibrary/obj/{configuration}/LegacyClassLibrary.dll",
"pdb": "../../LegacyClassLibrary/obj/{configuration}/LegacyClassLibrary.pdb"
}
}
}
}
Note the framework version. If there is a mismatch then it will fail resolving the dependencies. For example if your MySolution.Models targets .Net 4.6 and thus when wrapped has a dnx46 framework reference but your MySolution.Server project has a reference to dnx452 (in the project.json for MySolution.Server) then it will fail when resolving the dependency to MySolution.Models.
The you quoted could probably be improved. It means that it could not resolve the dependency due to one of the following reasons
It could not find a MySolution.Models assembly (either source code or compiled dll) based on the paths it uses (starting from projects parameter in global.json).
It found a MySolution.Models assembly (either source code or compiled) BUT it was an invalid version. Check version in Models project vs the reference in Server project.json.
It found a MySolution.Models assembly but it can't resolve framework dependencies (i.e. Models requires dnx46 but Server only targets dnx452).
In my experience the third one if the most common. For the DNX templates in VS 2015 RC the default full framework being targeted is dnx452 (or is it dnx451?). New csproj projects will be 4.6 (dnx46) by default and existing projects could be just about anything.
An alternative solution:
I have found the following alternative to result in easier dependency management. If MySolution.Models will only be used by DNX projects then just convert it to a DNX project move it into the source folder and reference it directly. It will be part of the source compilation and you gain the benefits of dynamic compilation.
If MySolution.Models will be referenced by both DNX and legacy (csproj) projects then you can create a side-by-side xproj and project.json files for Models. They will be ignored by the legacy project. In essence you have both a legacy and DNX project using the same source files. You can then just like above reference it directly. Keep in mind the folder structure if the models folder is not under /src (and it probably isn't if this was an existing project) then you will either need to move it or add a reference to the folder in global.json. That sounded more confusing that it really is. Just keep in mind for a DNX project the global.json defines the relative paths to where DNX can find source code. The DNX also can resolve dependencies by nuget or searching the GAC but that is beyond what you are trying to do.
Q: Is it possible/feasible, to have a multiple solutions stored in a single 'Solutions' directory, and multiple NuGet packages stored in another single 'Packages' directory, and for everything to work nicely with different versions?
Further details...
For example: I have 2 projects. ProjectA requires Newstonsoft.Json.4.5.11, ProjectB requires Newstonsoft.Json.5.0.6.
For sake of example I have a solution file for both. I need all my solution files in the same directory (this is just the process that is followed, all the solutions in a directory are built in turn).
By default NuGet will create a packages directory alongside each solution file.
I have created a nuget.config file to allow me to store packages in a single directory, called 'SharedPackages', following this answer: Nu-Get & issue with project level dependences for projects referenced by multiple solutions
<settings>
<repositoryPath>..\SharedPackages</repositoryPath>
</settings>
This works great so far, so my structure is:
\Projects\ProjectA
\Projects\ProjectB
\Solutions
\SharedPackages
If I create ProjectB, it has Json.NET 4.5.11 by default. If I go to Manage NuGet Packages for Solution I have the option to update it to version 5.0.6. This is great as ProjectB needs the newer version. What is even better is now in my Shared Packages directory I have a directory for both versions of Json.NET side-by-side, so ProjectA can use the older version.
However, now I want to create ProjectC as a full MVC4 Web Application. For JQuery, you get version 1.8.2 currently when creating an ASP.NET MVC4 application in VS2012. I also get Knockout 2.2.0.
My process is, I delete the default packages directory, move the new solution to the Solutions directory alongside the existing nuget.config and edit the new solution file to update the relative path to the new .csproj file. Then when I build, NuGet Package Manager restores the extra packages I need (that weren't in use by ProjectA and ProjectB) to the Shared Packages directory. However... I get build errors, it cannot resolve some references including DotNetOpenAuth, WebGrease, System.Spatial... the references are pointing to the packages directory, not the SharedPackages directory...
As an aside: if I Enable Package Restore for solution, then it also tries to restore them to a packages folder within the Solutions directory by default, instead of restoring them to the SharedPackages directory.
Around about this point I realise that just creating the nuget.config file wasn't enough for ProjectA and ProjectB either, although they appeared to be working originally, the references in the .csproj. file are pointing to the bin folder beneath the project file, instead of my SharedPackages directory.
So I manually 'Find and Replace' ..\packages with ..\..\SharedPackages for all the references. I have to do this for ProjectA, ProjectB and ProjectC. Now everything builds and seems to work OK, new packages go into the right place.
Now, if I go back to ProjectA, and add the Knockout package, this is version 2.3.0. This installs happily alongside the other Knockout package in use by Project C which is version 2.2.0. Doing this also installs JQuery 2.0.3, alongside JQuery 1.8.2. So far so good.
Just as a sanity check, I create another Web Application - ProjectD, move the files around, update the references in the solution and the project. This time everything builds first time. I try and update WebGrease in ProjectD to see if it will retain the older version for ProjectC. This results in more issues, it installs it to the packages directory instead. WebGrease seems to have a separate config setting as well <WebGreaseLibPath>... it won't seem to restore...
I then go back to ProjectB and try 'Update All' - it looks like the files that already exist are updated in SharedPackages, with new version directories alongside the existing ones, but any new dependencies (e.g. now I have a reference to Owin.dll) get placed in the packages folder :( If I delete the packages folder, and the bin folder within ProjectB, then build the ProjectB solution, understandably I get build errors, the packages aren't automatically restored to the SharedPackages directory at any point.
Is it even possible to set NuGet to update packages in a common directory other than packages alongside the solution?
Would it be easier to just use the default packages folder, instead of SharedPackages, or would I still have problems?
This is turning into way too many questions. To try and keep it in scope, has anyone attempted a similar setup, what obstacles did they overcome and how did they manage it, or did they give up altogether? If you gave up, how did you end up using NuGet to manage packages in a massive code base?
I appreciate this is close to this question, which was well answered for that particular question, however the use case here is slightly different: NuGet and multiple solutions. It is also pretty much identical to this question: Setting up a common nuget packages folder for all solutions when some projects are included in multiple solutions, but I have decided to add this anyway as that question is more focused on the having different configurations for different solutions, whereas here I want all the packages in one place, I just want to implement it and see if it is possible. Also I think the troubleshooting and research time may be useful to someone.
Often times a developer on my team will create a new Visual Studio project and reference a DLL somewhere on their local machine (e.g., C:\mydlls\homersimpson\test.dll). Then, when I get the project from the source control repository, I cannot build the project because I do not have the referenced dll in the exact same location on my machine.
What is the best practice for storing and referencing shared libraries?
I typically create a lib folder in my project, where I put the referenced dll's. Then I point the reference to the dll in the lib folder. This way, every developer can build the project after retrieving from source control.
If it's a project that was built in house, you could also add that project to your solution.
If the assembly is not in the GAC, create a directory called dependencies and add all assemblies there. The folder and the assemblies are added to source control. The rule is that given any project in source control, all that is required to build is to do a checkout and build the project (or run some tool that is also checked into the project).
If you add a folder to the solution and add the assemblies to the solution folder, this also provides a visual cue to the devs that indicates what external dependencies are present... all dependencies are in that directory. Relative paths ensure that Visual Studio can locate the references without a problem.
For large solutions, with 20+ projects, this makes life much easier!
Best practice I would expect would have Your SC repository include and enforce the relative locations of referenced objects for you (usually via a shared path), so you aren't dealing with this issue directly. The original developer should check in this information.
If you check in the actual DLLs into source control, then you can reference them by relative path and all developers will automatically get any dependencies when they next update the project.
Adding a DLL reference by full path would be a developer error just as adding a source file by full path would be an error.
Rule of thumb: If the project isn't a part of the solution, reference released dlls from a source controlled /binshare or /lib directory that is under your solution's source control tree. All external dependencies should have versioned DLLs that go in this /binshare directory.
I understand what your co-worker is doing in regards to convenience. However, that developer's approach is diametrically opposed to proper configuration/build management.
Example: If you use the MS Data Application Block as a dependency in your application, you should reference a properly released binary, instead of getting latest from MS's dev source trunk.
I think this approach is quite the opposite of what I would consider best practice. I think it would be a much better approach to keep the third party binaries out of the source repository and reference them through something like a Maven repository in your build process. Putting the dlls in the source repository unnecessarily bloats the contents of the repository and results in gets of projects taking considerably longer then necessary. It also makes the independent management of the third party binaries' versions obfuscated by not referencing the version by name but rather implied by referencing the dll of a particular version stored in the projects lib folder.
Why not set up a private NuGet-feed? This way, there is only a single copy of a dependency (the NuGet repository) and multiple projects can reference it. Multiple versions of the dependency can coexist, and each project can reference a different version, if necessary. Also, TFS Build can restore the packages at build time.
Configuring VS: https://www.visualstudio.com/en-us/docs/package/nuget/consume