Background:
We have several ASP.NET projects with common Core. Static from Core is copied to all other projects. We've added TypeScript to all our projects.
Here is how TypeScript build looks in csproj:
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\TypeScript\Microsoft.TypeScript.Default.props" />
When TypeScript files are compiled, all references files are compiled too. Some TypeScript files references files from Core project. So, files from Core project are sometimes compiled many times (if several files from other project references them).
Simple example:
Core.csproj
-> Common.ts
A.csproj
-> ScriptA.ts
B.csproj
-> ScriptB.ts
ScriptA.ts:
/// <reference path="../Core/Common.ts" />
...
ScriptB.ts:
/// <reference path="../Core/Common.ts" />
...
Build of A or B project causes Common.ts from Core to be built also.
Problem:
It's not a problem that some files are built several times. BUT - if we building projects in parallel (and this is default VS behaviour!) - sometimes build crashed with exception:
[VsTsc] VSTSC error TS5033: Build: Could not write file '...'
Reason is that two or more projects tries to build TypeScript files and tries to build referenced files from some common project. One project starts building ts file to js file and locks js file. Other project tries to lock the same file and crashes.
So, the question is - how to avoid such parallel builds/locks? Referenced project must be compiled already, so may be say TypeScript compiler not to build files from other projects somehow?
I handle this by shipping shared libraries as a NuGet package. Not only does this make your builds independent, it lets you control your dependencies (i.e. you can upgrade when you decide to, not just because someone edited a shared file).
I suggest you treat project Common as a library, not as a collection of individual files.
When you refer to to Common using line /// <reference path="../Core/Common.ts" />, this pulls in Common.ts -- and possibly files it references -- into projects A and B, thus duplicating it, and making it compiled several times.
Instead, what you need is
/// <reference path="../Core/Common.d.ts" />
That is, you only use declarations from the project Common. The declarations are not built by default, you should check option "Generate declaration files" on TypeScript page of project configuration, or if you prefer manually adding line
<TypeScriptGeneratesDeclarations>True</TypeScriptGeneratesDeclarations>
in your .csproj file. A small downside is that you have to load more than one .js file into your page or node.js project. However it is small price to pay for ability to compose your code into modules. For example, one day you might want to load both A.js and B.js into the same page, and you would have had a mess, because copies of common.ts would clash and override each other. Referencing by type declarations solves this problem, along with your specific problem of build breaks.
Related
We have an internal JavaScript library that we'd like to share between multiple projects. Actually we are already sharing it via file copying, but this has (predictably) resulted in multiple forks of the code.
The consuming projects are a mix of "full" ASP.NET (MVC and Web Forms) and ASP.NET Core MVC. (I'm planning on creating two separate packages.)
Installing into ASP.NET projects seems to work fine, but I'm having problems with ASP.NET Core.
Initially I had all the artifacts within a files element, and nothing at all was showing up in the consuming project. After re-reading the docs, I realized that ASP.NET Core projects would use a PackageReference ... so I would have to use a contentFiles element instead of (or in addition to) a files element.
I created a contentFiles folder and a script to copy the requisite files from the source project folder structure into contentFiles/any/any/wwwroot/lib/ourAwesomeWidget, and modified the package manifest accordingly.
This works. Sort of. The package appears to get build correctly. The files do get added to the consuming project, but they get added as links; the actual files (the link targets) reside in my local package cache.
The relevant portion of the package manifest is:
<metadata minClientVersion="3.3">
...
<contentFiles>
<files include="**/*" buildAction="Content"
copyToOutput="true" flatten="false" />
</contentFiles>
</metadata>
<files>
<file src="contentFiles\**" target="contentFiles" />
</files>
Part of the issue is that I don't find the docs very clear concerning contentFiles. All the examples show a single file element ... but the include attribute on the files element is required, so it's not clear what the individual file elements would even do.
Is there a way to get the actual files (not links) added to the consuming project? Or, alternatively, is there a way to get the package to install as a "normal" package (rather than a PackageReference)?
Update:
I did some further digging and found this answer by #Martin to a similar question -- but he answered this one before I had a chance to update it.
It appears this behavior (adding files as links) is by design.
I find this highly unsatisfactory, because (as #Martin points out), our JavaScript library will not be available during development on consuming projects.
But part 2 of my question still stands. According to the docs,
By default, PackageReference is used for .NET Core projects, .NET Standard projects, and UWP projects targeting Windows 10 Build 15063 (Creators Update) and later.
Is there a way to trigger the non-default behavior, i.e. allow .NET Core projects to consume packages other than via PackageReference?
contentFiles are supposed to be added as a link. The contentFiles section controls the msbuild items that are generated for these files into the obj\projectname.csproj.nuget.g.props file.
The copyToOutput="true" will cause the items to be copied to the output and publish directory. However that does not help you when running the application during development, since it will be run from the project directory, not the output directory.
Consider consuming client libraries via npm (since bower is deprecated).
I have solution that consists of 10 projects. Each project has a test assembly (making 20 projects).
Currently, my build script builds all the test assemblies, then runs all the tests, great.
Except that each test assembly references 2 or more of the core assemblies (directly and indirectly), which means there is lots of redundant building going on.
How can I simplify things (without reducing number of assemblies) to speed up the build?
I guess I could build each project directly without resolving the inter-project references and bung it all in a single output dir, but how do i still resolve the other references projects have to 3rd aprty ddls etc.
Other suggestions?
thanks
I am working on a tool to automate the build process it is still on development and it's open source here is the link:
https://github.com/jupaol/NCastor
To speed up your build you could try to build in parallel your projects:
http://geekswithblogs.net/deadlydog/archive/2012/03/30/parallel-msbuild-ftwndashbuild-faster-in-parallel.aspx
http://www.hanselman.com/blog/FasterBuildsWithMSBuildUsingParallelBuildsAndMulticoreCPUs.aspx
To force MSBuild to use a single output directory:
<BuildProperties>
Configuration=$(Configuration);
Platform=$(Platform);
OutputPath=$(BuildingPath);
$(BuildProperties);
</BuildProperties>
<MSBuild Projects="$(FullSolutionFilePath)" Properties="$(BuildProperties);" Targets="ReBuild"/>
Can you build the referenced assemblies first, copy them to a "Common" folder, and have the "Common" folder assemblies referenced in the using projects as "Referenced Libraries"?
We do this with our CompanyName.Enterprise libraries and it works fine. They get built once or twice a year and the projects using them build daily.
I'm not able to build solution incrementally. I checked diagnostic log and I found that every project containing workflows are always rebuild because of this:
Input file ".NETFramework,Version=v3.5" does not exist.
Workflows are always recompiled, new temporary files are created and project is build again.
Building target "WorkflowCompilation" completely.
Input file ".NETFramework,Version=v3.5" does not exist.
Using "CompileWorkflowTask" task from assembly "System.Workflow.ComponentModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35".
Task "CompileWorkflowTask"
No files found with '.xoml' extension in the set of input files.
Generated temporary code file: C:\Users\Ludwo\AppData\Local\Temp\uwdnm5th.cs
Workflow markup validations completed with 0 errors and 0 warnings.
Done executing task "CompileWorkflowTask".
Done building target "WorkflowCompilation" in project "Delta.Workflow.Common.Merged.csproj".
Target "CoreCompile" in file "C:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.CSharp.targets" from project "h:\Prj\R4x\M\CountrySystems\Delta\Common\Delta.Workflow.Common\Delta.Workflow.Common.Merged.csproj" (target "Compile" depends on it):
Building target "CoreCompile" completely.
Input file "C:\Users\Ludwo\AppData\Local\Temp\uwdnm5th.cs" is newer than output file "obj\Debug\Delta.Workflow.Common.pdb".
I'm building my projects using MSBuild 4.0. My projects are set to build with v3.5 TargetFrameworkVersion, unit tests projects are build with TargetFrameworkVersion set to v4.0. I tried to build it on different PC but the result is still the same. I also played with references in my projects. It seems to be like v4.0/v3.5 conflict, but I don't know how to fix it. Any ideas?
I found it. The root cause is wrong version of Workflow.targets file imported inside my workflow (.csproj) projects. Workflow.targets for .NET v4.0 was imported instead of v3.5. It should be related to projects upgrade from VS2008 to VS2010 I did some time ago.
I changed Workflow.targets Import from
<Import Project="$(MSBuildToolsPath)\Workflow.targets"/>
to
<Import Project="$(MSBuildExtensionsPath)\Microsoft\Windows Workflow Foundation\v3.5\Workflow.targets" />
Hope it helps someone...
I need to collect into a single folder all test assemblies, with their dependencies, and configuration files. The process should preserve the directory structure from the output of each test project. We have a solution that requires manually attaching test projects to a master project, but our solution has far too many projects for this to be maintainable. These should be located automatically based on naming convention (x.UnitTest.csproj, y.IntegrationTest.csproj).
For background, we are working with a build system that passes artifacts (binaries, etc) between agents. We are compiling on one agent, and testing on other agents. The massive duplication of assemblies between test projects is slowing the build process down.
What I have done:
1) I have a csproj that references most of the test projects. This gets binaries and dependencies into one folder.
2) I am able to identify all files to copy using this
<CreateItem
Include="%(ProjectReference.RootDir)%(ProjectReference.Directory)$(OutDir)*.config">
<Output TaskParameter="Include" ItemName="TestConfigurationFiles"/>
</CreateItem>
<Copy
SourceFiles="#(TestConfigurationFiles)"
DestinationFolder="$(OutDir)">
</Copy>
I've attempted most obvious things, such as
MsBuild task: RebaseOutputs attribute, overriding the OutDir property. I can provide the msbuild task with a dynamically generated set of outputs, but can only build them in their default folder.
Hooking into the TargetOutputs of
msbuild task gives only the primary
output assembly (without
dependencies).
I experimented with "Copy Always" for
configuration files. This puts them
in the output directory of the
dependent project as "app.config" not
"dllname.config", and not in the
final project.
Solutions that could make this better might include
Provide an example of adding to the projectreference item array dynamically, before compilation.
Use msbuild TargetOutputs to create a list of all files in the folder (instead of just the primary output) and copy to a destination folder.
Today I'm using msbuild 3.5. Ideally the solution would work with msbuild 3.5. We are transitioning to .NET 4 / MsBuild 4 soon, so, if must be done in .Net 4, that is fine.
Have you considered flattening the folder structure when you export your artifacts?
Something like:
src/*.UnitTest*/bin/**/*.* -> /testlibs
src/*.IntegrationTest*/bin/**/*.* -> /testlibs
Following Situation:
2 Team Projects
Dvelop of Team Project A added Project References of Team Project B to their projects.
For speeding up the Build I want to replace the project references with referencing the dll's directly.
My Idea:
in the csproj of Team Project A:
<ProjectReference Condition="'$(IsDesktopBuild)' == 'true'" Include="[Project Reference] >...
in the TFSBuild.proj
<AdditionalReferencePath Include="[buildoutputOfTeamProjectB]" />
OR
Disable SolutionToBuild and use the csproj files directly.
Thanks for your suggestions.
I would suggest that each project have a dependencies folder that contains the appropriate dlls that are required for each project. When a project that is depended upon is built it would be up to you to automatically update the dll in the dependencies folder or not via your build process (cruise control/nant/msbuild?). However, I would also give some consideration around deploying versions of the depended upon dll just in case you blow up the dependent projects usage of that dll. It would suck for someone to update their project (the depended on project), kick off a build, deploy their build output to the dependent project) only to break the project that relies on their code base. That sounds like a fragile way of managing dependencies.