Latest:
This is definitely a bug in msbuild. Other than that there cannot be any other explanation. This could only be happening on Linux or possibly on a wider range.
So i decided to just build one single project with absolutely no dependencies on others in the solution.
Looking at the captured diagnostics, I see these lines which are very promising:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
Input files: Annotations.cs;Auth.cs;AuthorizationConfig.cs;Backend.cs;Billing.cs;Code.cs;...
Output files: .obj/TheAgent.dll;.obj/TheAgent.pdb
Set Property: NoWarn=;1701;1702
15:23:27.396 1>Done building target "CoreCompile" in project "TheAgent.csproj".: (TargetId:40)
It looks like my dll and my pdb weren't built which is what I expected.
However, something must be happening before or after causing the timestamp to change (be that of this build time and not the last).
The timestamp of the dll is updated both in the intermediate object folder (.obj/) and also in the output folder.
Is there a known way of stopping msbuild right after its CoreCompile task?
Update:
I decided to search for is newer this time and found instances of these. I don't know how they have got to the solution/project files though:
Input file ".obj/Common.csproj.CoreCompileInputs.cache" is newer than output file ".obj/Common.pdb".
Further to the above, I came across this:
https://github.com/dotnet/project-system/issues/4736
Thinking that this was the issue, I upgraded to dotnet sdk version 2.2.402.
The end result is still the same :(
Original:
I need some pointers on how to troubleshoot this issue. I am using /t:build to build a solution file.
The resulting executable keeps getting refreshed each time.
First i thought the package restore was causing this. I have removed that step however it didn't make a difference.
Then I looked at this:
https://oz-code.com/blog/visual-studio-keeps-rebuilding-projects-no-good-reason/
I'm basically looking for some text in the diagnostics output which tells me if a target or a file is out of date and needs to be rebuilt. The above link talks about "project 'B' is not up to date". I don't have a not up to date in my msbuild output.
I already had two resources with CopyAlways which I changed to CopyIfNewer.
The above article also talks about circular dependencies. I am checking everything manually. And yes the references to dependent project are actually references to the project outputs (dll's /exe's). So Finding a circular dependency by just checking for that pattern seems a little odd.
There was one more problems in the dotnet platform and/or msbuild causing this to fail.
One of those was this https://github.com/dotnet/project-system/issues/4736
Installing SDK 3.0.100-preview7-012821 or better solved the problem
Related
Lets make obscure question simple...
We have a solution which consists of many projects and some of them have set Custom build events using 3rd party stuff for some dark magic compilation and looks similar like this:
<CustomBuild Include="..\folder\somestuff.xyz">
<FileType>Document</FileType>
<Command Condition="'$(Configuration)|$(Platform)'=='Release|x64'">C:\Tcl\bin\tclsh.exe $(APP_PATH)\modules\APP\bin\generator.tcl -o %(RelativeDir)%(Filename) %(RelativeDir)%(Filename).xyz</Command>
<Message Condition="'$(Configuration)|$(Platform)'=='Release|x64'">APPGEN %(RelativeDir)%(Filename)</Message>
<Outputs Condition="'$(Configuration)|$(Platform)'=='Release|x64'">%(RelativeDir)%(Filename).cpp;%(RelativeDir)%(Filename).h;%(RelativeDir)%(Outputs)</Outputs>
</CustomBuild>
This was working properly until we switch form VS2015 to VS2019 as now during the compilation it reports that:
Project is not up-to-date: build output 'd:\projects\program\app\src\plugins\shared\' is missing. This would be more or less ok, but it forces the compiler to recompile also the dependencies of this project and this start to be really annoying as you need to rebuild several projects everytime even when no changes were done.
I found out that the problem originates from this line of Custom build:
<Outputs Condition="'$(Configuration)|$(Platform)'=='Release|x64'">%(RelativeDir)%(Filename).cpp;%(RelativeDir)%(Filename).h;%(RelativeDir)%(Outputs)</Outputs>
More precisely from this part: %(RelativeDir)%(Outputs) as the check for .cpp and .h file in the same tag do not generate any issues. So I removed this check for directory. When this chunk of code is removed the project compiles properly and do not re-compile all day long.
So why the Custom build's Output check is now working properly just with files and directories are generating this kind of issue?
And yes, the examined dir exists and it refers to the existing correct path.
The real problem is that your real project is always rebuild due to the metadata Outputs.
The special point is that you should make sure the validity and legitimacy of the value of Outputs.
The problem is under %(RelativeDir) of %(RelativeDir)%(Outputs). When you add it, the outputs has an illegal folder structure rather than a file which makes the outputs always find the missing illegal folder structure so that causes the project always rebuild.
Let me describe it in detail,
when msbuild reads outputs proeperty, when it reads till %(RelativeDir)%(Filename).cpp;%(RelativeDir)%(Filename).h;%(RelativeDir), the value of Outputs is this:
..\folder\somestuff.cpp;..\folder\somestuff.h;..\folder\
Then, it reads %(Outputs)(reads itself), which is more like copy the above value twice:
..\folder\somestuff.cpp;..\folder\somestuff.h;..\folder\..\folder\somestuff.cpp;..\folder\somestuff.h;..\folder\
You will find the last part ..\folder\ is not a file and it is a folder structure which is illegal for the outputs.
That is the reason.
And it is more like your problem that the folder structure d:\projects\program\app\src\plugins\shared\ is missing.
Suggestion
So you should not add outputs again.
<Outputs Condition="'$(Configuration)|$(Platform)'=='Release|x64'">%(RelativeDir)%(Filename).cpp;%(RelativeDir)%(Filename).h;</Outputs>
I'm developer on a big system (>100 Projects in Solution, >100 000 LOC, > 10 Services, ...) and did the installation of this system in the past with wix and it worked fine. Now I need a way to patch (Minor Upgrade) parts of the system and run into several issues.
My Current Wix Setup is as following:
I have VS2010 and Wix3.6 Toolset and TFS2012 to Build the whole thing and get an installer
I'm using a Setup Library Project Type per Service
I'm using exactly one Setup Project to bundle things together and get one installer for the whole system.
It's not possible to change this setup.
The Setup Library Projects are set up as following:
I use the heat-directory msbuild task to generate the components and files and I'm using preprocessor variables to modify the file paths.
I need to modify the file paths because it must be possible to build an installer on the local developer system and to build the installer on the tfs build system which is different in folder structures.
The TFS uses always the same directory to compile subsequent versions of the software and moves the output after successful compilation to a unique folder structure.
Now I need a patch.
I created the Patch.wxs and called candle and light for it. I called torch to get the difference file. And finally want to create the patch with pyro.
Everything worked fine with a simple testproject, but on the big system
Pyro has the problem that it can't find the files to install.
Through my setup (see above), I must use preprocessor variables and have a full qualified path in my wix output (for example: C:\builds\myproduct\prodct.exe as file source). After moving the TFS output to another location this path is not valid anymore. I tried to use -bt and -bu switches for pyro, but this does only work for relative paths or for named bindpaths.
Now I wanted to change my wix project setup to use named bindpaths rather than preprocessor variables, but it seems that this is not possible.
heat can only use preprocessor variables or wixvariables but it seems not to be possible to use bindpath variables. heat provides a switch -wixvar which should create binder variables instead of preprocessor variables but I does exactly nothing.
Now I tried do use no wix and no preprocessor variables in heat and tell light per -bu -bt switches where to find the files. But if I do not set a preprocessor variable the resulting files look like Sources\product.exe. I can't get rid of this Sources. I know that I can transform all the xml with xslt and remove the Sources but thats a workaround which I would only implement if no other solution is possible. This would also mean that there is a problem in the wix toolchain.
It looks like pyro does only support bindpath variables and heat does only support preprocessor and wix variables. This seems to be really crazy, because how should they work together?
How can I create a patch if I use lit, light, candle, heat, torch and pyro and if the original build paths have changed (which is very common on a build system) and the file paths are created with heat and therefore be fixed or preprocessor or wix variables?
As you've found heat wasn't designed to be used in the patching scenario. It was only in recent versions of the WiX toolset that the generated GUIDs got to a point where there was even a chance that heat could successfully build output that would be patchable. Still need to do work there to make patching where heat is used work well.
Ultimately, I believe the answer is to simplify the "original source" problem. It is challenging to get all the bindpaths set up correctly and that makes patching, which is a hard problem, even harder. We've kicked around a few ideas but nothing has come together yet.
You could always use admin image based patching. It's slower but can be easier to get the "original source" and "target" laid out. That path does lose filtering though.
Basically, we need to do a bit more work in patching scenarios to make it much easier.
PS: "Source" in the path for a File/#Source attribute is an alias for the "default bindpath". You can use bindpaths there.
First, I have a base assumption from watching Visual Studio compile things with its default .*proj files that, if you build the same solution twice in a row, it detects that nothing has changed and seems to fly through the solution build. Does this mean it knows that nothing was changed in a project and doesn't have to make a new DLL output?
If that's the case, I have a question. Say I have a solution with multiple class libraries, and an MSBuild task in each project that automatically increments the build's version by modifying AssemblyInfo.cs. Thing is (if my previous assumption is correct) it does this every time and triggers a new rebuild of each class library. Is there a target or property in MSBuild that can tell if the project needs recompilation, and skip my versioning step if so?
I ask because let's say I update project A, but not project B in a solution. If I run a build on the solution, I want it to update the version on project A, but since project B hasn't changed, I want to leave it alone.
Found something: http://msdn.microsoft.com/en-us/library/ms171483.aspx
MSBuild can compare the timestamps of
the input files with the timestamps of
the output files and determine whether
to skip, build, or partially rebuild a
target. In the following example, if
any file in the #(CSFile) item
collection is newer than the hello.exe
file, MSBuild will run the target;
otherwise it will be skipped:
<Csc
Sources="#(CSFile)"
OutputAssembly="hello.exe"/> </Target>
...that worked. But then got me thinking, what if someone pulls the code down from source control without the assemblies (which is how we do it)? Since it has no output to compare against, it'll do a compile and increment the version anyway. I think the complexities might lead me to abandon this approach.
It doesn't really matter if you increment on a developers box - what's important is that your daily/CI build is only incremented when needed. So, what I've done in the past is have some small XML file contain the next build number, and have an MSBuild task take this xml file and create a file called Version.cs (containing the versioning attributes you'd usually find in AssemblyInfo.cs).
Version.cs is never checked into your soure control - it's generated by the build.
Developers will sync the current XML file, build their binaries, and get the current version number. The continous integration build may also do the same thing. But a daily/official build will check out the XML file, increment the version information, and then check it in. From that moment on the version number has officially changed.
There are variations on this theme, but the general idea works.
I want to remove AssemblyInfo.cpp, because of some metadata errors that sometimes come up.
Is AssemblyInfo.cpp useful for anything? Or can it be removed without any problem?
I've discovered one distinction for this file: it has to do with values reported under calls to Assembly.GetReferencedAssemblies. I was working on tracking version numbers of our binaries from our SVN repository by embedding the revision numbers into them. Initially I too was updating AssemblyInfo.cpp and found nothing reported in the file property details tab for the binary. It seemed this file did nothing for me in terms of updating those details, which was not the case with similar updates to a csproj's AssemblyInfo.cs. Why the difference right?
Now in one such csproj we happen to reference a vcxproj and that csproj dumps to a log the versions of all its referenced assemblies using the .NET Assembly.GetReferencedAssemblies method. What I discovered was that the number that was being reported in that log was not the vcxproj's version as given by the VS_VERSIONINFO resource I added (which does get the version details into the file properties details tab). Instead the number reported was actually matching that defined in the AssemblyInfo.cpp.
So for vcxproj files it looks like VS_VERSIONINFO is capable of updating the contents you find under the file properties details tab but AssemblyInfo.cpp is capable of exposing the version to GetReferencedAssemblies. In C# these two areas of reporting seem to be unified. Maybe there's a way to direct AssemblyInfo.cpp to propagate into the file details in some fashion, but what I'm going to wind up doing is duplicating the build info to both locations in a prebuild step. Maybe someone can find a better approach.
So far I never had the AssemblyInfo.cpp in my managed c++ dlls, so I don't think it is necessary.
(I just added the file to have version information for my c++ dlls).
Why not just fix the errors? On that note, what errors are you getting?
This file provides information such as a version number which is definitely needed in order to use the assembly you have built.
Is there a mode, some switch or a programmatic way that I can ask MSBuild to display or output it's calculated dependencies for a given build file?
Some background -
I have a large project that requires splitting up to speed up the build time and want to remove the slow changing infrastructure code into it's own release area. Not all of the information is contained in the build file itself, as some sub-projects are referenced by their vcproj or csproj files.
I'd really like to see what MSBuild thinks needs doing (either worse-case [rebuild all] and perhaps for a make) without actually doing the rebuild.
The MSBuild Profiler project should be able to help you in seeing where time is being taken on the build. It doesn't directly show dependencies. With or without build dependencies, just profiling the builds can probably give some insight and help speed up the process.
I did just come across this application, but I have not used it myself yet, Dependency Visualizer that looks to have just added MSBuild-compatible project files. There have also been posts about doing this previously, but no code (see A, B).
Whilst I asked the original question quite a long time ago, I have moved on in jobs and surprisingly encountered the same need. In this case I was more successful in my pursuit of a tool and discovered Microsoft Build Sidekick which offers:
view
edit
build
debug
of Microsoft Visual Studio© 2005, 2008 and 2010 project files.
As well as debugging and logging features I haven't yet used, it has a diagramming mode where you can select the "Target" and it shows all of the dependent Targets and steps within them. Apparently this diagram can be viewed when stepping through the build process (debugging)!