Is AssemblyInfo.cpp necessary? - c++-cli

I want to remove AssemblyInfo.cpp, because of some metadata errors that sometimes come up.
Is AssemblyInfo.cpp useful for anything? Or can it be removed without any problem?

I've discovered one distinction for this file: it has to do with values reported under calls to Assembly.GetReferencedAssemblies. I was working on tracking version numbers of our binaries from our SVN repository by embedding the revision numbers into them. Initially I too was updating AssemblyInfo.cpp and found nothing reported in the file property details tab for the binary. It seemed this file did nothing for me in terms of updating those details, which was not the case with similar updates to a csproj's AssemblyInfo.cs. Why the difference right?
Now in one such csproj we happen to reference a vcxproj and that csproj dumps to a log the versions of all its referenced assemblies using the .NET Assembly.GetReferencedAssemblies method. What I discovered was that the number that was being reported in that log was not the vcxproj's version as given by the VS_VERSIONINFO resource I added (which does get the version details into the file properties details tab). Instead the number reported was actually matching that defined in the AssemblyInfo.cpp.
So for vcxproj files it looks like VS_VERSIONINFO is capable of updating the contents you find under the file properties details tab but AssemblyInfo.cpp is capable of exposing the version to GetReferencedAssemblies. In C# these two areas of reporting seem to be unified. Maybe there's a way to direct AssemblyInfo.cpp to propagate into the file details in some fashion, but what I'm going to wind up doing is duplicating the build info to both locations in a prebuild step. Maybe someone can find a better approach.

So far I never had the AssemblyInfo.cpp in my managed c++ dlls, so I don't think it is necessary.
(I just added the file to have version information for my c++ dlls).

Why not just fix the errors? On that note, what errors are you getting?
This file provides information such as a version number which is definitely needed in order to use the assembly you have built.

Related

My VS project keeps getting rebuilt using msbuild

Latest:
This is definitely a bug in msbuild. Other than that there cannot be any other explanation. This could only be happening on Linux or possibly on a wider range.
So i decided to just build one single project with absolutely no dependencies on others in the solution.
Looking at the captured diagnostics, I see these lines which are very promising:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
Input files: Annotations.cs;Auth.cs;AuthorizationConfig.cs;Backend.cs;Billing.cs;Code.cs;...
Output files: .obj/TheAgent.dll;.obj/TheAgent.pdb
Set Property: NoWarn=;1701;1702
15:23:27.396 1>Done building target "CoreCompile" in project "TheAgent.csproj".: (TargetId:40)
It looks like my dll and my pdb weren't built which is what I expected.
However, something must be happening before or after causing the timestamp to change (be that of this build time and not the last).
The timestamp of the dll is updated both in the intermediate object folder (.obj/) and also in the output folder.
Is there a known way of stopping msbuild right after its CoreCompile task?
Update:
I decided to search for is newer this time and found instances of these. I don't know how they have got to the solution/project files though:
Input file ".obj/Common.csproj.CoreCompileInputs.cache" is newer than output file ".obj/Common.pdb".
Further to the above, I came across this:
https://github.com/dotnet/project-system/issues/4736
Thinking that this was the issue, I upgraded to dotnet sdk version 2.2.402.
The end result is still the same :(
Original:
I need some pointers on how to troubleshoot this issue. I am using /t:build to build a solution file.
The resulting executable keeps getting refreshed each time.
First i thought the package restore was causing this. I have removed that step however it didn't make a difference.
Then I looked at this:
https://oz-code.com/blog/visual-studio-keeps-rebuilding-projects-no-good-reason/
I'm basically looking for some text in the diagnostics output which tells me if a target or a file is out of date and needs to be rebuilt. The above link talks about "project 'B' is not up to date". I don't have a not up to date in my msbuild output.
I already had two resources with CopyAlways which I changed to CopyIfNewer.
The above article also talks about circular dependencies. I am checking everything manually. And yes the references to dependent project are actually references to the project outputs (dll's /exe's). So Finding a circular dependency by just checking for that pattern seems a little odd.
There was one more problems in the dotnet platform and/or msbuild causing this to fail.
One of those was this https://github.com/dotnet/project-system/issues/4736
Installing SDK 3.0.100-preview7-012821 or better solved the problem

C#, Gendarme, Sonar and Jenkins : Exclude generated files from Gendarme

I'm working with gendarme for .net called by Sonar (launched by Jenkins).
I've a lot of AvoidVisibleFieldsRule violations. The main violations are found in the generated files. As I can't do anything on it, i would like to exclude *.designer.cs from the scan.
I can't find a way to do that. There is a properties in Sonar to exclude generated files but it doesn't seem to be applied for gendarme.
Is there a way to do such a thing ?
Thanks for all
Gendarme expects you provide an ignore list,
http://www.mono-project.com/Gendarme.FAQ
https://github.com/mono/mono-tools/blob/master/gendarme/self-test.ignore
The ignore file format is bit of weird, but you can learn it by experiments.
Indeed that is actually not normal at all. Generated code is excluded by the plugin with the standard configuration. What version of the C# plugins are you using ?
Anyway, the configuration property you can try is "sonar.exclusions" (see http://docs.codehaus.org/display/SONAR/Advanced+parameters).
If you do not solve your problem right away, the best thing would be to drop a mail to the user mailing list (see http://www.sonarsource.org/support/support/) and send the verbose output of your build. To get this output simply add "-X" to the command line.
Hope it helps

Xcode search paths for public headers in dependencies

I am trying to clean up some of my projects, and one of the things that are puzzling me is how to deal with header files in static libraries that I have added as "project dependencies" (by adding the project file itself). The basic structure is like this:
MyProject.xcodeproj
Contrib
thirdPartyLibrary.xcodeproj
Classes
MyClass1.h
MyClass1.m
...
Now, the dependencies are all set up and built correctly, but how can I specify the public headers for "thirdPartyLibrary.xcodeproj" so that they are on the search path when building MyProject.xcodeproj. Right now, I have hard-coded the include directory in the thirdPartyLibrary.xcodeproj, but obviously this is clumsy and non-portable. I assume that, since the headers are public and already built to some temporary location in ~/Library (where the .a file goes as well), there is a neat way to reference this directory. Only.. how? An hour of Googling turned up blank, so any help is greatly appreciated!
If I understand correctly, I believe you want to add a path containing $(BUILT_PRODUCTS_DIR) to the HEADER_SEARCH_PATHS in your projects build settings.
As an example, I took an existing iOS project which contains a static library, which is included just in the way you describe, and set the libraries header files to public. I also noted that the PUBLIC_HEADERS_FOLDER_PATH for this project was set to "/usr/local/include" and these files are copied to $(BUILT_PRODUCTS_DIR)/usr/local/include when the parent project builds the dependent project. So, the solution was to add $(BUILT_PRODUCTS_DIR)/usr/local/include to HEADER_SEARCH_PATHS in my project's build settings.
HEADER_SEARCH_PATHS = $(BUILT_PRODUCTS_DIR)/usr/local/include
Your situation may be slightly different but the exact path your looking for can probably be found in Xcode's build settings. Also you may find it helpful to add a Run Script build phase to your target and note the values of various settings at build time with something like:
echo "BUILT_PRODUCTS_DIR " $BUILT_PRODUCTS_DIR
echo "HEADER_SEARCH_PATHS " $HEADER_SEARCH_PATHS
echo "PUBLIC_HEADERS_FOLDER_PATH " $PUBLIC_HEADERS_FOLDER_PATH
.
.
.
etc.
I think that your solution is sufficient and a generally accepted one. One alternative would be to have all header files located under an umbrella directory that can describe the interface to using the depended-on libraries and put that in your include path. I see this as being similar to /usr/include. Another alternative that I have never personally tried, but I think would work would be to create references to all the headers of thirdPartyLibrary from MyProject so that they appear to be members of the MyProject. You would do this by dragging them from some location into MyProject, and then deselecting the checkbox that says to copy them into the project's top level directory. From one perspective this seems feasible to me because it is as if you are explicitly declaring that your project depends on those specific classes, but it is not directly responsible for compiling them.
One of the things to be wary of when addressing this issue is depending on implementation-specific details of Xcode for locating libraries automatically. Doing so may seem innocuous in the meantime but the workflows that it uses to build projects are subject to change with updates and could potentially break your project in subtle and confusing ways. If they are not well-defined in some documentation, I would take any effect as being coincidental and not worth leveraging in your project when you can enforce the desired behavior by some other means. In the end, you may have to define a convention that you follow or find one that you adopt from someone else. By doing so, you can rest assured that if your solution is documented and reproducible, any developer (including yourself in the future) can pick it up and proceed without tripping over it, and that it will stand the testament of time.
The way we do it is to go into build target settings for the main project and add:
User Header Search Path = "Contrib"
and check that it searches recursively. We don't see performance problems with searching recursively even with many (10-15 in some projects) dependencies.

MSBuild - Assemblies differ slightly after each clean+build

I'm trying to work with an existing home grown implementation of click-once. Currently we manually update the manifest for assemblies that we actually changed. I'm attempting to make it automatic based on a binary comparison of the existing assemblies and the newly built assemblies. Unfortunately, it seems that each time I run clean + build (automated build script) there are small differences to the assemblies, essentially invalidating the use of our click-once solution at all. I'm guessing that these differences are caused by some sort of guid generation or something along those lines. Is there anyway to prevent the differences in the assemblies?
And unfortunately, due to our branching/CI strategy I don't have the option of not cleaning because each release is from a new branch.
Otherwise, any suggestions on how I can compare two assemblies to see if any code has changed, without having access to the source code.
Thanks,
David
Typically, autobuild systems check the filesystem timestamps of the binary vs the source files (or object files vs source files, depending on the language). If the source is newer than the binary/object, a rebuild is triggered. This strategy may work better for you instead of actually diffing binaries/
I found BitDiffer a tool from www.BitWidgets.com that compares what has changed in an assembly. While this runs slower than a binary comparison, it removes the need to have MSBuild create an identical assembly.
Thanks,
David

WIX MSBuild automation help - solution best practices

I know there are many questions out there regarding this same information. I have read them all, but my brain is all turned around and I don't know which way to go. Plus the lack of documentation really hurts.
Here is my scenerio. We are trying to use WIX to create an installer for our application that goes out to our dealers for our product information. The app includes about 2000 images and documents of our products and a SQL CE database that are updated via Microsoft Sync Framework. The data changes so often that keeping these 2000 as content files in the app's project is very undesirable. The app relies on .NET Framework 3.5 SP1, SQL Server CE 3.5, Microsoft Sync Framework 1.0 and ADO.NET Sync Services 2.0.
Here are the requirements for the app:
The dealers will be given the app on a CD every year for any updates (app or data updates).
The app must update itself from the internet to get any new images, documents or data.
The prerequisites must be installed if they do not exist on the client machine.
The complete installer should be generated from an MSBuild script with as little human interaction as possible (we don't want to be manually updating the 2000+ file list).
What we have accomplished so far is that we have a Votive project in our solution. We have manually specified the binaries in a .wxs file. Web have modified the .wixproj file to use the HeatDirectory task to gather our data (images and documents and database) from a specified location (This is broken and giving an ICE38 error). This seems all right, but still is a lot of work. We have to manually update our data by running the program in release mode and copying it to the specified directory.
I am looking to see what other people would do in this situation.
How would you arrange your solution with regards to the 2000+ data files? Would you create a custom build script that gets the current data from the server or would you include them as content files in the main project?
How would you get WIX to include all of the project output (including the referenced assemblies) and all of the data files? If you have any complete samples, that would be great. All I have found are little clips here and there and not an entire example from start to finish.
How would you deal with the version numbers? Would you put them as a constant in the build script and reference them through the $(var.VersionNumberName)? Would you have the version number automatically picked up from the project being deployed? If so, How?
If there is any better information than what I am finding, please include. I have read numerous articles, blogs, Stackoverflow questions, the tuturial, the wiki, etc. Everything seems to be in bits and pieces. The tutorial is nice, but doesn't explain anything about MSBuild and Votive. I would like to see a start to finish tutorial on using MSBuild and Votive and all the WIX MSBuild targets. If no one knows of a tutorial like this I may put one together. I have already spent the entire week gathering info and reading. I'm new to MSBuild as well, so if anyone has any great articles on MSBuild, please include them.
The key is to isolate the different types of complexities into separate merge modules and put them altogether into an MSI as part of the build. That way things that change often can change without impacting things that hardly change at all.
1) For the data files:
We use Paraffin to generate the WiX and hence the merge modules for an html + Flash based help system consisting of thousands of files (I can't convince the customer to go to CHM).
Compile these into a merge module all by themselves.
2) Assemblies: assuming that this is a set that changes less often just make a merge module by hand or with WixEdit with the correct files and dependencies.
3) For the version number there a lot of ways to manage this depending on your build system. The AssemblyInfoTask is pretty straight forward way to make sure all your assemblies are versioned appropriately. The MSBuild Extension Pack has some versioning stuff if you are using TFS.
I had a similar scenario and was unable to find a drop in solution so ended up with the following:
I wrote a custom command line program called wixgen.exe for generating wxs manifest files. It is pretty specific to our implementation in that it only knows how to create 2 types of wxs files. One for IIS Website/Virtual Directory deployments and another for Windows Service deployments.
Each time a build is triggered by our continuous integration server a post-build task runs wixgen with the right args to generate a new manifest.wxs for the project being changed. It automatically includes all the files needed for the deployment. These builds also version the dlls using a variation of the technique at: http://richardsbraindump.blogspot.com/2007/07/versioning-builds-with-tfs-and-msbuild.html
A seperate build which is manually triggered is then used to build the wixproj projects containing the generated wxs files and produce the msi's.
I would ditch the CD delivery (so 90's) and got with ClickOnce. This solution seems to fit well since you already use the .NET framework. With ClickOnce you should be able to just keep updating the content of your solution and make updates available to your heart's content. Let me know if you need, sample ClickOnce deployment code.
You can find more ClickOnce information here.
Similar to dkackman's answer, you should seperate your build into several components, isolating build components to be built seperately.
I come from a mainly Java background, however for building MSIs and NET executables we use maven; with the 'maven-wix-plugin' plugin for building the installers, and using the NMaven plugin for compiling any NET code. However, as we're only performing very basic development in NET, with most development in Java, we don't need too much complexity from the NMaven plugin (which is probably a 'good thing' (TM) as it's only at version 0.17).
If you're a purely NET house, you could also look into Blydan (http://www.codeplex.com/byldan), which seems to be the focus of development there at the moment (it's the same team for NMaven and Byldan).
If you do want more information on NMaven or Byldan raise another question and I'll give as much info as I can (which is not a huge amount, as stated I only do very limited NET development).