Best way to access DLLs during development - dll

I'm creating a development environment that should be easy to install on new systems. Therefore I created library packages (OpenCV, boost, Qt etc) all placed in a single folder (environment_folder) which is accessed by an environment variable by Visual Studio projects/solutions (so for a single compiler the whole folder can be copied and only the variable has to be set).
Now to execute the binary it needs access to the DLLs of those libraries.
I typically copy all needed DLLs to the binary directory, but that's neither nice nor pratical.
Another way would be to add all the libraries' pathes to the PATH variable, but that's a lot of work to perform while/after installing the library packages, so I would like to avoid it. Additionally it is error-prone when testing new versions of libraries for example.
A third way that comes in my mind would be to copy all the DLLs to a single package_bin folder, so only a single path has to be added to the PATH variable (which is additonally relative to the environment_folder), but I don't like that idea.
Is there any better way to do it?

Related

cmake: package config for installing arbitrary file dependencies for a target

I am creating a cmake package config file (a Foo-config.cmake) for a pre-existing .dll not created by cmake.
The annoying thing is that the .dll depends on some data files.
When a user consumes my package in his own cmake project, I want the INSTALL target to install both the .dll and data files to his specified install location. I don't want him to have to write extra install() rules to do that.
Is it good practice to write the install() rules directly in my Foo-config.cmake? Or is there a better way to do this, maybe with set_target_properties()? I just couldn't find the appropriate property for associating arbitrary file dependencies to a target.
In an alternate universe where this .dll didn't already exist and I had to create it myself using cmake, would I need to create a custom Foo-config.cmake, or is there something in cmake that can automatically generate it for me to achieve the same thing?
FWIW the .dll is an internal legacy library and is normally built by Visual Studio and uploaded in a .zip file to our internal artifactory. I want us to migrate away from manually pulling down .zip files from artifactory and manually integrating the files into Visual Studio projects.
I've since found that there are a couple different ways to do this:
In the config file, simply create one or more variables for the files/dirs you want to install. Then install those using install(FILES) and/or install(DIRECTORY). More info: https://stackoverflow.com/a/46361538/189341
Use file(GET_RUNTIME_DEPENDENCIES). More info:
https://discourse.cmake.org/t/installing-a-pre-built-module-and-its-various-dependencies/5227
How to use cmake file( GET_RUNTIME_DEPENDENCIES in an install statement?
Is it good practice to write the install() rules directly in my Foo-config.cmake?
No.
From 480 *-config.cmake and *Config.cmake files on my system none calls install().
Or is there a better way to do this, maybe with set_target_properties()?
No.
In an alternate universe where this .dll didn't already exist and I had to create it myself using cmake, would I need to create a custom Foo-config.cmake
No. This is unrelated to if you create a .dll or not. If .dll exists, there is no need to create Foo-config.cmake anyway. It is your choice that you want to (or make users to) use find_package.
is there something in cmake that can automatically generate it for me
No.
If you don't intent to support find_package features - VERSION OPTIONAL_COMPONENTS PATHS HINTS CONFIGS etc. - then just go with include(). find_package is just include() with some extra options.
If you want to have install() in your find_package, then just protect it with a variable, like if (FOO_DO_INSTALL) install(....) endif().

Is File.Open directory behavior different in x86 vs x64

I am working on a application built in VB.Net that allows a document to be uploaded and saved into a database. I did not build this application, but I do maintain it, put enhancements in it here and there. The target framework is .Net4
One of the functionalities within this process when uploading and saving the document it uses the method File.Open() to access the file and run other methods to compress it. The method that uses File.Open takes in a parameter that passes just the filename, not the entire path of where it came from.
When this application is running on an x64 machine I receive an error (System.IO.FileNotFoundException) when the code hits the File.Open method, complaining that it cannot find the file to open. It is expecting the file to be in the programs executing directory, which does make sense because it is only given the filename to go off, not the entire directory that it came from.
What's getting to me, is that this exact same application (exact same built assemblies) will run fine when run on an x86 system. It does not fail on File.Open() It still passes just the filename, but somehow, it will know the directory information.
How is this possible?
It's worth noting, that the method that contains the File.Open() method is in a different project in the same solution. It's a referenced DLL. e.g. MyApp.exe (Windows Form Application) references MyUtil.dll (Class Library). I have built against x86, x64 and AnyCPU configurations.
I understand that the fix to this would be to just pass the entire directory to the method, but what I need to know is how this is even possible? I want to better understand why this would happen, and hopefully this would help someone else better understand how assemblies may differ between different system environments.
EDIT: Using an absolute path did fix the underlying issue. See the comments below for some good information on this scenario
Windows has special handling for certain folder names on 64bit systems depending on whether you have a 32bit or 64bit process. Notably, the Program Files folder and the System32 folders map differently depending on what kind of process you have.
Note that this is a difference in Windows itself. It's not a behavior that is unique to .Net or Visual Basic. Any program platform that uses Windows native file handling will give you these results.
This is why you should use appropriate relative paths or the SpecialFolders enumeration, rather than hard-coding full path names, and be careful about where you put things you expect to reference later; you might find they end up in a different location than you expected. Often, the AppData or ProgramData folders are the more correct location, instead of the Windows or Program Files folders.

MSBuild shared .targets file

I have several .csproj files that I will be importing a common .targets file into, to extend the build process. The projects are in different directories. The .targets file is in the solution directory. How do I refer to the location of the .targets file to import it? There's a solution directory property, but this doesn't work if the developer just builds a project. What do I do? I am using .NET 4.5 and Visual Studio 2015.
As you figured a project doesn't know about a solution it's contained in, and arguably it shouldn't. So there's not much you can do to programmatically figure out where, from the project's point of view, a totally unrelated file is situated. Apart from scanning the entire filesystem for it. There are some alternatives:
rely on a proper directory structure. You do this already anyway, since you use a solution which also needs to find projects in a fixed location. So suppose you have a main project dir with projectA/a.vcxproj, projectB/b.vcxproj and solutionDir/ab.sln and solutionDir/my.targets then in a and b just <Import Project="$(MSBuildProjectDirectory)..\solutionDir\my.targets"/>
require a property (or environment variable) which is set to the location of the targets file and then use <Import Project="$(SomeDir)\my.targets"/>
put your targets file in a 'known' msbuild location like the Importbefore/ImportAfter directories, mentioned here for instance.
I've used all of these at one point and in the end the first is in my opinion the better one: you just have to stick with a directory convention - you need that anyway for projects spanning mulriple directories or with common shared stuff - and that's it. For example we have a ton of common msbuild files and they're in a single repository. Starting a new project always comes down to creating a directory, cloning the common files dir and adding a new project dir. That can it turn easily be automated, also works well on typical CI servers. The second option is also doable, but it relies on a properly setup environment which is less 'self-contained' and gets really messy if developpers start entering the variable in the machines' global environment variable settings, and in the local ones, and so on. Similar problems with the third one but worse since now there's only one correct location.

How to add custom content to a CMake project?

We recently started switching over from using plain visual studio projects to using proper CMake files. Previously we would have the "Content" folder in the solution root folder to allow our executables to access content from it using a relative path like "../Tiles/tileset1.png".
How could we make sure CMake copies the files correctly, or in some other way makes sure that our executables are able to find the content folder while debugging from Visual Studio without manually setting the working directory?
I can think of a few different options:
Have CMake put all your executables in the same folder, as described in this question. Then you can use ../Tiles or ../../Tiles or whatever as you've been doing. Note, however, that you might want to consider setting this on a per-target basis instead of globally, e.g., using:
set_target_properties(
my_target
PROPERTIES
RUNTIME_OUTPUT_DIRECTORY
${CMAKE_BINARY_DIR}/bin
)
Setting CMAKE_RUNTIME_OUTPUT_DIRECTORY works fine, but some people consider it to be the 'old' way of doing it. (Depending on your needs, you might also want to set LIBRARY_OUTPUT_DIRECTORY, and possibly ARCHIVE_OUTPUT_DIRECTORY.)
Use an environmental variable (e.g., CONTENT_ROOT or some-such) to locate the resources. Hard-code a default that makes sense for production, but let developers override it for their particular work flow.
Look into cross-platform resource libaries (something like Qt's QRC files, but perhaps not tied to Qt).
Try the CMake modules listed in this FAQ answer to change Visual Studio's working/debugging directory.
Actually, a combination of 1 and 2 is probably your best bet...

Best practice for storing and referencing DLL libraries?

Often times a developer on my team will create a new Visual Studio project and reference a DLL somewhere on their local machine (e.g., C:\mydlls\homersimpson\test.dll). Then, when I get the project from the source control repository, I cannot build the project because I do not have the referenced dll in the exact same location on my machine.
What is the best practice for storing and referencing shared libraries?
I typically create a lib folder in my project, where I put the referenced dll's. Then I point the reference to the dll in the lib folder. This way, every developer can build the project after retrieving from source control.
If it's a project that was built in house, you could also add that project to your solution.
If the assembly is not in the GAC, create a directory called dependencies and add all assemblies there. The folder and the assemblies are added to source control. The rule is that given any project in source control, all that is required to build is to do a checkout and build the project (or run some tool that is also checked into the project).
If you add a folder to the solution and add the assemblies to the solution folder, this also provides a visual cue to the devs that indicates what external dependencies are present... all dependencies are in that directory. Relative paths ensure that Visual Studio can locate the references without a problem.
For large solutions, with 20+ projects, this makes life much easier!
Best practice I would expect would have Your SC repository include and enforce the relative locations of referenced objects for you (usually via a shared path), so you aren't dealing with this issue directly. The original developer should check in this information.
If you check in the actual DLLs into source control, then you can reference them by relative path and all developers will automatically get any dependencies when they next update the project.
Adding a DLL reference by full path would be a developer error just as adding a source file by full path would be an error.
Rule of thumb: If the project isn't a part of the solution, reference released dlls from a source controlled /binshare or /lib directory that is under your solution's source control tree. All external dependencies should have versioned DLLs that go in this /binshare directory.
I understand what your co-worker is doing in regards to convenience. However, that developer's approach is diametrically opposed to proper configuration/build management.
Example: If you use the MS Data Application Block as a dependency in your application, you should reference a properly released binary, instead of getting latest from MS's dev source trunk.
I think this approach is quite the opposite of what I would consider best practice. I think it would be a much better approach to keep the third party binaries out of the source repository and reference them through something like a Maven repository in your build process. Putting the dlls in the source repository unnecessarily bloats the contents of the repository and results in gets of projects taking considerably longer then necessary. It also makes the independent management of the third party binaries' versions obfuscated by not referencing the version by name but rather implied by referencing the dll of a particular version stored in the projects lib folder.
Why not set up a private NuGet-feed? This way, there is only a single copy of a dependency (the NuGet repository) and multiple projects can reference it. Multiple versions of the dependency can coexist, and each project can reference a different version, if necessary. Also, TFS Build can restore the packages at build time.
Configuring VS: https://www.visualstudio.com/en-us/docs/package/nuget/consume