I'm using the old g77 compiler (http://people.tamu.edu/~matthewmccleskey/g77.html) but can't seem to find out how to use external DLLs in my code. Is it even possible, or would I have to get a newer compiler?
I have both DLL and LIB file.. The function is named GetDBI (_GetDBI#32).
Is it even possible to use DLLs in fortran 77?
You seem to be using Windows, where one problem is that there's a plethora of ABI's to choose from. AFAIK g77 supports only the default one that the accompanying gcc supports (cdecl?). There are also some issues with COMMON (static) data in DLL's on Windows, AFAIK.
The successor of g77, gfortran, has some support for different calling conventions as well as handling of common and module variables in DLL's, see http://gcc.gnu.org/onlinedocs/gfortran/GNU-Fortran-Compiler-Directives.html
Related
I am trying to create a single DLL for librsvg that does not require any other DLL and can be used within a MSVC application. Using MinGW/MSYS I have compiled librsvg (and its 20 dependencies) and produced a DLL but all libraries are shared and ultimately I need 21 DLLs. I have read and tried many dozens of articles and tried many different scenarios with the linker flag -static. I have used --enable-static on all the dependencies and produced the .a static libraries for each. However, I cannot seem to reach the end goal. When compiling librsvg I can either produce the DLL with shared libraries or I can produce a shared library for it w/o creating a DLL - I cannot reach the goal of a single DLL. I have also gone down the path of manually changing the makefile for librsvg and changing all the -l{library} lines to library.a and ultimately never got a clean compile due to issues with libtool. If this is the correct path I will continue; however, it seems the entire purpose of libtool is to accomplish this goal w/o the need to modify the makefile. Is there a better way? Is this even possible?
I'm trying to link my project to a external library that I also developed in which also also use CMake to build. When I try to find RelWithDebInfo or MinSizeRel like this:
FIND_LIBRARY(PCM_LIBRARY_DEBUG pcm
PATHS #CMAKE_LIBRARY_OUTPUT_DIRECTORY#
#CMAKE_LIBRARY_OUTPUT_DIRECTORY#/Debug
NO_DEFAULT_PATH
)
FIND_LIBRARY(PCM_LIBRARY_RELEASE pcm
PATHS #CMAKE_LIBRARY_OUTPUT_DIRECTORY#
#CMAKE_LIBRARY_OUTPUT_DIRECTORY#/Release
#CMAKE_LIBRARY_OUTPUT_DIRECTORY#/MinSizeRel
#CMAKE_LIBRARY_OUTPUT_DIRECTORY#/RelWithDebInfo
NO_DEFAULT_PATH
)
SET(PCM_LIBRARIES debug ${PCM_LIBRARY_DEBUG} optimized ${PCM_LIBRARY_RELEASE})
It does not search in ather directories that are not Release or Debug. I also tried creating PCM_LIBRARY_RELWITHDEBINFO and PCM_LIBRARY_MINSIZEREL but the same thing happens because there is only debug and optimized prefixes in SET. Anyone knows how can I link the correct libraries?
This is unfortunately one of the shortcomings of using find_library. There is no easy way around this without introducing tons of boilerplate code.
The problem here is that when passing files as dependencies to target_link_libraries, you can only distinguish between debug and optimized. If you need more fine-grained control, you will have to manipulate the respective target properties like LINK_INTERFACE_LIBRARIES directly. This is not only quite cumbersome, it also requires detailed knowledge about the inner workings of CMake's property system.
Fortunately, there is another way: The aforementioned limitation only applies when specifying dependencies via filenames. When specifying them as targets, this problem does not occur. The most obvious example is if a library and the executable that depends on it are built from the same source:
add_library(foo_lib some_files.cpp)
add_executable(bar_exe more_files.cpp)
target_link_libraries(bar_exe PUBLIC foo_lib)
This 'just works'. The correct library will be chosen for each build configuration. Things get a little more complicated if the library and the executable live in different independent projects. In that case the library has to provide a configure file with an exported target in addition to the binary files.
Instead of calling find_library to locate the binaries, the dependent executable now just loads that config file and can then use the imported target as if it was a target from the same project.
Many modern libraries already use this approach instead of the classical find_library technique (Qt5 is a prominent example). So if you are at liberty to change the CMakeLists of your dependency and you do not need to support very old CMake versions (<2.6), this is probably the way to go.
CMake is awesome, especially with lots of modules (FindOOXX). However, when it comes to write a FindXXX module, a library XXX which your project depends, it's not that easy to handle for non-cmake-expert. I sometimes encounter a library without support to CMake, and I like to make one for it. I'm wondering if there is any interactive shell while writing/testing cmake modules?
Are you writing FindXXX for project "XXX" or is "XXX" a dependency of your project that you're trying to find? If the former, you should instead write a file called XXX-config.cmake (or XXXConfig.cmake) and install it into one of the directories mentioned in the docs for find_package. In general, XXX-config.cmake files are for projects which are expected to be found by CMake (and installed by the project itself) and FindXXX.cmake files are for projects which don't support CMake (and usually have to support finding any version of XXX).
That said, for FindXXX.cmake, usually you just need a few find_file (e.g., for headers), some find_library calls, or even just a single pkg_check_module from FindPkgConfig.cmake followed by a find_package_handle_standard_args call (use include(FindPackageHandleStandardArgs) to get it). FPHSA makes writing proper Find modules a breeze.
For XXX-config.cmake files, I have typically used configure_file to generate two versions: one for the install (which includes your install(EXPORT) file) and one for the build tree (generated by export() calls). Using this, other useful variables can be accurately set such as things like "which exact version of Boost was used" or "was Python support compiled in" so that dependent projects can get a better picture of what the dependency looks like.
I have also recently discovered that CMake ships with the CMakePackageConfigHelpers module which is supposed to help with making these files. There looks to be quite a bit of documentation for it.
I have built a server binary using cmake (and also make) for arm and x86 targets. I am able to run my server on arm using correct linking paths for RPATH, for example populating CMAKE_INSTALL_RPATH. However when I try to run my x86 server it complains about not being able to find my databases. Would I be right in saying that CMAKE_INSTALL_RPATH is used for libraries only and not to find files or databases. Is there another cmake variable that is used to find files or databases at run time or by correctly populating CMAKE_INSTALL_RPATH it should find files and databases as-well as libraries.
Thanks Paul.
You are correct that the CMAKE_INSTALL_RPATH only deals with finding shared libraries. Specifically, setting the RPATH just gives the dynamic linker a list of directories to search for shared libraries.
If you want to find a file or database at runtime from within your application, you have to get the paths into your application some other way. This could be via a config file or hardcoded constants that are different for each platform.
The title mostly covers it, what is the difference between a module and a shared library? I just found this distinction in CMake's add_library command, where they say:
SHARED libraries are linked dynamically and loaded at runtime. MODULE libraries are plugins that are not linked into other targets but may be loaded dynamically at runtime using dlopen-like functionality.
But I can load a shared object using dlopen(), can't I?
The difference is that you can link to a SHARED library with the linker, but you cannot link to a MODULE with the linker. On some platforms.
So... to be fully cross-platform and work everywhere CMake works, you should never do this:
# This is a big NO-NO:
add_library(mylib MODULE ${srcs})
target_link_libraries(myexe mylib)
To be fair, on Windows, they're both just dlls, and so this code might actually work. But when you take it to a platform where it's impossible to link to the MODULE, you'll encounter an error.
Bottom line: if you need to link to the library, use SHARED. If you are guaranteed that the library will only be loaded dynamically, then it's safe to use a MODULE. (And perhaps even preferable to help detect if somebody does try to link to it...)
I think the distinction being made is that shared libraries are specified by the developer at compile-time and must be present for the application to run, even though their methods are loaded at runtime. A module, i.e. plugin, adds additional support at runtime but isn't required. Yes, you can dlopen() a shared library but in that case it would not have been specified as a required part of the program and functions instead as a module.
Another difference is in how ..._OUTPUT_DIRECTORY and ..._OUTPUT_NAME are handled:
Module libraries are always treated as library targets. For non-DLL platforms shared libraries are treated as library targets. For DLL platforms the DLL part of a shared library is treated as a runtime target and the corresponding import library is treated as an archive target. All Windows-based systems including Cygwin are DLL platforms.
For example, this means that if you compile a SHARED library on Windows, LIBRARY_OUTPUT_DIRECTORY will be ignored, because it's looking at ARCHIVE_OUTPUT_DIRECTORY and RUNTIME_OUTPUT_DIRECTORY instead.