How to handle library out of project's root?
Currently I want to use CMake to link to my project shared library that is located not in the root directory of a certain library. The library is located in lib, it's shared only between projects located in projects\project_n. Also, I need every project to build the lib independently
.
├── lib
│ ├── CMakeLists.txt
│ ├── src1.c
│ ├── hed1.h
│ ├── src2.c
│ ├── hed2.h
│ ...
│ ├── src_n.c
│ └── hed_n.h
└── projects
├── project_1
│ ├── Src
│ ├── Inc
│ ├── build
│ └── CMakeLists.txt
├── project_2
│ ├── Src
│ ├── Inc
│ ├── build
│ └── CMakeLists.txt
...
└── project_n
├── Src
├── Inc
├── build
└── CMakeLists.txt
Currently my lib\CMakeLists.txt looks like that:
cmake_minimum_required(VERSION 3.20)
file(GLOB_RECURSE SRCS *.c)
add_library(lib SRCS)
Part of code in projects\project_1\CMakeLists.txt responsible for lib linking looks like that:
add_subdirectory(../../lib build)
target_include_directories(${PROJECT_NAME} PUBLIC ../../lib)
target_link_directories(${PROJECT_NAME} PRIVATE ../../lib)
target_link_libraries(${PROJECT_NAME} lib)
Currently I'm getting Errors belows:
CMake Error at CMakeLists.txt:65 (target_include_directories):
Cannot specify include directories for target "project_1" which is
not built by this project.
CMake Error at CMakeLists.txt:66 (target_link_directories):
Cannot specify link directories for target "project_1" which is not
built by this project.
CMake Error at CMakeLists.txt:67 (target_link_libraries):
Cannot specify link libraries for target "project_1" which is not
built by this project.
Related
When I attempt to execute my CTest script it complains that "Memory checker (MemoryCheckCommand) not set, or cannot find the specified program" but DartConfiguration.tcl has been created and MemoryCheckCommand is correctly set to /usr/bin/valgrind.
This is simplified layout of my project:
.
├── SOURCES
│ ├── build
│ │ ├── CMakeCache.txt
│ │ [...]
│ ├── CMakeLists.txt
│ └── src
│ ├── CMakeLists.txt
│ [..]
├── gcc.cmake
└── TESTS
├── build
│ ├── x86
│ │ ├── CMakeCache.txt
│ │ ├── DartConfiguration.tcl
│ │ [...]
│ └── x86_64
│ ├── CMakeCache.txt
│ ├── DartConfiguration.tcl
│ [...]
├── CMakeLists.txt
├── CTestConfig.cmake
├── ctest_scripts
│ ├── TestValgrindJob64.cmake
│ └── TestValgrindJob.cmake
├── fakes
│ ├── CMakeLists.txt
│ [..]
├── libs
│ ├── googletest
│ │ ├── CMakeLists.txt
│ │ [..]
│ └── linux-9.3.0
│ ├── gmock
│ │ │
│ │ [..]
│ ├── gtest
│ │ │
│ │ [..]
│ ├── x86
│ │ ├── libgmock.a
│ │ └── libgtest.a
│ └── x86_64
│ ├── libgmock.a
│ └── libgtest.a
├── mocks
│ ├── CMakeLists.txt
│ [..]
├── stubs
│ ├── CMakeLists.txt
│ [..]
├── unittests
│ ├── CMakeLists.txt
│ [..]
├── x86_64.cmake
└── x86.cmake
Relevant bits of TESTS/CMakeLists.txt:
include(CTestConfig.cmake)
include(CTest)
#include(CTestUseLaunchers)
#enable_testing()
add_test(NAME Test1 COMMAND ${PROJECT_NAME})
CTestConfig.cmake:
set(CTEST_PROJECT_NAME "ProjectTest1")
set(CTEST_USE_LAUNCHERS YES)
CTest script TestValgrindJob64.cmake -- note 4 messages printing out MEMORYCHECK_COMMAND, CTEST_SCRIPT_DIRECTORY, CTEST_SOURCE_DIRECTORY and CTEST_BINARY_DIRECTORY:
include(${CTEST_SCRIPT_DIRECTORY}/../CTestConfig.cmake)
message("MEMORYCHECK_COMMAND ${MEMORYCHECK_COMMAND}")
message("CTEST_SCRIPT_DIRECTORY ${CTEST_SCRIPT_DIRECTORY}")
site_name(CTEST_SITE)
set(CTEST_BUILD_CONFIGURATION "Valgrind64")
set(CTEST_BUILD_NAME "${CMAKE_HOST_SYSTEM_NAME}-Val64")
set(CTEST_SOURCE_DIRECTORY "${CTEST_SCRIPT_DIRECTORY}/..")
message("CTEST_SOURCE_DIRECTORY ${CTEST_SOURCE_DIRECTORY}")
set(CTEST_BINARY_DIRECTORY "${CTEST_SCRIPT_DIRECTORY}/../build/x86_64")
message("CTEST_BINARY_DIRECTORY ${CTEST_BINARY_DIRECTORY}")
set(CTEST_CMAKE_GENERATOR "Unix Makefiles")
set(CTEST_CONFIGURATION_TYPE RelWithDebInfo)
#set(CTEST_MEMORYCHECK_COMMAND "/usr/bin/valgrind")
set(CTEST_MEMORYCHECK_COMMAND_OPTIONS "--tool=memcheck --leak-check=full --show-reachable=yes --num-callers=20 --track-fds=yes --track-origins=yes --error-exitcode=1")
set(configureOpts "-DCXXFLAGS=-m64")
ctest_empty_binary_directory(${CTEST_BINARY_DIRECTORY})
ctest_start(Experimental)
ctest_configure(OPTIONS "${configureOpts}")
ctest_build()
ctest_memcheck()
If I attempt to execute script from TESTS/ctest_scripts with ctest -S TestValgrindJob64.cmake I get the following output:
MEMORYCHECK_COMMAND
CTEST_SCRIPT_DIRECTORY /home/user1/project/TESTS/ctest_scripts
CTEST_SOURCE_DIRECTORY /home/user1/project/TESTS/ctest_scripts/..
CTEST_BINARY_DIRECTORY /home/user1/project/TESTS/ctest_scrips/../build/x86_64
Each . represents 1024 bytes of output
. Size of output: 0K
Each symbol represents 1024 bytes of output.
.. Size of output: 1K
Error(s) when building project
Memory checker (MemoryCheckCommand) not set, or cannot find the specified program.
DartConfiguration.tcl file exists before and after script is executed and, as previously mentioned, MemoryCheckCommand is set:
MemoryCheckCommand: /usr/bin/valgrind
whereis finds valgrind in /usr/bin/valgrind.
Any idea what I'm doing wrong here? Is CMake still ignoring DartConfiguration.tcl file?
My folder structure is like that:
├── SubLibA
│ ├── CMakeLists.txt
│ ├── include
│ │ └── SubLibA.h
│ └── SubLibA.cpp
├── SubLibB
│ ├── CMakeLists.txt
│ ├── include
│ │ └── structs.h
│ └── SubLibB.cpp
└── SharedLib
├── CMakeLists.txt
├── include
│ └── SharedLib.h
├── SharedLib.cpp
└── SharedLib.h
My global CMakeLists.txt looks like this:
add_subdirectory(SubLibA)
add_subdirectory(SubLibB)
add_subdirectory(SharedLib)
They all compile as static by default.
SharedLib depends on SubLibB that depends on SubLibA.
The dependent libraries SharedLib and SubLibB have:
#SubLibB
target_link_libraries(${PROJECT_NAME}
SubLibA::SubLibA
)
#SharedLib
target_link_libraries(${PROJECT_NAME}
SubLibB::SubLibB
)
Running cmake .. -DBUILD_SHARED_LIBS=ON compiles all the three libs as shared library...
Since they are tightly dependent, I'd like to keep them in the same repository with a unique CMakeLists.txt that compiles them all at once. I want to use the power of Modern CMake with the least hard-coded file and custom files as possible to keep a straightforward maintenance.
Try setting the variable within cmake:
set(BUILD_SHARED_LIBS OFF)
add_subdirectory(SubLibA)
add_subdirectory(SubLibB)
set(BUILD_SHARED_LIBS ON)
add_subdirectory(SharedLib)
set(BUILD_SHARED_LIBS OFF)
If you want SubLibA and SubLibB always be static libraries you can use the STATIC keyword on the add_library command, e.g. add_library(SubLibA STATIC ${SOURCES}) By omitting the keyword for SharedLib you are still free to build it as static or shared lib by setting -DBUILD_SHARED_LIBS=ON on the CMake command line.
I have setting up a configuration in 'wdio.conf.js' for "rpii html reporter". But its not generating master report for all suites.
const { ReportAggregator, HtmlReporter } = require('#rpii/wdio-html-reporter');
exports.config = {
reporters: ['spec', [HtmlReporter, {
debug: true,
outputDir: './reports/html-reports/',
filename: 'report.html',
reportTitle: 'Test Report Title',
showInBrowser:true
}
]],
onPrepare: function (config, capabilities) {
let reportAggregator = new ReportAggregator({
outputDir: './reports/html-reports/',
filename: 'master-report.html',
reportTitle: 'Master Report'
});
reportAggregator.clean() ;
global.reportAggregator = reportAggregator;
},
onComplete: function(exitCode, config, capabilities, results) {
(async () => {
await global.reportAggregator.createReport( {
config: config,
capabilities: capabilities,
results : results
});
})();
}
}
I expect single report with multiple test cases. But I'm getting multiple reports for each test cases.
The topic is pretty old atm, but I just addressed a similar issue in my project - cannot generate the report at all. In most of the case, it is just a matter of configuration, but there is no solid document or guideline for this painful wdio reporter configuration. So here I am, after a whole week of research and testing around, these are viable config you will need and other fellows out there who is/was facing the same issue.
First, let assume your project structure would be something like the below tree
.
├── some_folder1
│ ├── some_sub_folder1
│ ├── some_sub_folder2
├── some_folder2
├── #report
│ ├── html-reports
│ ├── template
│ │ ├── sanity-mobile-report-template.hbs
│ │ ├── wdio-html-template.hbs
├── specs
│ ├── test1
│ │ ├── test1.doSuccess.spec.js
│ │ ├── test1.doFail.spec.js
│ ├── test2
│ │ ├── test2.doSuccess.spec.js
│ │ ├── test2.doFail.spec.js
├── node-modules
├── package.json
Second, you should have templates for your reports, in my case, it is located in #report/template wdio-html-template.hbs and sanity-mobile-report-template.hbs for HtmlReporter and ReportAggregator respectively. As Rich Peters has notices above
Each suite is executed individually and an html and json file are
generated. wdio does not aggregate the suites, so this is done by the
report aggregator collecting all the files and creating an aggregate
file when complete
The HtmlReporter will actually need to find it template for generating the content for each .spec file, then there is a need for another template requested by ReportAggregator
Third, you need correct specs and suites declaration in your wdio config, generic for specs, and file specifically for suites.
Final, run your test using --suite parameter, reference to wdio guideline
My final project structure would look like this, notice the changes
.
├── some_folder1
│ ├── some_sub_folder1
│ ├── some_sub_folder2
├── some_folder2
├── #report
│ ├── html-reports
│ ├── ├── screenshots
│ ├── ├── suite-0-0
│ ├── ├── ├── 0-0
│ ├── ├── ├── ├── report.html
│ ├── ├── ├── ├── report.json
│ ├── ├── ├── 0-1
│ ├── ├── ├── ├── report.html
│ ├── ├── ├── ├── report.json
│ ├── ├── master-report.html
│ ├── ├── master-report.json
│ ├── template
│ │ ├── sanity-mobile-report-template.hbs
│ │ ├── wdio-html-template.hbs
├── specs
│ ├── test1
│ │ ├── test1.doSuccess.spec.js
│ │ ├── test1.doFail.spec.js
│ ├── test2
│ │ ├── test2.doSuccess.spec.js
│ │ ├── test2.doFail.spec.js
├── node-modules
├── package.json
Each suite is executed individually and an html and json file are generated. wdio does not aggregate the suites, so this is done by the report aggregator collecting all the files and creating an aggregate file when complete.
I'm trying to build ROOT on my MacBook using the terminal; I'm very novice when it comes to programming and downloading these things and I haven't been able to find anything that can explain to me what I need to do. This is what I've done so far: downloaded and unpacked ROOT, installed CMake, and emac. I've just been following the instructions CERN has on their website Building Root.
I made a directory to contain the build, but now I'm on the step which says "Execute the cmake command on the shell replacing path/to/source with the path to the top of your ROOT source tree." However, I have no idea what the path is to the top of my ROOT source tree, nor do I even know what that is to be honest. I'm trying to use ROOT with Xcode because it's really the only compiler I'm familiar with.
How can I find what the path is to the top of my ROOT source tree?
Tree here means the directory tree, so the "directory" or "folder" to which you unpacked root.
So, if your directory structure looks like this:
Downloads
├── PlaneTicket
├── oldThings
│ ├── Pictures
│ ├── Movies
│ ├── PDF-Documents
├── backup
│ ├── data
│ └── dataset
├── backup2
│ ├── data
│ └── dataset
├── build (<- I assume your build directory is there)
├── ROOT
│ ├── bindings
│ │ ├── doc
│ │ ├── pyroot
│ │ ├── r
│ │ └── ruby
│ ├── build
│ │ ├── misc
│ │ ├── package
│ │ ├── rmkdepend
│ │ ├── unix
│ │ └── win
│ ├── cmake
│ │ ├── modules
│ │ ├── patches
│ │ └── scripts
│ ├── config
│ ├── core
│ │ ...
│ ├── doc
│ │ ...
│ ├── documentation
│ │ ...
│ ...
└── very_old_files
Then your cmake commands should looks like this
cmake ../ROOT
If you've followed CERN's instructions that far, you can simply use the command 'dir' or 'ls' and the path to source will print to screen. Type 'cmake', and then copy/paste that in front of it to build ROOT.
I've been working with Silex a few times now. And I like it, but sometimes, the documentation confuses me simply because they use another folder structure.
Who can tell me which folder structure they use exactly in Silex 2.0?
Documentation
├── composer.json
├── composer.lock
├── vendor
│ └── ...
└── web
└── index.php
Where are the views, controllers etcetera stored?
Silex is not a "convention over configuration" framework: it does not prescribe nor care what the structure of your file system or application organisation is; that is why there's no mention of such things in the docs.
Just organise things the way that best suits your own needs.
Just for example, a directory structure I usually use.
├── config
│ └── dev.php
│ └── test.php
│ └── ...
├── src PSR-4 compatible directory structure
│ └── Component Customized components (Symfony's components or any other)
│ └── Security
│ └── ...
│ └── Validator
│ └── ...
│ └── ...
│ └── Controller
│ └── DataFixtures
│ └── Migrations
│ └── Provider My service providers
│ └── Serivice My services
│ └── Auth
│ └── ...
│ └── Registration
│ └── ...
│ └── ...
│ └── Application.php Customized Silex application class
├── tests
├── var
│ └── cache
│ └── log
│ └── ...
├── vendor
│ └── ...
├── web
│ └── index.php
│ └── index-test.php
├── composer.json
├── composer.lock
And my implementation on GitHub. It's currently WIP, but you can use it as a boilerplate for your Silex application.