I am working on a ROS package to be deployed on our lab robots. There is a feature in my package requires a third party ROS package. I don't think this package is released yet, at least I couldn't find it at ROS wiki document site. The dependent package is called ros_msg_parser for subscribing topics without knowing their msg type beforehand. Here is the link to the repo. (https://github.com/facontidavide/ros_msg_parser)
I need to mention that we use ubuntu 16.04 in all our devices. And we program with ROS, and C++.
My intention is to deploy my own ROS package to the robot without worrying about if the ros_msg_parser package is installed on the device or not.
I know a couple of ways to do it:
Use a .so library file. (We don't think this approach is the ideal way to proceed for us, since the .so library is going to be a black box for other colleagues in lab in future, and no way to know its version and so no.)
Release ros_msg_parser ROS repo and use it as a ros eco-environment program, such as std_msgs.
And at last, (not we want) we could build/install this ros_msg_parser package on all our devices.
I have also researched on externalpackage_add, to build/install ros_msg_parser as a third party library. Then I realized that I am using a ros package as my dependency, not really the standard way of build && cmake && install. Correct me if I am wrong.
I have desired package working alright now, by catkin_make the ros_msg_parser package in my working space together.
I am just wondering if any one can help me with things like if there is any approach I can do or any where I can research on my own to fulfill my goal.
Thanks in advance.
Furtunately, I have got some help from team to solve this problem. It is rather simpler than I imagined.
Here are the steps that we took to implementation:
git clone the ros pacakge source and only copy the source files to a folder called your_third_party_folder/ parallel to your_main_work_space/src folder. Remember to remove your git clone histories etc, you will only need the source files, otherwise your main work space won't work well with your own repo. Due to a dirty repo prompt, you will not be able to push our third_party project to your repo. Maybe there are another ways to solve this, but it is just simpler to copy the source files to a folder where want.
work on your two CMakeLists.txt files, make sure to inlcude, link and target some libraries to pass catkin_make
and don't forget to add_subdirectory(YOUR_THIRD_PARTY_PACKAGE) in your main workspace CMakeLists.txt file.
Note it took me quiet some time to fix the compiling process, but finally the third_party project is installed with no .so file and no local library installation.
Related
I am trying to build the latest GNURadio package on my development system. Unfortunately this system configuration is tightly controlled and I can't just install new packages of software on it as it is used to develop a product and all development systems are kept in lockstep. We are currently on an older version of RedHat.
While I cannot modify the system includes I can download and use newer versions of packages locally (in non-system directories) as long as that doesn't affect the product build/debug environment. Normally this isn't a problem.
However, when building GNURadio I found that our development platforms use an older version of the Boost libraries than is required to build GNURadio. So, I got the latest version of Boost and extracted it into my local (home) directory. I found several directions for, I thought, instructing CMake to use additional include directories. Unfortunately, this hasn't seemed to work with the Boost libraries. CMake keeps complaining that it finds the older version of Boost and not the newer one I have extracted locally.
I have tried using
-DCMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES=<dir>
and
-DCMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES_BEFORE=<dir>
and this had no effect. I then tried adding the following to the top-level CMakeLists.txt file:
SET(CMAKE_INCLUDE_DIRECTORIES_BEFORE ON)
SET(CMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES <dir>)
or, even
include_directories(BEFORE <dir>)
Again, no joy.
I did a bit of digging and found that there is a GrBoost.cmake module and it had an additional configuration for the boost directory so I added this:
list(PREPEND BOOST_LIBRARYDIR "<dir>")
to the top of the file. Again, no luck.
I've never used CMake before (and I'm not really keen on learning yet another build system if I don't have to - our company just switched to bazel and I am coming up to speed on that) so I am flying blind here.
What do I have to do to get CMake to look in my local directory to find the Boost stuff I downloaded?
Ok. As it often happens, just after asking the question I was able to find an answer.
It turns out that there is a command-line option to CMake (CMAKE_PREFIX_PATH=<dir>) where you can specify additional base paths to search for CMake config files. I just added this to the command-line and it was found just fine.
I wasn't even aware that Boost came with such config files. Live and learn.
#vre's comment would have probably worked just as well (maybe better, in fact).
Does anyone know, how to relocate the QT5 installation on Linux? Especially, all the set paths of the files in the mkspecs directory?
Any tool or script letting Qt5 create these files again is o.k.
I'd like to deploy the files of QT5's "lib/bin/mkspecs..." via a central repository on other computers to be able to seemingless do the compilation.
And no, I really don't want to use the systems QT5 by some package manager.
Thanks for your help!
I have attempted this a few times. It gets really hairy really fast, and my only recommendation is that you re-install to the new location.
The reason is that the path of the installation is hard-coded into so many files in so many locations that even search-replacing them all is really error prone.
1) Download source and make your custom Qt build - http://doc.qt.io/qt-5/build-sources.html
2) Make own package\installer of your builded Qt libraries.
If you want own Qt build for run particular application. Then make right installer for that application include all necessary files.
I'm a maintainer of a program that I'd might like to propose for inclusion in the Cygwin distribution.
We use CMake so there is a packager available, and it's easy to create a .bz2 package.
Once I've created the package, how can I try it locally? In Linux this can easily be done, but is there a way to use the Cygwin package installer so that it picks up a local package?
I've read the package contribution documentation and related pages but can't find an answer.
The CMake Cygwin package generator seems extremely out-of-date. Cygwin hasn't used .bz2 for some years. This is from a Cygwin-mailing list answer from Adam Dinwoodie:
Cygwin packages generally use Cygport to define the build process and
so forth. It's more-or-less the equivalent of rpmbuild for RPM
packages, and similar tools for other distribution systems. The
documentation for Cygport is at http://cygwinports.github.io/cygport/;
if you're using make in a reasonably standard way, most things should
Just Work™.
In particular, if you're using Cygport, it'll automatically do things
like creating setup.hint files for you.
For testing locally, I find it's simplest to just do tar -xaC/ -f
<tarball> on the compiled tarballs that Cygport generates. That
doesn't test the dependency management or anything that requires
post-install scripts, but it's fine for checking the installation
itself works.
I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.
We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier