Rust Workspace: Is it possible to use a binary crate for integrationstests in lib crate? - testing

I have the following workspace structure:
[workspace]
members = [
"skserver", # binary
"skclient", # binary
"skcommon", # lib
"skintegrationtests" # lib
]
The intention was to have an extra lib crate for integration testing of client/server-functionality. The Cargo.toml of skintegrationtests is as follows:
# for integration tests of own programs etc.
skcommon = {path = "../skcommon"}
skclient = {path = "../skclient"}
skserver = {path = "../skserver"}
skcommon can be referenced, but not skclient (I haven't tried skserver). Is that intentional from Rust? And if so, why?
I started doing integrationtests with skcommon. I want to avoid circular dependencies with skclient and skserver, and so I created skintegrationtests.

If you want to run the skclient binary from skintegrationtests, then you're looking for RFC 3028 binary dependencies, which are not yet implemented. There isn't a clean way to do this yet other than a build script separate from Cargo that makes sure the binary is built and then runs the test.
If you want to call functions defined in the skclient package's code, then you need to modify skclient so it is a library package — has a lib.rs — and all of the functions wanted are defined there rather than main.rs. This does not prevent it from also having a binary, which can refer to the library as use skclient::whatever;.

Related

How to combine STM32 HAL library with CMake, as it need file from project it linked to [duplicate]

i have recently switched my stm32 project to CMake to be independent on IDE. Root repository (application) contains multiple submodules (HAL, FreeRTOS etc.) and its CMakeLists.txt includes explicitly every single used file:
set(EXECUTABLE ${PROJECT_NAME}.elf)
add_executable(${EXECUTABLE}
# Own sources
src/main.c
src/SEGGER_SYSVIEW_Config_FreeRTOS.c
src/startup_stm32h723zgtx.s
src/stm32h7xx_hal_timebase_tim.c
src/system_stm32h7xx.c
# Base CMSIS and HAL library
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim_ex.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_uart.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc_ex.c
#long list of HAL c files there...
# FreeRTOS library
lib-freertos/croutine.c
lib-freertos/event_groups.c
lib-freertos/list.c
lib-freertos/queue.c
lib-freertos/stream_buffer.c
lib-freertos/tasks.c
lib-freertos/timers.c
lib-freertos/portable/GCC/ARM_CM7/r0p1/port.c
lib-freertos/trace/Sample/FreeRTOSV10/SEGGER_SYSVIEW_FreeRTOS.c
lib-freertos/trace/SEGGER/Syscalls/SEGGER_RTT_Syscalls_GCC.c
lib-freertos/trace/SEGGER/SEGGER_RTT_ASM_ARMv7M.S
lib-freertos/trace/SEGGER/SEGGER_RTT_printf.c
lib-freertos/trace/SEGGER/SEGGER_RTT.c
lib-freertos/trace/SEGGER/SEGGER_SYSVIEW.c
)
target_include_directories(${EXECUTABLE}
PRIVATE
include
src
lib-hal/stm32h7xx/CMSIS/Include
lib-hal/stm32h7xx/CMSIS/Device/ST/STM32H7xx/Include
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Inc
lib-freertos/include
lib-freertos/trace/Config
lib-freertos/trace/SEGGER
lib-freertos/trace/Sample/FreeRTOSV10/
lib-freertos/portable/GCC/ARM_CM7/r0p1
)
This solution works but i know it is not a sustainable approach. So i tried to create library in lib-hal and lib-freertos submodules, specifying their sources and includes
add_library(lib-hal-stm32h7xx)
target_include_directories(lib-hal-stm32h7xx
PUBLIC
CMSIS/Include
CMSIS/Device/ST/STM32H7xx/Include
STM32H7xx_HAL_Driver/Inc
PRIVATE
STM32H7xx_HAL_Driver/Src
)
target_sources(lib-hal-stm32h7xx
PRIVATE
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim_ex.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_uart.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc_ex.c
#long list of HAL c files there...
)
and then using
add_subdirectory(lib-hal/stm32h7xx)
add_subdirectory(lib-freertos)
and
target_link_library(${EXECUTABLE} lib-freertos lib-hal-stm32h7xx)
to "import" submodules into application project. But when building the executable, gcc cannot access files stm32h7xx_hal_conf.h and FreeRTOSConfig.h which are located in root directory include. I do not want to put configuration headers into submodules because they are used in multiple projects with different configurations. Is it possible to somehow extend already specified directory search scope for library after adding it into parent project?
File structure of project:
-src
-include (configuration for lib-hal and lib-freertos included there)
-lib-hal
-includes...
-sources...
-lib-freertos
-includes...
-sources...
Thanks in advance for response.
As Tsyvarev mentioned in the comments, you can modify the properties of the target in your project. To keep things clean, I usually create a function for this and place it in a separate file.
Tip: you can also add source files to the target. In case of FreeRTOS, you could add architecture-specific files, in case all your projects don't run on the same MCU family.
function(configure_freertos target_name)
target_sources(${target_name}
PRIVATE
lib-freertos/portable/GCC/ARM_CM7/r0p1/port.c
)
target_include_directories(${target_name}
PUBLIC
include
lib-freertos/portable/GCC/ARM_CM7/r0p1
)
endfunction()

How to fix third party dll include not being staged correctly by Unreal Build Tool?

I am using a pre-built C++ library in my Unreal project using a dynamic library file (let's say it's called MyPluginLib.dll). The library is contained in a plugin, let’s call it MyPlugin.
Building, packaging, playing in editor works fine. However, a packaged build doesn’t start, giving the following error: Code execution cannot proceed, MyPluginLib.dll was not found.
The packaging process places MyPluginLib.dll file in MyGame\Plugins\MyPlugin\Binaries. However, the execution process is seemingly looking for it in MyGame\Binaries – moving the library there manually solves this issue.
Why is the OS unable to find the dll in the first folder? Is there something wrong in the build.cs, or my folder structure?
The folder structure of the plugin folder is as follows:
Includes in Plugins\MyPlugin\Source\ThirdParty\MyPluginLib\
Binaries in Plugins\MyPlugin\Binaries\(PLATFORM)\
The plugin’s Build.cs looks like this:
public class MyPlugin : ModuleRules
{
public MyPlugin(ReadOnlyTargetRules Target) : base(Target)
{
PCHUsage = ModuleRules.PCHUsageMode.UseExplicitOrSharedPCHs;
string PluginRoot = Path.GetFullPath(Path.Combine(ModuleDirectory, "..", ".."));
string PlatformString = Target.Platform.ToString();
string LibraryDirectory = Path.Combine(PluginRoot, "Binaries", PlatformString);
PublicIncludePaths.Add(Path.Combine(PluginRoot, "Source", "ThirdParty", "MyPluginLib"));
if ((Target.Platform == UnrealTargetPlatform.Win64))
{
PublicAdditionalLibraries.Add(Path.Combine(LibraryDirectory, "MyPluginLib.lib"));
RuntimeDependencies.Add(Path.Combine(LibraryDirectory, "MyPluginLib.dll"), StagedFileType.NonUFS);
}
else if (Target.Platform == UnrealTargetPlatform.Linux)
{
// linux binaries...
}
}
Would appreciate any help.
Check your packaged games files, unreal loves to not include certain thing in packaged builds regarding plugins.

Avro generated class: Cannot access class 'Builder'. Check your module classpath for missing or conflicting dependencies

Running
val myAvroObject = MyAvroObject.newBuilder()
results in a compilation error:
Cannot access class 'MyAvroObject.Builder'. Check your module classpath for missing or conflicting dependencies
I am able to access other MyAvroObject variables. More precisely, methods such as
val schema = MyAvroObject.getClassSchema()
val decoder = MyAvroObject.getDecoder()
compiles fine. What makes it even stranger is that I can access newBuilder() in my test/ folder, but not in my src/ folder.
Why do I get a compile error when using newBuilder()? Is the namespace of the avro-schema used to generate MyAvroObject of importance?
Check your module classpath generally means, that your dependencies (which you didn't provide) are messed up. One of them should read implementation instead of testImplementation, in order to have the method available in the main source-set, instead of only the test source-set - but this may well have to do with the input classes, the output location of generated classes, or annotations alike #VisibleForTesting (just see what it even generates). Command gradlew can also list the dependencies per configuration. The builder seems to be called org.apache.avro.SchemaBuilder... there's only avro-1.11.0.jar & avro-tools-1.11.0.jar. With the "builder" design pattern, .newBuilder() tries to return inner class Builder.
had the same problem today and was able to solve it by adding the following additional source folder
<sourceDir>${project.basedir}/target/generated-sources/avro</sourceDir>
to the kotlin-maven-plugin.

How to disable default gradle buildType suffix (-release, -debug)

I migrated a 3rd-party tool's gradle.build configs, so it uses android gradle plugin 3.5.3 and gradle 5.4.1.
The build goes all smoothly, but when I'm trying to make an .aab archive, things got broken because the toolchain expects the output .aab file to be named MyApplicationId.aab, but the new gradle defaults to output MyApplicationId-release.aab, with the buildType suffix which wasn't there.
I tried to search for a solution, but documentations about product flavors are mostly about adding suffix. How do I prevent the default "-release" suffix to be added? There wasn't any product flavor blocks in the toolchain's gradle config files.
I realzed that I have to create custom tasks after reading other questions and answers:
How to change the generated filename for App Bundles with Gradle?
Renaming applicationVariants.outputs' outputFileName does not work because those are for .apks.
I'm using Gradle 5.4.1 so my Copy task syntax reference is here.
I don't quite understand where the "app.aab" name string came from, so I defined my own aabFile name string to match my toolchain's output.
I don't care about the source file so it's not deleted by another delete task.
Also my toolchain seems to be removing unknown variables surrounded by "${}" so I had to work around ${buildDir} and ${flavor} by omitting the brackets and using concatenation for proper delimiting.
tasks.whenTaskAdded { task ->
if (task.name.startsWith("bundle")) { // e.g: buildRelease
def renameTaskName = "rename${task.name.capitalize()}Aab" // renameBundleReleaseAab
def flavorSuffix = task.name.substring("bundle".length()).uncapitalize() // "release"
tasks.create(renameTaskName, Copy) {
def path = "$buildDir/outputs/bundle/" + "$flavorSuffix/"
def aabFile = "${android.defaultConfig.applicationId}-" + "$flavorSuffix" + ".aab"
from(path) {
include aabFile
rename aabFile, "${android.defaultConfig.applicationId}.aab"
}
into path
}
task.finalizedBy(renameTaskName)
}
}
As the original answer said: This will add more tasks than necessary, but those tasks will be skipped since they don't match any folder.
e.g.
Task :app:renameBundleReleaseResourcesAab NO-SOURCE

How to calculate a module's dist hash

I have Perl 6 installed in ~/.rakudo-star/rakudo-star-2018.04, using LoneStar. When zef installs a module, it gets installed into a subdirectory of the Rakudo Perl 6 directory. In here is a directory named perl6/site/resources, which seem to hold all the installed files. How can I figure out which module is contained in which file, using Perl 6?
If you want to get the source of a namespace that would get loaded you can do:
my $module-name = 'Test';
# Get a Distribution object which provides an IO interface to its contents
my $compunit = $*REPO.resolve(CompUnit::DependencySpecification.new(:short-name{$module-name}));
my $distribution = $compunit.distribution;
my $handle-from-name = $distribution.content($distribution.meta<provides>{$module-name}.keys[0]).open;
say $handle-from-name.slurp(:close);
# Or if we already know the name-path:
my $handle-from-path = $distribution.content("lib/Test.pm6").open;
say $handle-from-path.slurp(:close);
Note that $compunit.distribution will only work if resolve returned a CompUnit from a CompUnit::Repository::Installation repository.
rakudo#1812 is the framework to improve this further, allowing individual repositories to be queried ( $*REPO.resolve iterates the linked list of repos to give a result ) and unifying behavior for resolve/candidates/etc between CompUnit::Repository::Installation and CompUnit::Repository::FileSystem.
If I remember correctly, you shouldn't. It's zef the one that must take care of it. But if you positively have to, use the SHA1 signatures in the directory with zef locate
zef --sha1 locate 5417D0588AE3C30CF7F84DA87D27D4521713522A
will output (in my system)
===> From Distribution: zef:ver<0.4.4>:auth<github:ugexe>:api<>
lib/Zef/Service/Shell/PowerShell/download.pm6 => /home/jmerelo/.rakudobrew/moar-2018.06/install/share/perl6/site/sources/5417D0588AE3C30CF7F84DA87D27D4521713522A
From your question, it's not too clear if what you want to do is the opposite, that is, find out which SHA1 corresponds to which file; in that case, try and to this:
zef locate bin/lwp-download.pl
which will return
===> From Distribution: LWP::Simple:ver<0.103>:auth<Cosimo Streppone>:api<>
bin/lwp-download.pl => /home/jmerelo/.rakudobrew/moar-2018.06/install/share/perl6/site/resources/059BD7DBF74D1598B0ACDB48574CC351A3AD16BC