How to copy a whole directory recursively with an npm script on Windows 10 Powershell? - npm-scripts

How to copy a whole directory recursively with an npm script on Windows 10 Powershell?
Right now I have the following tree:
test
├───1
│ package.json
│
└───2
└───src
│ asd.txt
│
└───asd
asd - Copy (2).txt
asd - Copy.txt
asd.txt
What I want is a script that when run in dir 1 it goes to dir 2 and copies the whole dir src recursively from there to dir 1. So in the end I would have a similar src in 1 as there is in 2.
When I cd to the directory 1 and run npm run build:ui which is defined in package.json as
"scripts": {
"build:ui": "cd ..\\2 && copy src ..\\1"
}
it starts doing kind of what I want but not quite; it copies stuff from directory 2 to 1. The problem is it doesn't copy the whole directory with all of its subdirectories and all the possible contents, instead it just copies the files from directly inside 2/src/. In other words, here's what the tree looks like after the operation:
test
├───1
│ asd.txt
│ package.json
│
└───2
└───src
│ asd.txt
│
└───asd
asd - Copy (2).txt
asd - Copy.txt
asd.txt
So only the file asd.txt got copied.
Other configurations I have tried without success include:
"scripts": {
"build:ui": "cd ..\\2 && copy -r src ..\\1"
}
"scripts": {
"build:ui": "cd ..\\2 && Copy-Item -Recursive src ..\\1"
}
"scripts": {
"build:ui": "cd ..\\2 && cp -r src ..\\1"
}
...none of which are even valid.

Consider utilizing the xcopy command instead of copy as it better suits your requirement.
Redefine your build:ui script in the scripts section of your package.json file as follows:
Scripts section of package.json:
"scripts": {
"build:ui": "xcopy /e/h/y/q \"../2/src\" \"./src\\\" > nul 2>&1"
}
Running:
When you cd to the directory named 1, (i.e. the directory that contains the package.json with the aforementioned build:ui script defined in it), and then run:
npm run build:ui
it will produce the resultant directory structure:
test
├── 1
│   ├── package.json
│   └── src
│   ├── asd
│   │   ├── asd - Copy (2).txt
│   │   ├── asd - Copy.txt
│   │   └── asd.txt
│   └── asd.txt
└── 2
└── src
├── asd
│   ├── asd - Copy (2).txt
│   ├── asd - Copy.txt
│   └── asd.txt
└── asd.txt
As you can see, the src folder inside folder 2, and all of it's contents, has been copied to folder 1.
Explanation:
The following provides a detailed breakdown of the aforementioned xcopy command:
Options:
/e - Copy folders and subfolders, including empty folders.
/h - Copy hidden and system files and folders.
/y - Suppress prompt to confirm overwriting a file.
/q - Do not display file names while copying.
Notes:
Each pathname has been encased in JSON escaped double quotes, i.e. \"...\"
The ./src\\ part has a trailing backslash (\), which has been JSON escaped (\\), to inform xcopy that the destination is a directory. This also ensures the src directory is created if it doesn't already exist.
The > nul 2>&1 part suppresses the confirmation log that states how many files were copied.
Related information:
It's worth noting that on Windows npm utilizes cmd.exe as the default shell for running npm scripts - regardless of the CLI tool you're using, e.g. PowerShell. You can verify this by utilizing the npm-config command to check the script-shell setting. For instance run the following command:
npm config get script-shell
Edit:
If you want your resultant directory structure to be like this:
test
├── 1
│   ├── asd
│   │   ├── asd - Copy (2).txt
│   │   ├── asd - Copy.txt
│   │   └── asd.txt
│   ├── asd.txt
│   └── package.json
└── 2
└── src
├── asd
│   ├── asd - Copy (2).txt
│   ├── asd - Copy.txt
│   └── asd.txt
└── asd.txt
This time the contents of the src folder inside the folder named 2 has been copied to folder 1 - but not the actual containing src folder itself.
Then you need to define your npm script as follows:
"scripts": {
"build:ui": "xcopy /e/h/y/q \"../2/src\" \".\" > nul 2>&1"
}
Note: the destination path has been changed from \"./src\\\" to \".\".

For something like this, I might use an approach similar to the below.
Modify your NPM script (build:ui) to call a Powershell script(build.ui.ps1) that is located in the same dir as the package.json file.
"scripts": {
"build:ui": "#powershell -NoProfile -ExecutionPolicy Unrestricted -Command ./build.ui.ps1"
},
Create the aforementioned Powershell script with the following contents.
param(
$srcParentDir = '2',
$srcDir = 'src',
$srcDestDir = '1'
)
Set-Location (get-item $PSScriptRoot).parent.FullName
Copy-Item -Path "$srcParentDir\$srcDir" -Destination $srcDestDir -Recurse
Run the npm script
npm run build:ui

Related

How to properly use CMake to create a complex project with dependencies on custom libraries

I'm trying to create a complex project that becomes a single executable file that uses the following libraries: two libraries BHV and HAL that use one interface library.
I have this project structure:
.
├── BHV
│   ├── CMakeCache.txt
│   ├── CMakeFiles
│   ├── cmake_install.cmake
│   ├── CMakeLists.txt
│   ├── include
│   ├── libBHV_Library.so
│   ├── Makefile
│   └── sources
├── HAL
│   ├── check_libraries.cmake
│   ├── CMakeCache.txt
│   ├── CMakeFiles
│   ├── cmake_install.cmake
│   ├── CMakeLists.txt
│   ├── include
│   ├── libHAL_Library.so
│   ├── Makefile
│   └── sources
├── Interface
│   ├── CMakeCache.txt
│   ├── CMakeFiles
│   ├── cmake_install.cmake
│   ├── CMakeLists.txt
│   ├── include
│   ├── libInterface_Library.a
│   ├── Makefile
│   └── sources
├── CMakeCache.txt
├── CMakeFiles
├── cmake_install.cmake
├── CMakeLists.txt
├── main.cpp
├── Makefile
├── README.md
Unfortunately, I can't connect the individual libraries to each other.
In Interface_lib CMakeList.txt I have this:
cmake_minimum_required(VERSION 3.10)
project(Interface_Library)
#requires at least C++17
set(CMAKE_CXX_STANDARD 17)
# Add all .cpp files from sources folder
file(GLOB SOURCES "sources/*.cpp")
# Add all .h files from include folder
file(GLOB HEADERS "include/*.h")
# Add main.cpp to the project
add_library(Interface_Library STATIC ${SOURCES} ${HEADERS})
In HAL_lib CMakeList.txt I have this:
cmake_minimum_required(VERSION 3.10)
project(HAL_Library)
# requires at least C++17
set(CMAKE_CXX_STANDARD 17)
################################### DIR_MANAGMENT #########################################
# Get the parent directory
get_filename_component(PARENT_DIR ${CMAKE_CURRENT_LIST_DIR} DIRECTORY)
#Set the directory of the parent file as the current directory
set(CMAKE_CURRENT_LIST_DIR ${PARENT_DIR})
message("MYPROJECT_DIR directory: ${CMAKE_CURRENT_LIST_DIR}")
################################### HAL_LIB #########################################
# Add all .cpp files from sources folder
file(GLOB SOURCES "sources/*.cpp")
# Add all .h files from include folder
file(GLOB HEADERS "include/*.h")
# Add main.cpp to the project
add_library(HAL_Library SHARED ${SOURCES} ${HEADERS})
################################### INTERFACE_LIB #########################################
target_link_directories(${PROJECT_NAME} PUBLIC ${CMAKE_CURRENT_LIST_DIR}/Interface)
# Link the Interface_Library into the HAL_Library
target_link_libraries(HAL_Library Interface_Library)
# check if libraries were included
set(TARGET_NAME HAL_Library)
include(check_libraries.cmake)
this is the code i use to check_libraries.cmake (from the internet)
# Get a list of referenced libraries
get_target_property(LINK_LIBS ${TARGET_NAME} LINK_LIBRARIES)
# Print the list of referenced libraries
message("Odkazované knihovny: ${LINK_LIBS}")
# Verify that libraries are available on the system
foreach(LIB ${LINK_LIBS})
execute_process(COMMAND ldd $<TARGET_FILE:${TARGET_NAME}> | grep ${LIB}
RESULT_VARIABLE res
OUTPUT_QUIET ERROR_QUIET)
if(res EQUAL "0")
message("Library ${LIB} was successfully linked with ${TARGET_NAME}")
else()
message(FATAL_ERROR "Error: Library ${LIB} not found.")
endif()
endforeach()
As an output I keep getting the library not found. What am I doing wrong?
And is my approach to project structure correct?
Thank you.
Typically you do:
# ./CMakeLists.txt
cmake_minimum_required(VERSION 3.10)
project(Interface_Library)
add_subdirectory(HAL)
add_subdirectory(Interface)
# HAL/CMakeListst.txt
file(srcs GLOB sources/*.c include/*.h)
add_library(HAL ${srcs})
target_include_directories(HAL PUBLIC include)
target_link_libraries(HAL PUBLIC Interface)
# Interface/CMakeLists.txt
file(srcs GLOB source/*.c include/*.h)
add_library(Interface ${srcs})
target_include_directories(HAL PUBLIC include)
That's all. Only one project() call, in the root. Note the target_include_directories missing in your code. Note the main root CMakeLists.txt that includes all. I find HEADERS and SOURCES separately confusing, I would also use lowercase everything. No target_link_directories, CMake will find everything. Also, everything is compiled as one big project, not separately.

Can I make a CMake target dependent upon a target in another CMake project?

I have two separate projects, but one of them must now incorporate aspects of the other, including the generation of some code, which done by a Python script which is called by CMake.
Here is my project structure:
repo/
├── project_top/
│ ├── stuff_and_things.cpp
│ └── CMakeLists.txt
│
└── submods/
└── project_bottom/
├── CMakeLists.txt
└── tools/
├── build_scripts
│ └── cmake_bits.cmake
└── generator
└── gen_code.py
In repo/submods/project_bottom/tools/build_scripts/cmake_bits.cmake there is a macro set_up_additional_targets(), which includes a custom target which runs repo/submods/project_bottom/tools/generator/gen_code.py in that directory. This is based on project_bottom being its own project.
add_custom_target(gen_code
COMMAND echo "Generating code"
COMMAND python3 gen_code.py args
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/tools/generator
)
Now, I need to make a new target in project_top dependent upon the gen_code target in project_bottom. How do I do this? The gen_code target needs to be run as part of the project_top build, but within the context of project_bottom, because for that target, ${CMAKE_CURRENT_SOURCE_DIR} needs to be repo/submods/project_bottom, not repo/project_top.

combine to outputs of diffrent rules in Snakemake

I would like to use snakemake to first merge some files and than later process other files based on that merge. (Less abstract: I want to combine control IGG bam files of two different sets and than use those to perform peakcalling on other files.
In a minimal example, the folder structure would look like this.
├── data
│   ├── toBeMerged
│   │   ├── singleA
│   │   ├── singleB
│   │   ├── singleC
│   │   └── singleD
│   └── toBeProcessed
│   ├── NotProcess1
│   ├── NotProcess2
│   ├── NotProcess3
│   ├── NotProcess4
│   └── NotProcess5
├── merge.cfg
├── output
│   ├── mergeAB_merge
│   ├── mergeCD_merge
│   ├── NotProcess1_processed
│   ├── NotProcess2_processed
│   ├── NotProcess3_processed
│   ├── NotProcess4_processed
│   └── NotProcess5_processed
├── process.cfg
└── Snakefile
Which files are combined and which are processed are defined in two config files.
merge.cfg
singlePath controlName
data/toBeMerged/singleA output/controlAB
data/toBeMerged/singleB output/controlAB
data/toBeMerged/singleC output/controlCD
data/toBeMerged/singleD output/controlCD
and process.cfg
controlName inName
output/controlAB data/toBeProcessed/NotProcess1
output/controlAB data/toBeProcessed/NotProcess2
output/controlCD data/toBeProcessed/NotProcess3
output/controlCD data/toBeProcessed/NotProcess4
output/controlAB data/toBeProcessed/NotProcess5
I am currently stuck with a snakefile like this, which itself does not work and gives me the error that both rules are ambiguous. And even if I would get it to work, I suspect, that this not the "correct" way, since the process rule, should have {mergeName} as input to build its dag. But this does not work, since then I would need two wildcarts in one rule.
import pandas as pd
cfgMerge = pd.read_table("merge.cfg").set_index("controlName", drop=False)
cfgProc= pd.read_table("process.cfg").set_index("inName", drop=False)
rule all:
input:
expand('{mergeName}', mergeName= cfgMerge.controlName),
expand('{rawName}_processed', rawName= cfgProc.inName)
rule merge:
input:
lambda wc: cfgMerge[cfgMerge.controlName == wc.mergeName].singlePath
output:
"{mergeName}"
shell:
"cat {input} > {output}"
rule process:
input:
inMerge=lambda wc: cfgProc[cfgProc.inName == wc.rawName].controlName.iloc[0],
Name=lambda wc: cfgProc[cfgProc.inName == wc.rawName].inName.iloc[0]
output:
'{rawName}_processed'
shell:
"cat {input.inMerge} {input.Name} > {output}"
I guess the key problem is how to use the output of a rule as the input for another one, when it does not depend on the same wildcard, or includes other another wildcard.
For future reference:
The problem did not seem to be the "using the output of a rule as the input for another one, when it does not depend on the same wildcard, or includes other another wildcard."
It seems that the input for rule all and the output for the other two rules where ambiguous. The simple solution is to put every output in a different directory and it worked (see below).
import pandas as pd
cfgMerge = pd.read_table("merge.cfg").set_index("controlName", drop=False)
cfgProc= pd.read_table("process.cfg").set_index("inName", drop=False)
#ruleorder: merge > process
rule all:
input:
expand('output/bam/{rawName}_processed', rawName= cfgProc.inName),
expand('output/control/{controlNameSnake}', controlNameSnake= cfgMerge.controlName.unique())
rule merge:
input:
lambda wc: cfgMerge[cfgMerge.controlName == wc.controlNameSnake].singlePath.unique()
output:
'output/control/{controlNameSnake}'
shell:
'echo {input} > {output}'
rule process:
input:
in1="data/toBeProcessed/{rawName}",
in2=lambda wc: "output/control/"+"".join(cfgProc[cfgProc.inName == wc.rawName].controlName.unique())
output:
'output/bam/{rawName}_processed'
shell:
'echo {input} > {output}'

lessc.cmd cannot find #import file, path resolves fine?

The above code displays the following error:
cmd.exe /D /C call C:/Users/<user>/AppData/Roaming/npm/lessc.cmd --no-color style.less
FileError: '/styles/vars.less' wasn't found. Tried - /styles/vars.less,D:\projects\ui\themes\default\styles\vars.less in D:\projects\ui\themes\default\style.less on line 1, column 1:
1 #import "/styles/vars.less";
2
Process finished with exit code 1
But if I change the path to something invalid:
Now I see "Cannot resolve file" as well as the unresolved variable. Is this a bug or something I'm doing wrong?
Here is the folder structure of the relevant files:
(root)/
├── styles/
│ ├── vars.less
├── themes/
│ ├── default/
│ └── style.less (the file with the error)

NSIS - check if process exists (nsProcess not working)

For my NSIS uninstaller, I want to check if a process is running. FindProcDLL is not working under Windows 7 x64, so I tried nsProcess.
I've downloaded the version 1.6 from the website: http://nsis.sourceforge.net/NsProcess_plugin
If I start the nsProcessTest.nsi in the Example folder, I get the following errors:
Section: "Find process" ->(FindProcess)
!insertmacro: nsProcess::FindProcess
Invalid command: nsProcess::_FindProcess
Error in macro nsProcess::FindProcess on macroline 1
Error in script "C:\Users\Sebastian\Desktop\nsProcess_1_6\Example\nsProcessTest.nsi" on line 14 -- aborting creation process
This is line 14 of the example script:
${nsProcess::FindProcess} "Calc.exe" $R0
Do somebody know what is wrong? How can I check if a process is running with NSIS?
NSIS does not find the plug-in, so make sure you copied its files to the correct folder.
NSIS 2.x:
NSIS/
├── Include/
│ └── nsProcess.nsh
└── Plugins/
└── nsProcess.dll
NSIS 3.x:
NSIS/
├── Include/
│ └── nsProcess.nsh
└── Plugins/
├── x86-ansi/
│ └── nsProcess.dll
└── x86-unicode/
└── nsProcess.dll
The file inside Plugins\x86-unicode is nsProcessW.dll renamed to nsProcess.dll (blame the author for making it overly complicated!)
More generally, refer to How can I install a plugin? on the NSIS Wiki.