I was wondering if there was a way (such as a commad) to move a directory filled with, say, image files, to the build directory using cmake 2.8.
Thanks in advance!
The file() command can do what you want.
From the cmake manual:
The file() command also provides COPY and INSTALL signatures:
file(<COPY|INSTALL> files... DESTINATION <dir>
[FILE_PERMISSIONS permissions...]
[DIRECTORY_PERMISSIONS permissions...]
[NO_SOURCE_PERMISSIONS] [USE_SOURCE_PERMISSIONS]
[FILES_MATCHING]
[[PATTERN <pattern> | REGEX <regex>]
[EXCLUDE] [PERMISSIONS permissions...]] [...])
The COPY signature copies files, directories, and symlinks to a destination fold Relative input paths are evaluated with respect to the current source directory, and a relative destination is evaluated with respect to the current build directory. Copying preserves input file timestamps, and optimizes out a file if it exists at the destination with the same timestamp. Copying preserves input permissions unless explicit permissions or NO_SOURCE_PERMISSIONS are given (default is USE_SOURCE_PERMISSIONS). See the install(DIRECTORY) command for documentation of permissions, PATTERN, REGEX, and EXCLUDE options.
So you would have something like (tested):
file(COPY ${YOUR_SRC_IMAGE_DIR} DESTINATION ${CMAKE_CURRENT_BINARY_DIR}/YourPreferedDestination)
To move, you can use the RENAME form:
file(RENAME ${YOUR_SRC_IMAGE_DIR} ${CMAKE_CURRENT_BINARY_DIR}/PreferedDestination)
But I am not sure that you would want that, because the source will not be available anymore to reproduce the build sequence, hence my attempt to answer with the copy command above.
Related
Context: I'm trying to come up with a fix for https://github.com/tensorflow/tensorflow/issues/37861 where header files of an external dependency are manually listed but that list is version specific and hence impossible to keep up to date.
What is happening:
tf_http_archive(name = "com_google_protobuf", system_build_file = clean_dep("//third_party/systemlibs:protobuf.BUILD") ...) is invoked
tf_http_archive is a repository_rule with effectively nothing but ctx.template("BUILD.bazel", ctx.attr.system_build_file, {...}, False)
In the protobuf.BUILD there is a list HEADERS = ["google/protobuf/any.pb.h", ...] which is passed to the hdrs argument of cc_library calls
a genrule apperantly symlinks those headers from $(INCLUDEDIR) into $(#D) (I'm not really familiar with Bazel but IIUC the latter is some internal build directory used later)
As I'm unfamiliar with Bazel in general I'll just assume the list of headers is required and there exists a $(INCLUDEDIR)/google/protobuf folder and is somewhere (else) on the system, e.g. /usr/local/include.
Is there any way to get all *.h and *.inc files in the format (i.e. relative to $(INCLUDEDIR)) via a glob or similar? The Bazel glob function doesn't work for absolute paths, so that can't be used.
I found https://github.com/bazelbuild/bazel/issues/8846 suggesting to use new_local_repository with a build_file and a path set to (in this case) $(INCLUDEDIR) but I don't see how that could be applied to the tf_http_archive (which has some conditions to either download the dependency or just use the system_build_file). This seems to also allow to avoid the symlinking (which I'm highly suspicious of anyway because that folder is added via -iquote but include style is #include <...>, see my comments in https://github.com/tensorflow/tensorflow/issues/37861)
Bonus points for people contributing to the issue or ideas why action_env environment variables seem to be ignored in a native.cc_library call.
Imagine a code generator which reads an input file (say a UML class diagram) and produces an arbitrary number of source files which I want to be handled in my project. (to draw a simple picture let's assume the code generator just produces .cpp files).
The problem is now the number of files generated depends on the input file and thus is not known when writing the CMakeLists.txt file or even in CMakes configure step. E.g.:
>>> code-gen uml.xml
generate class1.cpp..
generate class2.cpp..
generate class3.cpp..
What's the recommended way to handle generated files in such a case? You could use FILE(GLOB.. ) to collect the file names after running code-gen the first time but this is discouraged because CMake would not know any files on the first run and later it would not recognize when the number of files changes.
I could think of some approaches but I don't know if CMake covers them, e.g.:
(somehow) define a dependency from an input file (uml.xml in my example) to a variable (list with generated file names)
in case the code generator can be convinced to tell which files it generates the output of code-gen could be used to create a list of input file names. (would lead to similar problems but at least I would not have to use GLOB which might collect old files)
just define a custom target which runs the code generator and handles the output files without CMake (don't like this option)
Update: This question targets a similar problem but just asks how to glob generated files which does not address how to re-configure when the input file changes.
Together with Tsyvarev's answer and some more googling I came up with the following CMakeList.txt which does what I want:
project(generated)
cmake_minimum_required(VERSION 3.6)
set(IN_FILE "${CMAKE_SOURCE_DIR}/input.txt")
set_property(DIRECTORY APPEND PROPERTY CMAKE_CONFIGURE_DEPENDS "${IN_FILE}")
execute_process(
COMMAND python3 "${CMAKE_SOURCE_DIR}/code-gen" "${IN_FILE}"
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
INPUT_FILE "${IN_FILE}"
OUTPUT_VARIABLE GENERATED_FILES
OUTPUT_STRIP_TRAILING_WHITESPACE
)
add_executable(generated main.cpp ${GENERATED_FILES})
It turns an input file (input.txt) into output files using code-gen and compiles them.
execute_process is being executed in the configure step and the set_property() command makes sure CMake is being re-run when the input file changes.
Note: in this example the code-generator must print a CMake-friendly list on stdout which is nice if you can modify the code generator. FILE(GLOB..) would do the trick too but this would for sure lead to problems (e.g. old generated files being compiled, too, colleagues complaining about your code etc.)
PS: I don't like to answer my own questions - If you come up with a nicer or cleaner solution in the next couple of days I'll take yours!
[Question] Does Session::RemoveFiles() remove files in sub directory of source directory? If not, how to implement this ability?
(Please do not ask me why I have the remote directory as /C/testTransfer/. The code just for testing purpose.)
I have a SFTP program using WinSCP .Net assembly. Program language is C++/CLI. It opens up a work file. The file contains many lines of FTP instructions.
One type of instruction I have to handle is to transfer *.txt from source directory. The source directory may contain sub directories which may contain .txt as well. Once transfer is successful, delete the source files.
I use Session::GetFiles() for the transfer. It correctly transfer all .txt files (/C/testTransfer/*.txt), even those in sub directories (/C/testTransfer/sub/*.txt), in the source to the destination.
transferOptions->FileMask = "*.txt";
session->GetFiles("/C/testTransfer", "C:\\temp\\win", false, transferOption);
Now to remove, I use session->RemoveFiles("/C/testTransfer/*.txt"). I only see *.txt in the source (/C/testTransfer/*.txt), but not in the sub directory (/C/testTransfer/sub/*.txt), are removed.
The Session::RemoveFiles can remove even files in subdirectories in general. But not this way with wildcard, because WinSCP will not descend to subdirectories that do not match the wildcard (*.txt). Also note that even if you do not need the wildcard, the Session::RemoveFiles would remove even the subdirectories themselves, what I'm not sure you want it to.
Though you have other (and better = more safe) options:
Use the remove parameter of the Session::GetFiles method to instruct it to remove source file after successful transfer.
If you need to delete source files transactionally (=only after download of all files succeed), iterate the TransferOperationResult::Transfers returned by Session::GetFiles and call the Session::RemoveFiles for each (unless the TransferEventArgs::Error is not null).
Use the TransferEventArgs::FileName to get a file path to pass to the Session::RemoveFiles. Use the RemotePath::EscapeFileMask to escape the file name before passing it to the Session::RemoveFiles.
There's a similar full example available for Moving local files to different location after successful upload.
To recursively delete files matching a wildcard in a standalone operation (not after downloading the same files), use the Session::EnumerateRemoteFiles. Pass your wildcard to its mask argument. Use the EnumerationOptions.AllDirectories option for recursion.
Call the Session::RemoveFiles for each returned file. Use the RemotePath::EscapeFileMask to escape the file name before passing it to the Session::RemoveFiles.
I have put these two lines in my Findglm.cmake file to point to the headers for this header library.
find_path(glm_INCLUDE_DIR NAMES glm.hpp matrix_transform.hpp type_ptr.hpp PATHS
${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm ${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm/gtc
${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm/gtx ${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2glm/core)
set(glm_INCLUDE_DIRS ${glm_INCLUDE_DIR})
However when I generate my Xcode project it says that it cannot locate matrix_transform.hpp and type_ptr.hpp
I have played around with this some more it appears to only find the first argument I am wondering if I am using find path wrong ?
I am using cmake 2.8.8 darwinports.
The find_path() command returns single directory. In your case, it's the first dir, which contains the first file.
If this glm will be always located in your source dir, it would be sufficient to do
include_directories(${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm
${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm/gtc
${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm/gtx
${CMAKE_SOURCE_DIR}/libs/glm-0.9.3.2/glm/core)
The find_path() is used to determine dir somewhere outside of your project.
I got a .RAR file which contains different files with the same name.
For example,
index.txt 40 Text Document 04/01/2010 4:40PM
index.txt 22 Text Document 04/01/2010 4:42PM
index.txt 10 Text Document 04/01/2010 4:45PM
index.txt 13 Text Document 04/01/2010 4:50PM
Why?
Like said before, the files could be in separate paths, but as I'll show further, this isn't always the case.
If you use WinRAR to list the file contents and your options are set as the following, then it only appears you have files with the same name, but they are in different paths.
Options -> File list -> Flat folders view (ctrl+h)
Options -> File list -> Details
After the column CRC32, there is one called Path. If this is different, extraction shouldn't be a problem if:
Extract -> Extraction path and options -> Advanced -> Extract relative paths is set.
If it is Do not extract paths, WinRAR will need to ask you to rename them because of file system limitations.
I assume command line unrar won't be a problem in this case because you need to specify additional parameters to change its default behavior.
It is possible for a RAR archive to have multiple files with the same name in the same directory. If you use Windows, use "C:\Program Files\WinRAR\Rar.exe"
instead of rar on the command line in the following examples.
Create a new file and add it to a RAR archive. You can also check the changes by listing its contents.
rar a rarfile.rar testfile.txt
rar l rarfile.rar
rar a rarfile.rar testfile.txt
If you try to re-add this file, rar will replace the already added file with the same name.
Updating archive rarfile.rar
Updating testfile.txt OK
Done
Create an other file or rename the first one and add it to the RAR file.
move testfile.txt second.txt (new file)
rar a rarfile.rar second.txt (add it)
rar lb rarfile.rar (list archive, bare info)
Rename the second file to the first one's name.
rar rn rarfile.rar second.txt testfile.txt
This is how you create a RAR file with multiple files of the same name in the same path. These steps will be similar in WinRAR. If you try to rename the file again, the file name of all files in that directory will change too.
Why would someone want to do this?
The only explanation I can think of is that the person that created this archive wanted to imitate a version control/backup system. But if you want to extract only one specific version and it isn't the first one, WinRAR extracts the wrong file. It seems I've found a very obscure WinRAR bug :-)
Edit: seems a bad explanation after finding this in the RAR documentation:
-ver[n] File version control
Forces RAR to keep previous file versions when updating
files in the already existing archive. Old versions are
renamed to 'filename;n', where 'n' is the version number.
By default, when unpacking an archive without the switch
-ver, RAR extracts only the last added file version, the name
of which does not include a numeric suffix. But if you specify
a file name exactly, including a version, it will be also
unpacked. For example, 'rar x arcname' will unpack only
last versions, when 'rar x arcname file.txt;5' will unpack
'file.txt;5', if it is present in the archive.
If you specify -ver switch without a parameter when unpacking,
RAR will extract all versions of all files that match
the entered file mask. In this case a version number is
not removed from unpacked file names. You may also extract
a concrete file version specifying its number as -ver parameter.
It will tell RAR to unpack only this version and remove
a version number from file names. For example,
'rar x -ver5 arcname' will unpack only 5th file versions.
If you specify 'n' parameter when archiving, it will limit
the maximum number of file versions stored in the archive.
Old file versions exceeding this threshold will be removed.
they are in different paths, most likely.
try outputting the full path. or see what happens when you extract them.
you'll probably see something like:
index.txt
path1/index.txt
path2/index.txt
etc etc