Sparx Systems Enterprise Architect Script: Clone a Subpackage to another Package? - scripting

I try to wrote a Automation in JScript. If i bin done with creating the structure of 'Package b' then i need the subpackages from 'Package b' in 'Package a' for creating a diagramm of something.
is it possible to clone a Subpackage of for example 'package b' to 'package a'with the API? The Subpackage is a package which includes a lot subpackage more. need not only the "Master"-Subpackage but the complete structure of the package tree of my subpackage.
Package a
Package b
--> subpackage b.1
-->subpackage b.1.1
-->subpackage b.1.2
-->subpackage b.1.3
-->subpackage b.1.3.1
-->and so on....

Use package.clone() which inserts a copy of the package into the same parent as the original package and returns the newly-created package.

Related

Why doesn't cmake find_package(VTK) find VTK_INCLUDE_DIRS after building VTK from source and installing?

I’m running cmake version 3.23.0-rc1, on ubuntu 20.04.
I built vtk-8.2 from source; cmake, make, then ‘make install’. Now I am trying to find the VTK package for my own application, using cmake’s find_package(VTK). The application’s CMakeLists.txt contains this:
find_package(VTK)
message("VTK_FOUND: ${VTK_FOUND}")
message("VTK_INCLUDE_DIRS: ${VTK_INCLUDE_DIRS}")
message("VTK_LIBRARIES: ${VTK_LIBRARIES}")
Result is that VTK_FOUND=1, VTK_LIBRARIES contains many entries, but VTK_INCLUDE_DIRS is blank/empty. Why would this be?
I do see file /usr/local/lib/cmake/vtk-8.2, which contains many *.cmake files. But I don’t see a corresponding /usr/local/include/cmake directory, despite the presence of /usr/local/include/vtk-8.2. Is that expected? Here is the output:
VTK_FOUND: 1
VTK_INCLUDE_DIRS:
VTK_LIBRARIES: VTK::WrappingTools;VTK::ViewsQt;VTK::ViewsInfovis;VTK::CommonColor;VTK::ViewsContext2D;VTK::loguru;VTK::TestingRendering;VTK::TestingCore;VTK::vtksys;VTK::RenderingQt;VTK::PythonContext2D;VTK::RenderingVolumeOpenGL2;VTK::RenderingOpenGL2;VTK::glew;VTK::opengl;VTK::PythonInterpreter;VTK::Python;VTK::RenderingLabel;VTK::octree;VTK::RenderingLOD;VTK::RenderingImage;VTK::RenderingContextOpenGL2;VTK::IOVeraOut;VTK::hdf5;VTK::IOTecplotTable;VTK::IOSegY;VTK::IOParallelXML;VTK::IOPLY;VTK::IOOggTheora;VTK::theora;VTK::ogg;VTK::IONetCDF;VTK::netcdf;VTK::IOMotionFX;VTK::pegtl;VTK::IOParallel;VTK::jsoncpp;VTK::IOMINC;VTK::IOLSDyna;VTK::IOInfovis;VTK::libxml2;VTK::zlib;VTK::IOImport;VTK::IOGeometry;VTK::IOVideo;VTK::IOMovie;VTK::IOExportPDF;VTK::libharu;VTK::IOExportGL2PS;VTK::RenderingGL2PSOpenGL2;VTK::gl2ps;VTK::png;VTK::IOExport;VTK::RenderingVtkJS;VTK::RenderingSceneGraph;VTK::IOExodus;VTK::exodusII;VTK::IOEnSight;VTK::IOCityGML;VTK::pugixml;VTK::IOAsynchronous;VTK::IOAMR;VTK::InteractionImage;VTK::ImagingStencil;VTK::ImagingStatistics;VTK::ImagingMorphological;VTK::ImagingMath;VTK::GUISupportQtSQL;VTK::IOSQL;VTK::sqlite;VTK::GUISupportQt;VTK::GeovisCore;VTK::libproj;VTK::InfovisLayout;VTK::ViewsCore;VTK::InteractionWidgets;VTK::RenderingVolume;VTK::RenderingAnnotation;VTK::ImagingHybrid;VTK::ImagingColor;VTK::InteractionStyle;VTK::FiltersTopology;VTK::FiltersSelection;VTK::FiltersSMP;VTK::FiltersPython;VTK::FiltersProgrammable;VTK::FiltersPoints;VTK::FiltersVerdict;VTK::verdict;VTK::FiltersParallelImaging;VTK::FiltersImaging;VTK::ImagingGeneral;VTK::FiltersHyperTree;VTK::FiltersGeneric;VTK::FiltersFlowPaths;VTK::FiltersAMR;VTK::FiltersParallel;VTK::FiltersTexture;VTK::FiltersModeling;VTK::FiltersHybrid;VTK::RenderingUI;VTK::DomainsChemistry;VTK::CommonPython;VTK::WrappingPythonCore;VTK::ChartsCore;VTK::InfovisCore;VTK::FiltersExtraction;VTK::ParallelDIY;VTK::diy2;VTK::IOXML;VTK::IOXMLParser;VTK::expat;VTK::ParallelCore;VTK::IOLegacy;VTK::IOCore;VTK::doubleconversion;VTK::lz4;VTK::lzma;VTK::utf8;VTK::FiltersStatistics;VTK::eigen;VTK::ImagingFourier;VTK::ImagingSources;VTK::IOImage;VTK::DICOMParser;VTK::jpeg;VTK::metaio;VTK::tiff;VTK::RenderingContext2D;VTK::RenderingFreeType;VTK::freetype;VTK::kwiml;VTK::RenderingCore;VTK::FiltersSources;VTK::ImagingCore;VTK::FiltersGeometry;VTK::FiltersGeneral;VTK::CommonComputationalGeometry;VTK::FiltersCore;VTK::CommonExecutionModel;VTK::CommonDataModel;VTK::CommonSystem;VTK::CommonMisc;VTK::CommonTransforms;VTK::CommonMath;VTK::CommonCore
find_package(VTK) no longer sets VTK_INCLUDE_DIRS variable. If you look into description part of vtk-config.cmake (script CMake/vtk-config.cmake.in contains template of that file), then you find no note about VTK_INCLUDE_DIRS.
Since VTK_LIBRARIES variable contains IMPORTED targets (in form of VTK::foo), linking with the content of that variable using target_link_libraries will automatically provide include directories.

i18n.merge_file can't translate desktop file

I'm trying to convert autotools project to Meson and stuck on translation of desktop file.
There is no problem to get all .mo files created.
POTFILES and LINGUAS are in 'po' folder as per manual.
Only problem is i18n.merge_file is not generating file with translations.
My meson.build looks like
...
package = meson.project_name()
i18n = import('i18n')
add_project_arguments('-DGETTEXT_PACKAGE="#0#"'.format(package), language:'c')
subdir('po')
i18n.merge_file(
input: 'data/clipit.desktop.in',
output: 'clipit.desktop',
type: 'desktop',
po_dir: 'po',
install: true,
install_dir: '/usr/share/applications'
)
...
po/meson.build
i18n.gettext(package, preset: 'glib')
clipit.desktop.in
[Desktop Entry]
_Name=ClipIt
_Comment=Clipboard Manager
Icon=clipit-trayicon-offline
Exec=clipit
Terminal=false
Type=Application
Categories=GTK;GNOME;Application;Utility;
After ninja install output is:
[Desktop Entry]
Icon=clipit-trayicon-offline
Exec=clipit
Terminal=false
Type=Application
Categories=GTK;GNOME;Application;Utility;
It is based on (https://mesonbuild.com/Porting-from-autotools.html) but also tried to follow 'eye of gnome' meson.build. No luck.
Current version of code on github.
Edit:
Leaving snippet that can be used, as meson documentation don't cover using intltool.
custom_target('clipit.desktop',
input : 'data/clipit.desktop.in',
output : 'clipit.desktop',
command: [intltool_merge, '-d', '-u', join_paths(meson.source_root(), 'po'), '#INPUT#', '#OUTPUT#'],
install : true,
install_dir : get_option('datadir') / 'applications')
The reason why this doesn't work, is that this isn't valid input to gettext :)
The underscore at the start of the _Name and _Comment fields are because of intltool, another translation tool similar to gettext. To solve this, just remove the underscore of those fields. This will work for .desktop files. For more information, you can also take a few hints from https://wiki.gnome.org/MigratingFromIntltoolToGettext
On a side note, you shouldn't direct install to '/usr/share/applications', since someone might want to choose a custom prefix or datadir (see Meson - Built-in options for more info). It's better to use install_dir: get_option('datadir') / 'applications'.

Using extended classes in gst (GNU smalltalk)?

This is a bit of a follow-up question to this one.
Say I've managed to extend the Integer class with a new method 'square'. Now I want to use it.
Calling the new method from within the file is easy:
Integer extend [
square [
| r |
r := self * self.
^r
]
]
x := 5 square.
x printNl.
Here, I can just call $ gst myprogram.st in bash and it'll print 25. But what if I want to use the method from inside the GNU smalltalk application? Like this:
$ gst
st> 5 square
25
st>
This may have to do with images, I'm not sure. This tutorial says I can edit the ~/.st/kernel/Builtins.st file to edit what files are loaded into the kernel, but I have no such file.
I would not edit what's loaded into the kernel. To elaborate on my comment, one way of loading previously created files into the environment for GNU Smalltalk, outside of using image files, is to use packages.
A sample package.xml file, which defines the package per the documentation, would look like:
<package>
<name>MyPackage</name>
<!-- Include any prerequisite packages here, if you need them -->
<prereq>PrequisitePackageName</prereq>
<filein>Foo.st</filein>
<filein>Bar.st</filein>
</package>
A sample Makefile for building the package might look like:
# MyPackage makefile
#
PACKAGE_DIR = ~/.st
PACKAGE_SPEC = package.xml
PACKAGE_FILE = $(PACKAGE_DIR)/MyPackage.star
PACKAGE_SRC = \
Foo.st \
Bar.st
$(PACKAGE_FILE): $(PACKAGE_SRC) $(PACKAGE_SPEC)
gst-package -t ~/.st $(PACKAGE_SPEC)
With the above files in your working directory containing Foo.st and Bar.st, you can do a make and it will build the .star package file and put it in ~/.st (where gst will go looking for packages as the first place to look). When you run your environment, you can then use PackageLoader to load it in:
$ gst
GNU Smalltalk ready
st> PackageLoader fileInPackage: 'MyPackage'
Loading package PrerequisitePackage
Loading package MyPackage
PackageLoader
st>
Then you're ready to rock and roll... :)

How can I conditionally include large scripts in my ssdt post deployment script?

In our SSDT project we have a script that is huge and contains a lot of INSERT statements for importing data from an old system. Using sqlcmd variables, I'd like to be able to conditionally include the file into the post deployment script.
We're currently using the :r syntax which includes the script inline:
IF '$(ImportData)' = 'true'
BEGIN
:r .\Import\OldSystem.sql
END
This is a problem because the script is being included inline regardless of whether $(ImportData) is true or false and the file is so big that it's slowing the build down by about 15 minutes.
Is there another way to conditionally include this script file so it doesn't slow down the build?
Rather than muddy up my prior answer with another. There is a special case with a VERY simple option.
Create separate SQLCMD input files for each execution possibility.
The key here is to name the execution input files using the value of your control variable.
So, for example, your publish script defines variable 'Config' which may have one of these values: 'Dev','QA', or 'Prod'.
Create 3 post deployment scripts named 'DevPostDeploy.sql', 'QAPostDeploy.sql' and 'ProdPostDeploy.sql'.
Code your actual post deploy file like this:
:r ."\"$(Config)PostDeploy.sql
This is very much like the build event mechanism where you overwrite scripts with appropriate ones except you don't need a build event. But you are dependent upon naming your scripts very specifically.
The scripts referenced using :r are always included. You have a couple of options but I would first verify that if you take the script out it improves the performance to where you want it to get to.
The simplest approach is to just keep it outside of the whole build process and change your deploy process so it becomes a two step thing (deploy DAC then deploy script). The positives of this are you can do things outside of the ssdt process but the negatives are you don't get things like auto disabling of constraints on tables changing in the deployment.
The second way is to not include the script in the deploy when you build but create an AfterBuild msbuild task that adds the script as a post deploy script in the dacpac. The dacpac is a zip file so you can use the .net packaging Api to add a part called postdeploy.sql which will then be included in the deployment process.
Both of these ways mean you lose verification so you might want to keep it in a separate ssdt project which has a "same database" reference to your main project, it will slow down the build when it changes but should be quick the rest of the time.
Here is the way I had to do it.
1) Create a dummy post-deploy script.
2) Create build configurations in your project for each deploy scenario.
3) Use a pre-build event to determine which post deploy configuration to use.
You can either create separate scripts for each configuration or dynamically build the post-deploy script in your pre-build event. Either way you base what you do on the value of $(configuration) which always exists in a build event.
If you use separate static scripts, your build event only needs to copy the appropriate static file, overwriting the dummy post-deploy with whichever script is useful in that deploy scenario.
In my case I had to use dynamic generation because the decision about which scripts to include required knowing the current state of the database being deployed to. So I used the configuration variable to tell me which environment was being deployed to and then used an SQLCMD script with :OUT set to my Post-Deploy script location. Thus my pre-build script would then write the post-deploy script dynamically.
Either way, once build completed and the normal deploy process started the Post-Deploy script contained exactly the :r commands that I wanted.
Here's an example of the SQLCMD script I invoke in pre-build.
:OUT .\Script.DynamicPostDeployment.sql
PRINT ' /*';
PRINT ' DO NOT MANUALLY MODIFY THIS SCRIPT. ';
PRINT ' ';
PRINT ' It is overwritten during build. ';
PRINT ' Content IS based on the Configuration variable (Debug, Dev, Sit, UAT, Release...) ';
PRINT ' ';
PRINT ' Modify Script.PostDeployment.sql to effect changes in executable content. ';
PRINT ' */';
PRINT 'PRINT ''PostDeployment script starting at''+CAST(GETDATE() AS nvarchar)+'' with Configuration = $(Configuration)'';';
PRINT 'GO';
IF '$(Configuration)' IN ('Debug','Dev','Sit')
BEGIN
IF (SELECT IsNeeded FROM rESxStage.StageRebuildNeeded)=1
BEGIN
-- These get a GO statement after every file because most are really HUGE
PRINT 'PRINT ''ETL data was needed and started at''+CAST(GETDATE() AS nvarchar);';
PRINT ' ';
PRINT 'EXEC iESxETL.DeleteAllSchemaData ''pExternalETL'';';
PRINT 'GO';
PRINT ':r .\PopulateExternalData.sql ';
....
I ended up using a mixture of our build tool (Jenkins) and SSDT to accomplish this. This is what I did:
Added a build step to each environment-specific Jenkins job that writes to a text file. I either write a SQLCMD command that includes the import file or else I leave it blank depending on the build parameters the user chooses.
Include the new text file in the Post Deployment script via :r.
That's it! I also use this same approach to choose which pre and post deploy scripts to include in the project based on the application version, except that I grab the version number from the code and write it to the file using a pre-build event in VS instead of in the build tool. (I also added the text file name to .gitignore so it doesn't get committed)

How to use sqlite3 in C without the dev library installed?

I'm trying to load a local DB with SQLite under a RedHat Linux server. I have a C code to load the database from a very large file spliting the columns. The bad new is that sqlite3 is not installed in the machine (fatal error: sqlite3.h: No such file or directory) and I won't be able to have permissions to install libsqlite3-dev (acording to this) , so I could only use it throuth bash or python:
[dhernandez#zl1:~]$ locate sqlite3
/opt/xiv/host_attach/xpyv/lib/libsqlite3.so
/opt/xiv/host_attach/xpyv/lib/libsqlite3.so.0
/opt/xiv/host_attach/xpyv/lib/libsqlite3.so.0.8.6
/opt/xiv/host_attach/xpyv/lib/python2.7/sqlite3
/opt/xiv/host_attach/xpyv/lib/python2.7/lib-dynload/_sqlite3.so
/opt/xiv/host_attach/xpyv/lib/python2.7/sqlite3/__init__.py
/opt/xiv/host_attach/xpyv/lib/python2.7/sqlite3/dbapi2.py
/opt/xiv/host_attach/xpyv/lib/python2.7/sqlite3/dump.py
/usr/bin/sqlite3
/usr/lib64/libsqlite3.so.0
/usr/lib64/libsqlite3.so.0.8.6
/usr/lib64/python2.6/sqlite3
/usr/lib64/python2.6/lib-dynload/_sqlite3.so
/usr/lib64/python2.6/sqlite3/__init__.py
/usr/lib64/python2.6/sqlite3/__init__.pyc
/usr/lib64/python2.6/sqlite3/__init__.pyo
/usr/lib64/python2.6/sqlite3/dbapi2.py
/usr/lib64/python2.6/sqlite3/dbapi2.pyc
/usr/lib64/python2.6/sqlite3/dbapi2.pyo
/usr/lib64/python2.6/sqlite3/dump.py
/usr/lib64/python2.6/sqlite3/dump.pyc
/usr/lib64/python2.6/sqlite3/dump.pyo
/usr/lib64/xulrunner/libmozsqlite3.so
/usr/share/man/man1/sqlite3.1.gz
/usr/share/mime/application/x-kexiproject-sqlite3.xml
/usr/share/mime/application/x-sqlite3.xml
What would be faster of the following options?
Split the columns in my C program, and then execute the insert like
this:
system("echo 'insert into t values(1,2);'" | sqlite3 mydb.db);
Split the columns in my C program, save it to a temp file and when
I've got 500.000 rows I execute the script like this (and then empty
the temp file to continue loading rows):
system("sqlite3 mydb.db < temp.sql);
Split the columns in my C program adding a delimiter between them, save it to a temp file and import it like this:
.delimiter '#'
.import temp.txt t
You can use the amalgation version. It is a single .c file you can include in your project, and all of SQLite is available. No need for dynamic linking.
You could probably try to dynamically load the sqlite3 library at runtime.
There are quite few stuff to learn about it, but that's a powerful functionality and I am quite sure this would solve your problem.
Here is a link describing how you can do it : http://tldp.org/HOWTO/Program-Library-HOWTO/dl-libraries.html
Download the devel package and use it from your project directory. You only need it for the compilation.