i18n.merge_file can't translate desktop file - meson-build

I'm trying to convert autotools project to Meson and stuck on translation of desktop file.
There is no problem to get all .mo files created.
POTFILES and LINGUAS are in 'po' folder as per manual.
Only problem is i18n.merge_file is not generating file with translations.
My meson.build looks like
...
package = meson.project_name()
i18n = import('i18n')
add_project_arguments('-DGETTEXT_PACKAGE="#0#"'.format(package), language:'c')
subdir('po')
i18n.merge_file(
input: 'data/clipit.desktop.in',
output: 'clipit.desktop',
type: 'desktop',
po_dir: 'po',
install: true,
install_dir: '/usr/share/applications'
)
...
po/meson.build
i18n.gettext(package, preset: 'glib')
clipit.desktop.in
[Desktop Entry]
_Name=ClipIt
_Comment=Clipboard Manager
Icon=clipit-trayicon-offline
Exec=clipit
Terminal=false
Type=Application
Categories=GTK;GNOME;Application;Utility;
After ninja install output is:
[Desktop Entry]
Icon=clipit-trayicon-offline
Exec=clipit
Terminal=false
Type=Application
Categories=GTK;GNOME;Application;Utility;
It is based on (https://mesonbuild.com/Porting-from-autotools.html) but also tried to follow 'eye of gnome' meson.build. No luck.
Current version of code on github.
Edit:
Leaving snippet that can be used, as meson documentation don't cover using intltool.
custom_target('clipit.desktop',
input : 'data/clipit.desktop.in',
output : 'clipit.desktop',
command: [intltool_merge, '-d', '-u', join_paths(meson.source_root(), 'po'), '#INPUT#', '#OUTPUT#'],
install : true,
install_dir : get_option('datadir') / 'applications')

The reason why this doesn't work, is that this isn't valid input to gettext :)
The underscore at the start of the _Name and _Comment fields are because of intltool, another translation tool similar to gettext. To solve this, just remove the underscore of those fields. This will work for .desktop files. For more information, you can also take a few hints from https://wiki.gnome.org/MigratingFromIntltoolToGettext
On a side note, you shouldn't direct install to '/usr/share/applications', since someone might want to choose a custom prefix or datadir (see Meson - Built-in options for more info). It's better to use install_dir: get_option('datadir') / 'applications'.

Related

meson and git information

I need to provide to the binary built with meson build system some git information regarding branch and version used:
git describe --tags
git descibe --help
the problem I have is how retrieve this information with meson,
with the make build I use the following instruction:
GITREF = $(shell git describe --all)
LIB1_VER = $(shell cd ../../lib1;git describe --tags;cd - &>NULL)
so in meson for GITREF I've tried
info_dep = vcs_tag(command : ['git descibe --all'],
input : 'infoBuild.h.in',
output : 'infoBuild.h',
replace_string : 'BRANCHNAME')
where infobuild.h.in is:
#define GITREF "BRANCHNAME"
but when I go to compile with ninja I got
/usr/local/bin/meson --internal vcstagger ../../src/prog1/info/infoBuild.h.in src/prog1/info/infoBuild.h 1.1.0 /home/mariano/clonesIntel/projMes/src/prog1/info BRANCHNAME '(.*)' '/home/mariano/clonesIntel/ProjMes/src/prog1/info/git describe --all'
but I don't find any infoBuild.h,
more over for the LIB1_VER is more difficult because it is in an external folder,
I could overcome this issue with a bash script but is there a way to retrieve both information in meson build?
I see an immediate problem in that it's going to try to run a command 'git describe --all', which is not what you want, as meson will be sure to escape the spaces in your shell so that it treats that as the a single filename, you want ['git', 'describe', '--all']. Of course, that could just be a type in your example.
One option you might consider is a run_command and configure_file, which is a command run at compile time, and produces a result object that you can get string values from. The disadvantage of this compared to vcs_tag (or a custom_target) is that it happens at configure time, as opposed to build time, so you need to reconfigure to update your tags:
res = run_command(['git', 'describe', '--all'], capture : true, check : true)
describe = res.stdout()
version_h = configure_file(
input : 'version.h.in',
output : 'version.h',
configuration : {'PLACEHOLDER' : describe}
)

How can I concatenate multiple files into one in Meson?

I'm having trouble with a basic task in Meson where I need multiple files concatenated into one during build; basically:
cat *.txt > compiled.txt
or
cat foo.txt bar.txt baz.txt > compiled.txt
However whether I use custom_target(), generator() or any other function, Meson either can't find the compiled.txt or can't handle transitioning from multiple input files to a single output file.
Is there an easy way to achieve this?
Update:
Using run_command() I've managed to build compiled.txt and have it appear in the source directory. Ultimately I want compiled.txt (which I've listed in the gresource.xml) to be compiled by gnome.compile_resources(). Is there a way I can run this command and pass the file directly to that function to process?
Use custom_target(), pass the output to dependencies of gnome.compile_resources(). Note you will need a fairly recent glib for it to work.
See also: http://mesonbuild.com/Gnome-module.html#gnomecompile_resources
Moved solution from question to answer:
Solution:
I ended up not using gresources, but still needed this solution to concatenate files
cat_prog = find_program('cat')
parts_of_the_whole = files(
'part1.txt',
'part2.txt'
)
concat_parts = custom_target(
'concat-parts',
command: [ cat_prog, '#INPUT#' ],
capture: true,
input: parts_of_the_whole,
output: 'compiled.txt',
install_dir: appdatadir,
install: true,
build_by_default: true
)

rpm spec file skeleton to real spec file

The aim is to have skeleton spec fun.spec.skel file which contains placeholders for Version, Release and that kind of things.
For the sake of simplicity I try to make a build target which updates those variables such that I transform the fun.spec.skel to fun.spec which I can then commit in my github repo. This is done such that rpmbuild -ta fun.tar does work nicely and no manual modifications of fun.spec.skel are required (people tend to forget to bump the version in the spec file, but not in the buildsystem).
Assuming the implied question is "How would I do this?", the common answer is to put placeholders in the file like ##VERSION## and then sed the file, or get more complicated and have autotools do it.
We place a version.mk file in our project directories which define environment variables. Sample content includes:
RELPKG=foopackage
RELFULLVERS=1.0.0
As part of a script which builds the RPM, we can source this file:
#!/bin/bash
. $(pwd)/Version.mk
export RELPKG RELFULLVERS
if [ -z "${RELPKG}" ]; then exit 1; fi
if [ -z "${RELFULLVERS}" ]; then exit 1; fi
This leaves us a couple of options to access the values which were set:
We can define macros on the rpmbuild command line:
% rpmbuild -ba --define "relpkg ${RELPKG}" --define "relfullvers ${RELFULLVERS}" foopackage.spec
We can access the environment variables using %{getenv:...} in the spec file itself (though this can be harder to deal with errors...):
%define relpkg %{getenv:RELPKG}
%define relfullvers %{getenv:RELFULLVERS}
From here, you simply use the macros in your spec file:
Name: %{relpkg}
Version: %{relfullvers}
We have similar values (provided by environment variables enabled through Jenkins) which provide the build number which plugs into the "Release" tag.
I found two ways:
a) use something like
Version: %(./waf version)
where version is a custom waf target
def version_fun(ctx):
print(VERSION)
class version(Context):
"""Printout the version and only the version"""
cmd = 'version'
fun = 'version_fun'
this checks the version at rpm build time
b) create a target that modifies the specfile itself
from waflib.Context import Context
import re
def bumprpmver_fun(ctx):
spec = ctx.path.find_node('oregano.spec')
data = None
with open(spec.abspath()) as f:
data = f.read()
if data:
data = (re.sub(r'^(\s*Version\s*:\s*)[\w.]+\s*', r'\1 {0}\n'.format(VERSION), data, flags=re.MULTILINE))
with open(spec.abspath(),'w') as f:
f.write(data)
else:
logs.warn("Didn't find that spec file: '{0}'".format(spec.abspath()))
class bumprpmver(Context):
"""Bump version"""
cmd = 'bumprpmver'
fun = 'bumprpmver_fun'
The latter is used in my pet project oregano # github

How to include .iuml path to generate PlantUML diagram in Doxygen

I'm working on the documentation of a component using Doxygen and I want to include UMLdiagrams in between the text.
I know how to do most of it, as I simply need to copy the .tuml source into my .dox file and run doxygen. However, one of my diagrams is a class diagram that includes other .iuml files, like explained in the PlantUML site.
So, basically, I do:
#mainpage main_page MyDoxygen
\
...
\
#startuml
\
!include iuml_files/Class01.iuml
!include iuml_files/Class02.iuml
\
MainClass <|-- Class01
MainClass <|-- Class02
\
#enduml
Long story short, I don't know how to make Doxygen understand it must look for the .iuml files in the directory (relative path) I'm giving as argument to the include directive.
If I wasn't clear enough as to what I need, please let me know and I will try make it clearer.
Can I please get some help?
I had a similar problem (I own the Word Add-in for plantuml)
You can specify the java property "plantuml.include.path" in the command line :
java -Dplantuml.include.path="c:/mydir" -jar plantuml.jar atest1.txt
(see http://plantuml.sourceforge.net/preprocessing.html)
I expect it'll work when you modify the batch file for calling Plantuml
http://plantuml.sourceforge.net/doxygen.html
I had a similar request for my Word Addin for Plantuml and here it worked.
The Real Answer
Use the PLANTUML_INCLUDE_PATH = ./someRelativeDir configuration, visible in the Doxygen wizard's DOT panel.
The include path is relative to your Doxygen config, ie the starting directory from which the doxygen config is taken.
A Red Herring
I'm leaving the rest of this answer here in case anyone found it previously.
I wrongly reported a bug because I needed new reading glasses and didn't notice a stray character in my path.
This was resolved as not a Doxygen bug
For any interested parties, this is what I saw.
Running PlantUML on generated file /Users/andydent/dev/touchgramdesign/doxygeneratedTG4IM/html/inline_umlgraph_1.pu
Preprocessor Error: Cannot include /Users/andydent/dev/touchgramdesign/doxygeneratedTG4IM/html/handDrawnStyle.iuml
Error line 2 in file: /Users/andydent/dev/touchgramdesign/doxygeneratedTG4IM/html/inline_umlgraph_1.pu
Some diagram description contains errors
error: Problems running PlantUML. Verify that the command 'java -jar "/Library/Java/Extensions/plantuml.jar" -h' works from the command line. Exit code: 1
This is using the configuration setting
PLANTUML_INCLUDE_PATH = ./iumltToCopy
Sharper eyes than mine (at the time) noticed the extra character in the path iuml t ToCopy

How to document Visual Basic with Doxygen

I am trying to use some Doxygen filter for Visual Basic in Windows.
I started with Vsevolod Kukol filter, based on gawk.
There are not so many directions.
So I started using his own commented VB code VB6Module.bas and, by means of his vbfilter.awk, I issued:
gawk -f vbfilter.awk VB6Module.bas
This outputs a C-like code on stdin. Therefore I redirected it to a file with:
gawk -f vbfilter.awk VB6Module.bas>awkout.txt
I created this Doxygen test.cfg file:
PROJECT_NAME = "Test"
OUTPUT_DIRECTORY = test
GENERATE_LATEX = NO
GENERATE_MAN = NO
GENERATE_RTF = NO
CASE_SENSE_NAMES = NO
INPUT = awkout.txt
QUIET = NO
JAVADOC_AUTOBRIEF = NO
SEARCHENGINE = NO
To produce the documentation I issued:
doxygen test.cfg
Doxygen complains as the "name 'VB6Module.bas' supplied as the second argument in the \file statement is not an input file." I removed the comment #file VB6Module.bas from awkout.txt. The warning stopped, but in both cases the documentation produced was just a single page with the project name.
I tried also the alternative filter by Basti Grembowietz in Python vbfilter.py. Again without documentation, again producing errors and without any useful output.
After trials and errors I solved the problem.
I was unable to convert a .bas file in a format such that I can pass it to Doxygen as input.
Anyway, following #doxygen user suggestions, I was able to create a Doxygen config file such that it can interpret the .bas file comments properly.
Given the file VB6Module.bas (by the Doxygen-VB-Filter author, Vsevolod Kukol), commented with Doxygen style adapted for Visual Basic, I wrote the Doxygen config file, test.cfg, as follows:
PROJECT_NAME = "Test"
OUTPUT_DIRECTORY = test
GENERATE_LATEX = NO
GENERATE_MAN = NO
GENERATE_RTF = NO
CASE_SENSE_NAMES = NO
INPUT = readme.md VB6Module.bas
QUIET = YES
JAVADOC_AUTOBRIEF = NO
SEARCHENGINE = NO
FILTER_PATTERNS = "*.bas=vbfilter.bat"
where:
readme.md is any Markdown file that can used as the main documentation page.
vbfilter.bat contains:
#echo off
gawk.exe -f vbfilter.awk "%1%"
vbfilter.awk by the filter author is assumed to be in the same folder as the input files to be documented and obviously gawk should be in the path.
Running:
doxygen test.cfg
everything is smooth, apart two apparently innocuous warnings:
gawk: vbfilter.awk:528: warning: escape sequence `\[' treated as plain `['
gawk: vbfilter.awk:528: warning: escape sequence `\]' treated as plain `]'
Now test\html\index.html contains the proper documentation as extracted by the ".bas" and the Markdown files.
Alright I did some work:
You can download this .zip file. It contains:
MakeDoxy.bas The macro that makes it all happen
makedoxy.cmd A shell script that will be executed by MakeDoxy
configuration Folder that contains doxygen and gawk binaries which are needed to create the doxygen documentation as well as some additional filtering files which were already used by the OP.
source Folder that contains example source code for doxygen
How To Use:
Note: I tested it with Excel 2010
Extract VBADoxy.zip somehwere (referenced as <root> from now on)
Import MakeDoxy.bas into your VBA project. You can also import the files from source or use your own doxygen-documented VBA code files but you'll need at least one documented file in the same VBA project.
Add "Microsoft Visual Basic for Applications Extensibility 5.3" or higher to your VBA Project References (did not test it with lower versions). It's needed for the export-part (VBProject, VBComponent).
Run macro MakeDoxy
What is going to happen:
You will be asked for the <root> folder.
You will be asked if you want to delete <root>\source afterwards It is okay to delete those files. They will not be removed from your VBA Project.
MakeDoxy will export all .bas, cls and .frm files to location:<root>\source\<modulename>\<modulename>(.bas|.cls|.frm)
cmd.exewill be commanded to run makedoxy.cmd and delete <root>\source if you've chosen that way which alltogether will result in your desired documentation.
A logfile MakeDoxy.bas.logwill be re-created each time MakeDoxy is executed.
You can play with configuration\vbdoxy.cfg a little if you want to change doxygens behavior.
There is still some room for improvements but I guess this is something one can work with.