Package PyGObject Python 3 program with pynsist? - packaging

I would like to package a Python3-PyGObject program with pynsist. The repository has an example for PyGTK and it made me think that it shouldn't be too hard to change the example.
The example can be found here:
https://github.com/takluyver/pynsist/tree/master/examples/pygtk
In this file (https://github.com/takluyver/pynsist/blob/master/examples/pygtk/grab_files.sh) I think one just has to grab the files targeting GTK 3 (http://www.gtk.org/download/win32.php):
wget -O gtkbundle.zip http://win32builder.gnome.org/gtk+-bundle_3.6.4-20130921_win32.zip
wget -O pygobject.exe http://sourceforge.net/projects/pygobjectwin32/files/pygi-aio-3.14.0_rev12-setup.exe/download
wget -O pycairo.zip http://ftp.gnome.org/pub/gnome/binaries/win32/dependencies/cairo_1.10.2-2_win32.zip
I am not sure what to do with the fourth line, because it is my current understanding that those bindings should already be inside the gtk or pygobject bundle:
wget -O pygtk.exe http://ftp.gnome.org/pub/GNOME/binaries/win32/pygtk/2.24/pygtk-2.24.0.win32-py2.7.exe
I then tried to customize this file (https://github.com/takluyver/pynsist/blob/master/examples/pygtk/installer.cfg) to include (use gi instead of gi.repository):
[Include]
packages=gi
This resulting error is:
raise ExtensionModuleMismatch(extensionmod_errmsg % ('Windows', path))
nsist.copymodules.ExtensionModuleMismatch: Found an extension module that will not be usable on Windows:
/usr/lib/python3/dist-packages/gi/_gi.cpython-34m-x86_64-linux-gnu.so
Put Windows packages in pynsist_pkgs/ to avoid this.
Does anyone know what the correct approach for a program (like e.g. one of these: https://python-gtk-3-tutorial.readthedocs.org) would be?
Edit 1
After packaging and installing the program on Windows, starting the test-program produces the following traceback:
Traceback (most recent call last):
File "C:\Program Files (x86)\hellogtk\hellogtk.launch.pyw", line 31, in <module>
from gtk_test import main
File "C:\Program Files (x86)\hellogtk\pkgs\gtk_test.py", line 3, in <module>
from gi.repository import Gtk
File "C:\Program Files (x86)\hellogtk\pkgs\gi\__init__.py", line 42, in <module>
from . import _gi
ImportError: DLL load failed: The specified module could not be found.
It is odd that this ImportError occurs because there is a _gi.pyd-file in the same directory (gi) as the __init__.py
This is the current layout:
- directory
|- pynsist_pkgs
|-- cairo
|--- _cairo.pyd
|--- __init__.py
|-- gi
|--- _gobject
|--- overrides
|--- repository
|--- __init__.py
|--- _gi.pyd
|--- ...
|-- gtk
|--- bin
|--- etc
|--- lib
|--- manifest
|--- share
|-- dbus
|--- __init__.py
|--- ...
|-- gnome
|--- ...
|-- pygtkcompat
|--- ...
|-- _dbus_bindings.pyd
|-- _dbus_glib_bindings.pyd
|-- ...
|- gtk_test.py
|- grab_files.sh
|- installer.cfg
|- gtk_preamble.py
And I used the py-3.4-64 folder of the pygobject bindings. The Linux I am creating the package on is 64 bit, and the Windows I am running the program is also 64 bit.
Edit 2:
Using Dependency-Walker I can see that 2 DLLs are missing: GPSVC.DLL and IESHIMS.DLL.
Edit 3:
I found those 2 DLLs on the system and copied them in different directories of the test-program, but it didn't work.
Edit 4:
This might be useful for the import-error:
import gtk/glib produces ImportError: DLL load failed

I worked together with Thomas K, the author of pynsist, to solve this. And I do want to advertise that it is a great tool, with very good support, and it makes packaging orders of magnitudes easier in my opinion.
There were a number of mistakes in my approach (see question), so it might be easier to just describe the correct approach:
Download dependencies
The only dependency needed for a program that only imports:
from gi.repository import Gtk
is the most recent pygi-aio (currently pygi-aio-3.14) bundle that can be downloaded here (The example in the pynsist-repository has a download script, but i might need to be updated for newer releases):
https://sourceforge.net/projects/pygobjectwin32/files/
Extract dependencies
The PyGObject/PyGI example that has now been merged into the pynsist-repository, comes with a script that extracts the necessary dependencies from the bundle (See: https://github.com/takluyver/pynsist/tree/master/examples/pygi_mpl_numpy).
Most importantly it extracts the contents of the bindings zip file (Modify the script for the targeted Python version and bitness) and copies them into the pynsist_pkgs folder:
- cairo
- dbus
- gi
- gnome
- pygtkcompat
Then it extracts and copies the subdependencies into the pynsist_pkgs/gnome/ folder. As lazka pointed out, the minimum requirements for a typical minimal Gtk-program are (each library has a pygi/noarch and pygi/[TargetedArchitecture] zip file):
- ATK
- Base
- Gdk
- GDKPixbuf
- GTK
- JPEG
- Pango
- WebP
- TIFF
Build the installer
The installer was then build in my case using:
python3 -m nsist installer.cfg
The installer.cfg is also in the repositories example folder. It only requires gnome to be listed (The subdependecies in the gnome folder behave as one unit).
Note about the pygi-aio bundle
When the pygi-aio is installed on a Windows-machine, the installer performs some post-installation compiling steps. This might become an issue if you are using this approach, because it only extracts the dependencies. In some cases you might need to run an exe file (comes with the bundle) and copy the compiled files back into your build directory. I describe the only problem I had here:
https://github.com/tobias47n9e/innsbruck-stereographic/issues/1
And there is a bug report with more information here:
https://sourceforge.net/p/pygobjectwin32/tickets/12/
Working example
You can get the example here:
https://github.com/takluyver/pynsist/tree/master/examples/pygi_mpl_numpy

Related

lessc command cannot find import inside another import

I have an issue with running lessc command when there is hierarchical imports. My folder structure is like this:
(root)/
|--- website
|--- styles
|--- master.less
|--- library
|--- testDir
|--- child.less
|--- test.less
The short version of master.less:
#import url('../../library/test.less');
The short version of test.less:
#import url('/testDir/child.less');
When I run lessc website/styles/master.less website/styles/master.css, I am given FileError: '/testDir/child.less' wasn't found. Tried - /testDir/child.less, etc.
Note: It is not viable for me to use --include-path=./library/testDir option, because I have many instances of these hierarchical imports in my less files.
Any other option that I am missing, or a workaround?

Package a pre-built python extension

I am working on a C library (using cmake as a build system) and a corresponding python extension written in cython.
The build process is conducted by cmake, which calls the cython executable to generate a C file. The file is compiled into a python_library.so which links
against the native library.so and other dependencies.
The library works as expected, I can set the PYTHONPATH to the build directory, run python and import and execute the wrapped python code.
What remains is the question about how to install / package the python module.
As far as I know, the recommended method to create python packages is to use setuptools / distutils inside a setup.py file.
It is of course possible to define a C Extension (optionally using cython) inside the setup.py file. However, I want the compilation to be handled by cmake (it involves some dependent libraries etc.)
So basically, I would like to tell python that the whole package is defined by an existing python_library.so file. Is that at all possible?
Note: there is a related question. But the OP has already figured out how to package the extension.
Obviously, this is not the most robust way to distribute python-packages as it will not work for different OSes or may lead to strange results if there is Python-version mismatch - but nevertheless it is possible.
Let's consider following folder structure:
/
|--- setup.py
|--- my_package
|------- __init__.py
|------- impl.pyx [needed only for creation of impl.so]
|------- impl-XXX.so [created via "cythonize -i impl.pyx"]
With the following content:
__init__.py:
from .impl import foo
impl.pyx:
def foo():
print("I'm foo from impl")
setup.py:
from setuptools import setup, find_packages
kwargs = {
'name':'my_package',
'version':'0.1.0',
'packages':find_packages(),
#ensure so-files are copied to the installation:
'package_data' : { 'my_package': ['*.so']},
'include_package_data' : True,
'zip_safe' : False
}
setup(**kwargs)
Now after calling python setup.py install, the package is installed and can be used:
>>> python -c "import my_package; my_package.foo()"
I'm foo from impl
NB: Don't call the test from the folder with the setup file, because then not the installed but local version of my_package can be used.
You might want to have different so-binaries for different Python versions. It is possible to have the same extension compiled for different Python versions - you have to add the right suffix to the resulting shared library, for example:
impl.cpython-36m-x86_64-linux-gnu.so for Python3.6 on my linux machine
impl.cpython-37m-x86_64-linux-gnu.so for Python3.7
impl.cp36-win_amd64.pyd on windows
One can get the suffix for extensions on the current machine using
>>> import importlib
>>> importlib.machinery.EXTENSION_SUFFIXES
['.cp36-win_amd64.pyd', '.pyd']

How do I use my own modules in a JuliaBox notebook?

I've recently started using JuliaBox for programming in Julia, and I want to use my own modules that I've previously written using the Juno-Atom IDE. I've uploaded the relevant modules to JuliaBox, but I am unable to call them from a JuliaBox notebook. The error message I get is as follows:
using MyModule
ArgumentError: Module MyModule not found in current path.
Run `Pkg.add("MyModule")` to install the MyModule package.
Stacktrace:
[1] _require(::Symbol) at ./loading.jl:435
[2] require(::Symbol) at ./loading.jl:405
[3] include_string(::String, ::String) at ./loading.jl:522
I originally had the module in a separate folder called 'modules', but even after moving it to the main folder (same location as the notebook), I still get the same error message.
I have ascertained the working directory:
pwd()
"/mnt/juliabox"
..and that seems to be the folder where my module is currently stored. At least, that's the directory which is displayed when I try to move the module file on the main JuliaBox screen.
I did try installing the module as an unregistered package under Package Builder (I was getting desperate!), but that didn't work either.
So I'm wondering whether I need to add something to the JULIA_LOAD_PATH in Environment Variables; however, that would seem to rather defeat the purpose of using an online version of Jupyter notebooks, which presumably is to allow easy access anywhere.
Anyway, I've run out of ideas, so if anyone could give me a clue as to where I am going wrong it would be very much appreciated.
If your module file is in the main folder, add it to the LOAD_PATH (it is not added by default). Customize the path if you put your file somewhere else.
#everywhere push!(LOAD_PATH, homedir())
import MyModule
or
include("MyModule.jl") # if it is already in pwd()
import MyModule
The issue is not related to JuliaBox or IJulia. That is how you import a Module. You either put the folder in LOAD_PATH or include the file containing the module.
https://docs.julialang.org/en/stable/manual/modules/#Relative-and-absolute-module-paths-1
I believe this issue on Github addressing the problem you are facing: https://github.com/JuliaLang/julia/issues/4600
I did try installing the module as an unregistered package under Package Builder (I was getting desperate!), but that didn't work either.
I think the package builder functionality is working properly. Just try creating a dummy module with the following structure and the contents:
~/MyModule.jl> tree
.
├── REQUIRE
└── src
├── functions
│   └── myfunc.jl
└── MyModule.jl
2 directories, 3 files
~/MyModule.jl> cat REQUIRE
julia 0.6
~/MyModule.jl> cat src/functions/myfunc.jl
myfunc(x) = 2x
~/MyModule.jl> cat src/MyModule.jl
module MyModule
export myfunc
include(joinpath("functions", "myfunc.jl"))
end
Then, git init a repository inside the directory, git add and git commit all the files, add a remote repository (like on GitHub or GitLab) with git remote add, and git push your local repository to the newly added remote repository. You should see that the unregistered package option is working as intended.
All that remains is to call
julia> using MyModule
julia> myfunc(10)
20
EDIT. You can try adding https://github.com/aytekinar/MyModule.jl as an unregistered package to your JuliaBox. That repository hosts the above-mentioned dummy module.

Why are `__init__.py` and `BUILD` needed inside TensorFlow's `models/tutorials/rnn/translate`?

Inside the tensorflow/models/tutorials/rnn/translate folder, we have a few files including __init__.py and BUILD.
Without __init__.py and BUILD files, the translate script can still manage to run.
What is the purpose of __init__.py and BUILD here? Are we supposed to install or build it using these two files?
The BUILD file supports using Bazel for hermetic building and testing of the model code. In particular a BUILD file is present in that directory to define the integration test translate_test.py and its dependencies, so that we can run it on continuous integration system (e.g. Jenkins).
The __init__.py file causes Python to treat that directory as a package. See this question for a discussion of why __init__.py is often present in a Python source directory. While this file is not strictly necessary to invoke translate.py directly from that directory, it is necessary if we want to import the code from translate.py into a different module.
(Note that when you run a Python binary through Bazel, the build system will automatically generate __init__.py files if they are missing. However, TensorFlow's repositories often have explicit __init__.py files in Python source directories so that you can run the code without invoking Bazel.)

Debugging externally compiled Typescript in IntelliJ

I was trying to debug typescript with IntelliJ but I cannot get it working. I use webpack to build the typescript files and maps and only the compiled js is used in the page. I also use an external webserver, so I cannot use the build in IDEA webserver.
My structure looks as follows:
root
|-- compiled
|-- compiled.js
|-- compiled.js.map
|-- src
|-- file1.ts
|-- some_subfolder
|-- file2.ts
|-- ....
I setup a debug configuration for Javascript, installed the Chrome extension and did the path mappings. If I put a breakpoint into compiled.js, the breakpoint gets hit and I can debug. Breakpoints in my ts files are ignored though. I did mark the compiled folder as excluded as per documentation (it says the IDE will then autoload map files from these folders). As far as I can see there is no option to manually set a map file for the script file in the debug configuration.
Any ideas what I might be missing?