how to build falcon with cython using pipenv - falconframework

In the falcon docs it is mentioned that in order to build falcon with cython the following commands need to be issued:
$ pip install cython
$ pip install --no-binary :all: falcon
The question is how to do this with pipenv.
First of all i want to say Pipfile that i have build dependency cython.
I want also to pass the parameters --no-binary :all: falcon to pip.

Related

Install Numpy Requirement in a Dockerfile. Results in error

I am attempting to install a numpy dependancy inside a docker container. (My code heavily uses it). On building the container the numpy library simply does not install and the build fails. This is on OS raspbian-buster/stretch. This does however work when building the container on MAC OS.
I suspect some kind of python related issue, but can not for the life of me figure out how to make it work.
I should point out that removing the pip install numpy from the requirements file and using it in its own RUN statement in the dockerfile does not solve the issue.
The Dockerfile:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV APP /app
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN mkdir $APP
WORKDIR $APP
ADD requirements.txt .
RUN pip install -r requirements.txt
COPY . .
The requirements.txt contains all the project requirements, amounf which is numpy.
Step 6/15 : RUN pip install numpy==1.14.3
---> Running in 266a2132b078
Collecting numpy==1.14.3
Downloading https://files.pythonhosted.org/packages/b0/2b/497c2bb7c660b2606d4a96e2035e92554429e139c6c71cdff67af66b58d2/numpy-1.14.3.zip (4.9MB)
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): still running...
Building wheel for numpy (setup.py): still running...
EDIT:
So after the comment by skybunk and the suggestion to head to official docs, some more debugging on my part, the solution wound up being pretty simple. Thanks skybunk to you go all the glory. Yay.
Solution:
Use alpine and install python install package dependencies, upgrade pip before doing a pip install requirements.
This is my edited Dockerfile - working obviously...
FROM python:3.6-alpine3.7
RUN apk add --no-cache --update \
python3 python3-dev gcc \
gfortran musl-dev \
libffi-dev openssl-dev
RUN pip install --upgrade pip
ENV PYTHONUNBUFFERED 1
ENV APP /app
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN mkdir $APP
WORKDIR $APP
ADD requirements.txt .
RUN pip install -r requirements.txt
COPY . .
To use Numpy on python3 here, we first head over to the official documentation to find what dependencies are required to build Numpy.
Mainly these 5 packages + their dependencies must be installed:
Python3 - 70 mb
Python3-dev - 25 mb
gfortran - 20 mb
gcc - 70 mb
musl-dev -10 mb (used for tracking unexpected behaviour/debugging)
An POC setup would look something like this -
Dockerfile:
FROM gliderlabs/alpine
ADD repositories.txt /etc/apk/repositories
RUN apk add --no-cache --update \
python3 python3-dev gcc \
gfortran musl-dev
ADD requirements-pip.txt .
RUN pip3 install --upgrade pip setuptools && \
pip3 install -r requirements-pip.txt
ADD . /app
WORKDIR /app
ENV PYTHONPATH=/app/
ENTRYPOINT python3 testscript.py
repositories.txt
http://dl-5.alpinelinux.org/alpine/v3.4/main
requirements-pip.txt
numpy
testscript.py
import numpy as np
def random_array(a, b):
return np.random.random((a, b))
a = random_array(2,2)
b = random_array(2,2)
print(np.dot(a,b))
To run this - clone alpine, build it using "docker build -t gliderlabs/alpine ."
Build and Run your Dockerfile
docker build -t minidocker .
docker run minidocker
Output should be something like this-
[[ 0.03573961 0.45351115]
[ 0.28302967 0.62914049]]
Here's the git link, if you want to test it out
From the error logs, it does not seem that it is from numpy. but you can install numpy before the requirment.txt and verify if it's working.
FROM python:3.6
RUN pip install numpy==1.14.3
Build
docker build -t numpy .
Run and Test
docker run numpy bash -c "echo import numpy as np > test.py ; python test.py"
So you will see no error on import.
or You can try numpy as an alpine package
FROM python:3-alpine3.9
RUN apk add --no-cache py3-numpy
Or better to post the requirement.txt.
I had lot of trouble with this issue using FROM python:3.9-buster and pandas.
My requirements.txt had the python-dev-tools, numpy and pandas, along with other packages.
I always got something like this when attempting to build:
preluded by:
and by:
Following hints by Adiii in this thread, I did some debug and found out that this actually works and builds a perfectly running container:
RUN pip3 install NumPy==1.18.0
RUN pip3 install python-dev-tools
RUN pip3 install pandas
RUN pip3 install -r requirements.txt
So, giving a specific RUN layer to the pip3 installing pandas solved the problem!
Another method is to install from the 'slim' distribution of python (based on debian):
FROM python:slim
CMD pip install numpy
123Mb
This results in a smaller image than that of alpine:
FROM python:3-alpine3.9
RUN apk add --no-cache py3-numpy
187MB
Plus it gives better support for other whl libraries for slim is based on a glibc library (against which all the wheels are built) while apline uses musl (incompatible with the wheels), so all packages will have to be either apk added or compiled from sources.

How to install a package in collaboratory using `python setup.py install`

Right now, I have
!wget myPackage
!tar -xvjf myPackage
!cd myPackage && python setup.py install
All of which complete, but then
import myPackage
fails. That seems likely to be because the python which I call with !python setup.py is different than the python which executes each cell. How do I install a package using !python setup.py install such that I'm able to import the package?
I tried !which python which gave me /usr/local/bin/python but /usr/local/bin/python setup.py install had the same issue, which seems reasonable.
Also see: https://colab.research.google.com/notebooks/snippets/importing_libraries.ipynb
Restart your Python runtime after the install using the Runtime -> Restart runtime... menu.
Here's a worked example:
https://colab.research.google.com/drive/1Z1d5IGStWkqGXMk_txdxTngDXMkxcDOh

Installing seaborn on Docker Alpine

I am trying to install seaborn with this Dockerfile:
FROM alpine:latest
RUN apk add --update python py-pip python-dev
RUN pip install seaborn
CMD python
The error I get is related to numpy and scipy (required by seaborn). It starts with:
/tmp/easy_install-nvj61E/numpy-1.11.1/setup.py:327: UserWarning:
Unrecognized setuptools command, proceeding with generating Cython
sources and expanding templates
and ends with
File "numpy/core/setup.py", line 654, in get_mathlib_info
RuntimeError: Broken toolchain: cannot link a simple C program
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-DZ4cXr/scipy/
The command '/bin/sh -c pip install seaborn' returned a non-zero code: 1
Any idea how I can fix this?
To fix this error, you need to install gcc: apk add gcc.
But you will see that you will hit a new error as numpy, matplotlip and scipy have several dependencies. You need to also install gfortran, musl-dev, freetype-dev, etc.
Here is a Dockerfile based on you initial one that will install those dependencies as well as seaborn:
FROM alpine:latest
# install dependencies
# the lapack package is only in the community repository
RUN echo "http://dl-4.alpinelinux.org/alpine/edge/community" >> /etc/apk/repositories
RUN apk --update add --no-cache \
lapack-dev \
gcc \
freetype-dev
RUN apk add python py-pip python-dev
# Install dependencies
RUN apk add --no-cache --virtual .build-deps \
gfortran \
musl-dev \
g++
RUN ln -s /usr/include/locale.h /usr/include/xlocale.h
RUN pip install seaborn
# removing dependencies
RUN apk del .build-deps
CMD python
You'll notice that I'm removing the dependencies using apk-del .build-deps to limit the size of your image (http://www.sandtable.com/reduce-docker-image-sizes-using-alpine/).
Personally I also had to install ca-certificates but it seems you didn't have this issue.
Note: You could also build your image FROM the python:2.7-alpine image to avoid installing python and pip yourself.

Python 3.4: using pip

I've read in the documentation that Python3.4 ships with pip installed. When I try to make a call however I get an error. What am I missing?
U:\>py -V
Python 3.4.2
U:\>py -3.4 -m pip install matplotlib
C:\Python34\python.exe: No module named pip

lxml won't install under pypy using easy_install

When doing:
$ sudo pypy -m easy_install lxml
The response is:
Searching for lxml
[...snip...]
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
/usr/lib/pypy/lib-python/2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
warning: no files found matching '*.txt' under directory 'src/lxml/tests'
src/lxml/lxml.etree.c:8:22: fatal error: pyconfig.h: No such file or directory
compilation terminated.
error: Setup script exited with error: command 'cc' failed with exit status 1
At the same time, sudo pip install lxml works fine.
What's going on?
Thanks.
sudo apt-get install python-dev fixed it for me on ubuntu 13.04
$yum install python-lxml or apt-get install python-lxml
this solved mine.
I've stumbled with this trouble a couple of times.
Short answer
Python2: $ python2.7 setup.py clean build --with-cython install
Python3: $ pip-3.3 install lxml
Long answer
The hypothesis is that pip install lxml should work in every environment, regardless if you are using Python2 or Python3.
There's also Cython to be considered: You will certainly enjoy lxml compiled with Cython due to relevant performance gains.
For reasons unknown to me, the compilation on Python2 does not find Cython.
To be more precise and absolutely explicit about this matter, both commands below DO NOT employ Cython:
# DO NOT use these commands. I repeat: DO NOT use these commands.
$ pip-2.7 install lxml
$ easy_install-2.7 install lxml
So, when using Python2 you have only one alternative, as far as I know, which is: compile from sources, Luke!
# install build environment and dependencies
$ kernel_release=$( uname -r )
$ sudo apt-get install linux-headers-${kernel_release} build-essential -y
$ sudo apt-get install libxml2-dev libxslt1-dev -y
# Download from github and compile from sources
$ git clone --branch lxml-3.2.4 https://github.com/lxml/lxml
$ python2.7 setup.py clean build --with-cython install
I've handled this problem by installed Ubuntu package pypy-dev.