Install "Ifcopenshell" in google-colaboratory - google-colaboratory

I've tried with: import ifcopenshell
after that I tried: !pip install -q ifcopenshell
and later with : !apt-get -qq install -y ifcopenshell
I had an error in all three cases: Could not find a version that satisfies the requirement ifcopenshell (from versions: )
No matching distribution found for ifcopenshell
... How can I install "ifcopenshell" in google-colaboratory ?
Thanks in advance

use conda
!wget -c https://repo.continuum.io/archive/Anaconda3-5.1.0-Linux-x86_64.sh
!chmod +x Anaconda3-5.1.0-Linux-x86_64.sh
!bash ./Anaconda3-5.1.0-Linux-x86_64.sh -b -f -p /usr/local
!conda install -c conda-forge -c oce -c dlr-sc -c ifcopenshell ifcopenshell
import sys
sys.path.append('/usr/local/lib/python3.6/site-packages/')

I believe we must leave as linux, as we are cloning a repo inside google colab.
Just open a notebook in googlecollab, copy and paste the exact lines previewsly suggested, and run.

Related

How can I update Google Colab's Python version?

The current default version of Python running on Google Colab is 3.7, but I need 3.9 for my notebooks to work.
How can I update Google Colab's Python version to 3.9 (or greater)?
In Google Colab you have a Debian-based Linux, and you can do whatever you can on a Debian Linux. Upgrading Python is as easy as upgrading it on your own Linux system.
Detect the current python version in Colab:
!python --version
#Python 3.8.16
Install new python version
Let's first install and upgrade to Python 3.9:
#install python 3.9
!sudo apt-get update -y
!sudo apt-get install python3.9
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.9 2
#check python version
!python --version
#3.9.16
Port Colab kernel to the new installed python
As mentioned in the comments, the above commands just add a new python version to your google colab and update the default python for commandline usage. But your runtime packages such as sys are still running on the previous python version. The following commands need to be executed as well, to update the sys version.
# install pip for new python
!sudo apt-get install python3.9-distutils
!wget https://bootstrap.pypa.io/get-pip.py
!python get-pip.py
# credit of these last two commands blongs to #Erik
# install colab's dependencies
!python -m pip install ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
# link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google \
/usr/local/lib/python3.9/dist-packages/google
Now you can restart runtime and check the sys version. Note that in the new python version you have to install every packages, such as pandas, tensorflow, etc. from scratch.
Also, note that you can see a list of installed Python versions and switch between them at any time with this command:
(If nothing changed after installation, use this command to select python version manually)
!sudo update-alternatives --config python3
#after running, enter the row number of the python version you want.
It's also possible to update the kernel without going through ngrok or conda with some creative package installation.
Raha's answer suggesting making a link between the default google package and the newly installed Python version is the trick that makes this work because, at least with Python 3.9, the version of pandas (0.24.0) that the google package requires fails to build.
Here's the code I used to install and switch my Colab kernel to Python 3.9:
#install python 3.9 and dev utils
#you may not need all the dev libraries, but I haven't tested which aren't necessary.
!sudo apt-get update -y
!sudo apt-get install python3.9 python3.9-dev python3.9-distutils libpython3.9-dev
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.9 2
#Check that it points at the right location
!python3 --version
# install pip
!curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
!python3 get-pip.py --force-reinstall
#install colab's dependencies
!python3 -m pip install ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
# link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google \
/usr/local/lib/python3.9/dist-packages/google
# There has got to be a better way to do this...but there's a bad import in some of the colab files
# IPython no longer exposes traitlets like this, it's a separate package now
!sed -i "s/from IPython.utils import traitlets as _traitlets/import traitlets as _traitlets/" /usr/local/lib/python3.9/dist-packages/google/colab/*.py
!sed -i "s/from IPython.utils import traitlets/import traitlets/" /usr/local/lib/python3.9/dist-packages/google/colab/*.py
If Google updates from Python 3.8, you'll have to change the path to the default package.
Then go the Runtime menu and select Restart runtime. It should reconnect and choose the updated version of Python as the default kernel. You can check that it worked with:
#check python version
import sys
print(sys.version)
!python3 --version
!python --version
To use another python version in google colab, you need to:
1- Installing Anaconda.
2- Adding (fake) google colab library.
3- Starting Jupyterlab.
4- Accessing it with ngrok.
# install Anaconda3
!wget -qO ac.sh https://repo.anaconda.com/archive/Anaconda3-2020.07-Linux-x86_64.sh
!bash ./ac.sh -b
# a fake google.colab library
!ln -s /usr/local/lib/python3.6/dist-packages/google \
/root/anaconda3/lib/python3.8/site-packages/google
# start jupyterlab, which now has Python3 = 3.8
!nohup /root/anaconda3/bin/jupyter-lab --ip=0.0.0.0&
# access through ngrok, click the link
!pip install pyngrok -q
from pyngrok import ngrok
print(ngrok.connect(8888))
you can also use:
# Install the python version
!apt-get install python3.9
# Select the version
!python3.9 setup.py
another way is to use a virtual environment with your desired python version:
virtualenv env --python=python3.9
Update 24.12.2022 - Unfortunately, the method does not work anymore.
This worked for me (copied from GitHub), I successfully installed Python 3.10.
#The code below installs 3.10 (assuming you now have 3.8) and restarts environment, so you can run your cells.
import sys #for version checker
import os #for restart routine
if '3.10' in sys.version:
print('You already have 3.10')
else:
#install python 3.10 and dev utils
#you may not need all the dev libraries, but I haven't tested which aren't necessary.
!sudo apt-get update -y
!sudo apt-get install python3.10 python3.10-dev python3.10-distutils libpython3.10-dev
!sudo apt-get install python3.10-venv binfmt-support #recommended in install logs of the command above
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 2
# install pip
!curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10
!python3 get-pip.py --force-reinstall
#install colab's dependencies
!python3 -m pip install setuptools ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
#minor cleanup
!sudo apt autoremove
#link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google /usr/local/lib/python3.10/dist-packages/google
#this is just to verify if 3.10 folder was indeed created
!ls /usr/local/lib/python3.10/
#restart environment so you don't have to do it manually
os.kill(os.getpid(), 9)
In addition to Kaveh's answer, I added the following code. (This colab python version is python 3.8 and I tried to downgrade to python 3.7)
!pip install google-colab==1.0.0
# install colab's dependencies
!python -m pip install ipython==7.9.0 ipython_genutils==0.2.0 ipykernel==5.3.4 jupyter_console==6.1.0 prompt_toolkit==2.0.10 httplib2==0.17.4 astor==0.8.1 traitlets==5.7.1 google==2.0.3
This way, I solved the crashing runtime error.
Simple as that: -
!wget -O mini.sh https://repo.anaconda.com/miniconda/Miniconda3-py39_4.9.2-Linux-x86_64.sh
!chmod +x mini.sh
!bash ./mini.sh -b -f -p /usr/local
!conda install -q -y jupyter
!conda install -q -y google-colab -c conda-forge
!python -m ipykernel install --name "py39" --user
Source: https://colab.research.google.com/drive/1m47aWKayWTwqJG--x94zJMXolCEcfyPS?usp=sharing#scrollTo=r3sLiMIs8If3

How to install Tensorflow federated directly from GitHub or local download?

I want to have access to features from TensorFlow federated (tff.python.research) which aren't present with the pip3 install method.
I'm working on a remote server that does not have bazel, thus I cannot build from source. Are there other ways to get and install the latest working version of TFF from its GitHub REPO?
(https://github.com/tensorflow/federated)
To install the latest Tensorflow 2.0 federated, you may follow the steps below.
Install TensorFlow Federated using pip
Install the Python development environment
On Ubuntu:
$ sudo apt update
$ sudo apt install python3-dev python3-pip # Python 3
$ sudo pip3 install --upgrade virtualenv # system-wide install
On macOS:
$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
$ export PATH="/usr/local/bin:/usr/local/sbin:$PATH"
$ brew update
$ brew install python # Python 3
$ sudo pip3 install --upgrade virtualenv # system-wide install
Create a virtual environment
$ virtualenv --python python3 "venv"
$ source "venv/bin/activate"
(venv) $ pip install --upgrade pip
Note: To exit the virtual environment, run deactivate.
Install the TensorFlow Federated pip package.
(venv) $ pip install --upgrade tensorflow_federated
(Optional) Test Tensorflow Federated.
(venv) $ python -c "import tensorflow_federated as tff; print(tff.federated_computation(lambda: 'Hello World')())"
Build the TensorFlow Federated pip package
Install the Python development environment.
On Ubuntu:
$ sudo apt update
$ sudo apt install python3-dev python3-pip # Python 3
$ sudo pip3 install --upgrade virtualenv # system-wide install
On macOS:
$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
$ export PATH="/usr/local/bin:/usr/local/sbin:$PATH"
$ brew update
$ brew install python # Python 3
$ sudo pip3 install --upgrade virtualenv # system-wide install
Install Bazel
Install Bazel, the build tool used to compile Tensorflow Federated.
Clone the Tensorflow Federated repository.
$ git clone https://github.com/tensorflow/federated.git
$ cd "federated"
Create a virtual environment.
$ virtualenv --python python3 "venv"
$ source "venv/bin/activate"
(venv) $ pip install --upgrade pip
Note: To exit the virtual environment, run deactivate.
Install Tensorflow Federated dependencies.
(venv) $ pip install --requirement "requirements.txt"
(Optional) Test Tensorflow Federated.
(venv) $ bazel test //tensorflow_federated/...
Create a new project.
$ mkdir "/tmp/project"
$ cd "/tmp/project"
$ virtualenv --python python3 "venv"
$ source "venv/bin/activate"
(venv) $ pip install --upgrade pip
Note: To exit the virtual environment run deactivate.
Install the pip package.
(venv) $ pip install --upgrade "/tmp/tensorflow_federated/tensorflow_federated-"*".whl"
Test Tensorflow Federated.
(venv) $ python -c "import tensorflow_federated as tff; print(tff.federated_computation(lambda: 'Hello World')())"
Reference: https://www.tensorflow.org/federated/install

osmNX in Google Colab

For my purposes I require osmNX in Google Colab
Has anyone done this before? I use the following commands:
!wget https://repo.anaconda.com/archive/Anaconda3-2019.07-Linux-x86_64.sh && bash Anaconda3-2019.07-Linux-x86_64.sh -bfp /usr/local
import sys
sys.path.append('/usr/local/lib/python3.6/site-packages')
!conda config --prepend channels conda-forge
The command:
!conda info --envs
Shows that the enviroment is created succesfully.
When I run the command:
!conda activate ox
The error is displayed:
CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'.
To initialize your shell, run
$ conda init <SHELL_NAME>
Currently supported shells are:
- bash
- fish
- tcsh
- xonsh
- zsh
- powershell
See 'conda init --help' for more information and options.
IMPORTANT: You may need to close and restart your shell after running 'conda init'.
The command
!conda init bash
has no effect.
Thanks for the help
!apt-get -qq install -y libspatialindex-dev && pip install -q -U osmnx
import osmnx as ox
ox.config(use_cache=True, log_console=True)
you can use this command !
!pip install geopandas== 0.10.0
!pip install matplotlib==3.4
!pip install networkx==2.6
!pip install numpy==1.21
!pip install pandas==1.3
!pip install pyproj==3.2
!pip install requests==2.26
!pip install Rtree==0.9
!pip install Shapely==1.7
!pip install osmnx
I installed the respective packages based on the requirements provided in this link https://github.com/gboeing/osmnx/blob/main/requirements.txt , it has worked in my application so far, hope it works for you too.
Alternatively, similar to another answer, you can use the code below, found in https://stackoverflow.com/a/65378540/18403512:
!apt install libspatialindex-dev
!pip install osmnx
The answer would be similar to running osmnx on any docker or external server.
I tried it and almost got there, maybe someone can help make it complete.
So let's start with the basic osmnx installation:
conda config --prepend channels conda-forge
conda create -n ox --strict-channel-priority osmnx
Then, let's look at how can this be done at remote docker, e.g. travis CI (working sample .travis.yml from one of my repos):
- bash miniconda.sh -b -p $HOME/miniconda
- source "$HOME/miniconda/etc/profile.d/conda.sh"
- hash -r
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
# Useful for debugging any issues with conda
- conda info -a
- conda config --prepend channels conda-forge
- conda create -n ox --strict-channel-priority osmnx
- conda activate ox
Then we may take a look at how to have conda in colab and use this snippet:
%%bash
MINICONDA_INSTALLER_SCRIPT=Miniconda3-4.5.4-Linux-x86_64.sh
MINICONDA_PREFIX=/usr/local
wget https://repo.continuum.io/miniconda/$MINICONDA_INSTALLER_SCRIPT
chmod +x $MINICONDA_INSTALLER_SCRIPT
./$MINICONDA_INSTALLER_SCRIPT -b -f -p $MINICONDA_PREFIX
which then finally boils down to this almost working notebook, based on this post.
What is not working is switching between environments, so !conda env list returns ox as one of environments, yet activating it fails:
!conda activate ox
raises:
CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'.
To initialize your shell, run
$ conda init <SHELL_NAME>
Currently supported shells are:
- bash
- fish
- tcsh
- xonsh
- zsh
- powershell
See 'conda init --help' for more information and options.
IMPORTANT: You may need to close and restart your shell after running 'conda init'.

Install Numpy Requirement in a Dockerfile. Results in error

I am attempting to install a numpy dependancy inside a docker container. (My code heavily uses it). On building the container the numpy library simply does not install and the build fails. This is on OS raspbian-buster/stretch. This does however work when building the container on MAC OS.
I suspect some kind of python related issue, but can not for the life of me figure out how to make it work.
I should point out that removing the pip install numpy from the requirements file and using it in its own RUN statement in the dockerfile does not solve the issue.
The Dockerfile:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV APP /app
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN mkdir $APP
WORKDIR $APP
ADD requirements.txt .
RUN pip install -r requirements.txt
COPY . .
The requirements.txt contains all the project requirements, amounf which is numpy.
Step 6/15 : RUN pip install numpy==1.14.3
---> Running in 266a2132b078
Collecting numpy==1.14.3
Downloading https://files.pythonhosted.org/packages/b0/2b/497c2bb7c660b2606d4a96e2035e92554429e139c6c71cdff67af66b58d2/numpy-1.14.3.zip (4.9MB)
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): still running...
Building wheel for numpy (setup.py): still running...
EDIT:
So after the comment by skybunk and the suggestion to head to official docs, some more debugging on my part, the solution wound up being pretty simple. Thanks skybunk to you go all the glory. Yay.
Solution:
Use alpine and install python install package dependencies, upgrade pip before doing a pip install requirements.
This is my edited Dockerfile - working obviously...
FROM python:3.6-alpine3.7
RUN apk add --no-cache --update \
python3 python3-dev gcc \
gfortran musl-dev \
libffi-dev openssl-dev
RUN pip install --upgrade pip
ENV PYTHONUNBUFFERED 1
ENV APP /app
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN mkdir $APP
WORKDIR $APP
ADD requirements.txt .
RUN pip install -r requirements.txt
COPY . .
To use Numpy on python3 here, we first head over to the official documentation to find what dependencies are required to build Numpy.
Mainly these 5 packages + their dependencies must be installed:
Python3 - 70 mb
Python3-dev - 25 mb
gfortran - 20 mb
gcc - 70 mb
musl-dev -10 mb (used for tracking unexpected behaviour/debugging)
An POC setup would look something like this -
Dockerfile:
FROM gliderlabs/alpine
ADD repositories.txt /etc/apk/repositories
RUN apk add --no-cache --update \
python3 python3-dev gcc \
gfortran musl-dev
ADD requirements-pip.txt .
RUN pip3 install --upgrade pip setuptools && \
pip3 install -r requirements-pip.txt
ADD . /app
WORKDIR /app
ENV PYTHONPATH=/app/
ENTRYPOINT python3 testscript.py
repositories.txt
http://dl-5.alpinelinux.org/alpine/v3.4/main
requirements-pip.txt
numpy
testscript.py
import numpy as np
def random_array(a, b):
return np.random.random((a, b))
a = random_array(2,2)
b = random_array(2,2)
print(np.dot(a,b))
To run this - clone alpine, build it using "docker build -t gliderlabs/alpine ."
Build and Run your Dockerfile
docker build -t minidocker .
docker run minidocker
Output should be something like this-
[[ 0.03573961 0.45351115]
[ 0.28302967 0.62914049]]
Here's the git link, if you want to test it out
From the error logs, it does not seem that it is from numpy. but you can install numpy before the requirment.txt and verify if it's working.
FROM python:3.6
RUN pip install numpy==1.14.3
Build
docker build -t numpy .
Run and Test
docker run numpy bash -c "echo import numpy as np > test.py ; python test.py"
So you will see no error on import.
or You can try numpy as an alpine package
FROM python:3-alpine3.9
RUN apk add --no-cache py3-numpy
Or better to post the requirement.txt.
I had lot of trouble with this issue using FROM python:3.9-buster and pandas.
My requirements.txt had the python-dev-tools, numpy and pandas, along with other packages.
I always got something like this when attempting to build:
preluded by:
and by:
Following hints by Adiii in this thread, I did some debug and found out that this actually works and builds a perfectly running container:
RUN pip3 install NumPy==1.18.0
RUN pip3 install python-dev-tools
RUN pip3 install pandas
RUN pip3 install -r requirements.txt
So, giving a specific RUN layer to the pip3 installing pandas solved the problem!
Another method is to install from the 'slim' distribution of python (based on debian):
FROM python:slim
CMD pip install numpy
123Mb
This results in a smaller image than that of alpine:
FROM python:3-alpine3.9
RUN apk add --no-cache py3-numpy
187MB
Plus it gives better support for other whl libraries for slim is based on a glibc library (against which all the wheels are built) while apline uses musl (incompatible with the wheels), so all packages will have to be either apk added or compiled from sources.

How can I install the latest versions of NumPy/Scipy/Matplotlib/IPython/Pandas on Ubuntu

Users sometimes need to know how to install a newer version of Pandas than their OS package manager offers. Pandas requires NumPy, and works best with SciPy, Matplotlib and IPython.
How can I install the latest versions of NumPy/Scipy/Matplotlib/IPython/Pandas?
Using Ubuntu, here is how to install the entire NumPy/Scipy/Matplotlib/IPython/Pandas
stack from Github in a virtualenv using Python2.7:
Note: The instructions below install the latest development version of each package. If you wish to install the latest tagged version, then after git clone, inspect the tags available with
git tag
and select the version you wish to install with
git checkout tag-name
Install virtualenv and virtualenvwrapper:
sudo apt-get install python-virtualenv
sudo pip install virtualenvwrapper
# edit ~/.bashrc to include
source /usr/share/virtualenvwrapper/virtualenvwrapper.sh
# edit ~/.profile to include
export WORKON_HOME=$HOME/.virtualenvs
# You may have to log out then log back in to make the change effective
Make a virtualenv
mkvirtualenv --system-site-packages dev
workon dev
# If you want to make this virtual environment your default Python,
# edit ~/.bashrc to include
workon dev
Add site-packages to sys.path:
add2virtualenv $USER/.virtualenvs/dev/lib/python2.7/site-packages
Install Cython
pip install -U Cython
Install git
sudo apt-get install git
Install NumPy
cd ~/src
git clone https://github.com/numpy/numpy.git
sudo apt-get install python-dev build-essential
sudo apt-get install libatlas-base-dev libatlas3gf-base
# ensure clean build
# this is not necessary the first time, but useful when upgrading
cd ~/src/numpy
/bin/rm -rf ~/src/numpy/build
cdsitepackages && /bin/rm -rf numpy numpy-*-py2.7.egg-info
cd ~/src/numpy
python setup.py build --fcompiler=gnu95
python setup.py install
Install SciPy
cd ~/src
git clone https://github.com/scipy/scipy.git
# ensure clean build
cd ~/src/scipy
/bin/rm -rf ~/src/scipy/build
cdsitepackages && /bin/rm -rf scipy scipy-*-py2.7.egg-info
cd ~/src/scipy
git clean -xdf
python setup.py install
Install Matplotlib dependencies
pip install -U pyparsing
pip install -U six
pip install -U python-dateutil
pip install -U pytz
sudo apt-get install libzmq-dev
pip install -U tornado pygments pyzmq
pip install -U nose
sudo apt-get install python-qt4 python-qt4-doc python-pyside python-cairo python-wxgtk2.8 python-gtk2 dvipng
apt-cache depends python-matplotlib | awk '/Depends:/{print $2}' | xargs dpkg --get-selections
sudo apt-get build-dep python-matplotlib
Install Matplotlib
cd ~/src/
git clone https://github.com/matplotlib/matplotlib
# ensure clean build
cd ~/src/matplotlib
/bin/rm -rf ~/src/matplotlib/build
cdsitepackages && /bin/rm -rf matplotlib* mpl_toolkits
# compile and install
cd ~/src/matplotlib
python setup.py build
python setup.py install
Install IPython
cd ~/src
git clone https://github.com/ipython/ipython.git
# ensure clean build
cd ~/src/ipython
/bin/rm -rf ~/src/ipython/build
cdsitepackages && /bin/rm -rf ipython-*-py2.7.egg
cd ~/src/ipython
python setupegg.py install
Install Pandas
cd ~/src
git clone https://github.com/pydata/pandas.git
cd ~/src/pandas
# update
git fetch origin
git rebase --interactive origin/master
# ensure clean build and install
/bin/rm -rf ~/src/pandas/{build,dist} && cdsitepackages && /bin/rm -rf pandas* && cd ~/src/pandas && python setup.py build_ext --inplace && python setup.py install
Updating:
The advantage of
the git approach is that it provides a way to always keep these packages
up-to-date:
cd ~/src/package-name
git fetch origin
git rebase --interactive origin/master
Follow the instructions above to ensure a clean build, and then rebuild and
reinstall the package.
Shorthand for using pip with GitHub directly
The above steps to clone and install packages can be automated to an extent with pip. For example, we can also install NumPy like this:
pip install git+git://github.com/numpy/numpy.git
The updating would then be just
pip install numpy --upgrade --force-reinstall
--force-reinstall flag may be needed because pip checks the version from PyPI and doesn't update if the current version isn't smaller.
Via the Anaconda distribution:
Download and install
wget http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh -O miniconda.sh
chmod +x miniconda.sh
./miniconda.sh -b
export PATH=/home/travis/miniconda/bin:$PATH
conda update conda --yes
Install just the packages in the title in their own environment:
conda create --name myenv --yes python=3.4 pandas matplotlib ipython-notebook
source activate myenv
Note: I believe anaconda supports Python versions 2.6, 2.7, 3.3, and 3.4.