The current default version of Python running on Google Colab is 3.7, but I need 3.9 for my notebooks to work.
How can I update Google Colab's Python version to 3.9 (or greater)?
In Google Colab you have a Debian-based Linux, and you can do whatever you can on a Debian Linux. Upgrading Python is as easy as upgrading it on your own Linux system.
Detect the current python version in Colab:
!python --version
#Python 3.8.16
Install new python version
Let's first install and upgrade to Python 3.9:
#install python 3.9
!sudo apt-get update -y
!sudo apt-get install python3.9
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.9 2
#check python version
!python --version
#3.9.16
Port Colab kernel to the new installed python
As mentioned in the comments, the above commands just add a new python version to your google colab and update the default python for commandline usage. But your runtime packages such as sys are still running on the previous python version. The following commands need to be executed as well, to update the sys version.
# install pip for new python
!sudo apt-get install python3.9-distutils
!wget https://bootstrap.pypa.io/get-pip.py
!python get-pip.py
# credit of these last two commands blongs to #Erik
# install colab's dependencies
!python -m pip install ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
# link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google \
/usr/local/lib/python3.9/dist-packages/google
Now you can restart runtime and check the sys version. Note that in the new python version you have to install every packages, such as pandas, tensorflow, etc. from scratch.
Also, note that you can see a list of installed Python versions and switch between them at any time with this command:
(If nothing changed after installation, use this command to select python version manually)
!sudo update-alternatives --config python3
#after running, enter the row number of the python version you want.
It's also possible to update the kernel without going through ngrok or conda with some creative package installation.
Raha's answer suggesting making a link between the default google package and the newly installed Python version is the trick that makes this work because, at least with Python 3.9, the version of pandas (0.24.0) that the google package requires fails to build.
Here's the code I used to install and switch my Colab kernel to Python 3.9:
#install python 3.9 and dev utils
#you may not need all the dev libraries, but I haven't tested which aren't necessary.
!sudo apt-get update -y
!sudo apt-get install python3.9 python3.9-dev python3.9-distutils libpython3.9-dev
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.9 2
#Check that it points at the right location
!python3 --version
# install pip
!curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
!python3 get-pip.py --force-reinstall
#install colab's dependencies
!python3 -m pip install ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
# link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google \
/usr/local/lib/python3.9/dist-packages/google
# There has got to be a better way to do this...but there's a bad import in some of the colab files
# IPython no longer exposes traitlets like this, it's a separate package now
!sed -i "s/from IPython.utils import traitlets as _traitlets/import traitlets as _traitlets/" /usr/local/lib/python3.9/dist-packages/google/colab/*.py
!sed -i "s/from IPython.utils import traitlets/import traitlets/" /usr/local/lib/python3.9/dist-packages/google/colab/*.py
If Google updates from Python 3.8, you'll have to change the path to the default package.
Then go the Runtime menu and select Restart runtime. It should reconnect and choose the updated version of Python as the default kernel. You can check that it worked with:
#check python version
import sys
print(sys.version)
!python3 --version
!python --version
To use another python version in google colab, you need to:
1- Installing Anaconda.
2- Adding (fake) google colab library.
3- Starting Jupyterlab.
4- Accessing it with ngrok.
# install Anaconda3
!wget -qO ac.sh https://repo.anaconda.com/archive/Anaconda3-2020.07-Linux-x86_64.sh
!bash ./ac.sh -b
# a fake google.colab library
!ln -s /usr/local/lib/python3.6/dist-packages/google \
/root/anaconda3/lib/python3.8/site-packages/google
# start jupyterlab, which now has Python3 = 3.8
!nohup /root/anaconda3/bin/jupyter-lab --ip=0.0.0.0&
# access through ngrok, click the link
!pip install pyngrok -q
from pyngrok import ngrok
print(ngrok.connect(8888))
you can also use:
# Install the python version
!apt-get install python3.9
# Select the version
!python3.9 setup.py
another way is to use a virtual environment with your desired python version:
virtualenv env --python=python3.9
Update 24.12.2022 - Unfortunately, the method does not work anymore.
This worked for me (copied from GitHub), I successfully installed Python 3.10.
#The code below installs 3.10 (assuming you now have 3.8) and restarts environment, so you can run your cells.
import sys #for version checker
import os #for restart routine
if '3.10' in sys.version:
print('You already have 3.10')
else:
#install python 3.10 and dev utils
#you may not need all the dev libraries, but I haven't tested which aren't necessary.
!sudo apt-get update -y
!sudo apt-get install python3.10 python3.10-dev python3.10-distutils libpython3.10-dev
!sudo apt-get install python3.10-venv binfmt-support #recommended in install logs of the command above
#change alternatives
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.8 1
!sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 2
# install pip
!curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10
!python3 get-pip.py --force-reinstall
#install colab's dependencies
!python3 -m pip install setuptools ipython ipython_genutils ipykernel jupyter_console prompt_toolkit httplib2 astor
#minor cleanup
!sudo apt autoremove
#link to the old google package
!ln -s /usr/local/lib/python3.8/dist-packages/google /usr/local/lib/python3.10/dist-packages/google
#this is just to verify if 3.10 folder was indeed created
!ls /usr/local/lib/python3.10/
#restart environment so you don't have to do it manually
os.kill(os.getpid(), 9)
In addition to Kaveh's answer, I added the following code. (This colab python version is python 3.8 and I tried to downgrade to python 3.7)
!pip install google-colab==1.0.0
# install colab's dependencies
!python -m pip install ipython==7.9.0 ipython_genutils==0.2.0 ipykernel==5.3.4 jupyter_console==6.1.0 prompt_toolkit==2.0.10 httplib2==0.17.4 astor==0.8.1 traitlets==5.7.1 google==2.0.3
This way, I solved the crashing runtime error.
Simple as that: -
!wget -O mini.sh https://repo.anaconda.com/miniconda/Miniconda3-py39_4.9.2-Linux-x86_64.sh
!chmod +x mini.sh
!bash ./mini.sh -b -f -p /usr/local
!conda install -q -y jupyter
!conda install -q -y google-colab -c conda-forge
!python -m ipykernel install --name "py39" --user
Source: https://colab.research.google.com/drive/1m47aWKayWTwqJG--x94zJMXolCEcfyPS?usp=sharing#scrollTo=r3sLiMIs8If3
I started installing all I need according to react-native Get Started guide.
I installed Watchman and according to there guide.
I got following error while running ./configure command.
arafath#dell-pc:~/watchman$ ./configure
bash: ./configure: No such file or directory
OS - UBUNTU 16.04
I have succeeded in installing following things.
sudo apt-get install -y autoconf automake build-essential python-dev libssl-dev libtool
I tried installing scrapy on another server.
When I run pip install scrapy:
error in cryptography setup command: Invalid environment marker: python_version < '3'
Complete output from command python setup.py egg_info:
error in cryptography setup command: Invalid environment marker: python_version < '3'
----------------------------------------
Cleaning up...
Command python setup.py egg_info failed with error code 1 in /tmp/pip_build_root/cryptography
Storing debug log for failure in /root/.pip/pip.log
any ideas please help
Python 2.7.6
I did
sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev
and
sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev
I did an sudo apt-get update before that too
The same happens on debian 8. You can pin the cryptography library on version 2.0.3, then retrying.
pip install cryptography==2.0.3
I just encountered this same issue and solved it via:
pip install --upgrade setuptools pip
Apparently the default version of setuptools and pip on 14.04 is insufficient for the cryptography package.
I've created a virtualenv for PyPy with:
virtualenv test -p `which pypy`
source test/bin/activate
I installed the following dependencies:
sudo apt-get install python-dev libxml2 libxml2-dev libxslt-dev
And then I run:
pip install --upgrade pypy
As a result I get a lot of errors looking like this:
src/lxml/lxml.etree.c:234038:22: error: `PyThreadState` {aka struct _ts}` has no member named `use_tracing`
How do I properly install lxml for PyPy 2.6.0?
I used the following fork of lxml for PyPy instead:
https://github.com/aglyzov/lxml/tree/cffi
It can be installed with:
pip install -e git+git://github.com/aglyzov/lxml.git#cffi#egg=lxml-cffi
When doing:
$ sudo pypy -m easy_install lxml
The response is:
Searching for lxml
[...snip...]
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
/usr/lib/pypy/lib-python/2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
warning: no files found matching '*.txt' under directory 'src/lxml/tests'
src/lxml/lxml.etree.c:8:22: fatal error: pyconfig.h: No such file or directory
compilation terminated.
error: Setup script exited with error: command 'cc' failed with exit status 1
At the same time, sudo pip install lxml works fine.
What's going on?
Thanks.
sudo apt-get install python-dev fixed it for me on ubuntu 13.04
$yum install python-lxml or apt-get install python-lxml
this solved mine.
I've stumbled with this trouble a couple of times.
Short answer
Python2: $ python2.7 setup.py clean build --with-cython install
Python3: $ pip-3.3 install lxml
Long answer
The hypothesis is that pip install lxml should work in every environment, regardless if you are using Python2 or Python3.
There's also Cython to be considered: You will certainly enjoy lxml compiled with Cython due to relevant performance gains.
For reasons unknown to me, the compilation on Python2 does not find Cython.
To be more precise and absolutely explicit about this matter, both commands below DO NOT employ Cython:
# DO NOT use these commands. I repeat: DO NOT use these commands.
$ pip-2.7 install lxml
$ easy_install-2.7 install lxml
So, when using Python2 you have only one alternative, as far as I know, which is: compile from sources, Luke!
# install build environment and dependencies
$ kernel_release=$( uname -r )
$ sudo apt-get install linux-headers-${kernel_release} build-essential -y
$ sudo apt-get install libxml2-dev libxslt1-dev -y
# Download from github and compile from sources
$ git clone --branch lxml-3.2.4 https://github.com/lxml/lxml
$ python2.7 setup.py clean build --with-cython install
I've handled this problem by installed Ubuntu package pypy-dev.