Pip always fails ssl even when I do pip install dedupe or pip install --trusted-host pypi.python.org dedupe
The output is always the same no matter what:
Collecting dedupe
Retrying (Retry(total=4, connect=None, read=None,
redirect=None, status=None)) after connection broken by
'SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate
verify failed (_ssl.c:777)'),)': /simple/dedupe/
Retrying...
skipping
Could not find a version that satisfies the requirement dedupe (from versions: ) No matching distribution found for dedupe
So I uninstalled anaconda and reinstalled it. Same thing.
Do you think the problem is that my _ssl.c file (which I have no idea where it is) must be corrupt or something? Why would pip need to reference that if I'm telling it to bypass ssl verification anyway?
It may be related to the 2018 change of PyPI domains.
Please ensure your firewall/proxy allows access to/from:
pypi.org
files.pythonhosted.org
So you could give a try to something like:
$ python -m pip install --trusted-host files.pythonhosted.org --trusted-host pypi.org --trusted-host pypi.python.org [--proxy ...] [--user] <packagename>
Please see $ pip help install for the --user option description (omit if in a virtualenv).
The --trusted-host option doesn't actually bypass SSL/TLS, but allows to mark host as trusted when (and only when) it does not have valid (or any) HTTPS. It shouldn't really matter with PiPY because pypi.org (formerly pypi.python.org) does use HTTPS and there is CDN in front of it which always enforces TLSv1.2 handshake requirement regardless of the connecting pip client options.. But if you had your own local mirrors of pypi.org with HTTP-only access, then --trusted-host could be handy. Oh, and if you are behind a proxy, please also make sure to also specify: --proxy [user:passwd#]proxyserver:port
Some corporate proxies may even go as far as to replace the certificates of HTTPS connections on the fly. And if your system clock is out of sync, it could break SSL verification process as well.
If firewall / proxy / clock isn't a problem, then check SSL certificates being used in pip's SSL handshake. In fact, you could just get a current cacert.pem (Mozilla's CA bundle from curl) and try it using the pip option --cert:
$ pip --cert ~/cacert.pem install --user <packagename>
where --cert argument is system path to your alternate CA bundle in PEM format. (regarding the --user option, please see below).
Or, it's possible to create a custom config ~/.pip/pip.conf and point the option at a valid system cert (or your cacert.pem) as a workaround, for example:
[global]
cert = /etc/pki/tls/external-roots/ca_bundle.pem
(or another pem file)
It's even possible to manually replace the original cacert.pem found in pip with your trusty CA bundle (if your pip is very old for example). Older pip versions knew to fallback between pip/_vendor/requests/cacert.pem and system stores like /etc/ssl/certs/ca-certificates.crt or /etc/pki/tls/certs/ca-bundle.crt in case of cert issues, but in recent pip it's no longer the case, as it seems to rely solely on pip/_vendor/certifi/cacert.pem
Basically, pip package uses requests which uses urllib3 which, among other things, verifies SSL certificates; and all of them are shipped (vendored) within pip, along with the certifi package (also included, since pip 9.0.2) that provides current CA bundle (cacert.pem file) required for TLS verification. Requests itself uses urllib3 and certifi internally, and before 9.0.2, pip used cacert.pem from requests or the system. What it all means is that actually updating pip may help fix the CERTIFICATE_VERIFY_FAILED error, particularly if the OS and pip were deployed long ago:
The OP used anaconda, so they could try:
$ conda update pip - because issues can arise if conda and pip are both used together in the same environment. If there's no pip version update available, they could try:
$ conda config --add channels conda-forge; conda update pip
Alternatively, it's possible to use conda alone to directly install / manage python packages: it is a tool completely separate from pip, but provides similar features in terms of package and venv management. Its packages come not from PyPI, but from anaconda's own repositories.
The problem is, if you mix both and run conda after pip, the former can overwrite and break packages (and their dependencies) installed via pip, and render it all unusable. So it's recommended to only use one or the other, or, if you have to, use only pip after conda (and no conda after pip), and only in isolated conda environments.
On normal Linux Python installations without conda:
If you are using a version of pip supplied by your OS distribution, then use vendor-supplied upgrades for a system-wide pip update:
$ sudo apt-get install python-pip or: $ sudo yum install python27-pip
Some updates may not be readily available because distros usually lag behind PyPI. In this case, it's possible to upgrade pip at your user level (right in your $HOME dir), or inside a virtualenv, like:
$ python -m pip install --user --trusted-host files.pythonhosted.org --trusted-host pypi.org --trusted-host pypi.python.org --upgrade pip
(omit --user if in a virtualenv)
The --user switch will upgrade pip only for the current user (in your home ~/.local/lib/) rather than for the whole OS, which is a good practice to avoid interfering with the system python packages. It's enabled by default in a pip distributed in recent Ubuntu/Fedora versions. Be aware of how to solve ImportError if you don't use this option and happen to overwrite the OS-level system pip.
Alternatively (also at a user level) you could try:
$ curl -LO https://bootstrap.pypa.io/get-pip.py && python get-pip.py --user
The PyPA script contains a wrapper that extracts the .pem SSL bundle from pip._vendor.certifi.
Otherwise, if still no-go, try running pip with -vvv option to add verbosity to the output and check if there is now another SSLError caused by tlsv1 alert protocol version.
This worked for me, try this:
pip install --trusted-host=pypi.org --trusted-host=files.pythonhosted.org --user {name of whatever I'm installing}
My way is a simplification of #Alex C's answer:
python -m pip install --trusted-host pypi.python.org --trusted-host files.pythonhosted.org --trusted-host pypi.org --upgrade pip
I experienced the same issue because I have Zscaler (a cloud security software) installed and was causing:
URL host for python packages being blocked
invalid SSL certificate warnings popping up
SSL inspection certificate not trusted
As mentioned by others, the below will fix individual package installations. pypi.python.org is not required since it has been replaced by pypi.org.
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org <package to install>
I permanently fixed the issue by creating pip.ini file (pip.conf in Unix) and adding the below:
[global]
trusted-host = pypi.python.org
pypi.org
files.pythonhosted.org
See pip configuration files for how to locate your pip.ini, or where to put it if you need to create one.
The error above or one like it was caused by the virtual machine (VM) not be time synchronized, my guest Ubuntu VM was several days in the past.
I ran this commend to get the VM to pick up the correct network time:
sudo timedatectl set-ntp on
This makes the Ubuntu guest OS get the network time. (You may have to provide a network time source... I used this article: Digital Ocean - How to set time on Ubuntu)
Check the time is correct:
timedatectl
Re-run the failing pip command.
Related
I am trying to compile a simple program into an apk for Android with Buildozer but have run into the following problem. Can you please help? I tried upgrading pip but that didn't help:python3 -m pip install --upgrade pip
So, I reverted back to the original pip version.
Installed Cython separately: pip3 install Cython
But the same issue persists. I am at a loss. :-(
Command: buildozer android debug
RAN: /bin/bash -c 'venv/bin/pip install Cython'
STDOUT:
WARNING: pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.
How can I fix this?
Could not fetch URL https://pypi.org/simple/cython/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/cython/ (Caused by SSLError("Can't connect to HTTPS URL because the SSL module is not available.")) - skipping
ERROR: Could not find a version that satisfies the requirement Cython (from versions: none)
ERROR: No matching distribution found for Cython
WARNING: pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.
This is a recent bug, you need to install libssl-dev using apt install libssl-dev
Then you might also want to clean your buildozer directory by running rm -rf .buildozer in the directory that contains your buildozer.spec file.
That should do it!
I have downloaded spacy in Anaconda prompt by using conda install -c conda-forge spacy. But when I tried to download en_core_we_sm using python -m spacy download en_core_web_sm I getting SSL: CERTIFICATE_VERIFY_FAILED error.
With HTTPS, trying to download something from a remote host produces an SSL connection error in some cases like if your computer is behind a proxy which does not let you to make SSL connection freely. For those cases, a downloading manager like pip , conda for python or apt-get or yum for Linux provide some options for a user to specify certificate for such connections or to allow untrusted communication with a remote host for such downloads.
However, downloading a model VIA spacy with python -m spacy download does not provide such options. You cannot add any SSL certificates nor specify trusted host for a download.
Fortunately, there's a workaround solution with two separate steps , downloading and installing. That is, download the model with any other clients which is under control with SSL (browser, curl, wget...) then install the downloaded model with pip install
Find appropriate model you need on https://github.com/explosion/spacy-models/releases and download tar.gz file like,
wget https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.2.5/en_core_web_sm-2.2.5.tar.gz
Then install it like,
python -m pip install ./en_core_web_sm-2.2.5.tar.gz
Just download the direct version.
python -m spacy download en_core_web_sm-2.2.0 --direct
I had the same error as you, gave this a try, and it worked. For more information here are some additional details from the model page:
https://spacy.io/usage/models
The answer provided by K. Symbol is helpful. As an alternative, the download and installation can be done in one statement with pip. Pip can be assigned "trusted-host" and the "install" object can be a website, so:
pip --trusted-host github.com --trusted-host objects.githubusercontent.com install https://github.com/explosion/spacy-models/releases/download/en_core_web_md-3.4.0/en_core_web_md-3.4.0.tar.gz
For me the issue was i was running the command "python -m spacy download en" from a different location other than "C:\WINDOWS\system32". When i ran the command from "C:\WINDOWS\system32" with "Run as Admin" it worked like charm. Seems from other locations it is not able to load the correct ssl config.
If you are unable to download it because you cannot verify the certificate as you are behind a company proxy, you can also do the following by first downloading the file via requests and specifying that you don't want to check certifictates, then install it via pip:
import requests, os
lang = 'en'
r = requests.get(f'https://github.com/explosion/spacy-models/releases/download/{lang}_core_news_sm-3.0.0/{lang}_core_news_sm-3.0.0-py3-none-any.whl',
verify=False) # verify=False to skip checking of certificate
file = f'{lang}_core_news_sm-3.0.0-py3-none-any.whl'
with open(file,'wb') as output_file:
output_file.write(r.content) # save the wheel locally
# then install it via pip
!pip install {file} --user
os.remove(file) # remove the file
First, Uninstall Spacy and clean the directories. Then install with the following link -
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org spacy
Use pip3 for Python3 and run following in a terminal
python -m spacy download en_core_web_sm
let me know if you still get error/s. Follow https://spacy.io/usage/models
I'm trying to follow this guide to test this new algorithm: https://github.com/lalonderodney/SegCaps
I can't do it in my PC, so i'm using another server with Putty. Now I'm connected with the other server.
First of all I installed TensorFlow as indicates in the guide with :
pip install -r requirements.txt
After I wrote this code: ./main.py segcaps.png
in which segcaps.png is the image that i want to use
Finally I wrote python main.py --data_root_dir data
that is the only required parameter with the directory containing imgs and masks folders.
Now it gives me an error:
ModuleNotFoundError: No module named 'tensorflow.python.framework'
I searched it in the directory tensorflow/python/framework and it exists.
So, i don't know how to solve it. Ideas?
If you have multiple Python versions installed, then you'll (most likely) have multiple pip versions installed too. Make sure that the pip command you use installs the package(s) into the Python version you want it to. It may so happen that the package got installed into python2 but you wanted it in python3.
Since using pip did not install the packages in python3, pip3 is most likely to the PyPI for python3. Try
pip3 install -r requirements.txt
and that should work.
In case you have an EnvironmentError you can try this (bad idea):
pip3 install -r requirements.txt --user
This solves the problem most of the times on standalone machines. I'm not sure about the server; insufficient permissions might block this.
Why is the --user flag a bad idea? Read: What is the purpose “pip install --user …”?
You can use pip show tensorflow to see if it is installed or not.
As for ModuleNotFoundError try uninstalling keras and reinstalling an earlier version by pip install keras==2.1.6
I updated my mac to high sierra, and now I can't install pycurl. It fails with this message : Curl is configured to use SSL, but we have not been able to determine which SSL backend it is using. Please see PycURL documentation for how to specify the SSL backend manually.
I searched on the documentation and the web and I found some solution that not fix my problem. the most popular is this one :
pip uninstall pycurl
export PYCURL_SSL_LIBRARY=openssl
pip install pycurl
here is the complete error
A solution similar to the one you found worked for me when issued from within my virtualenv. I use Homebrew as a package manager on macOS High Sierra, and Pipenv to manage my project dependencies and virtualenv. The error emerged after adding the PyVimeo API Library, which has PycURL as a dependency, to my project.
The generated errors were, first,
src/pycurl.c:137:4: warning: #warning "libcurl was compiled with SSL
support, but configure could not determine which library was used;
thus no SSL crypto locking callbacks will be set, which may cause
random crashes on SSL requests" [-Wcpp]
then,
ImportError: pycurl: libcurl link-time ssl backend (openssl) is
different from compile-time ssl backend (none/other)
As mentioned in the PycURL docs, the solution was to "tell [PycURL's] setup.py what SSL backend is used." Setting the environment variables recommended in the output of brew info openssl, alone, did not solve the problem.
Then I found a tangentially related Github issue comment and tried the following from within my project's virtualenv:
(env)$ pip uninstall pycurl
(env)$ pip install --upgrade pip
(env)$ export LDFLAGS=-L/usr/local/opt/openssl/lib
(env)$ export CPPFLAGS=-I/usr/local/opt/openssl/include
(env)$ export PYCURL_SSL_LIBRARY=openssl
(env)$ pip install pycurl
The install command gave this output:
Collecting pycurl Using cached
https://files.pythonhosted.org/packages/e8/e4/0dbb8735407189f00b33d84122b9be52c790c7c3b25286826f4e1bdb7bde/pycurl-7.43.0.2.tar.gz
Building wheels for collected packages: pycurl Running setup.py
bdist_wheel for pycurl ... done Stored in directory:
/Users/me/Library/Caches/pip/wheels/d2/85/ae/ebf5ff0f1378a69d082b4863df492bf54c661bf6306a2bd
Successfully built pycurl
tuspy 0.2.1 has requirement pycurl==7.43.0,
but you'll have pycurl 7.43.0.2 which is incompatible. Installing
collected packages: pycurl Successfully installed pycurl-7.43.0.2
I noted the (somewhat petty?) tuspy error and trudged on. This time, my script ran without PycURL complaining.
everyone!
I have debian device and I want it to upgrarde automatically from my repository.
To do that I just call apt-get from cron:
apt-get --assume-yes --force-yes install mypackage
But in this case it will install the package event if it can't check the signature. How do i check the signature before installing it?
There are two types of GPG signatures:
GPG signatures on the APT repository metadata, and
GPG signatures on Debian packages.
In order to verify the APT repository metadata, you need to import the public GPG key of the signer with something like this:
sudo bash -c 'wget -O - https://url/key' | apt-key add -
In order to verify Debian package signatures it is much more complicated and most package providers (like Ubuntu and Debian) don't sign packages. Most likely, the package you are trying to install is not signed.
However, if the package is signed and you'd like to verify it, you'll need to:
Ensure you have debsig-verify installed.
Create an XML policy document for verifying package signatures.
Modify /etc/dpkg/dpkg.cfg to enable package signature verification. CAUTION You should ensure that enabling this option does not break package installation of unsigned packages (like the ones provided by Ubuntu and Debian).
Package signatures will be verified when installed with apt-get.
Check out this blog post I wrote called GPG sign and verify deb packages and APT repositories which explains everything you need to know about verifying debian packages and APT repositories and includes some example configurations for debsig-verify.