Upload package to pypi moans "must use HTTPS" - pypi

When executing this from the command-line of within my package:
python setup.py sdist bdist_egg upload
I get:
Server response (403): Must access using HTTPS instead of HTTP
This used to work many times until now. Searching for the err-msg didn't give me helpful infos, has anyone a clue what's going on?

Update: Use twine for uploading distributions to pypi.
Are you using a .pypirc file?
If you are maybe change the urls to point to the https links?
[distutils]
index-servers =
pypi
pypitest
[pypi]
repository=https://pypi.python.org/pypi
username=your_username
password=your_password
[pypitest]
repository=https://testpypi.python.org/pypi
username=your_username
password=your_password

Updating setuptools let's the error dissapear:
pip install setuptools -U
Then running the upload-command ends with:
Submitting dist/my.packagename-1.3.tar.gz to https://upload.pypi.org/legacy/
error: None
But still, no new version is available at pypi.

Related

How to install SASL with Python 3.8?

I am trying to install sasl3-0.2.11 python package on a windows 10 machine (64 bit).
It is failing with a C1083 fatal error.
Due to some proxies and me not being able to avoid them, I am installing it by downloading the tar.gz from pypi, logging into the uncompressed folder and doing python setup.py install.
This solution worked for all modules but sasl.
I have then read this useful comment but the .whl from Cyrus Sasl did not work too. They suppot until 3.7 python, not 3.8.
I am really wondering how can I bypass this issue or could I avoid sasl for being able to use Pyhive.
Thanks in advance.
Nourou
Finally, I just uninstalled Python 3.8 and install the 3.7.
Then, I was able to install Sasl via the wheel file here
You just need to install the following packages on Ubuntu:
apt-get install libsasl2-dev libsasl2-2 libsasl2-modules-gssapi-mit

Scrapy Installation Error: [CondaEnvironmentNotFoundError] : could not find environment: base

I'm currently trying to install scrapy when I encountered my first error:
ERROR conda.core.link:_execute_actions(337): An error occurred while installing package 'conda-forge::automat-0.7.0-py_1'.
CondaError: Cannot link a source that does not exist. D:\ProgramFiles\Python\Scripts\conda.exe
Running conda clean --packages may resolve your problem.
Attempting to roll back.
CondaError: Cannot link a source that does not exist. D:\ProgramFiles\Python\Scripts\conda.exe
Running conda clean --packages may resolve your problem.
I researched this error and followed the advice on this link:
My issues were largely similar to his until I reached the comment which advised me to run conda update -n base conda.
When I ran this code, I encountered my next error:
CondaEnvironmentNotFoundError: Could not find environment: base .
You can list all discoverable environments with conda info --envs.
Kindly advice if my steps taken were appropriate and how can I fix this issue.
The weird thing is I installed scrapy before, and these errors occurred after I recently re-installed Anaconda.
I'm not sure what other info you might require to better understand the situation. Do let me know and I will assist promptly.
Thank You
Try the conda install scrapy channel instead of the conda-forge channel.
To understand the difference between these two channels please read the answer of the following question Should conda, or conda-forge be used for Python environments?

Install Pycurl after mac update to High Sierra - SSL error

I updated my mac to high sierra, and now I can't install pycurl. It fails with this message : Curl is configured to use SSL, but we have not been able to determine which SSL backend it is using. Please see PycURL documentation for how to specify the SSL backend manually.
I searched on the documentation and the web and I found some solution that not fix my problem. the most popular is this one :
pip uninstall pycurl
export PYCURL_SSL_LIBRARY=openssl
pip install pycurl
here is the complete error
A solution similar to the one you found worked for me when issued from within my virtualenv. I use Homebrew as a package manager on macOS High Sierra, and Pipenv to manage my project dependencies and virtualenv. The error emerged after adding the PyVimeo API Library, which has PycURL as a dependency, to my project.
The generated errors were, first,
src/pycurl.c:137:4: warning: #warning "libcurl was compiled with SSL
support, but configure could not determine which library was used;
thus no SSL crypto locking callbacks will be set, which may cause
random crashes on SSL requests" [-Wcpp]
then,
ImportError: pycurl: libcurl link-time ssl backend (openssl) is
different from compile-time ssl backend (none/other)
As mentioned in the PycURL docs, the solution was to "tell [PycURL's] setup.py what SSL backend is used." Setting the environment variables recommended in the output of brew info openssl, alone, did not solve the problem.
Then I found a tangentially related Github issue comment and tried the following from within my project's virtualenv:
(env)$ pip uninstall pycurl
(env)$ pip install --upgrade pip
(env)$ export LDFLAGS=-L/usr/local/opt/openssl/lib
(env)$ export CPPFLAGS=-I/usr/local/opt/openssl/include
(env)$ export PYCURL_SSL_LIBRARY=openssl
(env)$ pip install pycurl
The install command gave this output:
Collecting pycurl Using cached
https://files.pythonhosted.org/packages/e8/e4/0dbb8735407189f00b33d84122b9be52c790c7c3b25286826f4e1bdb7bde/pycurl-7.43.0.2.tar.gz
Building wheels for collected packages: pycurl Running setup.py
bdist_wheel for pycurl ... done Stored in directory:
/Users/me/Library/Caches/pip/wheels/d2/85/ae/ebf5ff0f1378a69d082b4863df492bf54c661bf6306a2bd
Successfully built pycurl
tuspy 0.2.1 has requirement pycurl==7.43.0,
but you'll have pycurl 7.43.0.2 which is incompatible. Installing
collected packages: pycurl Successfully installed pycurl-7.43.0.2
I noted the (somewhat petty?) tuspy error and trudged on. This time, my script ran without PycURL complaining.

django-registration module latest version on openshift

I am hosting django-1.5 app on openshift. I need django-registration module which I have specified in requirements.txt file.
The problem is that openshift is not able to find latest version django-registration-1.0 but only django-registration-0.8 which is not compatible with django-1.5 Any idea how to resolve this or how to add manual link to latest version in requirements.txt?
I'm not getting why its not able to find package while it is available at PyPI.
remote: Searching for django-registration==1.0
remote: Reading http://mirror1.ops.rhcloud.com/mirror/python/web/simple/django-registration/
remote: Reading http://www.bitbucket.org/ubernostrum/django-registration/wiki/
remote: Reading <some other link>
remote: Reading <some other link>
remote: Reading <Some Other link>
remote: No local packages or download links found for django-registration==1.0
remote: Best match: None
I made it work using setuptools specifying dependency link, though why PyPI package is not working is still not clear to me.
from setuptools import setup, find_packages
setup(
...
...
packages=find_packages(),
include_package_data=True,
install_requires=['django-registration==1.0'],
dependency_links = [
"http://pypi.python.org/pypi/django-registration"
],
)
How about directly installing package by logging into the application gear via ssh and running:
source ~/python-2.6/virtenv/bin/activate
pip install --log $OPENSHIFT_DATA_DIR/inst.log https://URL_TO_CUSTOM_PACKAGE
OR
source ~/python-2.6/virtenv/bin/activate
pip install --log $OPENSHIFT_DATA_DIR/inst.log -E $VIRTUAL_ENV $path_to/package
Since the issue is still alive (argh!) and I couldn't install the last security release for django I had to find a workaround for this problem.
Inserting the following line to the top of the requirements.txt magically solved the problem:
--index-url https://pypi.python.org/simple
It just sets the base url for finding packages.
I know that the question is a bit old, but I had a similar problem with OpenShift. On PyPi the package wagtail had the latest version of 1.4.1, but on OpenShift only 1.3.1 was found. After git push it show an url in the output, which seemed to point to a mirror instead os the pypi.python.org.
I logged in the app and:
env | grep -i pypi
OPENSHIFT_PYPI_MIRROR_URL=http://mirror1.ops.rhcloud.com/mirror/python/web/simple
It seems that OpenShift by default uses it's own mirror for Python packages. A mirror that is a bit out of date. I don't know why. I can't really say whether it is better to do as tomako suggests or could the change be made to the env variable OPENSHIFT_PYPI_MIRROR_URL or how often the mirror is updated.

getting easy_install to install something for python2.5 when default is python2.7

since i'm using google app engine, i have to use python2.5.
because of this, i need to install an older version of BeautifulSoup that works with python2.5 (i think bs 3.0.7a will work).
in order to do that, as far as i can tell, i need to get easy_install to put BeautifulSoup in the python2.5 folder rather than in the python2.7 folder, which is does by default.
the docu for easy_install said the following:
"Also, if you’re working with Python version 2.4 or higher, you can run Python with -m easy_install to run that particular Python version’s easy_install command."
but how exactly do i do that?
this guide helped me get it right:
http://achinghead.com/installing-multiple-versions-python.html