Python pip install requires server_hostname - python

I finished installing pip on linux, the pip list command works. But when using the pip install command it got the following error:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/basecommand.py", line 232, in main
status = self.run(options, args)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/commands/install.py", line 339, in run
requirement_set.prepare_files(finder)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/req/req_set.py", line 333, in prepare_files
upgrade=self.upgrade,
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/index.py", line 305, in find_requirement
page = self._get_page(main_index_url, req)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/index.py", line 783, in _get_page
return HTMLPage.get_page(link, req, session=self.session)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/index.py", line 872, in get_page
"Cache-Control": "max-age=600",
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/sessions.py", line 473, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/download.py", line 365, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/sessions.py", line 461, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/cachecontrol/adapter.py", line 43, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/adapters.py", line 370, in send
timeout=timeout
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/packages/urllib3/connectionpool.py", line 518, in urlopen
body=body, headers=headers)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/packages/urllib3/connectionpool.py", line 322, in _make_request
self._validate_conn(conn)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/packages/urllib3/connectionpool.py", line 727, in _validate_conn
conn.connect()
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/packages/urllib3/connection.py", line 238, in connect
ssl_version=resolved_ssl_version)
File "/usr/local/lib/python2.7/site-packages/pip-6.0.7-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py", line 254, in ssl_wrap_socket
return context.wrap_socket(sock)
File "/usr/local/lib/python2.7/ssl.py", line 350, in wrap_socket
_context=self)
File "/usr/local/lib/python2.7/ssl.py", line 537, in __init__
raise ValueError("check_hostname requires server_hostname")
ValueError: check_hostname requires server_hostname
How can I fix this?

pip 6.1.0 has been released, fixing this issue. You can upgrade with:
pip --trusted-host pypi.python.org install -U pip
to self-upgrade.
Original answer:
This is caused by a change in Python 2.7.9, which urllib3 needs to account for. See issue #543 for that project. Your OpenSSL libraries do not support SNI, which means urllib3 won't pass in the host name to the SSL socket wrapper, but Python 2.7.9 expects the hostname to be passed in anyway for different purposes.
urllib3 is indirectly used by requests (see requests issue 2435), which in turn is being used by pip.
I've opened a ticket to track this from pip's perspective.
The underlying issues have been fixed by the project maintainers, and awaiting a new release. You could install the current development version of pip if you are impatient:
pip install --trusted-host=github.com -U https://github.com/pypa/pip/archive/develop.zip
This'll install pip-6.1.0.dev0, when 6.1.0 is fully released you can upgrade again with pip install -U pip to get the final release from PyPI.

I get the same issue, and find that it can be avoided (pip 6.0.8) in my case as follows
pip --trusted-host pypi.python.org install <thing>

It is related to urllib3.
You can resolve it with urllib3 version 1.25.8.
Download that version of urllib3 manually and install it.
Even though you install thia version, pip will still use its own version.So you have to remove it and replace it.
Usually, installed module is on PythonXX/Lib/site-packages
Delete urllib3 in PythonXX/Lib/site-packages/pip/_vendor
Move "PythonXX/Lib/site-packages/urllib3" to "PythonXX/Lib/site-packages/pip/_vendor".

I encountered this problem and tried the above methods, but not work. I finally find it is because I turn on the VPN. When I turn off the VPN, I can successfully download packages.

Related

AttributeError: decoding error while running Python tweepy script in Ubuntu

I have a Python 3.5 script which essentially is a Twitter scraper that collects tweets using the tweepy package and its StreamListener function.
Now my script runs perfectly within the Command shell but when I try to run it on the Ubuntu environment of my server I receive a decoding error.
File "/usr/lib/python3/dist-packages/tweepy/streaming.py", line 445, in filter
self._start(async)
File "/usr/lib/python3/dist-packages/tweepy/streaming.py", line 361, in _start
self._run()
File "/usr/lib/python3/dist-packages/tweepy/streaming.py", line 294, in _run
raise exception
File "/usr/lib/python3/dist-packages/tweepy/streaming.py", line 247, in _run
verify=self.verify)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 454, in reque st
prep = self.prepare_request(req)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 388, in prepa
re_request
hooks=merge_hooks(request.hooks, self.hooks),
File "/usr/lib/python3/dist-packages/requests/models.py", line 297, in prepare
self.prepare_auth(auth, url)
File "/usr/lib/python3/dist-packages/requests/models.py", line 490, in prepare _auth
r = auth(self)
File "/usr/lib/python3/dist-packages/requests_oauthlib/oauth1_auth.py", line 7 1, in __call__
r.url = to_native_str(r.url)
File "/usr/lib/python3/dist-packages/requests_oauthlib/oauth1_auth.py", line 1 4, in to_native_str
return string.decode('utf-8')
AttributeError: 'str' object has no attribute 'decode'
This makes sense because you can't decode a string which already is a string. But my question is how I can solve this issue because this is an error occuring within a package?
It looks like the problem comes from requests-oauthlib, you are running an old version of it, and the problem seems to have been fixed since 4.0.1 .
Upgrade your system (the packaged version with Ubuntu 16.10 is 7.0):
sudo apt update && apt upgrade
Or install using pip:
sudo pip3 install requests-oauthlib

error while installing graphlab in ubuntu

I am getting the following error while installing graphlab in ubuntu 14.04. I already spent lot of time to solve this problem by googling it. can any one help me to solve this prblem.
Error:
ramy#ramy-Aspire-4739Z:~$ sudo pip install --upgrade https://get.graphlab.com/GraphLab-Create/2.1/mohangtrichy#gmail.com/<KEY>/GraphLab-Create-License.tar.gz
Downloading/unpacking https://get.graphlab.com/GraphLab-Create/2.1/mohangtrichy#gmail.com/<KEY>/GraphLab-Create-License.tar.gz
Cleaning up...
Exception:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 122, in main
status = self.run(options, args)
File "/usr/lib/python2.7/dist-packages/pip/commands/install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "/usr/lib/python2.7/dist-packages/pip/req.py", line 1198, in prepare_files
do_download,
File "/usr/lib/python2.7/dist-packages/pip/req.py", line 1376, in unpack_url
self.session,
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 546, in unpack_http_url
resp = session.get(target_url, stream=True)
File "/usr/share/python-wheels/requests-2.2.1-py2.py3-none-any.whl/requests/sessions.py", line 467, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 237, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/usr/share/python-wheels/requests-2.2.1-py2.py3-none-any.whl/requests/sessions.py", line 455, in request
resp = self.send(prep, **send_kwargs)
File "/usr/share/python-wheels/requests-2.2.1-py2.py3-none-any.whl/requests/sessions.py", line 558, in send
r = adapter.send(request, **kwargs)
File "/usr/share/python-wheels/requests-2.2.1-py2.py3-none-any.whl/requests/adapters.py", line 385, in send
raise SSLError(e)
SSLError: [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error
Storing debug log for failure in /home/ramy/.pip/pip.log
ramy#ramy-Aspire-4739Z:~$
Requests can verify SSL certificates for HTTPS requests, just like a
web browser. To check a host’s SSL certificate, you can use the verify
argument:
And if you want to avoid the SSLError you can find the requests part and change verify to False:
requests.get('https://google.com', verify=False)
Another way reinstall certifi module:
sudo pip uninstall -y certifi
sudo pip install certifi==2015.04.28
Have a look at these questions:
Python Requests throwing up SSLError
SSLError: bad handshake, Python requests

pip install doesnt work , InvalidSchema: Missing dependencies for SOCKS support

I am freaking out with this error I am struck on for 2 days. 2 days ago activated my virtualenv, then tried to install some dependencies but pip install keeps throwing this error.
pip install django
Traceback (most recent call last):
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/basecommand.py", line 215, in main
status = self.run(options, args)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/commands/install.py", line 335, in run
wb.build(autobuilding=True)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/wheel.py", line 749, in build
self.requirement_set.prepare_files(self.finder)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/req/req_set.py", line 380, in prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/req/req_set.py", line 554, in _prepare_file
require_hashes
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/req/req_install.py", line 278, in populate_link
self.link = finder.find_requirement(self, upgrade)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/index.py", line 465, in find_requirement
all_candidates = self.find_all_candidates(req.name)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/index.py", line 423, in find_all_candidates
for page in self._get_pages(url_locations, project_name):
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/index.py", line 568, in _get_pages
page = self._get_page(location)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/index.py", line 683, in _get_page
return HTMLPage.get_page(link, session=self.session)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/index.py", line 792, in get_page
"Cache-Control": "max-age=600",
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 488, in get
return self.request('GET', url, **kwargs)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/download.py", line 386, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 596, in send
r = adapter.send(request, **kwargs)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/adapter.py", line 47, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/adapters.py", line 390, in send
conn = self.get_connection(request.url, proxies)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/adapters.py", line 290, in get_connection
proxy_manager = self.proxy_manager_for(proxy)
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/adapters.py", line 184, in proxy_manager_for
**proxy_kwargs
File "/home/blackpython/.local/lib/python2.7/site-packages/pip/_vendor/requests/adapters.py", line 43, in SOCKSProxyManager
raise InvalidSchema("Missing dependencies for SOCKS support.")
InvalidSchema: Missing dependencies for SOCKS support.
I tried installing request[socks], but then I get an error saying that it cannot make out the socks version
printenv | grep -i proxy
NO_PROXY=localhost,127.0.0.0/8,::1
http_proxy=http://proxy.iiit.ac.in:8080/
FTP_PROXY=http://proxy.iiit.ac.in:8080/
ftp_proxy=http://proxy.iiit.ac.in:8080/
all_proxy=socks://proxy.iiit.ac.in:8080/
ALL_PROXY=socks://proxy.iiit.ac.in:8080/
https_proxy=http://proxy.iiit.ac.in:8080/
HTTPS_PROXY=http://proxy.iiit.ac.in:8080/
no_proxy=localhost,127.0.0.0/8,::1
HTTP_PROXY=http://proxy.iiit.ac.in:8080/
This is proxy setting on my system, can anyone help me with this?
Thanks
edit: I tired reinstalling ubuntu, and I am still facing this error
Unset socks proxy, in your case:
unset all_proxy
unset ALL_PROXY
Install missing dependencies:
pip install pysocks
Reset proxy, source .bashrc, and pip install works again with socks proxy.
I was also not able to install Django and other stuffs after activating venv than I finally did this [1]
type: sudo gedit .bashrc #or any text editor if gedit is not there
add this line at the last of your .bashrc file
export all_proxy="https://your_proxy:your_port/"
save the changes it should work. Thanks
Ref. [1] https://stackoverflow.com/a/39959360/4706745
This is what worked for me pip install --user --proxy https://user:pass#proxyaddress:port django
Uninstall the installed package using
pip uninstall packagename
and set the proxy
http_proxy=http://username:password#host:port/
https_proxy=https://username:password#host:port/
and try to install using
pip install packagename

Pip install from requirements file is failing, but installing one by one works

I am trying to install a bunch of python dependencies using a requirements.txt file with the following command:
pip install -r requirements.txt
The requirements.txt file has the following packages:
pep8
selenium
paramiko
soappy
nose
wmi
mock
python-keystoneclient
python-novaclient
python-cinderclient
python-swiftclient
python-glanceclient
python-heatclient
python-neutronclient
But when running the pip install command I am getting this error:
Downloading/unpacking PrettyTable>=0.7,<0.8 (from python-keystoneclient->-r requ
irements.txt (line 9))
Cleaning up...
Exception:
Traceback (most recent call last):
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\basecommand.py", line
122, in main
status = self.run(options, args)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\commands\install.py",
line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundl
e=self.bundle)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\req.py", line 1197, i
n prepare_files
do_download,
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\req.py", line 1375, i
n unpack_url
self.session,
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\download.py", line 54
6, in unpack_http_url
resp = session.get(target_url, stream=True)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\_vendor\requests\sess
ions.py", line 395, in get
return self.request('GET', url, **kwargs)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\download.py", line 23
7, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\_vendor\requests\sess
ions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\_vendor\requests\sess
ions.py", line 486, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\VirtualEnvs\test\lib\site-packages\pip\_vendor\requests\adap
ters.py", line 381, in send
raise ProxyError(e)
ProxyError: Cannot connect to proxy. Socket error: [Errno 10054] An existing con
nection was forcibly closed by the remote host.
Storing debug log for failure in C:\Users\cjmarti2\pip\pip.log
But for some reason if I install all packages using pip install <package> they all get installed correctly.
A couple of things to consider;
1) since I am in my company's intranet I am setting a proxy using set https_proxy=https://company-proxy.com:port. 2) Installing the exact same packages using requirements.txt in a Linux machine works fine. Any ideas?
Update:
I was using Python 2.7 for 64 bits. I uninstalled it and installed the 32 bits, and I no longer had this problem. Looks like the python version was the problem, the 64 bits one for some reason cause problems.

pip installed but throwing error while installing any package

pip is installed on my system but whenever I try to install any package it throws below exception. Same exception comes on installing each and every package. I could not find this anywhere on web. Please help.
pip install django-social-auth
Downloading/unpacking django-social-auth
Cleaning up...
Exception:
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/basecommand.py", line 122, in main
status = self.run(options, args)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/commands/install.py", line 274, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/req.py", line 1166, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/index.py", line 194, in find_requirement
page = self._get_page(main_index_url, req)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/index.py", line 568, in _get_page
session=self.session,
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/index.py", line 670, in get_page
resp = session.get(url, headers={"Accept": "text/html"})
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/_vendor/requests/sessions.py", line 395, in get
return self.request('GET', url, **kwargs)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/download.py", line 237, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/_vendor/requests/sessions.py", line 356, in request
env_proxies = get_environ_proxies(url) or {}
File "/Library/Python/2.7/site-packages/pip-1.5.2-py2.7.egg/pip/_vendor/requests/utils.py", line 504, in get_environ_proxies
bypass = proxy_bypass(netloc)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib.py", line 1433, in proxy_bypass
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib.py", line 1413, in proxy_bypass_macosx_sysconf
ValueError: negative shift count
The urllib module used by pip will automatically load proxy configuration from your OS.
In this case, this fails because you appear to have some malformed proxy configuration, in the proxy-bypass field. Verify that your proxy configuration is correct. On Mac, do so under System Preferences > Network.
Bypass proxy settings hosts section. i removed them and its working fine now. ----This worked for me as well
If #Martijn Pieters' answer doesn't fix it for you, be sure to check the "Bypass proxy settings hosts" section.
Try removing the addresses in that field, click apply changes and try again.

Categories