I have RHEL 8 server which has not internet connection and the server has a jupyter notebook installed. I need to install exchangelib module there. Since the server has no any internet connection I couldn't do that. So I started to create a proxy like below.
http_proxy = "http://10.11.111.11:3128"
https_proxy = "https://10.11.111.11:3128"
ftp_proxy = "ftp://10.11.111.11:3128"
proxyDict = {
"http" : http_proxy,
"https" : https_proxy,
"ftp" : ftp_proxy
}
# setting up the URL and checking the connection by printing the status
url = 'https://www.google.lk'
page = requests.get(url, proxies=proxyDict)
print(page.status_code)
print(page.url)
The output of the following code is as follows.
200
https://www.google.lk
So I was able to connect to internet by using that. But I couldn't figure out how I install pip packages after that. Can anyone guide me on that?
You shouldn't use pip as a library. pip project recommends using it with subprocess call.
subprocess.check_call([sys.executable, '-m', 'pip', 'install', 'my_package'])
Then for proxy you can add the --proxy flag to it. This Stackoverflow answer shows it well. But to complete the answer, this is how it should look like,
subprocess.check_call([
sys.executable,
'-m',
'pip',
'install',
'--proxy',
'http://10.11.111.11:3128',
'my_package'
])
Related
I'm trying to make a https request from within a docker container. Here's python code that runs fine on my Windows 10 host:
import certifi
import ssl
import urllib.request
tmp_filename = "penguin.jpg"
pingu_link = "https://i.pinimg.com/originals/cc/3a/1a/cc3a1ae4beafdd5ac2293824f1fb0437.jpg"
print(certifi.where())
default = ssl.create_default_context()
https_handler = urllib.request.HTTPSHandler(context=ssl.create_default_context())
opener = urllib.request.build_opener(https_handler)
# add user agent headers to avoid 403 response
opener.addheaders = [
(
"User-agent",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101 Firefox/91.0",
)]
urllib.request.install_opener(opener)
r = urllib.request.urlretrieve(pingu_link, tmp_filename)
If I understand correctly, certifi comes with its own set of CA certificates, which are contained in the .pem file you can find by calling certifi.where(). However, if I convert this file to .crt and tell request to use it by calling
https_handler = urllib.request.HTTPSHandler(context=ssl.create_default_context(cafile="cacert.crt"))
the verification fails: ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129). As this post explains, certifi also automatically imports certfiles from windows cert store.
Now I'm a little confused what this means if you want to verify SSL certificates in a docker container. It seems like there are two options:
Just install the ca-certificates package. It should provide the necessary public keys for most CAs.
Install your own (possibly self-signed) certificate: copy it into your docker container and tell the ca-certificates package about it by calling update-ca-certificates. (You could also install it in windows global certificate store and it should work with docker out of the box according to this issue on github)
Unfortunately, the first approach does not seem to work for me. It raises the same Verification Error as above. Even worse, since I don't know which .crt file is used to verify certificates without docker, the second option is not a possibility either. Here's the Dockerfile:
# start with a python env that already has urllib3 installed
FROM company_harbor/base_py3_container
ENV HTTP_PROXY="my_company_proxy"
ENV HTTPS_PROXY="my_company_proxy"
# install ca certificates
RUN apt-get update && \
apt-get install ca-certificates -y && \
apt-get clean
RUN pip install --upgrade certifi --trusted-host=pypi.org --trusted-host=files.pythonhosted.org
# what I would do if I found the right .crt file
# COPY cacert.crt /usr/share/ca-certificates/cacert.crt
# RUN chmod 644 /usr/share/ca-certificates/cacert.crt
RUN update-ca-certificates
COPY ./download_penguin.py ./download_penguin.py
CMD [ "python", "download_penguin.py" ]
What do you need to do in order to verify SSL certificates with python in docker?
Turns out company proxies can swap SSL certificates in a Man-in-the-middle manner.
The standard certificates from apt-get install ca-certificates or python's certifi package are not going to include these company certificates. Additionally, this is not specifically a Docker related question but a question of "How to install a root certificate on Linux". Debian to be more precise, because thats what Docker containers run by default.
This was not as straight-forward as expected. Here's what worked in the end:
Use the company's certificates in .pem format to begin with.
Rename them so they end with .crt. Do NOT use any openssl .pem to .crt transformation. In my case, every .crt file I found online was encoded in a way that made it unreadable for Notepad++, Vim and alike. .pem files on the other hand looked fine.
Copy the renamed certificates to the proper ca-certificate location on your OS.
Install the certificates via update-ca-certificates.
Translated into a Dockerfile, here's the important part:
COPY root.pem /usr/local/share/ca-certificates/root.crt
COPY proxy.pem /usr/local/share/ca-certificates/proxy.crt
RUN update-ca-certificates
I'm getting the error:
urllib3.exceptions.ProxySchemeUnknown: Proxy URL had no scheme, should start with http:// or https://
but the proxies are fine & so is the URL.
URL = f"https://google.com/search?q={query2}&num=100"
mysite = self.listbox.get(0)
headers = {"user-agent": USER_AGENT}
while True:
proxy = next(proxy_cycle)
print(proxy)
proxies = {"http": proxy, "https": proxy}
print(proxies)
resp = requests.get(URL, proxies=proxies, headers=headers)
if resp.status_code == 200:
break
Print results:
41.139.253.91:8080
{'http': '41.139.253.91:8080', 'https': '41.139.253.91:8080'}
On Linux unset http_proxy and https_proxy using terminal on the current location of your project
unset http_proxy
unset https_proxy
I had the same problem and setting in my terminal https_proxy variable really helped me. You can set it as follows:
set HTTPS_PROXY=http://username:password#proxy.example.com:8080
set https_proxy=http://username:password#proxy.example.com:8080
Where proxy.example.com is the proxy address (in my case it is "localhost") and 8080 is my port.
You can figure out your username by typing echo %username% in your command line. As for the proxy server, on Windows, you need to go to "Internet Options" -> "Connections" -> LAN Settings and tick "Use a proxy server for your LAN". There, you can find your proxy address and port.
An important note here. If you're using PyCharm, try first running your script from the terminal. I say this because you may get the same error if you will just run the file by "pushing" the button. But using the terminal may help you get rid of this error.
P.S. Also, you can try to downgrade your pip to 20.2.3 as it may help you too.
I was having same issue. I resolved with upgrading requests library in python3 by
pip3 install --upgrade requests
I think it is related to lower version of requests library conflicting higher version of python3
Trying to connect to Azure CosmosDB mongo server results into an SSL handshake error.
I am using Python3 and Pymongo to connect to my Azure CosmosDB. The connection works fine if I run the code with Python27 but causes the below error when using Python3:
import pymongo
from pymongo import MongoClient
import json
import sys
def check_server_status(client, data):
'''check the server status of the connected endpoint'''
db = client.result_DB
server_status = db.command('serverStatus')
print('Database server status:')
print(json.dumps(server_status, sort_keys=False, indent=2, separators=(',', ': ')))
coll = db.file_result
print (coll)
coll.insert_one(data)
def main():
uri = "mongodb://KEY123#backend.documents.azure.com:10255/?ssl=true&replicaSet=globaldb"
client = pymongo.MongoClient(uri)
emp_rec1 = {
"name":"Mr.Geek",
"eid":24,
"location":"delhi"
}
check_server_status(client, emp_rec1)
if __name__ == "__main__":
main()
Running this on Python3 results into below error:
pymongo.errors.ServerSelectionTimeoutError: SSL handshake failed:
backendstore.documents.azure.com:10255: [SSL:
CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749)
Here is my successful output when I run the same code with Python27:
Database server status: { "_t": "OKMongoResponse", "ok": 1 }
Collection(Database(MongoClient(host=['backend.documents.azure.com:10255'],
document_class=dict, tz_aware=False, connect=True, ssl=True,
replicaset='globaldb'), u'result_DB'), u'file_result')
On Windows you can do like this
pip install certifi
Then use it in code:
import certifi
ca = certifi.where()
client = pymongo.MongoClient(
"mongodb+srv://username:password#cluster0.xxxxx.mongodb.net/xyzdb?retryWrites=true&w=majority", tlsCAFile=ca)
Solved the problem with this change:
client = pymongo.MongoClient(uri, ssl_cert_reqs=ssl.CERT_NONE)
The section Troubleshooting TLS Errors of the PyMongo offical document `TLS/SSL and PyMongo introduces the issue as below.
TLS errors often fall into two categories, certificate verification failure or protocol version mismatch. An error message similar to the following means that OpenSSL was not able to verify the server’s certificate:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
This often occurs because OpenSSL does not have access to the system’s root certificates or the certificates are out of date. Linux users should ensure that they have the latest root certificate updates installed from their Linux vendor. macOS users using Python 3.6.0 or newer downloaded from python.org may have to run a script included with python to install root certificates:
open "/Applications/Python <YOUR PYTHON VERSION>/Install Certificates.command"
Users of older PyPy and PyPy3 portable versions may have to set an environment variable to tell OpenSSL where to find root certificates. This is easily done using the certifi module from pypi:
$ pypy -m pip install certifi
$ export SSL_CERT_FILE=$(pypy -c "import certifi; print(certifi.where())")
You can try to follow the description above to fix your issue, which seems to be for Linux and Mac Users. On Windows, I can not reproduce your issue in Python 3.7 and 3.6. If you have any concern, please feel free to let me know.
Faced the same issue when trying to connect mongodb from Digital Ocean,
Solved by using this function with params in MongoClient:
def get_client(host,port,username,password,db):
return MongoClient('mongodb://{}:{}/'.format(host,port),
username=username,
password=password,
authSource=db,
ssl=True,ssl_cert_reqs=ssl.CERT_NONE)
client = get_client("host-ip","port","username","password","db-name")
On Mac Mojave 10.14.6 , I used (PyMongo 3.10 and python 3.7), to solve:
flask pymongo pymongo.errors.ServerSelectionTimeoutError [SSL: CERTIFICATE_VERIFY_FAILED]
Execute in terminal:
sudo /Applications/Python\ 3.7/Install\ Certificates.command
If you use other python version, only change versión number (In my case, i have Python 3.7)
cluster = MongoClient(
"url",
ssl=True,
ssl_cert_reqs=ssl.CERT_NONE,
)
By default pymongo relies on the operating system’s root certificates.
It could be that Atlas itself updated its certificates or it could be that something on your OS changed. “certificate verify failed” often occurs because OpenSSL does not have access to the system’s root certificates or the certificates are out of date. For how to troubleshoot see TLS/SSL and PyMongo — PyMongo 3.12.0 documentation 107.
pls Try :
client = pymongo.MongoClient(connection, tlsCAFile=certifi.where())
and dont forget to install certifi
On mac Monterey, I used pymongo 3.12.1 and virtual environment
To solve, use
ssl_cert_reqs=CERT_NONE
with mongodb url
Is there any alternative to using proxy in scrapy. The source site has blocked the server which I'm using for running spiders. I've added ProxyMiddleware in the project and randomized the proxy. But the problem is the proxies are also being blocked by the source site. I've also set the DOWNLOAD_DELAY to 5 but the problem is still alive. Is there any other way to access the site without using proxies other than shifting to new server?
Using tor with privoxy solved my problem of blocking.
Install tor
$ sudo apt-get install tor
Install polipo
$ sudo apt-get install polipo
configure privoxy to use tor socks proxy.
$sudo nano /etc/polipo/config
Add following lines at the end of file.
socksParentProxy = localhost:9050
diskCacheRoot=""
disableLocalInterface=""
Add proxy middleware in middlewares.py.
class ProxyMiddleware(object):
def process_request(self, request, spider):
request.meta['proxy'] = 'http://localhost:8123'
spider.log('Proxy : %s' % request.meta['proxy'])
Activate the proxyMiddleware in Project settings.
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
'project_name.middlewares.ProxyMiddleware': 100
}
You may want the squid.
It will shield failure proxy, use proxy faster, automatic rotation, automatic retry forwarding, and set the rules.
Just set your spider to the same export agent.
I'm trying to use Pip behind a proxy server which requires authentication. I've installed cntlm and filled out the hashed passwords. When I run this:
cntlm -c cntlm.ini -I -M http://www.google.co.uk
I enter my password and then get this as a result:
Config profile 1/4... Auth not required (HTTP code: 200)
Config profile 2/4... Auth not required (HTTP code: 200)
Config profile 3/4... Auth not required (HTTP code: 200)
Config profile 4/4... Auth not required (HTTP code: 200)
Your proxy is open, you don't need another proxy.
However, pip doesn't work, still giving me a timeout. Knowing that I don't need another proxy is all fine and dandy, but pip still times out. Port 3128 is working because I can telnet on that port and it shows as listening under netstat. So what should I do from here?
Thank you.
I have had the exact same issue.
Cntlm is used for authentication proxy servers, these statements mean that your server does not require authentication.
The pip command does have a --proxy option. Try using something like:
pip install --proxy=10.0.0.1:80 package_name
If this works, you know that you don't need authentication to access the web. If it still fails try:
pip install --proxy=user:password#10.0.0.1:80 package_name
This works to get around authentication. I have written a small cmd script to get around this in windows:
#echo off
:: GetPwd.cmd - Get password with no echo.
setlocal
<nul: set /p passwd=
for /f "delims=" %%i in ('python -c "from getpass import getpass; pwd = getpass();print pwd;"') do set passwd=%%i
echo.
::Prompt for the package name
set /p package=What package would you like to get:
::Get the package with PIP
pip install --proxy="admin:%passwd%#PROXY_ADDRESS:80" %package%
endlocal