Python SSL client certificate - python

I am trying to use client side certificates for authentication in Python, but I just can't figure it out.
I am using this for my testing:
https://blog.cloudflare.com/introducing-tls-client-auth/
(Skip to "TLS Client Authentication On The Edge" for the pem file and example curl call).
It works as expected in curl, but when I try it in Python I get a 403 back from the server.
Here is what I have tried in Python:
import ssl
import socket
s = socket.socket()
addr_info = socket.getaddrinfo('auth.pizza',443)
addr = addr_info[0][-1]
ss = ssl.wrap_socket(s, certfile='pizza.pem')
ss.connect(addr)
ss.write(b'GET / HTTP/1.0\r\n\r\n')
print(ss.read(4096))
I'm at a loss for how to even debug this further.

Related

MQRC_SSL_INITIALIZATION_ERROR with PyMQI (however it connects successfully with c application amssslc)

I am using the same computer local windows 7 computer with MQ Client 9.0.4.0, when trying to connect to the server with amqssslc I can successfully connect to the QMGR (connected like so), however, when I try to connect with PyMQI, I get the following error,
MQI Error. Comp: 2, Reason 2393: FAILED: MQRC_SSL_INITIALIZATION_ERROR
The code that I am using is the following,
import pymqi
import logging
import sys, codecs, locale
logging.basicConfig(level=logging.INFO)
queue_manager = 'QMGR'
channel = 'channel'
host = 'server.com'
port = '1414'
conn_info = '%s(%s)' % (host, port)
ssl_cipher_spec = str.encode('TLS_RSA_WITH_AES_256_CBC_SHA256')
key_repo_location = str.encode('T:/Desktop/certificates/key')
message = 'TEST Message'
channel = str.encode(channel)
host = str.encode(host)
conn_info = str.encode(conn_info)
cd = pymqi.CD(Version=pymqi.CMQXC.MQCD_VERSION_11)
cd.ChannelName = channel
cd.ConnectionName = conn_info
cd.ChannelType = pymqi.CMQC.MQCHT_CLNTCONN
cd.TransportType = pymqi.CMQC.MQXPT_TCP
cd.SSLCipherSpec = ssl_cipher_spec
cd.CertificateLabel = 'edgar'.encode()
sco = pymqi.SCO()
sco.KeyRepository = key_repo_location
qmgr = pymqi.QueueManager(None)
qmgr.connect_with_options(queue_manager, cd, sco)
However when using amqssslc, that comes when installing the MQ Client on my machine, it does work without any errors and connects successfully.
The error from the AMQERR01 log file says the following,
AMQ9716E: Remote SSL certificate revocation status check failed for channel
'channel_name'.
EXPLANATION:
IBM MQ failed to determine the revocation status of the remote SSL certificate
for one of the following reasons:
(a) The channel was unable to contact any of the CRL servers or OCSP responders
for the certificate.
(b) None of the OCSP responders contacted knows the revocation status of the
certificate.
(c) An OCSP response was received, but the digital signature of the response
could not be verified.
I can't change my mqclient.ini config file, since it is locked by not having admin rights (company policies). What I find weird, is that amqssslc works, when they are both using the same mqclient file. I have also tried setting up the path of the MQCLTNFC to another folder including a different config file without success.
Any help would be truly appreciated!

Set CA bundle for requests going through HTTPS tunnel

I'm trying to send an HTTPS request through an HTTPS tunnel. That is, my proxy expects HTTPS for the CONNECT. It also expects a client certificate.
I'm using Requests' proxy features.
import requests
url = "https://some.external.com/endpoint"
with requests.Session() as session:
response = session.get(
url,
proxies={"https": "https://proxy.host:4443"},
# client certificates expected by proxy
cert=(cert_path, key_path),
verify="/home/savior/proxy-ca-bundle.pem",
)
with response:
...
This works, but with some limitations:
I can only set client certificates for the TLS connection with the proxy, not for the external endpoint.
The proxy-ca-bundle.pem only verifies the server certificates in the TLS connection with the proxy. The server certificates from the external endpoint are seemingly ignored.
Is there any way to use requests to address these two issues? I'd like to set a different set of CAs for the external endpoint.
I also tried using http.client and HTTPSConnection.set_tunnel but, as far as I can tell, its tunnel is done through HTTP and I need HTTPS.
Looking at the source code, it doesn't seem like requests currently supports this "TLS in TLS", ie. providing two sets of clients/CA bundles for a proxied requests.
We can use PycURL which simply wraps libcurl
from io import BytesIO
import pycurl
url = "https://some.external.com/endpoint"
buffer = BytesIO()
curl = pycurl.Curl()
curl.setopt(curl.URL, url)
curl.setopt(curl.WRITEDATA, buffer)
# proxy settings
curl.setopt(curl.HTTPPROXYTUNNEL, 1)
curl.setopt(curl.PROXY, "https://proxy.host")
curl.setopt(curl.PROXYPORT, 4443)
curl.setopt(curl.PROXY_SSLCERT, cert_path)
curl.setopt(curl.PROXY_SSLKEY, key_path)
curl.setopt(curl.PROXY_CAINFO, "/home/savior/proxy-ca-bundle.pem")
# endpoint verification
curl.setopt(curl.CAINFO, "/home/savior/external-ca-bundle.pem")
try:
curl.perform()
except pycurl.error:
pass # log or re-raise
else:
status_code = curl.getinfo(curl.RESPONSE_CODE)
PycURL will use the PROXY_ settings to establish a TLS connection to the proxy, send it an HTTP CONNECT request. Then it'll establish a new TLS session through the proxy connection to the external endpoint and use the CAINFO bundle to verify those server certificates.

How to execute remote commands via SSH through authenticated HTTP Proxy?

I am posting the question and the answer, I found, as well, incase it would help someone. The following were my minimum requirements:
1. Client machine is Windows 10 and remote server is Linux
2. Connect to remote server via SSH through HTTP Proxy
3. HTTP Proxy uses Basic Authentication
4. Run commands on remote server and display output
The purpose of the script was to login to the remote server, run a bash script (check.sh) present on the server and display the result. The Bash script simply runs a list of commands displaying the overall health of the server.
There have been numerous discussions, here, on how to implement HTTP Proxy or running remote commands using Paramiko. However, I could not find the combination of both.
from urllib.parse import urlparse
from http.client import HTTPConnection
import paramiko
from base64 import b64encode
# host details
host = "remote-server-IP"
port = 22
# proxy connection & socket definition
proxy_url = "http://uname001:passw0rd123#HTTP-proxy-server-IP:proxy-port"
url = urlparse(proxy_url)
http_con = HTTPConnection(url.hostname, url.port)
auth = b64encode(bytes(url.username + ':' + url.password,"utf-8")).decode("ascii")
headers = { 'Proxy-Authorization' : 'Basic %s' % auth }
http_con.set_tunnel(host, port, headers)
http_con.connect()
sock = http_con.sock
# ssh connection
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
ssh.connect(hostname=host, port=port, username='remote-server-uname', password='remote-server-pwd', sock=sock)
except paramiko.SSHException:
print("Connection Failed")
quit()
stdin,stdout,stderr = ssh.exec_command("./check")
for line in stdout.readlines():
print(line.strip())
ssh.close()
I would welcome any suggestions to the code as I am a network analyst and not a coder but keen to learn and improve.
I do not think that your proxy code is correct.
For a working proxy code, see How to ssh over http proxy in Python?, particularly the answer by #tintin.
As it seems that you need to authenticate to the proxy, after the CONNECT command, add a Proxy-Authorization header like:
Proxy-Authorization: Basic <credentials>
where the <credentials> is base-64 encoded string username:password.
cmd_connect = "CONNECT {}:{} HTTP/1.1\r\nProxy-Authorization: Basic <credentials>\r\n\r\n".format(*target)

Creating an SSL socket with python

I'm using Python 2.4.4 and OpenSSL 0.9.8k (not by choice)
I've referred to the documentation: https://docs.python.org/release/2.4.4/lib/module-socket.html
and to pretty much every mention of "openSSL" and "python" on the internet, and I haven't found a solution to my problem.
I'm simply writing a test program to initiate an SSL connection. Here is the code:
server
#!/usr/bin/python
import socket
import _ssl
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('', 4433))
s.listen(5)
while True:
client, address = s.accept()
ssl_client = socket.ssl(client,
keyfile='keyfile',
certfile='certfile')
print "Connection: ", address
data = ssl_client.read(1024)
if data:
print "received data: ", data
ssl_client.write(data + " Hello, World!")
del ssl_client
client.close()
client
#!/usr/bin/python
import socket
import _ssl
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect(('host', 4433))
ssl_s = socket.ssl(s,
keyfile='keyfile',
certfile='certfile')
print 'writing ', ssl_s.write("Hello, World!"), ' bytes to ssl stream'
data = ssl_s.read(1024)
del ssl_s
s.close()
print "received data: ", data
Some notes about this code - keyfile and certfile are paths to my actual key and cert file. Those arguments are not the issue. The hostnames are also not the issue. I'm aware that the port used is 4433 - in our requirements, we're meant to use a generic port, not 443. I was unaware that it was possible to use SSL over a different port, but regardless, even when I use 443 I get the exact same error.
I can run the server fine, and then when I run the client, I get the following error on the wrap_socket lines for both client and server:
error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
I've read it's due to using a non-443 port, but again, using 443 didn't fix things. I've read it could be a protocol mismatch, but the client and the server are both defaulting to SSL2.3. We're meant to use TLS1.2 as per our requirements, but the docs don't seem to have any information on how to set the SSL protocol version. I'm unsure if that's related to my issue. Please keep in mind I'm not here to open a dialogue regarding to use of outdated SSL and Python versions.
socket.ssl is only able to initiate a SSL connection and the given optional cert and key are for use of client certificates. socket.ssl is not able to be used on the server side and it looks like python 2.4.4 does not offer this feature in any of the core modules at all. In later versions of python you can use the ssl module for this but 2.4.4 does not seem to have this.

Send mail with smtplib using proxy

I have a very basic piece of Python code:
import smtplib
server = smtplib.SMTP(host, port)
problems = server.sendmail(from_addr, to_addr, message)
Is there solution to run it behind an HTTP proxy? I am using Python 3.4.1 on Linux with the http_proxy variable set.
Now I am getting a timeout from SMTP, but if I run this code from a proxy-free network, it works OK.
Is there solution to run it behind an HTTP proxy?
No, HTTP is a different protocol than SMTP and the proxy is for HTTP only. If you are very lucky you might be able to create a tunnel using the CONNECT command to the outside SMTP server, but usually the ports used for CONNECT are restricted so that you will not be able to create a tunnel to an outside host port 25 (i.e. SMTP).

Categories