I’m extracting the SSL certificate from a website using the socket + ssl library in python. My understanding that it connects using the preferred method used by the server.
Using this method I’m able to identify what version of SSL is used to connect, but I also need to identify whether the website supports SSL v3, in the case when the default connection is TLS.
Is there a way to identify this information without manually testing multiple SSL connections?
I don't think sites advertise what they support. Rather, it's negotiated between client and server.
You could use the excellent server tester at www.ssllabs.com. It will try lots of configurations and report what the server in question supports. (Hopefully the site doesn't support SSL v3!)
Related
The situation is that a desktop application is needed to be run in the background (an application that would be "hanging around" in the system tray) with an API. For simplicity reasons, I chose Flask to build the API and Python overall to build the desktop part of it. Is this a practical or reasonable way to create a desktop application? The application itself will not be large scale, it will only hold several Python scripts.
Basically, a Microsoft PowerApp will be communicating with this API on the desktop. When a call will be executed from the Microsoft PowerApp to the API, it will be targeting a public static IP address to a specific port, then that will be forwarded to the local IP of the Flask application. I understand that PowerApps requires SSL to communicate with applications. I can figure out how to build the API and desktop part of it, but I cannot figure out the SSL certificates. When I try to generate a certificate through CertBot, it requires me to supply a domain. This situation will not be using a domain, only the public static IP. Does this at all seem logical to do or should a different approach be taken?
Though some SSL certificate providers support issuing certs to IP addresses, do yourself a favor and get the one assigned to a hostname. Just use organization's domain to create a hostname you like.
Alternatively, try entering IP address instead of domain when ordering a certificate.
You can read more about IP-based certs here: Is it possible to have SSL certificate for IP address, not domain name?
I’m trying to establish TLS channel between a client and a web server that are under my control. Both the client and server authenticates themselves using certificates that I’ve created under private PKI scheme. Client key and certificate are stored on usb dongle type HSM. Python is the main application language.
I’m able to do all required crypto operations for my project using python-pkcs11 package such as AES encryption, HMAC signing, RSA signing, and etc. However, I couldn’t find a way to “bind” pkcs11 to any TLS library. What I mean is a “Pythonic” way of calling a function that handles pkcs11 layer and establishes a TLS channel. Requests does not support pkcs11. libcurl has support for pkcs11 but it’s not implemented in pycurl, neither pyopenssl.
I’m able to do it openssl’s s_client CLI tool using engine api:
openssl s_client -engine pkcs11 -verify 2 -CAfile path/to/CA.pem -keyform engine -key "pkcs11:...;object=rsa;type=private" -cert path/to/client-cert.pem -connect localhost:8443
An example of what I’m looking for:
do_tls_with_pkcs(key=’pkcs11:URL’, cert=’cert.pem’, verify=’CA-cert.pem’)
As far as I could search around, no such library exists yet. Now I’m looking for a workaround.
I have read that if openssl, libp11, and python are compiled in such a way it is possible to abstract all of this, hence simple requests calls would go through HSM, transparent to application code. Although, I couldn’t find any material on how to do it.
I faced a similar problem as I wanted to use a PKCS#11 token (YubiKey, PIV applet) along Python requests.
I came up with https://github.com/cedric-dufour/scriptisms/blob/master/misc/m2requests.py
It's imperfect, in the sense that it does not use connections pools and does not support HTTP streams or proxying - like requests's stock HTTPS adapter does - but it does the job for simple connections to backends that require mTLS.
So, the situation is: I want to know what path is a program sending the request to. With Wireshark, I can only know that it is sending https request and the corresponding domain but not the path.
I think there could be a way to at least inspect the outbound https traffic even without hacking the program.
Let's say if I run a fake website and redirect the connection to the real site to my local fake site. So the request will be sent to my fake site, and I can create a self-signed fake key pair for my fake site. Install the private key on the fake site, and install the public key on my local machine. Then the handshake should be approved.
But I have several problems:
How to launch a fake https server in the simplest way? Nginx? Or is there a simple solution in Python?
How can I install the public key on my local machine? I'm using Linux Mint 19 which is based on Ubuntu 18.04.
Any help is appreciated!
You may want to check Charles proxy. This a proxy with which you can inspect the outbound traffic (including HTTPS).
In order to inspect HTTPS traffic, it will be required to enable SSL Proxy which means that Charles will dynamically generate a certificate and become man-in-the-middle for HTTPS connections.
Charles signs these dynamic certificates with it's own which has to be added to the trusted storage of the application you use. Various instructions are available here.
I am writing a Flask Web-Application and use eventlet as the networking library for that application (eventlet is wrapped by Flask-SocketIO to allow asynchronous operation)
Following this guide I have been successfully creating a SSL key- and cert-file which I pass to the WSGI Server
socket_io.run(app,
host=APP_HOST,
port=APP_PORT,
keyfile='ia.key',
certfile='ia.crt')
This works fine but unfortunately Safari / Chrome says that my SSL-Certificate is not trustworthy when I access the page for the first time.
The Chrome-Failure is the following:
NET::ERR_CERT_COMMON_NAME_INVALID
How to I generate a valid SSL Certificate so that the browsers don't show that error when a user connects to the web application the first time!?
That is because it is something called a "Self Signed Certificate", which is not from any trusted company, so any modern browser auto-detects this as an untrusted site. If you are using a UNIX-based operating system, (Linux, or macOS, Fedora, and more), you can use what I am using. You have to generate new certification from a trusted site.
This is what I use to get a TRUSTED certificate that most browsers can use: https://certbot.eff.org/instructions.
I have a python server using BaseHTTPServer and SimpleHTTPServer to respond to clients over SSL. I am able to use a generated SSL cert to secure connections, but am unable to choose which security protocol to use (e.g. SSLv1, TLS 1.2, SSLv3).
Is there any way to specify the security protocol to use within these modules, or is there a python module that I could use instead to provide similar functionality and be able to specify security protocol?