using kubernetes secrets in SSLContext - python

I am doing POC to check if we can connect to an API, for that I use the below code.
# Define the client certificate settings for https connection
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
print(type(context))
context.load_cert_chain(certfile=CERT, keyfile=KEY, password=PW)
# Create a connection to submit HTTP requests
connection = http.client.HTTPSConnection(host, port=443, context=context)
# Use connection to submit a HTTP POST request
connection.request(method="GET", url=request_url, headers=request_headers)
# Print the HTTP response from the IOT service endpoint
response = connection.getresponse()
print(response.status, response.reason)
data = response.read()
print(data)
these two variables (CERT and KEY), i get the secrets via kubernetes files and convert them to strings. Is there alternate ways to load the downloaded secrets into context object instead of using load_cert_chain method (since this one needs files). I know this is not ideal, but since I am doing only POC I just want to see if this is doable.

Related

Python - Requests Library - How to ensure HTTPS requests

This is probably a dumb question, but I just want to make sure with the below.
I am currently using the requests library in python. I am using this to call an external API hosted on Azure cloud.
If I use the requests library from a virtual machine, and the requests library sends to URL: https://api-management-example/run, does that mean my communication to this API, as well as the entire payload I send through is secure? I have seen in my Python site-packages in my virtual environment, there is a cacert.pem file. Do I need to update that at all? Do I need to do anything else on my end to ensure the communication is secure, or the fact that I am calling the HTTPS URL means it is secure?
Any information/guidance would be much appreciated.
Thanks,
A HTTPS is secure with valid signed certificate. Some people use self signed certificate to maintain HTTPS. In requests library, you explicitly verify your certificate. If you have self-signed HTTPS then, you need to pass the certificate to cross verify with your local certificate.
verify = True
import requests
response = requests.get("https://api-management-example/run", verify=True)
Self Signed Certificate
import requests
response = requests.get("https://api-management-example/run", verify="/path/to/local/certificate/file/")
Post requests are more secure because they can carry data in an encrypted form as a message body. Whereas GET requests append the parameters in the URL, which is also visible in the browser history, SSL/TLS and HTTPS connections encrypt the GET parameters as well. If you are not using HTTPs or SSL/TSL connections, then POST requests are the preference for security.
A dictionary object can be used to send the data, as a key-value pair, as a second parameter to the post method.
The HTTPS protocol is safe provided you have a valid SSL certificate on your API. If you want to be extra safe, you can implement end-to-end encryption/cryptography. Basically converting your so called plaintext, and converting it to scrambled text, called ciphertext.
You can explicitly enable verification in requests library:
import requests
session = requests.Session()
session.verify = True
session.post(url='https://api-management-example/run', data={'bar':'baz'})
This is enabled by default. you can also verify the certificate per request:
requests.get('https://github.com', verify='/path/to/certfile')
Or per session:
s = requests.Session()
s.verify = '/path/to/certfile'
Read the docs.

Bottle-WebSocket: How to ensure an HTTP request is from the same session as ws connection?

I built an web application using Python Bottle framework.
I used bottle-websocket plugin for WebSocket communication with clients.
Here is a part of my code.
from bottle import Bottle, request, run
from bottle.ext.websocket import GeventWebSocketServer, websocket
class MyHandler():
...
class MyServer(Bottle):
...
def _serve_websocket(self, ws):
handler = MyHandler()
some_data = request.cookies.get('some_key') # READ SOME DATA FROM HTTP REQUEST
while True:
msg = ws.receive()
handler.do_sth_on(msg, some_data) # USE THE DATA FROM HTTP REQUEST
ws.send(msg)
del(handler)
if __name__ == '__main__':
run(app=MyServer(), server=GeventWebSocketServer, host=HOST, port=PORT)
As the code shows, I need to read some data from the browser (cookies or anything in the HTTP request headers) and use it for WebSocket message processing.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
NOTE
As I do not have much knowledge of HTTP and WebSocket, I'd love to here detailed answere as much as possible.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
Browser session is a bit abstract since HTTP does not have a concept of sessions. HTTP and RESTful APIs is designed to be stateless, but there is options.
Usually, what you usually want to know is what user the request comes from. This is usually solved by authentication e.g. by using OpenID Connect and let the user send his JWT-token in the Authorization: header, this works for all HTTP requests, including when setting up a Websocket connection.
bottle-oauthlib seem to be a library for authenticating end-users using OAuth2 / OpenID Connect.
Another option is to identify the "browser session" using cookies but this depends on a state somewhere on the server side and is harder to implement on cloud native platforms like e.g. Kubernetes that prefer stateless workloads.

flask proxy for ttyd

I am looking for a method to write a simple proxy in flask for ttyd which is an open-source web terminal(https://github.com/tsl0922/ttyd). The most immediate way is to read client request and relay to ttyd server. However, it fails when the websocket is connecting.
My view function is as follows:
#app.route('/')
#app.route('/auth_token.js')
#app.route('/ws')
def ttyd():
if request.path=='/ws':
url = 'ws://192.168.123.172:7681' + request.path
else:
url = 'http://192.168.123.172:7681' + request.path
method = request.method
data = request.data or request.form or None
cookies = request.cookies
headers = request.headers
with closing(
requests.request(method, url, headers=headers, data=data, cookies=cookies)
) as r:
resp_headers = []
for name, value in r.headers.items():
resp_headers.append((name, value))
return Response(r, status=r.status_code, headers=resp_headers)
As you can see, the view function will handle 3 url requests, the first two succeed with status code 200, the third fails with status code 500. The error code in server side is as follows:
requests.exceptions.InvalidSchema: No connection adapters were found for 'ws://192.168.123.172:7681/ws'
I also check the network in two cases(with/without proxy). The picture 'without proxy' means direct type 'http://192.168.123.172:7681', it succeeds. The picture 'with proxy' means access ttyd server with flask proxy, it fails.
Without proxy
With proxy
Since I am new to flask and websocket, I am confused about the result. The sHTTPe flask proxy can handle any other http request(e.g. access google.com) but fails in WebSocket connection.
Thank you for telling me why and how can I fix it?
According to Websockets in Flask there is a flask-sockets project at https://github.com/heroku-python/flask-sockets to serve a websocket-endpoint in flask. To make the backend websocket connection to the server you can't use requests but websocket-client, see How do I format a websocket request?.
When I had this problem I solved it using the autobahn-python project, see https://github.com/arska/stringreplacingwebsocketproxy/
Cheers,
Aarno

Python zeep. Force not to use a Proxy

How can i use python zeep module and configure the connection to connect with no proxy?
I need to access an internal WSDL. That means no proxy is needed.
I have tried to create the client:
from zeep import client
client = Client("myURL")
But i am getting an error because is trying to connect with a default proxy
Regards.
Using the information provided in the links provided by Dima,
the following worked for me:
session = requests.Session()
session.trust_env = False
transport = Transport(timeout=10)
transport.session = session
client = Client("your url", transport=transport)

Meanings of parameters when generating pre-signed URL using boto S3

I am using boto to create pre-signed URLs to permission users to upload directly to S3.
I know that we can use generate_url method (available both for Connection and Bucket classes) for this, but it's not clear to me what some of the available parameters such as headers, response_headers, and force_http specifically mean in that method.
I guess that headers are the headers in the request to the URL generation? And response_headers are the ones that will be in the response when the file is downloaded?
As for force_http, which connection is it for? Is it the connection between my application and aws, or the connection between the uploading client and aws?
force_http
will force non SSL http connection (e.g. not https). This is for the resulting URL
headers (e.g. generate_url(headers='X-Forward-For: 1.2.3.4')
will set the headers for the request, maybe passing a aws token for auth or something
response_headers
will override the default response headers from the S3 object, maybe you want to have a header in the response that your receiving client will interpret.
Details can be seen here: http://boto.readthedocs.org/en/latest/ref/gs.html?highlight=generate_url#boto.gs.key.Key.generate_url

Categories