How to use AWS certificate with python requests library? - python

I have an elastic load balancer running on aws cloud. I have attached a HTTPS listener to it on port 443, and it is using a certificate in certificate manager. I want to send https requests from a python script to the elastic load. I can't figure out the exact api call that I should make to implement this.

You can make https get request to the DNS of the ELB.
To generate HTTPS request use following script quoted from this answer.
import urllib.request
r = urllib.request.urlopen('<DNS OF ELB>')
print(r.read())
If you really want to use http.client, you must call endheaders
after you send the request headers:
import http.client
conn = http.client.HTTPSConnection('DNS OF ELB', 443)
conn.putrequest('GET', '/')
conn.endheaders() # <---
r = conn.getresponse()
print(r.read())
As a shortcut to putrequest/endheaders, you can also use the request
method, like this:
import http.client
conn = http.client.HTTPSConnection('DOMAIN', 443)
conn.request('GET', '/') # <---
r = conn.getresponse()
print(r.read())
Update 1
For Python 2.7 You can use httplib or urllib2
If you are using httplib, HTTPS supports only if the socket module was compiled with SSL support.
For urllib2 refer this article.

Related

Getting a 401 response while using Requests package

I am trying to access a server over my internal network under https://prodserver.de/info.
I have the code structure as below:
import requests
from requests.auth import *
username = 'User'
password = 'Hello#123'
resp = requests.get('https://prodserver.de/info/', auth=HTTPBasicAuth(username,password))
print(resp.status_code)
While trying to access this server via browser, it works perfectly fine.
What am I doing wrong?
By default, requests library verifies the SSL certificate for HTTPS requests. If the certificate is not verified, it will raise a SSLError. You check this by disabling the certificate verification by passing verify=False as an argument to the get method, if this is the issue.
import requests
from requests.auth import *
username = 'User'
password = 'Hello#123'
resp = requests.get('https://prodserver.de/info/', auth=HTTPBasicAuth(username,password), verify=False)
print(resp.status_code)
try using requests' generic auth, like this:
resp = requests.get('https://prodserver.de/info/', auth=(username,password)
What am I doing wrong?
I can not be sure without investigating your server, but I suggest checking if assumption (you have made) that server is using Basic authorization, there exist various Authentication schemes, it is also possible that your server use cookie-based solution, rather than headers-based one.
While trying to access this server via browser, it works perfectly
fine.
You might then use developer tools to see what is actually send inside and with request which does result in success.

using kubernetes secrets in SSLContext

I am doing POC to check if we can connect to an API, for that I use the below code.
# Define the client certificate settings for https connection
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
print(type(context))
context.load_cert_chain(certfile=CERT, keyfile=KEY, password=PW)
# Create a connection to submit HTTP requests
connection = http.client.HTTPSConnection(host, port=443, context=context)
# Use connection to submit a HTTP POST request
connection.request(method="GET", url=request_url, headers=request_headers)
# Print the HTTP response from the IOT service endpoint
response = connection.getresponse()
print(response.status, response.reason)
data = response.read()
print(data)
these two variables (CERT and KEY), i get the secrets via kubernetes files and convert them to strings. Is there alternate ways to load the downloaded secrets into context object instead of using load_cert_chain method (since this one needs files). I know this is not ideal, but since I am doing only POC I just want to see if this is doable.

How do I check public IP using Python or DDNS with Cloudflare

It's any way to check my public IP addres using Python? I have had an account on Cloudflare and VPS in a home (but dynamic IP). I need to update VPN IP'S before had an account on OVH and DDNS works after migration dose not work.
requests:
import requests
response = requests.get('http://ifconfig.me')
print(response.text)
python 2:
import urllib2
response = urllib2.urlopen('http://ifconfig.me')
print response.read()
python 3:
from urllib import request
response = request.urlopen('http://ifconfig.me')
print(response.read())
There's a bunch of websites online that offer simple APIs to check the IP address you're connecting from.
As an example:
➜ ~ curl 'https://api.ipify.org?format=json'
{"ip":"1.2.3.4"}
You can use a python library (like Requests) to call the API in python code.
# download requests with `pip install requests`
import requests
res = requests.get("https://api.ipify.org?format=json")
your_ip = res.json() # {"ip" : "1.2.3.4"}
Recently I've created a simple docker image that allows you to keep you local ip in sync with Cloudflare AKA Cloudflare Dynamic DNS (DDNS)
check it out here:
https://github.com/marcelowa/cloudflare-dynamic-dns

Python SOAP client with Zeep - authentication

I am trying to use Zeep to implement a SOAP client, as it seems the only maintained library at the moment:
ZSI looked very good but its latest version on pypi dates 2006
suds seemed to be a popular alternative, but the master is unmaintained since 2011 and there are a lot of forks out there but none seems "official" and "recent" enough to be used in a large project.
So, trying to use Zeep, I am stuck with the authentication required by the server to access the WSDL.
Such operation was quite easy with ZSI:
from ZSI.client import Binding
from ZSI.auth import AUTH
b = Binding(url='http://mysite.dom/services/MyWebServices?WSDL')
b.SetAuth(AUTH.httpbasic, 'userid', 'password')
and I can find something similar in __main__.py of Zeep:
from six.moves.urllib.parse import urlparse
from zeep.cache import InMemoryCache, SqliteCache
from zeep.client import Client
from zeep.transports import Transport
cache = SqliteCache() if args.cache else InMemoryCache()
transport_kwargs = {'cache': cache}
result = urlparse(args.wsdl_file)
if result.username or result.password:
transport_kwargs['http_auth'] = (result.username, result.password)
transport = Transport(**transport_kwargs)
client = Client(args.wsdl_file, transport=transport)
but that does not work in my case, I get an error:
Exception: HTTPConnectionPool(host='schemas.xmlsoap.org', port=80): Max retries exceeded with url: /soap/encoding/ (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f3dab9d30b8>: Failed to establish a new connection: [Errno 110] Connection timed out',))
Probably with the newer Version of zeep the older solution does not work anymore. Here is the new way:
from requests.auth import HTTPBasicAuth # or HTTPDigestAuth, or OAuth1, etc.
from requests import Session
from zeep import Client
from zeep.transports import Transport
session = Session()
session.auth = HTTPBasicAuth(user, password)
client = Client('http://my-endpoint.com/production.svc?wsdl',
transport=Transport(session=session))
For Basic Access Authentication you can use the HTTPBasicAuth class from the requests module, as explained on Zeep documentation http://docs.python-zeep.org/en/master/transport.html:
from requests.auth import HTTPBasicAuth # or HTTPDigestAuth, or OAuth1, etc.
from zeep import Client
from zeep.transports import Transport
client = Client('http://my-endpoint.com/production.svc?wsdl',
transport=Transport(http_auth=HTTPBasicAuth(user, password)))
In my case the API I was working with required WS-Security (WSSE) rather than HTTP.
from zeep import Client
from zeep.wsse.username import UsernameToken
client = Client(<wsdl_url>, wsse=UsernameToken(<username>, <password>)

Access HTTPS / :443 through VPN in this Python urllib.request library, when blocked

I am using gh-issues-import to migrate issues between GitHub and a GitHub Enterprise server. The problem I have is, our GHE requires going through a VPN proxy, while GitHubs API requires HTTPS route to access. I can only get one or the other, but having a hell of a time finding a way to access both via the same Python project using urllib.requests. Here is a scaled down script I used to utilize the library that is failing in gh-issues-import...
import urllib.request
# works through VPN (notice able to use http), requires VPN
GitHubEnterpriseurl = "http://xxxxx/api/v3/"
req = urllib.request.Request(GitHubEnterpriseurl)
response = urllib.request.urlopen(req)
json_data = response.read()
print(json_data)
# does not work on VPN due to https path, but fine outside of VPN
req = urllib.request.Request("https://api.github.com")
response = urllib.request.urlopen(req)
json_data = response.read()
print(json_data)
I have tried other HTTP libraries and comes down to the VPN blocking access to the https://api.github.com. What are some solutions for this? Can I create a script on another server, my VPN has access to, and simply clone the requests and route the data?
* I am able to connect to https://api.github.com using VPN through the browser (Chrome / Firefox) but when running any command line tools or this script to access it fails.

Categories