This is probably a dumb question, but I just want to make sure with the below.
I am currently using the requests library in python. I am using this to call an external API hosted on Azure cloud.
If I use the requests library from a virtual machine, and the requests library sends to URL: https://api-management-example/run, does that mean my communication to this API, as well as the entire payload I send through is secure? I have seen in my Python site-packages in my virtual environment, there is a cacert.pem file. Do I need to update that at all? Do I need to do anything else on my end to ensure the communication is secure, or the fact that I am calling the HTTPS URL means it is secure?
Any information/guidance would be much appreciated.
Thanks,
A HTTPS is secure with valid signed certificate. Some people use self signed certificate to maintain HTTPS. In requests library, you explicitly verify your certificate. If you have self-signed HTTPS then, you need to pass the certificate to cross verify with your local certificate.
verify = True
import requests
response = requests.get("https://api-management-example/run", verify=True)
Self Signed Certificate
import requests
response = requests.get("https://api-management-example/run", verify="/path/to/local/certificate/file/")
Post requests are more secure because they can carry data in an encrypted form as a message body. Whereas GET requests append the parameters in the URL, which is also visible in the browser history, SSL/TLS and HTTPS connections encrypt the GET parameters as well. If you are not using HTTPs or SSL/TSL connections, then POST requests are the preference for security.
A dictionary object can be used to send the data, as a key-value pair, as a second parameter to the post method.
The HTTPS protocol is safe provided you have a valid SSL certificate on your API. If you want to be extra safe, you can implement end-to-end encryption/cryptography. Basically converting your so called plaintext, and converting it to scrambled text, called ciphertext.
You can explicitly enable verification in requests library:
import requests
session = requests.Session()
session.verify = True
session.post(url='https://api-management-example/run', data={'bar':'baz'})
This is enabled by default. you can also verify the certificate per request:
requests.get('https://github.com', verify='/path/to/certfile')
Or per session:
s = requests.Session()
s.verify = '/path/to/certfile'
Read the docs.
Related
I am trying to make a script that uploads files to a nextcloud server via webdav. The documentation says this can be done using a PUT HTTP request, like so:
import requests
url = "https://example.com/path/to/file.txt"
file = open("file/to/upload.txt")
response = requests.put(url, auth=(user, passwd), data=file)
This works just fine, and I get a 201 status code as a response, which means the file has been created, but I feel like its quite unsafe to do it this way.
I can add the parameter verify=True to make sure the file is sent encrypted if the web has a valid SSL certificate, but I don't know if the authentication credentials are encrypted too.
In case they're not, how would I securely make the request without revealing my password?
You're PUTting to a url that starts with https so you're making the request over TLS which is secure if you have a good certificate. If you have a good certificate, your username and password will also be sent over TLS and be secure.
To find out if your certificate is good, try running openssl s_client -showcerts -connect example.com:443 from the commandline and check that it reports Verify return code: 0 (ok) or try opening a browser and type in https://example.com and see if the browser complains.
I'm trying to connect to one of my internal services at: https://myservice.my-alternative-domain.com through Python Requests. I'm using Python 3.6
I'm using a custom CA bundle to verify the request, and I'm getting the next error:
SSLError: hostname 'myservice.my-domain.com' doesn't match either of 'my-domain.com', 'my-alternative-domain.com'
The SSL certificate that the internal service uses has as CN: my-domain.com, and as SAN (Subject Alternative Names): 'my-domain.com', 'my-alternative-domain.com'
So, I'm trying to access the service through one of the alternative names (this has to be like this and it's not under my control)
I think the error is correct, and that the certificate should have also as SAN:
'*.my-alternative-domain.com'
in order for the request to work.
The only thing that puzzles me is that I can access the service through the browser.
Can somebody confirm the behavior of Python Requests is correct?
This is how I call the service:
response = requests.get('https://myservice.my-alternative-domain.com', params=params, headers=headers, verify=ca_bundle)
Thanks
pass verify as false might work
x=requests.get(-----,verify=false)
Is there a difference between those two bs4 objects?
from urllib2 import urlopen, Request
from bs4 import BeautifulSoup
req1 = Request("https://stackoverflow.com/") # HTTPS
html1 = urlopen(req1).read()
req2 = Request("http://stackoverflow.com/") # HTTP
html2 = urlopen(req2).read()
bsObj1 = BeautifulSoup(html1, "html.parser")
bsObj2 = BeautifulSoup(html2, "html.parser")
Do you really need to specify an HTTP protocol?
Here's my limited understanding: There isn't a practical difference in this case.
My understanding is that most websites that have https will redirect http URLs to https, as is the case here. It's possible for a site to have an http version and an https version up simultaneously, in which case they might not redirect. This would be bad practice, but nothing is stopping someone from doing it.
I would still explicitly use https whenever possible, just as a best practice.
All communication over the HTTP protocol happens using HTTP verbs GET, POST, PUT, DELETE. Specifying the protocol has two purposes:
1) It specifies the scheme for data communication.
A general URI is of the form:
scheme:[//[user[:password]#]host[:port]][/path][?query][#fragment] and common schemes are http(s), ftp, mailto, file, data, and irc.
2) It specifies if the scheme supports SSL encryption:
With http schemes, the added 's' in https ensures SSL encryption of data.
According to urllib3 Python docs:
It is highly recommended to always use SSL certificate verification.In order to enable verification you will need a set of root certificates. The easiest and most reliable method is to use the certifi package which provides Mozilla’s root certificate bundle:
pip install certifi
>>> import certifi
>>> import urllib3
>>> http = urllib3.PoolManager(
... cert_reqs='CERT_REQUIRED',
... ca_certs=certifi.where())
The PoolManager will automatically handle certificate verification and will raise SSLError if verification fails:
>>> http.request('GET', 'https://google.com')
(No exception)
>>> http.request('GET', 'https://expired.badssl.com')
urllib3.exceptions.SSLError ...
I need to access the Jenkins JSON API from a Python script. The problem is that our Jenkins installation is secured so to log in users have to select a certificate. Sadly, in Jenkins Remote Access Documentation they don't mention a thing about certificates and I tried using the API Token without success.
How can I get to authenticate from a Python script to use their JSON API?
Thanks in advance!
You have to authenticate to the JSON API using HTTP Basic Auth.
To make scripted clients (such as wget) invoke operations that require authorization (such as scheduling a build), use HTTP BASIC authentication to specify the user name and the API token. This is often more convenient than emulating the form-based authentication
https://wiki.jenkins-ci.org/display/JENKINS/Authenticating+scripted+clients
Here is a sample of using Basic Auth with Python.
http://docs.python-requests.org/en/master/user/authentication/
Keep in mind if you are using a Self Signed certificate on an internal Jenkin Server you'll need to turn off certificate validation OR get the certificate from the server and add it to the HTTP request
http://docs.python-requests.org/en/master/user/advanced/
I finally found out how to authenticate to Jenkins using certs and wget. I had to convert my pfx certificates into pem ones with cert and keys in separate files For more info about that come here. In the end this is the command I used.
wget --certificate=/home/B/cert.pem --private-key=/home/B/key.pem --no-check-certificate --output-document=jenkins.json https:<URL>
I'm not completely sure it covers your certificate use case, but since it took me some time to find out, I still want to share this snipped that retrieves the email address for a given user name in Python without special Jenkins libraries. It uses an API token and "supports" (actually ignores) https:
def _get_email_adress(user):
request = urllib.request.Request("https://jenkins_server/user/"+ user +"/api/json")
#according to https://stackoverflow.com/a/28052583/4609258 the following is ugly
context = ssl._create_unverified_context()
base64string = base64.b64encode(bytes('%s:%s' % ('my user name', 'my API token'),'ascii'))
request.add_header("Authorization", "Basic %s" % base64string.decode('utf-8'))
with urllib.request.urlopen(request, context=context) as url:
user_data = json.loads(url.read().decode())
for property in user_data['property']:
if property["_class"]=="hudson.tasks.Mailer$UserProperty":
return property["address"];
Using python-requests, how can I pin a self-signed .pem certificate for a specific server directly, without using CA root bundles?
Is this currently possible? If yes, can you please provide an example?
I read https://2.python-requests.org/en/v2.8.1/user/advanced/#ssl-cert-verification but am not sure if this applies to what I'm trying to do:
You can also specify a local cert to use as client side certificate, as a single file (containing the private key and the certificate) or as a tuple of both file’s path: requests.get('https://kennethreitz.com', cert=('/path/server.crt', '/path/key')) Response [200]
Because the certificate file is self-signed, this works just as you would do it normally with requests. Below is a step-by-step procedure:
Obtain the self-signed certificate, ideally in some secure, out-of-band manner. For example, I run a webserver that offers HTTPS access via a self-signed certificate, so I downloaded the certificate using scp:
scp <username>#<server>:/path/to/certfile.crt .
Because I use nginx this is already in PEM format, but if it's not you'll need to convert it. That's outside the scope of this answer.
Use the certificate file from inside requests:
r = requests.get('https://yoursite.com/', verify='certfile.crt')
That's all you need to do.
If you can't obtain the certificate in an out-of-band manner you trust, you can obtain the certificate using your browser. All browsers will let you export the certificate via their UIs. This is less-secure: if someone is going to MITM you then they may well have already started, and can offer you their MITM root CA instead of your self-signed cert.
You can also verify certificates against their fingerprints.
For this you need a custom transport adapter for requests.
An example for a simple one can be found here:
https://github.com/untitaker/vdirsyncer/blob/9d3a9611b2db2e92f933df30dd98c341a50c6211/vdirsyncer/utils/init.py#L198
import requests
from requests.packages.urllib3.poolmanager import PoolManager
class _FingerprintAdapter(requests.adapters.HTTPAdapter):
def __init__(self, fingerprint=None, **kwargs):
self.fingerprint = str(fingerprint)
super(_FingerprintAdapter, self).__init__(**kwargs)
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize,
block=block,
assert_fingerprint=self.fingerprint)