Has anyone combined soap.py or suds with python-ntlm? - python

I'd like to replace an app's current (badly busted and crufty) cURL-based (cURL command-line based!) SOAP client with suds or soap.py. Trouble is, we have to contact an MS CRM service, and therefore must use NTLM. For a variety of reasons the NTLM proxy is a bit of a pain to use, so I'm looking into python-ntlm to provide that support.
Can suds or soap.py be made to use this authentication method? If so, how? If not, any other suggestions would be fantastic.
Edit
As noted below, suds already supports python-ntlm out of the box.

Suds was fixed to support it since 0.3.8.
Sources of python-suds-0.3.9\suds\transport\https.py says:
class WindowsHttpAuthenticated(HttpAuthenticated):
"""
Provides Windows (NTLM) http authentication.
#ivar pm: The password manager.
#ivar handler: The authentication handler.
"""
def u2handlers(self):
# try to import ntlm support
try:
from ntlm import HTTPNtlmAuthHandler
except ImportError:
raise Exception("Cannot import python-ntlm module")
handlers = HttpTransport.u2handlers(self)
handlers.append(HTTPNtlmAuthHandler.HTTPNtlmAuthHandler(self.pm))
return handlers
Try with the following snippet as described here:
from suds.transport.https import WindowsHttpAuthenticated
ntlm = WindowsHttpAuthenticated(username='xx', password='xx')
client = Client(url, transport=ntlm)

Another approach would be to call your curl command during soap exceptions and then initiate a retry.
something like...
curl -x websenseproxy:8080 --ntlm -U domain\user:password --insecure https://blah.com/prod/webservice.asmx?WSDL
#insecure is used for self signed certs

You can use CNTLM as a local proxy service which will handle all the NTLM authentication calls. Then you just reference the local CNTLM proxy IP and port with no authentication using soapy or urllib2... whatever really.
I've yet to find a python library that deals with complex proxies well.

Related

access ADFS/OIDC protected webapi with curl or python

I would like to access a webapi by a script(bash or python), which is protected by mod_openidc/apache2 and an self-hosted ADFS.
For the authentication, a certificate from a smartcard or locally stored certificate is required.
I already tried several approaches with python or curl, but got no nearly working script.
approach at python:
from oauthlib.oauth2 import BackendApplicationClient
from requests_oauthlib import OAuth2Session
client_id="abcdef-abcd-abcd-abcd-abcdefghijk"
client = BackendApplicationClient(client_id=client_id)
#client = BackendApplicationClient()
oauth = OAuth2Session(client=client)
protected_url="https://protectedurl/page/"
oauth.fetch_token(token_url='https://sts.myserver.net/adfs/oauth2/token/', include_client_id=True, cert=('/home/user/cert.pem', '/home/user/server.key'))
which lead to: "oauthlib.oauth2.rfc6749.errors.InvalidClientError: (invalid_client) MSIS9627: Received invalid OAuth client credentials request. Client credentials are missing or found empty"
curl:
curl --cert /home/user/cert.pem --key /home/user/server.key
https://sts.example.net/adfs/oauth2/authorize/?response_type=code&scope=openid%20email%20profile%20allatclaims&client_id=XXX&state=XXXredirect_uri=https%3A%2F%2Fexample.net%2Fpage%2Fredirect_uri&nonceXXX
Which gives the sts page in html
So I think I dont have some small bug, but a wrong approach
Since it works in a browser, I dont suggest a issue on server side
Any approaches and examples are warmly welcome

How to authenticate in Jenkins while remotely accessing its JSON API?

I need to access the Jenkins JSON API from a Python script. The problem is that our Jenkins installation is secured so to log in users have to select a certificate. Sadly, in Jenkins Remote Access Documentation they don't mention a thing about certificates and I tried using the API Token without success.
How can I get to authenticate from a Python script to use their JSON API?
Thanks in advance!
You have to authenticate to the JSON API using HTTP Basic Auth.
To make scripted clients (such as wget) invoke operations that require authorization (such as scheduling a build), use HTTP BASIC authentication to specify the user name and the API token. This is often more convenient than emulating the form-based authentication
https://wiki.jenkins-ci.org/display/JENKINS/Authenticating+scripted+clients
Here is a sample of using Basic Auth with Python.
http://docs.python-requests.org/en/master/user/authentication/
Keep in mind if you are using a Self Signed certificate on an internal Jenkin Server you'll need to turn off certificate validation OR get the certificate from the server and add it to the HTTP request
http://docs.python-requests.org/en/master/user/advanced/
I finally found out how to authenticate to Jenkins using certs and wget. I had to convert my pfx certificates into pem ones with cert and keys in separate files For more info about that come here. In the end this is the command I used.
wget --certificate=/home/B/cert.pem --private-key=/home/B/key.pem --no-check-certificate --output-document=jenkins.json https:<URL>
I'm not completely sure it covers your certificate use case, but since it took me some time to find out, I still want to share this snipped that retrieves the email address for a given user name in Python without special Jenkins libraries. It uses an API token and "supports" (actually ignores) https:
def _get_email_adress(user):
request = urllib.request.Request("https://jenkins_server/user/"+ user +"/api/json")
#according to https://stackoverflow.com/a/28052583/4609258 the following is ugly
context = ssl._create_unverified_context()
base64string = base64.b64encode(bytes('%s:%s' % ('my user name', 'my API token'),'ascii'))
request.add_header("Authorization", "Basic %s" % base64string.decode('utf-8'))
with urllib.request.urlopen(request, context=context) as url:
user_data = json.loads(url.read().decode())
for property in user_data['property']:
if property["_class"]=="hudson.tasks.Mailer$UserProperty":
return property["address"];

Authenticating connection in PySolr

This is the first time I am using Python and Solr. I have my Solr instance set up within tomcat on GCE. I am trying to connect to it from my Python code using PySolr. However, I am not sure how to send authentication parameters via PySolr.
This is the exception I get:
solr = pysolr.Solr('http://MY INSTANCE IP/solr/News', timeout=10)
Apache Tomcat/7.0.28 - Error report HTTP Status 401 - type Status reportmessage description This request requires HTTP authentication ().Apache Tomcat/7.0.28
Please advise.
solr = pysolr.Solr('http://user:pass#IP:8983/solr/')
That's all you need ...
You can pass Solr authentication as part of the Solr connection parameter.
You don't have proper documentation in pySolr on how to carry out authentication. Since pySolr internally uses requests for authentication you can follow authentication in requests.
Here is a small example on custom authentication as well.
In the case of Basic Authentication, you can use it as
solr = pysolr.Solr('http://IP:8983/solr/collection',auth=('username','password'))
or
from requests.auth import HTTPBasicAuth
solr = pysolr.Solr('http://IP:8983/solr/collection',auth=HTTPBasicAuth('username','password'))
This is the proper way of authentication. Passing username and password as a part of URL is not recommended as it might create issues if # or ' are used in any of those may create issues in the authentication.Refer this GitHub issue

403 Forbidden Error for Python-Suds contacting Sharepoint

I'm using Python's SUDs lib to access Sharepoint web services.
I followed the standard doc from Suds's website.
For the past 2 days, no matter which service I access, the remote service always returns 403 Forbidden.
I'm using Suds 0.4 so it has built-in support for accessing Python NTLM.
Let me know if anyone has a clue about this.
from suds import transport
from suds import client
from suds.transport.https import WindowsHttpAuthenticated
import logging
logging.basicConfig(level=logging.INFO)
logging.getLogger('suds.client').setLevel(logging.DEBUG)
ntlm = WindowsHttpAuthenticated(username='USER_ID', password='PASS')
c_lists = client.Client(url='https://SHAREPOINT_URL/_vti_bin/Lists.asmx?WSDL', transport=ntlm)
#c_lists = client.Client(url='https://SHAREPOINT_URL/_vti_bin/spsearch.asmx?WSDL')
#print c_lists
listsCollection = c_lists.service.GetListCollection()
Are you specifying the username as DOMAIN\USER_ID as indicated in examples for the python-ntlm library? (Also see this answer).

Does urllib2 in Python 2.6.1 support proxy via https

Does urllib2 in Python 2.6.1 support proxy via https?
I've found the following at http://www.voidspace.org.uk/python/articles/urllib2.shtml:
NOTE
Currently urllib2 does not support
fetching of https locations through a
proxy. This can be a problem.
I'm trying automate login in to web site and downloading document, I have valid username/password.
proxy_info = {
'host':"axxx", # commented out the real data
'port':"1234" # commented out the real data
}
proxy_handler = urllib2.ProxyHandler(
{"http" : "http://%(host)s:%(port)s" % proxy_info})
opener = urllib2.build_opener(proxy_handler,
urllib2.HTTPHandler(debuglevel=1),urllib2.HTTPCookieProcessor())
urllib2.install_opener(opener)
fullurl = 'https://correct.url.to.login.page.com/user=a&pswd=b' # example
req1 = urllib2.Request(url=fullurl, headers=headers)
response = urllib2.urlopen(req1)
I've had it working for similar pages but not using HTTPS and I suspect it does not get through proxy - it just gets stuck in the same way as when I did not specify proxy. I need to go out through proxy.
I need to authenticate but not using basic authentication, will urllib2 figure out authentication when going via https site (I supply username/password to site via url)?
EDIT:
Nope, I tested with
proxies = {
"http" : "http://%(host)s:%(port)s" % proxy_info,
"https" : "https://%(host)s:%(port)s" % proxy_info
}
proxy_handler = urllib2.ProxyHandler(proxies)
And I get error:
urllib2.URLError: urlopen error
[Errno 8] _ssl.c:480: EOF occurred in
violation of protocol
Fixed in Python 2.6.3 and several other branches:
_bugs.python.org/issue1424152 (replace _ with http...)
http://www.python.org/download/releases/2.6.3/NEWS.txt
Issue #1424152: Fix for httplib, urllib2 to support SSL while working through
proxy. Original patch by Christopher Li, changes made by Senthil Kumaran.
I'm not sure Michael Foord's article, that you quote, is updated to Python 2.6.1 -- why not give it a try? Instead of telling ProxyHandler that the proxy is only good for http, as you're doing now, register it for https, too (of course you should format it into a variable just once before you call ProxyHandler and just repeatedly use that variable in the dict): that may or may not work, but, you're not even trying, and that's sure not to work!-)
Incase anyone else have this issue in the future I'd like to point out that it does support https proxying now, make sure the proxy supports it too or you risk running into a bug that puts the python library into an infinite loop (this happened to me).
See the unittest in the python source that is testing https proxying support for further information:
http://svn.python.org/view/python/branches/release26-maint/Lib/test/test_urllib2.py?r1=74203&r2=74202&pathrev=74203

Categories