I need to use Python to access data from a RESTful web service that requires certificate-based client authentication (PKI) over SSL/HTTPS. What is the recommended way of doing this?
The suggestion by stribika using httplib.HTTPSConnection should work for you provided that you do not need to verify the server's certificate. If you do want/need to verify the server, you'll need to look at a 3rd party module such as pyOpenSSL (which is a Python wrapper around a subset of the OpenSSL library).
I found this: http://code.activestate.com/recipes/117004/
I did not try it so it may not work.
I would recommend using M2Crypto. If you are a Twisted guy, M2Crypto integrates with Twisted so you can let Twisted handle the networking stuff and M2Crypto the SSL/verification/validation stuff.
Related
I would like to load-test a SignalR service using Locust. I found that the following library can send and receive SignalR requests: https://pypi.org/project/signalrcore/
Now, according to the Locust docs, the next step would be to write a custom client for Locust that can send SignalR requests. But there is the following warning:
Any protocol libraries that you use must be gevent-friendly (use the
Python socket module or some other standard library function like
subprocess), or your calls are likely to block the whole Locust/Python
process.
Some C libraries cannot be monkey patched by gevent, but allow for
other workarounds. For example, if you want to use psycopg2 to
performance test PostgreSQL, you can use psycogreen
I am a beginner in Python so I don't understand exactly what it means. The library "signalrcore" I am using is 100% synchronous. Does it means I can't use it with Locust?
I found an a fork of signalrcore that uses asyncio. Should I use that fork instead and just make sure all my signalr calls are non blocking?
Thanks!
SignalRCore seems to use requests and websocket-client under the hood, both of which are gevent-friendly. I cant say for sure, but I’d give it 90% probability that it will work ”out of the box” :)
If you do use the asyncio one you’d need to do some magic yourself. At least I have never combined that with gevent.
I'd like to expose a simple TCP server written in Python to the internet. To authenticate clients, I'd like to rely on both client and server certificates. Does socketserver.TCPServer support this mode by default? If not, can you suggest how to extend the server to implement mutual authentication?
The default library doesn't handle secure sockets (SSL/TLS). Assuming you want to use that specific library no matter what, here's another discussion that shows a way to do it using the OpenSSL libraries.
If you want to write a server application, you might want to use Twisted, an event-oriented framework for writing network applications in Python. Here's the relevant documentation on how to enable SSL for a TCP server.
Is there any way to make DNS resolving using Python with asyncore?
I can not install adns and I do not like to use gevent library.
(for URL downloading gevent give me slow performance than syncore.loop)
I asked google your question, and it told me about http://subvert-rpki.hactrn.net/trunk/rpkid/rpki/adns.py which seems to be a DNS client implementation based on asyncore. So, there you have it, there is a way.
Recently, AWS announced ElastiCache's auto-discovery feature, although they only officially released a client for Java. Does anyone know of a Python Memcached library with support for this feature?
As far as I know, ElastiCache cluster is just a bunch of memcached servers, so you need to give your memcached client the list of all of your servers and have the client do the relevant load balancing.
For Python, you have a couple of options:
pylibmc - which is a wrapper around libmemcached - one of the best and fastest memcached clients there is
python-memcached - a native Python client - very basic, but easy to work with, install and use
They haven't provided a client yet in python to deal with the new auto-discovery feature unfortunately.
There is library for django: django-elasticache.
You can find more details in my answer on same question here.
I wrote a package, it can do elasticache auto discovery:
https://github.com/yupeng820921/elasticache_pyclient
Yes elasticache_pyclient supports aws elasticache auto discovery.
It uses python-memcached implements memcache command, and use hash_ring implements consistent hash.
# Usage (from documentation)
>>> from elasticache_pyclient import MemcacheClient
>>> mc = MemcacheClient('test.lwgyhw.cfg.usw2.cache.amazonaws.com:11211')
>>> mc.set('foo', 'bar')
True
>>> mc.get('foo')
'bar'
I have worked with Django for a while but I am new to xml-rpc. I have two Django servers running and the first needs to call functions from some modules of second server. I find xml-rpc easiest way to do so but don't want to run a separate server for this only.
What options do I have? Can I run Django's web-server and xml-rpc server with a single manage runserver command ?
Easily - we use http://code.djangoproject.com/wiki/XML-RPC to add an xml-rpc server into our django server.
You may also consider David Fisher's rpc4django which supports both XMLRPC and JSONRPC within a single package. Features include:
Detects request type (JSONRPC or XMLRPC) based on content
Easy identification of RPC methods via a decorator
Pure python and requires no external modules except Django
Customizable RPC method documentation including reST
Supports XMLRPC and JSONRPC introspection
Supports method signatures (unlike SimpleXMLRPCServer)
Easy installation and integration with existing Django projects
Ties in with Django’s authentication and authorization
Try: http://pypi.python.org/pypi/django-xmlrpc