I have a xmlrpc server running looking like the following
from SimpleXMLRPCServer import SimpleXMLRPCServer
def add(x,y):
return x+y
server = SimpleXMLRPCServer(("localhost", 8000))
server.register_function(add, 'add.numbers')
server.serve_forever()
which is called used within the following code:
import xmlrpclib
class DeviceProxy(object):
def __init__(self, uri):
self.rpc = xmlrpclib.ServerProxy(uri)
def __getattr__(self, attr):
return getattr(self.rpc, attr)
original = DeviceProxy.__getattr__
def mygetattr(device, attr):
def wrapper(*args, **kw):
print('called with %r and %r' % (args, kw))
return original(device, attr)(*args, **kw)
return wrapper
DeviceProxy.__getattr__ = mygetattr
dev = DeviceProxy("http://localhost:8000/RPC2")
print dev.add.numbers(4,6)
As you can see, the Proxy class wraps the xmlrpc proxy for reasons outside the scope of this question, forwarding arbitrary calls via the __getattr__ method . For further reasons outside the scope for this question, I need to wrap/replace this __getattr__ method by a different method to e.g. print out the name of the function called, the arguments etc. (see related question here).
But this approach does not work, it gives the following error:
AttributeError: 'function' object has no attribute 'numbers'
The example works as expected when I
do not replace DeviceProxy.__getattr__ with something else
replace DeviceProxy.__getattr__ with the function
def dummy(instance, attr):
return original(device,attr)
replace the name of the xmlrpc function by a zero-dotted name (e.g. just sum instead of sum.numbers)
You can verify yourself that the following, direct call via the xmlrpc proxy will work as expected:
dev = xmlrpclib.ServerProxy("http://localhost:8000/RPC2")
print dev.add.numbers(4,6)
My question: How to solve my problem, i.e. how to be able to wrap/overwrite the DeviceProxy.__getattr__ correctly to be able to see the function called, all arguments etc WITHOUT making changes in the xmlrpc server or the DeviceProxy class?
I can see two problems here:
Are all attributes of a DeviceProxy functions? If they're not, then you're sometimes returning a function when an object is expected
When you wrap the function, you're not copying across members - use functools.wraps to achieve that.
This ought to work
from functools import wraps
#wraps(original) # probably not needed, but sensible
def mygetattr(device, key):
attr = original(device, key)
if callable(attr):
#wraps(attr) # copy across __name__, __dict__ etc
def wrapper(*args, **kw):
print('called with %r and %r' % (args, kw))
return attr(*args, **kw)
return wrapper
else: # handle (or rather, don't) non-callable attributes
return attr
Related
I'm implementing a RESTful web service in python and would like to add some QOS logging functionality by intercepting function calls and logging their execution time and so on.
Basically i thought of a class from which all other services can inherit, that automatically overrides the default method implementations and wraps them in a logger function. What's the best way to achieve this?
Something like this? This implictly adds a decorator to your method (you can also make an explicit decorator based on this if you prefer that):
class Foo(object):
def __getattribute__(self,name):
attr = object.__getattribute__(self, name)
if hasattr(attr, '__call__'):
def newfunc(*args, **kwargs):
print('before calling %s' %attr.__name__)
result = attr(*args, **kwargs)
print('done calling %s' %attr.__name__)
return result
return newfunc
else:
return attr
when you now try something like:
class Bar(Foo):
def myFunc(self, data):
print("myFunc: %s"% data)
bar = Bar()
bar.myFunc(5)
You'll get:
before calling myFunc
myFunc: 5
done calling myFunc
What if you write a decorator on each functions ? Here is an example on python's wiki.
Do you use any web framework for doing your webservice ? Or are you doing everything by hand ?
Assume I have to unit test methodA, defined in the following class:
class SomeClass(object):
def wrapper(fun):
def _fun(self, *args, **kwargs):
self.b = 'Original'
fun(self, *args, **kwargs)
return _fun
#wrapper
def methodA(self):
pass
My test class is as follows:
from mock import patch
class TestSomeClass(object):
def testMethodA(self):
def mockDecorator(f):
def _f(self, *args, **kwargs):
self.b = 'Mocked'
f(self, *args, **kwargs)
return _f
with patch('some_class.SomeClass.wrapper', mockDecorator):
from some_class import SomeClass
s = SomeClass()
s.methodA()
assert s.b == 'Mocked', 's.b is equal to %s' % s.b
If I run the test, I hit the assertion:
File "/home/klinden/workinprogress/mockdecorators/test_some_class.py", line 17, in testMethodA
assert s.b == 'Mocked', 's.b is equal to %s' % s.b
AssertionError: s.b is equal to Original
If I stick a breakpoint in the test, after patching, this is I can see wrapper has been mocked out just fine, but that methodA still references the old wrapper:
(Pdb) p s.wrapper
<bound method SomeClass.mockDecorator of <some_class.SomeClass object at 0x7f9ed1bf60d0>>
(Pdb) p s.methodA
<bound method SomeClass._fun of <some_class.SomeClass object at 0x7f9ed1bf60d0>>
Any idea of what the problem is here?
After mulling over, I've found a solution.
Since monkey patching seems not to be effective (and I've also tried a few
other solutions), I dug into the function internals and that proved to be fruitful.
Python 3
You're lucky - just use the wraps decorator, which creates a __wrapped__ attribute, which in turn contains the wrapped function. See the linked answers above for more details.
Python 2
Even if you use #wraps, no fancy attribute is created.
However, you just need to realise that the wrapper method does nothing but a closure: so you'll be able to find your wrapped function in its func_closure attribute.
In the original example, the wrapped function would be at: s.methodA.im_func.func_closure[0].cell_contents
Wrapping up (ha!)
I created a getWrappedFunction helper along this lines, to ease my testing:
#staticmethod
def getWrappedFunction(wrapper):
return wrapper.im_func.func_closure[0].cell_contents
YMMV, especially if you do fancy stuff and include other objects in the closure.
I'm trying to write a library that will register an arbitrary list of service calls from multiple service endpoints to a container. I intend to implement the service calls in classes written one per service. Is there a way to maintain the boundedness of the methods from the service classes when registering them to the container (so they will still have access to the instance data of their owning object instance), or must I register the whole object then write some sort of pass through in the container class with __getattr__ or some such to access the methods within instance context?
container:
class ServiceCalls(object):
def __init__(self):
self._service_calls = {}
def register_call(self, name, call):
if name not in self._service_calls:
self._service_calls[name] = call
def __getattr__(self, name):
if name in self._service_calls:
return self._service_calls[name]
services:
class FooSvc(object):
def __init__(self, endpoint):
self.endpoint = endpoint
def fooize(self, *args, **kwargs):
#call fooize service call with args/kwargs utilizing self.endpoint
def fooify(self, *args, **kwargs):
#call fooify service call with args/kwargs utilizing self.endpoint
class BarSvc(object):
def __init__(self, endpoint):
self.endpoint = endpoint
def barize(self, *args, **kwargs):
#call barize service call with args/kwargs utilizing self.endpoint
def barify(self, *args, **kwargs):
#call barify service call with args/kwargs utilizing self.endpoint
implementation code:
foosvc = FooSvc('fooendpoint')
barsvc = BarSvc('barendpoint')
calls = ServiceCalls()
calls.register('fooize', foosvc.fooize)
calls.register('fooify', foosvc.fooify)
calls.register('barize', barsvc.barize)
calls.register('barify', barsvc.barify)
calls.fooize(args)
I think this answers your question:
In [2]: f = 1 .__add__
In [3]: f(3)
Out[3]: 4
You won't need the staticmethod function when adding these functions to classes, because they are effectively already "staticed".
What you are trying to do will work fine, as you can see by running your own code. :)
The object foosvc.fooize is called a "bound method" in Python, and it contains both, a reference to foosvc and to the function FooSvc.fooize. If you call the bound method, the reference to self will be passed implicitly as the first paramater.
On a side note, __getattr__() shouldn't silently return None for invalid attribute names. Better use this:
def __getattr__(self, name):
try:
return self._service_calls[name]
except KeyError:
raise AttributeError
I don't understand the use case for this -- it seems to me that the easy, simple, idiomatic way to accomplish this is to just pass in an object.
But: program to the interface, not the implementation. Only assume that the object has the method you need -- don't touch the internals or any other methods.
I'm implementing a RESTful web service in python and would like to add some QOS logging functionality by intercepting function calls and logging their execution time and so on.
Basically i thought of a class from which all other services can inherit, that automatically overrides the default method implementations and wraps them in a logger function. What's the best way to achieve this?
Something like this? This implictly adds a decorator to your method (you can also make an explicit decorator based on this if you prefer that):
class Foo(object):
def __getattribute__(self,name):
attr = object.__getattribute__(self, name)
if hasattr(attr, '__call__'):
def newfunc(*args, **kwargs):
print('before calling %s' %attr.__name__)
result = attr(*args, **kwargs)
print('done calling %s' %attr.__name__)
return result
return newfunc
else:
return attr
when you now try something like:
class Bar(Foo):
def myFunc(self, data):
print("myFunc: %s"% data)
bar = Bar()
bar.myFunc(5)
You'll get:
before calling myFunc
myFunc: 5
done calling myFunc
What if you write a decorator on each functions ? Here is an example on python's wiki.
Do you use any web framework for doing your webservice ? Or are you doing everything by hand ?
I have a class that I wish to expose as a remote service using pythons SimpleXMLRPCServer. The server startup looks like this:
server = SimpleXMLRPCServer((serverSettings.LISTEN_IP,serverSettings.LISTEN_PORT))
service = Service()
server.register_instance(service)
server.serve_forever()
I then have a ServiceRemote class that looks like this:
def __init__(self,ip,port):
self.rpcClient = xmlrpclib.Server('http://%s:%d' %(ip,port))
def __getattr__(self, name):
# forward all calls to the rpc client
return getattr(self.rpcClient, name)
So all calls on the ServiceRemote object will be forwarded to xmlrpclib.Server, which then forwards it to the remote server. The problem is a method in the service that takes named varargs:
#useDb
def select(self, db, fields, **kwargs):
pass
The #useDb decorator wraps the function, creating the db before the call and opening it, then closing it after the call is done before returning the result.
When I call this method, I get the error "call() got an unexpected keyword argument 'name'". So, is it possible to call methods taking variable named arguments remotely? Or will I have to create an override for each method variation I need.
Thanks for the responses. I changed my code around a bit so the question is no longer an issue. However now I know this for future reference if I indeed do need to implement positional arguments and support remote invocation. I think a combination of Thomas and praptaks approaches would be good. Turning kwargs into positional args on the client through xmlrpclient, and having a wrapper on methods serverside to unpack positional arguments.
You can't do this with plain xmlrpc since it has no notion of keyword arguments. However, you can superimpose this as a protocol on top of xmlrpc that would always pass a list as first argument, and a dictionary as a second, and then provide the proper support code so this becomes transparent for your usage, example below:
Server
from SimpleXMLRPCServer import SimpleXMLRPCServer
class Server(object):
def __init__(self, hostport):
self.server = SimpleXMLRPCServer(hostport)
def register_function(self, function, name=None):
def _function(args, kwargs):
return function(*args, **kwargs)
_function.__name__ = function.__name__
self.server.register_function(_function, name)
def serve_forever(self):
self.server.serve_forever()
#example usage
server = Server(('localhost', 8000))
def test(arg1, arg2):
print 'arg1: %s arg2: %s' % (arg1, arg2)
return 0
server.register_function(test)
server.serve_forever()
Client
import xmlrpclib
class ServerProxy(object):
def __init__(self, url):
self._xmlrpc_server_proxy = xmlrpclib.ServerProxy(url)
def __getattr__(self, name):
call_proxy = getattr(self._xmlrpc_server_proxy, name)
def _call(*args, **kwargs):
return call_proxy(args, kwargs)
return _call
#example usage
server = ServerProxy('http://localhost:8000')
server.test(1, 2)
server.test(arg2=2, arg1=1)
server.test(1, arg2=2)
server.test(*[1,2])
server.test(**{'arg1':1, 'arg2':2})
XML-RPC doesn't really have a concept of 'keyword arguments', so xmlrpclib doesn't try to support them. You would need to pick a convention, then modify xmlrpclib._Method to accept keyword arguments and pass them along using that convention.
For instance, I used to work with an XML-RPC server that passed keyword arguments as two arguments, '-KEYWORD' followed by the actual argument, in a flat list. I no longer have access to the code I wrote to access that XML-RPC server from Python, but it was fairly simple, along the lines of:
import xmlrpclib
_orig_Method = xmlrpclib._Method
class KeywordArgMethod(_orig_Method):
def __call__(self, *args, **kwargs):
if args and kwargs:
raise TypeError, "Can't pass both positional and keyword args"
args = list(args)
for key in kwargs:
args.append('-%s' % key.upper())
args.append(kwargs[key])
return _orig_Method.__call__(self, *args)
xmlrpclib._Method = KeywordArgMethod
It uses monkeypatching because that's by far the easiest method to do this, because of some clunky uses of module globals and name-mangled attributes (__request, for instance) in the ServerProxy class.
As far as I know, the underlying protocol doesn't support named varargs (or any named args for that matter). The workaround for this is to create a wrapper that will take the **kwargs and pass it as an ordinary dictionary to the method you want to call. Something like this
Server side:
def select_wrapper(self, db, fields, kwargs):
"""accepts an ordinary dict which can pass through xmlrpc"""
return select(self,db,fields, **kwargs)
On the client side:
def select(self, db, fields, **kwargs):
"""you can call it with keyword arguments and they will be packed into a dict"""
return self.rpcClient.select_wrapper(self,db,fields,kwargs)
Disclaimer: the code shows the general idea, you can do it a bit cleaner (for example writing a decorator to do that).
Using the above advice, I created some working code.
Server method wrapper:
def unwrap_kwargs(func):
def wrapper(*args, **kwargs):
print args
if args and isinstance(args[-1], list) and len(args[-1]) == 2 and "kwargs" == args[-1][0]:
func(*args[:-1], **args[-1][1])
else:
func(*args, **kwargs)
return wrapper
Client setup (do once):
_orig_Method = xmlrpclib._Method
class KeywordArgMethod(_orig_Method):
def __call__(self, *args, **kwargs):
args = list(args)
if kwargs:
args.append(("kwargs", kwargs))
return _orig_Method.__call__(self, *args)
xmlrpclib._Method = KeywordArgMethod
I tested this, and it supports method with fixed, positional and keyword arguments.
As Thomas Wouters said, XML-RPC does not have keyword arguments. Only the order of arguments matters as far as the protocol is concerned and they can be called anything in XML: arg0, arg1, arg2 is perfectly fine, as is cheese, candy and bacon for the same arguments.
Perhaps you should simply rethink your use of the protocol? Using something like document/literal SOAP would be much better than a workaround such as the ones presented in other answers here. Of course, this may not be feasible.