Python extension methods - python

OK, in C# we have something like:
public static string Destroy(this string s) {
return "";
}
So basically, when you have a string you can do:
str = "This is my string to be destroyed";
newstr = str.Destroy()
# instead of
newstr = Destroy(str)
Now this is cool because in my opinion it's more readable. Does Python have something similar? I mean instead of writing like this:
x = SomeClass()
div = x.getMyDiv()
span = x.FirstChild(x.FirstChild(div)) # so instead of this
I'd like to write:
span = div.FirstChild().FirstChild() # which is more readable to me
Any suggestion?

You can just modify the class directly, sometimes known as monkey patching.
def MyMethod(self):
return self + self
MyClass.MyMethod = MyMethod
del(MyMethod)#clean up namespace
I'm not 100% sure you can do this on a special class like str, but it's fine for your user-defined classes.
Update
You confirm in a comment my suspicion that this is not possible for a builtin like str. In which case I believe there is no analogue to C# extension methods for such classes.
Finally, the convenience of these methods, in both C# and Python, comes with an associated risk. Using these techniques can make code more complex to understand and maintain.

You can do what you have asked like the following:
def extension_method(self):
#do stuff
class.extension_method = extension_method

I would use the Adapter pattern here. So, let's say we have a Person class and in one specific place we would like to add some health-related methods.
from dataclasses import dataclass
#dataclass
class Person:
name: str
height: float # in meters
mass: float # in kg
class PersonMedicalAdapter:
person: Person
def __init__(self, person: Person):
self.person = person
def __getattr__(self, item):
return getattr(self.person, item)
def get_body_mass_index(self) -> float:
return self.person.mass / self.person.height ** 2
if __name__ == '__main__':
person = Person('John', height=1.7, mass=76)
person_adapter = PersonMedicalAdapter(person)
print(person_adapter.name) # Call to Person object field
print(person_adapter.get_body_mass_index()) # Call to wrapper object method
I consider it to be an easy-to-read, yet flexible and pythonic solution.

You can change the built-in classes by monkey-patching with the help of forbidden fruit
But installing forbidden fruit requires a C compiler and unrestricted environment so it probably will not work or needs hard effort to run on Google App Engine, Heroku, etc.
I changed the behaviour of unicode class in Python 2.7 for a Turkish i,I uppercase/lowercase problem by this library.
# -*- coding: utf8 -*-
# Redesigned by #guneysus
import __builtin__
from forbiddenfruit import curse
lcase_table = tuple(u'abcçdefgğhıijklmnoöprsştuüvyz')
ucase_table = tuple(u'ABCÇDEFGĞHIİJKLMNOÖPRSŞTUÜVYZ')
def upper(data):
data = data.replace('i',u'İ')
data = data.replace(u'ı',u'I')
result = ''
for char in data:
try:
char_index = lcase_table.index(char)
ucase_char = ucase_table[char_index]
except:
ucase_char = char
result += ucase_char
return result
curse(__builtin__.unicode, 'upper', upper)
class unicode_tr(unicode):
"""For Backward compatibility"""
def __init__(self, arg):
super(unicode_tr, self).__init__(*args, **kwargs)
if __name__ == '__main__':
print u'istanbul'.upper()

You can achieve this nicely with the following context manager that adds the method to the class or object inside the context block and removes it afterwards:
class extension_method:
def __init__(self, obj, method):
method_name = method.__name__
setattr(obj, method_name, method)
self.obj = obj
self.method_name = method_name
def __enter__(self):
return self.obj
def __exit__(self, type, value, traceback):
# remove this if you want to keep the extension method after context exit
delattr(self.obj, self.method_name)
Usage is as follows:
class C:
pass
def get_class_name(self):
return self.__class__.__name__
with extension_method(C, get_class_name):
assert hasattr(C, 'get_class_name') # the method is added to C
c = C()
print(c.get_class_name()) # prints 'C'
assert not hasattr(C, 'get_class_name') # the method is gone from C

I'd like to think that extension methods in C# are pretty much the same as normal method call where you pass the instance then arguments and stuff.
instance.method(*args, **kwargs)
method(instance, *args, **kwargs) # pretty much the same as above, I don't see much benefit of it getting implemented in python.

After a week, I have a solution that is closest to what I was seeking for. The solution consists of using getattr and __getattr__. Here is an example for those who are interested.
class myClass:
def __init__(self): pass
def __getattr__(self, attr):
try:
methodToCall = getattr(myClass, attr)
return methodToCall(myClass(), self)
except:
pass
def firstChild(self, node):
# bla bla bla
def lastChild(self, node):
# bla bla bla
x = myClass()
div = x.getMYDiv()
y = div.firstChild.lastChild
I haven't test this example, I just gave it to give an idea for who might be interested. Hope that helps.

C# implemented extension methods because it lacks first class functions, Python has them and it is the preferred method for "wrapping" common functionality across disparate classes in Python.
There are good reasons to believe Python will never have extension methods, simply look at the available built-ins:
len(o) calls o.__len__
iter(o) calls o.__iter__
next(o) calls o.next
format(o, s) calls o.__format__(s)
Basically, Python likes functions.

Related

How to initialize a python class variable outside of functions?

I have a python function:
class MyClass:
my_class_variable: str = Optional[None]
#classmethod
def initialize(cls):
cls.my_class_variable = cls.some_function()
I plan to use it like:
x = MyClass.my_class_variable
How can I guarantee have my_class_variable to have initialized with a value, eg how can I force call initialize() ?
you could do something like :
def dec(cls):
cls.my_class_var = cls.some_func()
return cls
#dec
class MyClass:
my_class_var = ""
#classmethod
def some_func(cls):
return "Cool :)"
print(MyClass.my_class_var) --> Cool :)
Another option would be to use a metaprogramming, but as long as there is only one simple thing to do, I would use a decorator :)

Method overloading for different argument type in python

I'm writing a preprocessor in python, part of which works with an AST.
There is a render() method that takes care of converting various statements to source code.
Now, I have it like this (shortened):
def render(self, s):
""" Render a statement by type. """
# code block (used in structures)
if isinstance(s, S_Block):
# delegate to private method that does the work
return self._render_block(s)
# empty statement
if isinstance(s, S_Empty):
return self._render_empty(s)
# a function declaration
if isinstance(s, S_Function):
return self._render_function(s)
# ...
As you can see, it's tedious, prone to errors and the code is quite long (I have many more kinds of statements).
The ideal solution would be (in Java syntax):
String render(S_Block s)
{
// render block
}
String render(S_Empty s)
{
// render empty statement
}
String render(S_Function s)
{
// render function statement
}
// ...
Of course, python can't do this, because it has dynamic typing. When I searched for how to mimick method overloading, all answers just said "You don't want to do that in python". I guess that is true in some cases, but here kwargs is really not useful at all.
How would I do this in python, without the hideous kilometre-long sequence if type checking ifs, as shown above? Also, preferably a "pythonic" way to do so?
Note: There can be multiple "Renderer" implementations, which render the statements in different manners. I can't therefore move the rendering code to the statements and just call s.render(). It must be done in the renderer class.
(I've found some interesting "visitor" code, but I'm not sure if it's really the thing I want).
If you're using Python 3.4 (or are willing to install the backport for Python 2.6+), you can use functools.singledispatch for this*:
from functools import singledispatch
class S_Block(object): pass
class S_Empty(object): pass
class S_Function(object): pass
class Test(object):
def __init__(self):
self.render = singledispatch(self.render)
self.render.register(S_Block, self._render_block)
self.render.register(S_Empty, self._render_empty)
self.render.register(S_Function, self._render_function)
def render(self, s):
raise TypeError("This type isn't supported: {}".format(type(s)))
def _render_block(self, s):
print("render block")
def _render_empty(self, s):
print("render empty")
def _render_function(self, s):
print("render function")
if __name__ == "__main__":
t = Test()
b = S_Block()
f = S_Function()
e = S_Empty()
t.render(b)
t.render(f)
t.render(e)
Output:
render block
render function
render empty
*Code based on this gist.
Would something like this work?
self.map = {
S_Block : self._render_block,
S_Empty : self._render_empty,
S_Function: self._render_function
}
def render(self, s):
return self.map[type(s)](s)
Keeping a reference to a class object as a key in a dictionary and having it's value be the function object you want to call will make your code shorter and less error prone. The only place an error could occur here would be in the definition of the dictionary. Or one of your internal functions of course.
The overloading syntax you are looking for can be achieved using Guido van Rossum's multimethod decorator.
Here is a variant of the multimethod decorator which can decorate class methods (the original decorates plain functions). I've named the variant multidispatch to disambiguate it from the original:
import functools
def multidispatch(*types):
def register(function):
name = function.__name__
mm = multidispatch.registry.get(name)
if mm is None:
#functools.wraps(function)
def wrapper(self, *args):
types = tuple(arg.__class__ for arg in args)
function = wrapper.typemap.get(types)
if function is None:
raise TypeError("no match")
return function(self, *args)
wrapper.typemap = {}
mm = multidispatch.registry[name] = wrapper
if types in mm.typemap:
raise TypeError("duplicate registration")
mm.typemap[types] = function
return mm
return register
multidispatch.registry = {}
and it can be used like this:
class Foo(object):
#multidispatch(str)
def render(self, s):
print('string: {}'.format(s))
#multidispatch(float)
def render(self, s):
print('float: {}'.format(s))
#multidispatch(float, int)
def render(self, s, t):
print('float, int: {}, {}'.format(s, t))
foo = Foo()
foo.render('text')
# string: text
foo.render(1.234)
# float: 1.234
foo.render(1.234, 2)
# float, int: 1.234, 2
The demo code above shows how to overload the Foo.render method based on the types of its arguments.
This code searches for exact matching types as opposed to checking for isinstance relationships. It could be modified to handle that (at the expense of making the lookups O(n) instead of O(1)) but since it sounds like you don't need this anyway, I'll leave the code in this simpler form.
An alternate implementation with functools.singledispatch, using the decorators as defined in PEP-443:
from functools import singledispatch
class S_Unknown: pass
class S_Block: pass
class S_Empty: pass
class S_Function: pass
class S_SpecialBlock(S_Block): pass
#singledispatch
def render(s, **kwargs):
print('Rendering an unknown type')
#render.register(S_Block)
def _(s, **kwargs):
print('Rendering an S_Block')
#render.register(S_Empty)
def _(s, **kwargs):
print('Rendering an S_Empty')
#render.register(S_Function)
def _(s, **kwargs):
print('Rendering an S_Function')
if __name__ == '__main__':
for t in [S_Unknown, S_Block, S_Empty, S_Function, S_SpecialBlock]:
print(f'Passing an {t.__name__}')
render(t())
This outputs
Passing an S_Unknown
Rendering an unknown type
Passing an S_Block
Rendering an S_Block
Passing an S_Empty
Rendering an S_Empty
Passing an S_Function
Rendering an S_Function
Passing an S_SpecialBlock
Rendering an S_Block
I like this version better than the one with the map because it has the same behavior as the implementation that uses isinstance(): when you pass an S_SpecialBlock, it passes it to the renderer that takes an S_Block.
Availability
As mentioned by dano in another answer, this works in Python 3.4+ and there is a backport for Python 2.6+.
If you have Python 3.7+, the register() attribute supports using type annotations:
#render.register
def _(s: S_Block, **kwargs):
print('Rendering an S_Block')
Note
The one problem I can see is that you have to pass s as a positional argument, which means you can't do render(s=S_Block()).
Since single_dispatch uses the type of the first argument to figure out which version of render() to call, that would result in a TypeError - "render requires at least 1 positional argument" (cf source code)
Actually, I think it should be possible to use the keyword argument if there is only one... If you really need that then you can do something similar to this answer, which creates a custom decorator with a different wrapper.
It would be a nice feature of Python as well.
To add some performance measurements to the #unutbu 's answer:
#multimethod(float)
def foo(bar: float) -> str:
return 'float: {}'.format(bar)
def foo_simple(bar):
return 'string: {}'.format(bar)
import time
string_type = "test"
iterations = 10000000
start_time1 = time.time()
for i in range(iterations):
foo(string_type)
end_time1 = time.time() - start_time1
start_time2 = time.time()
for i in range(iterations):
foo_simple(string_type)
end_time2 = time.time() - start_time2
print("multimethod: " + str(end_time1))
print("standard: " + str(end_time2))
Returns:
> multimethod: 16.846999883651733
> standard: 4.509999990463257

How to automatically expose and decorate function versions of methods in Python?

I would like to expose the methods of a class as functions (after decoration) in my local scope. For example if I had a class and decorator:
def some_decorator(f):
...transform f...
return decorated_f
class C(object):
def __init__(self,x):
self.x = x
def f(self,y):
"Some doc string"
return self.x + y
def g(self,y,z):
"Some other doc string"
return self.x + y + z
and if I didn't care about automizing the process I could add the following code to my module the following:
#some_decorator
def f(C_instance,x):
"Some doc string."
return C_instance.f(x)
#some_decorator
def g(C_instance,x,y):
"Some other doc string."
return C_instance.g(x,y)
to the effect that the following evaluate to True
c = C(0)
f(c,1) == c.f(1)
But I would like to be able to do this automatically. Something like:
my_funs = expose_methods(MyClass)
for funname, fun in my_funs.iteritems():
locals()[funname] = some_decorator(fun)
foo = MyClass(data)
some_functionality(foo,*args) == foo.some_functionality(*args)
would do the trick (although it feels a little wrong declaring local variables this way). How can I do this in a way so that all the relevant attributes of the method correctly transform into the function versions? I would appreciate any suggestions.
P.S.
I am aware that I can decorate methods of class instances, but this is not really what I am after. It is more about (1) a preference for the function version syntax (2) the fact that my decorators make functions map over collections of objects in fancy and optimized ways. Getting behavior (2) by decorating methods would require my collections classes to inherit attributes from the objects they contain, which is orthogonal to the collection semantics.
Are you aware that you can use the unbound methods directly?
obj.foo(arg) is equivalent to ObjClass.foo(obj, arg)
class MyClass(object):
def foo(self, arg):
...
obj = MyClass()
print obj.foo(3) == MyClass.foo(obj, 3) # True
See also Class method differences in Python: bound, unbound and static and the documentation.
You say you have a preference for the function syntax. You could just define all your methods outside the class instead, and they would work exactly as you desire.
class C(object):
def __init__(self,x):
self.x = x
def f(c,y):
"Some doc string"
return c.x + y
def g(c,y,z):
"Some other doc string"
return c.x + y + z
If you want them on the class as well, you can always:
for func in f, g:
setattr(C, func.__name__, func)
Or with locals introspection instead of function name introspection:
for name in 'f', 'g':
setattr(C, name, locals()[name])
Or with no introspection, which is arguably a lot simpler and easier to manage unless you have quite a lot of these methods/functions:
C.f = f
C.g = g
This also avoids the potential issue mentioned in the comments on codeape's answer about Python checking that the first argument of an unbound method is an instance of the class.

String construction using OOP and Proxy pattern

I find it very interesting the way how SQLAlchemy constructing query strings, eg:
(Session.query(model.User)
.filter(model.User.age > 18)
.order_by(model.User.age)
.all())
As far as I can see, there applied some kind of Proxy Pattern. In my small project I need to make similar string construction using OOP approach. So, I tried to reconstitute this behavior.
Firstly, some kind of object, one of plenty similar objects:
class SomeObject(object):
items = None
def __init__(self):
self.items = []
def __call__(self):
return ' '.join(self.items) if self.items is not None else ''
def a(self):
self.items.append('a')
return self
def b(self):
self.items.append('b')
return self
All methods of this object return self, so I can call them in any order and unlimited number of times.
Secondly, proxy object, that will call subject's methods if it's not a perform method, which calls object to see the resulting string.
import operator
class Proxy(object):
def __init__(self, some_object):
self.some_object = some_object
def __getattr__(self, name):
self.method = operator.methodcaller(name)
return self
def __call__(self, *args, **kw):
self.some_object = self.method(self.some_object, *args, **kw)
return self
def perform(self):
return self.some_object()
And finally:
>>> obj = SomeObject()
>>> p = Proxy(obj)
>>> print p.a().a().b().perform()
a a b
What can you say about this implementation? Is there better ways to make the desirable amount of classes that would make such a string cunstructing with the same syntax?
PS: Sorry for my english, it's not my primary language.
Actually what you are looking at is not a proxy pattern but the builder pattern, and yes your implementation is IMHO is the classic one (using the Fluent interface pattern).
I don't know what SQLAlchemy does, but I would implement the interface by having the Session.query() method return a Query object with methods like filter(), order_by(), all() etc. Each of these methods simply returns a new Query object taking into account the applied changes. This allows for method chaining as in your first example.
Your own code example has numerous problems. One example
obj = SomeObject()
p = Proxy(obj)
a = p.a
b = p.b
print a().perform() # prints b

Virtual classes: doing it right?

I have been reading documentation describing class inheritance, abstract base classes and even python interfaces. But nothing seams to be exactly what I want. Namely, a simple way of building virtual classes. When the virtual class gets called, I would like it to instantiate some more specific class based on what the parameters it is given and hand that back the calling function. For now I have a summary way of rerouting calls to the virtual class down to the underlying class.
The idea is the following:
class Shape:
def __init__(self, description):
if description == "It's flat": self.underlying_class = Line(description)
elif description == "It's spiky": self.underlying_class = Triangle(description)
elif description == "It's big": self.underlying_class = Rectangle(description)
def number_of_edges(self, parameters):
return self.underlying_class(parameters)
class Line:
def __init__(self, description):
self.desc = description
def number_of_edges(self, parameters):
return 1
class Triangle:
def __init__(self, description):
self.desc = description
def number_of_edges(self, parameters):
return 3
class Rectangle:
def __init__(self, description):
self.desc = description
def number_of_edges(self, parameters):
return 4
shape_dont_know_what_it_is = Shape("It's big")
shape_dont_know_what_it_is.number_of_edges(parameters)
My rerouting is far from optimal, as only calls to the number_of_edges() function get passed on. Adding something like this to Shape doesn't seam to do the trick either:
def __getattr__(self, *args):
return underlying_class.__getattr__(*args)
What I am doing wrong ? Is the whole idea badly implemented ? Any help greatly appreciated.
I agree with TooAngel, but I'd use the __new__ method.
class Shape(object):
def __new__(cls, *args, **kwargs):
if cls is Shape: # <-- required because Line's
description, args = args[0], args[1:] #  __new__ method is the
if description == "It's flat": # same as Shape's
new_cls = Line
else:
raise ValueError("Invalid description: {}.".format(description))
else:
new_cls = cls
return super(Shape, cls).__new__(new_cls, *args, **kwargs)
def number_of_edges(self):
return "A shape can have many edges…"
class Line(Shape):
def number_of_edges(self):
return 1
class SomeShape(Shape):
pass
>>> l1 = Shape("It's flat")
>>> l1.number_of_edges()
1
>>> l2 = Line()
>>> l2.number_of_edges()
1
>>> u = SomeShape()
>>> u.number_of_edges()
'A shape can have many edges…'
>>> s = Shape("Hexagon")
ValueError: Invalid description: Hexagon.
I would prefer doing it with a factory:
def factory(description):
if description == "It's flat": return Line(description)
elif description == "It's spiky": return Triangle(description)
elif description == "It's big": return Rectangle(description)
or:
def factory(description):
classDict = {"It's flat":Line("It's flat"), "It's spiky":Triangle("It's spiky"), "It's big":Rectangle("It's big")}
return classDict[description]
and inherit the classes from Shape
class Line(Shape):
def __init__(self, description):
self.desc = description
def number_of_edges(self, parameters):
return 1
Python doesn't have virtual classes out of the box. You will have to implement them yourself (it should be possible, Python's reflection capabilities should be powerful enough to let you do this).
However, if you need virtual classes, then why don't you just use a programming language which does have virtual classes like Beta, gBeta or Newspeak? (BTW: are there any others?)
In this particular case, though, I don't really see how virtual classes would simplify your solution, at least not in the example you have given. Maybe you could elaborate why you think you need virtual classes?
Don't get me wrong: I like virtual classes, but the fact that only three languages have ever implemented them, only one of those three is still alive and exactly 0 of those three are actually used by anybody is somewhat telling …
You can change the class with object.__class__, but it's much better to just make a function that returns an instance of an arbitrary class.
On another note, all class should inherit from object unless you use using Python 3, like this, otherwise you end up with an old-style class:
class A(object):
pass

Categories