How to do Obj-C Categories in Python? - python

Obj-C (which I have not used for a long time) has something called categories to extend classes. Declaring a category with new methods and compiling it into your program, all instances of the class suddenly have the new methods.
Python has mixin possibilities, which I use, but mixins must be used from the bottom of the program: the class has to declare it itself.
Foreseen category use-case: Say you have a big class hierarchy that describe different ways of interacting with data, declaring polymorphic ways to get at different attributes. Now a category can help the consumer of these describing classes by implementing a convenient interface to access these methods in one place. (A category method could for example, try two different methods and return the first defined (non-None) return value.)
Any way to do this in Python?
Illustrative code
I hope this clarifies what I mean. The point is that the Category is like an aggregate interface, that the consumer of AppObj can change in its code.
class AppObj (object):
"""This is the top of a big hierarchy of subclasses that describe different data"""
def get_resource_name(self):
pass
def get_resource_location(self):
pass
# dreaming up class decorator syntax
#category(AppObj)
class AppObjCategory (object):
"""this is a category on AppObj, not a subclass"""
def get_resource(self):
name = self.get_resource_name()
if name:
return library.load_resource_name(name)
else:
return library.load_resource(self.get_resource_location())

Why not just add methods dynamically ?
>>> class Foo(object):
>>> pass
>>> def newmethod(instance):
>>> print 'Called:', instance
...
>>> Foo.newmethod = newmethod
>>> f = Foo()
>>> f.newmethod()
Called: <__main__.Foo object at 0xb7c54e0c>
I know Objective-C and this looks just like categories. The only drawback is that you can't do that to built-in or extension types.

I came up with this implementation of a class decorator. I'm using python2.5 so I haven't actually tested it with decorator syntax (which would be nice), and I'm not sure what it does is really correct. But it looks like this:
pycategories.py
"""
This module implements Obj-C-style categories for classes for Python
Copyright 2009 Ulrik Sverdrup <ulrik.sverdrup#gmail.com>
License: Public domain
"""
def Category(toclass, clobber=False):
"""Return a class decorator that implements the decorated class'
methods as a Category on the class #toclass
if #clobber is not allowed, AttributeError will be raised when
the decorated class already contains the same attribute.
"""
def decorator(cls):
skip = set(("__dict__", "__module__", "__weakref__", "__doc__"))
for attr in cls.__dict__:
if attr in toclass.__dict__:
if attr in skip:
continue
if not clobber:
raise AttributeError("Category cannot override %s" % attr)
setattr(toclass, attr, cls.__dict__[attr])
return cls
return decorator

Python's setattr function makes this easy.
# categories.py
class category(object):
def __init__(self, mainModule, override = True):
self.mainModule = mainModule
self.override = override
def __call__(self, function):
if self.override or function.__name__ not in dir(self.mainModule):
setattr(self.mainModule, function.__name__, function)
# categories_test.py
import this
from categories import category
#category(this)
def all():
print "all things are this"
this.all()
>>> all things are this

Related

How should I create properties using a closure in python?

I'm doing some coding for Maya with PyMel and I'm trying to create some properties in my rig class to wrap some PyMel code. The code for all of the properties was pretty similar, so I figured this would be a good place to use a closure.
import pymel.core as pm
import riggermortis.utils as utils
class RigModule(object):
def __init__:
# blah blah irrelevant code here
pass
def createRootProperty(attrName):
def getter(self):
return pm.getAttr(self.root+'.'+attrName)
def setter(self, nodeToLink):
if self.root.hasAttr(attrName):
pm.setAttr(self.root+'.'+attrName, nodeToLink)
else:
utils.linkNodes(self.root, nodeToLink, attrName)
return property(fget=getter,fset=setter)
hookConstraint = createRootProperty('hookConstraint')
unhookTarget = createRootProperty('unhookTarget')
moduleGrp = createRootProperty('moduleGrp')
hookGrp = createRootProperty('hookGrp')
Functionally it works, but Eclipse/PyDev is telling me that my 'createRootProperty' function needs 'self' as its first argument so I'm wondering if what I'm doing is incorrect.
For what you're doing a closure is not really needed except for cleanliness. The linter think it's an incorrectly formatted member function, even though it's doing what you want.
You can just move the function of the class scope and the linter will stop complaining -- you can rename the function with an underscore so nobody accidentally thinks it is a tool rather than a piece of infrastructure.
If you expect to do this a lot, you could automate it into a metaclass that reads a list of names from a class field and creates properies as appropriate. There's a more detailed example of that strategy here, but in essence the metaclass will get a copy of the class dictionary when the class is defined and it has the opportunity to mess with the definition before it gets compiled. You can create a property easily at that step:
def createRootProperty(name):
# this is a dummy, but as long as
# the return from this function
# is a property descriptor you're good
#property
def me(self):
return name, self.__class__
return me
class PropertyMeta(type):
# this gets called when a class using this meta is
# first compiled. It gives you a chance to intervene
# in the class creation project
def __new__(cls, name, bases, properties):
# if the class has a 'PROPS' member, it's a list
# of properties to add
roots = properties.get('PROPS', [])
for r in roots:
properties[r] = createRootProperty(r)
print ("# added property '{}' to {}".format(r, name))
return type.__new__( cls, name, bases, properties)
class RigModule(object):
__metaclass__ = PropertyMeta
PROPS = ['arm', 'head', 'leg']
def __init__(self):
pass
test = RigModule()
print test.arm
class Submodule(RigModule):
# metaclass added properties are inheritable
pass
test2 = Submodule()
print test2.leg
class NewProperties(RigModule):
# they can also be augmented in derived classes
PROPS = ['nose', 'mouth']
print NewProperties().nose
print NewProperties().arm
# added property 'arm' to RigModule
# added property 'head' to RigModule
# added property 'leg' to RigModule
# ('arm', <class '__main__.RigModule'>)
# ('leg', <class '__main__.Submodule'>)
# added property 'nose' to NewProperties
# added property 'mouth' to NewProperties
# ('nose', <class '__main__.NewProperties'>)
# ('arm', <class '__main__.NewProperties'>)
Metaclasses get a bad rep -- sometimes deservedly so -- for adding complexity. Don't use them when a simpler approach will do. But for boilerplate reduction in cases like this they are a great tool.

python - register all subclasses

I have a class Step, which I want to derive by many sub-classes. I want every class deriving from Step to be "registered" by a name I choose for it (not the class's name), so I can later call Step.getStepTypeByName().
Something like this, only working :):
class Step(object):
_STEPS_BY_NAME = {}
#staticmethod
def REGISTER(cls, name):
_STEPS_BY_NAME[name] = cls
class Derive1(Step):
REGISTER(Derive1, "CustomDerive1Name")
...
class Derive2(Step):
REGISTER(Derive2, "CustomDerive2Name")
...
Your solution do not work for three reasons.
The first one is that _STEPS_BY_NAME only exists as an attribute of the Step class, so Step.REGISTER cannot access _STEPS_BY_NAME without a reference to the Step class. IOW you have to make it a classmethod (cf below)
The second one is that you need to explicitely use Step.REGISTER(cls) - the name REGISTER does not exist outside the Step class.
The third reason is that within a class statement's body, the class object has not yet been created not bound to it's name, so you cannot not reference the class itself at this point.
IOW, you'd want this instead:
class Step(object):
_STEPS_BY_NAME = {}
# NB : by convention, "ALL_UPPER" names denote pseudo-constants
#classmethod
def register(cls, name):
# here `cls` is the current class
cls._STEPS_BY_NAME[name] = stepclass
class Derive1(Step):
...
Step.register(Derive1, "CustomDerive1Name")
class Derive2(Step):
...
Step.register(Derive2, "CustomDerive2Name")
Now with a minor modification to Step.register you could use it as a class decorator, making things much clearer:
class Step(object):
_STEPS_BY_NAME = {}
#classmethod
def register(cls, name):
def _register(stepclass):
cls._STEPS_BY_NAME[name] = stepclass
return stepclass
return _register
#Step.register("CustomDerive1Name")
class Derive1(Step):
...
#Step.register("CustomDerive2Name")
class Derive2(Step):
...
As a last note: unless you have a compelling reason to register your subclasses in the base class itself, it might be better to use module-level variables and functions (a Python module is actually a kind of singleton):
# steps.py
class Step(object):
#....
_STEPS_BY_NAME = {}
def register(name):
def _register(cls):
_STEPS_BY_NAME[name] = cls
return cls
return _register
def get_step_class(name):
return _STEPS_BY_NAME[name]
And in your other modules
import steps
#steps.register("CustomDerive1Name")
class Derive1(steps.Step):
# ...
The point here is to avoid giving too many responsabilies to your Step class. I don't know your concrete use case so I can't tell which design best fits your need, but I've been using this last one on quite a few projects and it always worked fine so far.
You are close. Use this
class Step(object):
pass
class Derive1(Step):
pass
class Derive2(Step):
pass
_STEPS_BY_NAME = {
'foo': Step,
'bar': Derive1,
'bar': Derive2
}
def get_step_by_name(name):
return _STEPS_BY_NAME[name]
Warning: there might be better approaches depending on what you are trying to achieve. Such a mapping from strings to methods is a maintenance nightmare. If you want to change the name of a method, you would have to remember to change it in multiple place. You won't get any autocomplete help from your IDE either.

How can I refer to the currently being defined class? [duplicate]

For a recursive function we can do:
def f(i):
if i<0: return
print i
f(i-1)
f(10)
However is there a way to do the following thing?
class A:
# do something
some_func(A)
# ...
If I understand your question correctly, you should be able to reference class A within class A by putting the type annotation in quotes. This is called forward reference.
class A:
# do something
def some_func(self, a: 'A')
# ...
See ref below
https://github.com/python/mypy/issues/3661
https://www.youtube.com/watch?v=AJsrxBkV3kc
In Python you cannot reference the class in the class body, although in languages like Ruby you can do it.
In Python instead you can use a class decorator but that will be called once the class has initialized. Another way could be to use metaclass but it depends on what you are trying to achieve.
You can't with the specific syntax you're describing due to the time at which they are evaluated. The reason the example function given works is that the call to f(i-1) within the function body is because the name resolution of f is not performed until the function is actually called. At this point f exists within the scope of execution since the function has already been evaluated. In the case of the class example, the reference to the class name is looked up during while the class definition is still being evaluated. As such, it does not yet exist in the local scope.
Alternatively, the desired behavior can be accomplished using a metaclass like such:
class MetaA(type):
def __init__(cls):
some_func(cls)
class A(object):
__metaclass__=MetaA
# do something
# ...
Using this approach you can perform arbitrary operations on the class object at the time that the class is evaluated.
Maybe you could try calling __class__.
Right now I'm writing a code that calls a class method from within the same class.
It is working well so far.
I'm creating the class methods using something like:
#classmethod
def my_class_method(cls):
return None
And calling then by using:
x = __class__.my_class_method()
It seems most of the answers here are outdated. From python3.7:
from __future__ import annotations
Example:
$ cat rec.py
from __future__ import annotations
class MyList:
def __init__(self,e):
self.data = [e]
def add(self, e):
self.data.append(e)
return self
def score(self, other:MyList):
return len([e
for e in self.data
if e in other.data])
print(MyList(8).add(3).add(4).score(MyList(4).add(9).add(3)))
$ python3.7 rec.py
2
Nope. It works in a function because the function contents are executed at call-time. But the class contents are executed at define-time, at which point the class doesn't exist yet.
It's not normally a problem because you can hack further members into the class after defining it, so you can split up a class definition into multiple parts:
class A(object):
spam= 1
some_func(A)
A.eggs= 2
def _A_scramble(self):
self.spam=self.eggs= 0
A.scramble= _A_scramble
It is, however, pretty unusual to want to call a function on the class in the middle of its own definition. It's not clear what you're trying to do, but chances are you'd be better off with decorators (or the relatively new class decorators).
There isn't a way to do that within the class scope, not unless A was defined to be something else first (and then some_func(A) will do something entirely different from what you expect)
Unless you're doing some sort of stack inspection to add bits to the class, it seems odd why you'd want to do that. Why not just:
class A:
# do something
pass
some_func(A)
That is, run some_func on A after it's been made. Alternately, you could use a class decorator (syntax for it was added in 2.6) or metaclass if you wanted to modify class A somehow. Could you clarify your use case?
If you want to do just a little hacky thing do
class A(object):
...
some_func(A)
If you want to do something more sophisticated you can use a metaclass. A metaclass is responsible for manipulating the class object before it gets fully created. A template would be:
class AType(type):
def __new__(meta, name, bases, dct):
cls = super(AType, meta).__new__(meta, name, bases, dct)
some_func(cls)
return cls
class A(object):
__metaclass__ = AType
...
type is the default metaclass. Instances of metaclasses are classes so __new__ returns a modified instance of (in this case) A.
For more on metaclasses, see http://docs.python.org/reference/datamodel.html#customizing-class-creation.
If the goal is to call a function some_func with the class as an argument, one answer is to declare some_func as a class decorator. Note that the class decorator is called after the class is initialized. It will be passed the class that is being decorated as an argument.
def some_func(cls):
# Do something
print(f"The answer is {cls.x}")
return cls # Don't forget to return the class
#some_func
class A:
x = 1
If you want to pass additional arguments to some_func you have to return a function from the decorator:
def some_other_func(prefix, suffix):
def inner(cls):
print(f"{prefix} {cls.__name__} {suffix}")
return cls
return inner
#some_other_func("Hello", " and goodbye!")
class B:
x = 2
Class decorators can be composed, which results in them being called in the reverse order they are declared:
#some_func
#some_other_func("Hello", "and goodbye!")
class C:
x = 42
The result of which is:
# Hello C and goodbye!
# The answer is 42
What do you want to achieve? It's possible to access a class to tweak its definition using a metaclass, but it's not recommended.
Your code sample can be written simply as:
class A(object):
pass
some_func(A)
If you want to refer to the same object, just use 'self':
class A:
def some_func(self):
another_func(self)
If you want to create a new object of the same class, just do it:
class A:
def some_func(self):
foo = A()
If you want to have access to the metaclass class object (most likely not what you want), again, just do it:
class A:
def some_func(self):
another_func(A) # note that it reads A, not A()
Do remember that in Python, type hinting is just for auto-code completion therefore it helps IDE to infer types and warn user before runtime. In runtime, type hints almost never used(except in some cases) so you can do something like this:
from typing import Any, Optional, NewType
LinkListType = NewType("LinkList", object)
class LinkList:
value: Any
_next: LinkListType
def set_next(self, ll: LinkListType):
self._next = ll
if __name__ == '__main__':
r = LinkList()
r.value = 1
r.set_next(ll=LinkList())
print(r.value)
And as you can see IDE successfully infers it's type as LinkList:
Note: Since the next can be None, hinting this in the type would be better, I just didn't want to confuse OP.
class LinkList:
value: Any
next: Optional[LinkListType]
It's ok to reference the name of the class inside its body (like inside method definitions) if it's actually in scope... Which it will be if it's defined at top level. (In other cases probably not, due to Python scoping quirks!).
For on illustration of the scoping gotcha, try to instantiate Foo:
class Foo(object):
class Bar(object):
def __init__(self):
self.baz = Bar.baz
baz = 15
def __init__(self):
self.bar = Foo.Bar()
(It's going to complain about the global name 'Bar' not being defined.)
Also, something tells me you may want to look into class methods: docs on the classmethod function (to be used as a decorator), a relevant SO question. Edit: Ok, so this suggestion may not be appropriate at all... It's just that the first thing I thought about when reading your question was stuff like alternative constructors etc. If something simpler suits your needs, steer clear of #classmethod weirdness. :-)
Most code in the class will be inside method definitions, in which case you can simply use the name A.

How to Inherit multiple classes in python dynamically [duplicate]

This article has a snippet showing usage of __bases__ to dynamically change the inheritance hierarchy of some Python code, by adding a class to an existing classes collection of classes from which it inherits. Ok, that's hard to read, code is probably clearer:
class Friendly:
def hello(self):
print 'Hello'
class Person: pass
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
That is, Person doesn't inherit from Friendly at the source level, but rather this inheritance relation is added dynamically at runtime by modification of the __bases__attribute of the Person class. However, if you change Friendly and Person to be new style classes (by inheriting from object), you get the following error:
TypeError: __bases__ assignment: 'Friendly' deallocator differs from 'object'
A bit of Googling on this seems to indicate some incompatibilities between new-style and old style classes in regards to changing the inheritance hierarchy at runtime. Specifically: "New-style class objects don't support assignment to their bases attribute".
My question, is it possible to make the above Friendly/Person example work using new-style classes in Python 2.7+, possibly by use of the __mro__ attribute?
Disclaimer: I fully realise that this is obscure code. I fully realize that in real production code tricks like this tend to border on unreadable, this is purely a thought experiment, and for funzies to learn something about how Python deals with issues related to multiple inheritance.
Ok, again, this is not something you should normally do, this is for informational purposes only.
Where Python looks for a method on an instance object is determined by the __mro__ attribute of the class which defines that object (the M ethod R esolution O rder attribute). Thus, if we could modify the __mro__ of Person, we'd get the desired behaviour. Something like:
setattr(Person, '__mro__', (Person, Friendly, object))
The problem is that __mro__ is a readonly attribute, and thus setattr won't work. Maybe if you're a Python guru there's a way around that, but clearly I fall short of guru status as I cannot think of one.
A possible workaround is to simply redefine the class:
def modify_Person_to_be_friendly():
# so that we're modifying the global identifier 'Person'
global Person
# now just redefine the class using type(), specifying that the new
# class should inherit from Friendly and have all attributes from
# our old Person class
Person = type('Person', (Friendly,), dict(Person.__dict__))
def main():
modify_Person_to_be_friendly()
p = Person()
p.hello() # works!
What this doesn't do is modify any previously created Person instances to have the hello() method. For example (just modifying main()):
def main():
oldperson = Person()
ModifyPersonToBeFriendly()
p = Person()
p.hello()
# works! But:
oldperson.hello()
# does not
If the details of the type call aren't clear, then read e-satis' excellent answer on 'What is a metaclass in Python?'.
I've been struggling with this too, and was intrigued by your solution, but Python 3 takes it away from us:
AttributeError: attribute '__dict__' of 'type' objects is not writable
I actually have a legitimate need for a decorator that replaces the (single) superclass of the decorated class. It would require too lengthy a description to include here (I tried, but couldn't get it to a reasonably length and limited complexity -- it came up in the context of the use by many Python applications of an Python-based enterprise server where different applications needed slightly different variations of some of the code.)
The discussion on this page and others like it provided hints that the problem of assigning to __bases__ only occurs for classes with no superclass defined (i.e., whose only superclass is object). I was able to solve this problem (for both Python 2.7 and 3.2) by defining the classes whose superclass I needed to replace as being subclasses of a trivial class:
## T is used so that the other classes are not direct subclasses of object,
## since classes whose base is object don't allow assignment to their __bases__ attribute.
class T: pass
class A(T):
def __init__(self):
print('Creating instance of {}'.format(self.__class__.__name__))
## ordinary inheritance
class B(A): pass
## dynamically specified inheritance
class C(T): pass
A() # -> Creating instance of A
B() # -> Creating instance of B
C.__bases__ = (A,)
C() # -> Creating instance of C
## attempt at dynamically specified inheritance starting with a direct subclass
## of object doesn't work
class D: pass
D.__bases__ = (A,)
D()
## Result is:
## TypeError: __bases__ assignment: 'A' deallocator differs from 'object'
I can not vouch for the consequences, but that this code does what you want at py2.7.2.
class Friendly(object):
def hello(self):
print 'Hello'
class Person(object): pass
# we can't change the original classes, so we replace them
class newFriendly: pass
newFriendly.__dict__ = dict(Friendly.__dict__)
Friendly = newFriendly
class newPerson: pass
newPerson.__dict__ = dict(Person.__dict__)
Person = newPerson
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
We know that this is possible. Cool. But we'll never use it!
Right of the bat, all the caveats of messing with class hierarchy dynamically are in effect.
But if it has to be done then, apparently, there is a hack that get's around the "deallocator differs from 'object" issue when modifying the __bases__ attribute for the new style classes.
You can define a class object
class Object(object): pass
Which derives a class from the built-in metaclass type.
That's it, now your new style classes can modify the __bases__ without any problem.
In my tests this actually worked very well as all existing (before changing the inheritance) instances of it and its derived classes felt the effect of the change including their mro getting updated.
I needed a solution for this which:
Works with both Python 2 (>= 2.7) and Python 3 (>= 3.2).
Lets the class bases be changed after dynamically importing a dependency.
Lets the class bases be changed from unit test code.
Works with types that have a custom metaclass.
Still allows unittest.mock.patch to function as expected.
Here's what I came up with:
def ensure_class_bases_begin_with(namespace, class_name, base_class):
""" Ensure the named class's bases start with the base class.
:param namespace: The namespace containing the class name.
:param class_name: The name of the class to alter.
:param base_class: The type to be the first base class for the
newly created type.
:return: ``None``.
Call this function after ensuring `base_class` is
available, before using the class named by `class_name`.
"""
existing_class = namespace[class_name]
assert isinstance(existing_class, type)
bases = list(existing_class.__bases__)
if base_class is bases[0]:
# Already bound to a type with the right bases.
return
bases.insert(0, base_class)
new_class_namespace = existing_class.__dict__.copy()
# Type creation will assign the correct ‘__dict__’ attribute.
del new_class_namespace['__dict__']
metaclass = existing_class.__metaclass__
new_class = metaclass(class_name, tuple(bases), new_class_namespace)
namespace[class_name] = new_class
Used like this within the application:
# foo.py
# Type `Bar` is not available at first, so can't inherit from it yet.
class Foo(object):
__metaclass__ = type
def __init__(self):
self.frob = "spam"
def __unicode__(self): return "Foo"
# … later …
import bar
ensure_class_bases_begin_with(
namespace=globals(),
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
Use like this from within unit test code:
# test_foo.py
""" Unit test for `foo` module. """
import unittest
import mock
import foo
import bar
ensure_class_bases_begin_with(
namespace=foo.__dict__,
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
class Foo_TestCase(unittest.TestCase):
""" Test cases for `Foo` class. """
def setUp(self):
patcher_unicode = mock.patch.object(
foo.Foo, '__unicode__')
patcher_unicode.start()
self.addCleanup(patcher_unicode.stop)
self.test_instance = foo.Foo()
patcher_frob = mock.patch.object(
self.test_instance, 'frob')
patcher_frob.start()
self.addCleanup(patcher_frob.stop)
def test_instantiate(self):
""" Should create an instance of `Foo`. """
instance = foo.Foo()
The above answers are good if you need to change an existing class at runtime. However, if you are just looking to create a new class that inherits by some other class, there is a much cleaner solution. I got this idea from https://stackoverflow.com/a/21060094/3533440, but I think the example below better illustrates a legitimate use case.
def make_default(Map, default_default=None):
"""Returns a class which behaves identically to the given
Map class, except it gives a default value for unknown keys."""
class DefaultMap(Map):
def __init__(self, default=default_default, **kwargs):
self._default = default
super().__init__(**kwargs)
def __missing__(self, key):
return self._default
return DefaultMap
DefaultDict = make_default(dict, default_default='wug')
d = DefaultDict(a=1, b=2)
assert d['a'] is 1
assert d['b'] is 2
assert d['c'] is 'wug'
Correct me if I'm wrong, but this strategy seems very readable to me, and I would use it in production code. This is very similar to functors in OCaml.
This method isn't technically inheriting during runtime, since __mro__ can't be changed. But what I'm doing here is using __getattr__ to be able to access any attributes or methods from a certain class. (Read comments in order of numbers placed before the comments, it makes more sense)
class Sub:
def __init__(self, f, cls):
self.f = f
self.cls = cls
# 6) this method will pass the self parameter
# (which is the original class object we passed)
# and then it will fill in the rest of the arguments
# using *args and **kwargs
def __call__(self, *args, **kwargs):
# 7) the multiple try / except statements
# are for making sure if an attribute was
# accessed instead of a function, the __call__
# method will just return the attribute
try:
return self.f(self.cls, *args, **kwargs)
except TypeError:
try:
return self.f(*args, **kwargs)
except TypeError:
return self.f
# 1) our base class
class S:
def __init__(self, func):
self.cls = func
def __getattr__(self, item):
# 5) we are wrapping the attribute we get in the Sub class
# so we can implement the __call__ method there
# to be able to pass the parameters in the correct order
return Sub(getattr(self.cls, item), self.cls)
# 2) class we want to inherit from
class L:
def run(self, s):
print("run" + s)
# 3) we create an instance of our base class
# and then pass an instance (or just the class object)
# as a parameter to this instance
s = S(L) # 4) in this case, I'm using the class object
s.run("1")
So this sort of substitution and redirection will simulate the inheritance of the class we wanted to inherit from. And it even works with attributes or methods that don't take any parameters.

How do I get a reference for the current class object?

In Python, how do I get a reference to the current class object within a class statement? Example:
def setup_class_members(cls, prefix):
setattr(cls, prefix+"_var1", "hello")
setattr(cls, prefix+"_var2", "goodbye")
class myclass(object):
setup_class_members(cls, "coffee") # How to get "cls"?
def mytest(self):
print(self.coffee_var1)
print(self.coffee_var2)
x = myclass()
x.mytest()
>>> hello
>>> goodbye
Alternatives that I've written off are:
Use locals(): This gives a dict in a class statement that can be written to. This seems to work for classes, however the documentation tells you not to do this. (I might be tempted to go with this alternative if someone can assure me that this will continue to work for some time.)
Add members to the class object after the class statement: My actual application is to derive a PyQt4 QWidget class with dynamically created pyqtProperty class attributes. QWidget is unusual in that it has a custom metaclass. Very roughly, the metaclass compiles a list of pyqtProperties and stores it as additional member. For this reason, properties that are added to the class after creation have no effect. An example to clear this up:
from PyQt4 import QtCore, QtGui
# works
class MyWidget1(QtGui.QWidget):
myproperty = QtCore.pyqtProperty(int)
# doesn't work because QWidget's metaclass doesn't get to "compile" myproperty
class MyWidget2(QtGui.QWidget):
pass
MyWidget2.myproperty = QtCore.pyqtProperty(int)
Please note that the above will work for most programming cases; my case just happens to be one of those unusual corner cases.
For Python 3, the class must be declared as
class myclass(object, metaclass = Meta):
prefix = "coffee"
...
A few other points:
The metaclass may be a callable, not just a class (Python 2&3)
If the base class of your class already has a non-standard metaclass, you have to make sure you call it's __init__() and __new__() methods instead of type's.
The class statement accepts keyword parameters that are passed on to the metaclass (Python 3 only)
A rewrite of mouad's solution in Python 3 using all of the above is...
def MetaFun(name, bases, attr, prefix=None):
if prefix:
attr[prefix+"_var1"] = "hello"
attr[prefix+"_var2"] = "goodbye"
return object.__class__(name, bases, attr)
class myclass(object, metaclass = MetaFun, prefix="coffee"):
def mytest(self):
print(self.coffee_var1)
print(self.coffee_var2)
AFAIK there is two way to do what you want:
Using metaclass, this will create your two variables in class creation time (which i think is what you want):
class Meta(type):
def __new__(mcs, name, bases, attr):
prefix = attr.get("prefix")
if prefix:
attr[prefix+"_var1"] = "hello"
attr[prefix+"_var2"] = "goodbye"
return type.__new__(mcs, name, bases, attr)
class myclass(object):
__metaclass__ = Meta
prefix = "coffee"
def mytest(self):
print(self.coffee_var1)
print(self.coffee_var2)
Create your two class variable in instantiation time:
class myclass(object):
prefix = "coffee"
def __init__(self):
setattr(self.__class__, self.prefix+"_var1", "hello")
setattr(self.__class__, self.prefix+"_var2", "goodbye")
def mytest(self):
print(self.coffee_var1)
print(self.coffee_var2)
N.B: I'm not sure what you want to achieve because if you want to create dynamic variables depending on the prefix variable why are you accessing like you do in your mytest method ?! i hope it was just an example.
Two more approaches you might use:
A class decorator.
def setup_class_members(prefix):
def decorator(cls):
setattr(cls, prefix+"_var1", "hello")
setattr(cls, prefix+"_var2", "goodbye")
return cls
return decorator
#setup_class_members("coffee")
class myclass(object):
# ... etc
Especially if you need to add attributes in various combinations, the decorator approach is nice because it does not have any effect on inheritance.
If you are dealing with a small set of of attributes that you wish to combine in various ways, you can use mixin classes. A mixin class is a regular class, it's just intended to "mix in" various attributes to some other class.
class coffee_mixin(object):
coffee_var1 = "hello"
coffee_var2 = "goodbye"
class tea_mixin(object):
tea_var1 = "good morning old bean"
tea_var2 = "pip pip cheerio"
class myclass(coffee_mixin, tea_mixin):
# ... etc
See zope.interface.declarations._implements for an example of doing this kind of magic. Just be warned that it's a serious maintainability and portability risk.

Categories