unittesting: patching an objects __base__ - python

Following on from this question, I attempted to patch class A() with Mock() so that when B() was initialised, the Mock was used as a base e.g.:
class A(object): ...
class B(A): ...
def setUp(self):
with patch('A', new_callable=Mock) as MockObject:
self.b = B()
self.b.__class__.__base__ = MockOjbect
Which doesn't work because base is read only. What's the correct way to go about doing this?
update:
>>> from mock import Mock
>>> class A(object):
... pass
...
>>> class B(A):
... pass
...
>>> b.__class__.__bases__ = (Mock, )
>>> b.__class__.__bases__
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/envs/myenv/local/lib/python2.7/site-packages/mock.py", line 656, in __getattr__
elif self._mock_methods is not None:
File "/opt/envs/myenv/local/lib/python2.7/site-packages/mock.py", line 655, in __getattr__
raise AttributeError(name)
AttributeError: _mock_methods
To be clear, I'm not convinced this is the best way to achieve what I want to do, I'm half hoping someone else will come up with another way.

It's __bases__ which is a tuple.
Corrected version:
class A(object): ...
class B(A): ...
def setUp(self):
with patch('A', new_callable=Mock) as MockObject:
self.b = B()
self.b.__class__.__bases__ = (MockOjbect,)
See:
>>> class Foo(object):
... pass
...
>>> Foo.__class__.__bases__
(<type 'object'>,)
tuple's are immutable but the __bases__ attribute is most certainly not read-only.

Related

When should one inherit from ABC?

I always thought one should inherit from abc.ABC when one does not want the class to be instantiated. But I've just realized that if a class has an #abstractmethod then one can also not instanciate it.
Is there any other reason to inherit from ABC?
Unless you use abc.ABCMeta as the metaclass for your class (either explicitly or by inheriting from abc.ABC), using abstractmethod doesn't really do anything.
>>> from abc import abstractmethod, ABC
>>> class Foo:
... #abstractmethod
... def bar(self):
... pass
...
>>> f = Foo()
>>>
Likewise, using ABCMeta doesn't mean much unless you mark at least one method as abstract:
>>> class Bar(ABC):
... pass
...
>>> b = Bar()
>>>
It's the combination of the two that allows a class to be (nominally) uninstantiable:
>>> class Baz(ABC):
... #abstractmethod
... def m(self):
... pass
...
>>> b = Baz()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Baz with abstract methods m
>>>
(Even then, note that all #abstractmethod does is add the decorated method to a set which the metaclass machinery consults when trying to instantiate the class. It is trivial to defeat that machinery:
>>> Baz.__abstractmethods__
frozenset({'m'})
>>> Baz.__abstractmethods__ = set()
>>> b = Baz()
>>>
)
Note that ABC itself is a trivial class that uses ABCMeta as its metaclass, which makes any of its descendants use it as well.
# Docstring omitted; see
# https://github.com/python/cpython/blob/3.7/Lib/abc.py#L166
# for the original
class ABC(metaclass=ABCMeta):
__slots__ = ()
What chepner said, and also readability. Inheriting from ABC makes it clear to your readers what you're up to.
>>> from abc import ABC, abstractmethod
>>>
>>> class Foo:
... #abstractmethod
... def f(self):
... pass
...
>>> class Bar(Foo):
... pass
...
>>> Bar().f()
>>>
>>> class Baz(ABC):
... #abstractmethod
... def f(self):
... pass
...
>>> class Quux(Baz):
... pass
...
>>> Quux().f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Quux with abstract methods f

In Python, how to enforce an abstract method to be static on the child class?

This is the setup I want:
A should be an abstract base class with a static & abstract method f(). B should inherit from A. Requirements:
1. You should not be able to instantiate A
2. You should not be able to instantiate B, unless it implements a static f()
Taking inspiration from this question, I've tried a couple of approaches. With these definitions:
class abstractstatic(staticmethod):
__slots__ = ()
def __init__(self, function):
super(abstractstatic, self).__init__(function)
function.__isabstractmethod__ = True
__isabstractmethod__ = True
class A:
__metaclass__ = abc.ABCMeta
#abstractstatic
def f():
pass
class B(A):
def f(self):
print 'f'
class A2:
__metaclass__ = abc.ABCMeta
#staticmethod
#abc.abstractmethod
def f():
pass
class B2(A2):
def f(self):
print 'f'
Here A2 and B2 are defined using usual Python conventions and A & B are defined using the way suggested in this answer. Following are some operations I tried and the results that were undesired.
With classes A/B:
>>> B().f()
f
#This should have thrown, since B doesn't implement a static f()
With classes A2/B2:
>>> A2()
<__main__.A2 object at 0x105beea90>
#This should have thrown since A2 should be an uninstantiable abstract class
>>> B2().f()
f
#This should have thrown, since B2 doesn't implement a static f()
Since neither of these approaches give me the output I want, how do I achieve what I want?
You can't do what you want with just ABCMeta. ABC enforcement doesn't do any type checking, only the presence of an attribute with the correct name is enforced.
Take for example:
>>> from abc import ABCMeta, abstractmethod, abstractproperty
>>> class Abstract(object):
... __metaclass__ = ABCMeta
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
...
>>> class Concrete(Abstract):
... foo = 'bar'
... bar = 'baz'
...
>>> Concrete()
<__main__.Concrete object at 0x104b4df90>
I was able to construct Concrete() even though both foo and bar are simple attributes.
The ABCMeta metaclass only tracks how many objects are left with the __isabstractmethod__ attribute being true; when creating a class from the metaclass (ABCMeta.__new__ is called) the cls.__abstractmethods__ attribute is then set to a frozenset object with all the names that are still abstract.
type.__new__ then tests for that frozenset and throws a TypeError if you try to create an instance.
You'd have to produce your own __new__ method here; subclass ABCMeta and add type checking in a new __new__ method. That method should look for __abstractmethods__ sets on the base classes, find the corresponding objects with the __isabstractmethod__ attribute in the MRO, then does typechecking on the current class attributes.
This'd mean that you'd throw the exception when defining the class, not an instance, however. For that to work you'd add a __call__ method to your ABCMeta subclass and have that throw the exception based on information gathered by your own __new__ method about what types were wrong; a similar two-stage process as what ABCMeta and type.__new__ do at the moment. Alternatively, update the __abstractmethods__ set on the class to add any names that were implemented but with the wrong type and leave it to type.__new__ to throw the exception.
The following implementation takes that last tack; add names back to __abstractmethods__ if the implemented type doesn't match (using a mapping):
from types import FunctionType
class ABCMetaTypeCheck(ABCMeta):
_typemap = { # map abstract type to expected implementation type
abstractproperty: property,
abstractstatic: staticmethod,
# abstractmethods return function objects
FunctionType: FunctionType,
}
def __new__(mcls, name, bases, namespace):
cls = super(ABCMetaTypeCheck, mcls).__new__(mcls, name, bases, namespace)
wrong_type = set()
seen = set()
abstractmethods = cls.__abstractmethods__
for base in bases:
for name in getattr(base, "__abstractmethods__", set()):
if name in seen or name in abstractmethods:
continue # still abstract or later overridden
value = base.__dict__.get(name) # bypass descriptors
if getattr(value, "__isabstractmethod__", False):
seen.add(name)
expected = mcls._typemap[type(value)]
if not isinstance(namespace[name], expected):
wrong_type.add(name)
if wrong_type:
cls.__abstractmethods__ = abstractmethods | frozenset(wrong_type)
return cls
With this metaclass you get your expected output:
>>> class Abstract(object):
... __metaclass__ = ABCMetaTypeCheck
... #abstractmethod
... def foo(self): pass
... #abstractproperty
... def bar(self): pass
... #abstractstatic
... def baz(): pass
...
>>> class ConcreteWrong(Abstract):
... foo = 'bar'
... bar = 'baz'
... baz = 'spam'
...
>>> ConcreteWrong()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class ConcreteWrong with abstract methods bar, baz, foo
>>>
>>> class ConcreteCorrect(Abstract):
... def foo(self): return 'bar'
... #property
... def bar(self): return 'baz'
... #staticmethod
... def baz(): return 'spam'
...
>>> ConcreteCorrect()
<__main__.ConcreteCorrect object at 0x104ce1d10>

What is the correct way to derive a classmethod in python?

Recently, I encountered a problem with metaclass calling a derived classmethod.
For example, I get a simple baseclass testA, which has an classmethod do1(a)
class testA(object):
#classmethod
def do1(cls, a):
print "in testA:",cls, a
Then I build a metaclass which actually do nothing but print the cls:
class testMetaA(type):
def __init__(cls,cname,bases,cdict):
print "in testMetaA: %s"%cls
Then I could use the metaclass to build a subclass testB, which works as expected:
class testB(testA):
#classmethod
def do1(cls, a):
print "in testB: %s"%cls
super(testB, cls).do1(a)
__metaclass__=testMetaA
It will print: in testMetaA: <class '__main__.testB'>; and the testB.do1(a) works as expected:
>>> testB.do1('hello')
in testB: <class '__main__.testB'>
in testA: <class '__main__.testB'> hello
However, if I try to call the classmethod inside the metaclass which contains a "super" as following testMetaB, it will raise an error: NameError: global name 'testC' is not defined.
class testMetaB(type):
def __init__(cls,cname,bases,cdict):
print "in testMetaB: %s"%cls
cls.do1("hello")
class testC(testA):
#classmethod
def do1(cls, a):
print "in testC: %s"%cls
super(testC, cls).do1(a)
__metaclass__=testMetaB
I finally find a way to solve it by use super(cls, cls) instead of super(testC, cls):
class testD(testA):
#classmethod
def do1(cls, a):
print "in testD: %s"%cls
super(cls, cls).do1(a)
__metaclass__=testMetaB
It will print as:
in testMetaB: <class '__main__.testD'>
in testD: <class '__main__.testD'>
in testA: <class '__main__.testD'> hello
The testD.do1(a) also works as expected:
>>> testD.do1('Well done')
in testD: <class '__main__.testD'>
in testA: <class '__main__.testD'> Well done
Now I am wondering which is the most correct way to use super in a classmethod? Should one always use super(cls,cls) instead of explicitly writing a current class name?
Thanks!
#jsbueno
If some piece of code resorts to tricks like dynamically creating derived classes, that is important - one should not use the class name as first parametere to Super if that name is assigned to another object than the class itself. Instead, cls for class methods, or self.__class__ for instance methods can be passed to Super.
Does this means it is a bad idea to use class name to super in general?
To myself, I usually use super(type(self),self) instead of super(type(self.__class__),self) for normal method. I do not know if there is any major advantage to use self.__class__.
I repeat #jsbueno example like this, here the C use super(type(self),self). So D2() will not change the behavior while the class C gets changed.
>>> class A(object):
def do(self):
print "in class A"
>>> class B(A):
def do(self):
super(B, self).do()
>>> class C(A):
def do(self):
super(type(self),self).do()
>>> D1=B
>>> D2=C
>>> D1().do()
in class A
>>> D2().do()
in class A
>>> class B(A):
def do(self):
print "in new class B"
>>> D1().do()
Traceback (most recent call last):
File "<pyshell#52>", line 1, in <module>
D1().do()
File "<pyshell#37>", line 3, in do
super(B, self).do()
TypeError: super(type, obj): obj must be an instance or subtype of type
>>> class C(A):
def do(self):
print "in new class C"
>>> D2().do()
in class A
according to #Don Question's suggestion, I put the python version here: sys.version= 2.7.2+ (default, Oct 4 2011, 20:06:09) [GCC 4.6.1]
However, if I try to call the classmethod inside the metaclass which
contains a "super" as following testMetaB, it will raise an error:
NameError: global name 'testC' is not defined.
The name TestC will only be bound to the new class after the MetaClass finished it's work - and that is after returning from it's __init__ (and before __init__, the __new__) method.
When we use the "super" call usign the class name as the first parameter, the class name does not appear there magically: it is a (module) global variable, to which the class itself is assigned - in normal circunstances.
In this case, the name has not been assigned yet - however, as it is a classmethod, yuu have a reference to the class in the cls variable- that is why it works.
If some piece of code resorts to tricks like dynamically creating derived classes, that is important - one should not use the class name as first parametere to Super if that name is assigned to another object than the class itself. Instead, cls for class methods, or self.__class__ for instance methods can be passed to Super.
Here is a snippet showing the global name binding for the class name is what super takes:
>>> class A(object):
... def do(self):
... print "In class A"
...
>>> class B(A):
... def do(self):
... super(B, self).do()
...
>>> C = B
>>> C().do()
In class A
>>> class B(object):
... def do(self):
... print "in new class B"
...
>>> C().do()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in do
TypeError: super(type, obj): obj must be an instance or subtype of type
>>>

Python - Testing an abstract base class

I am looking for ways / best practices on testing methods defined in an abstract base class. One thing I can think of directly is performing the test on all concrete subclasses of the base class, but that seems excessive at some times.
Consider this example:
import abc
class Abstract(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def id(self):
return
#abc.abstractmethod
def foo(self):
print "foo"
def bar(self):
print "bar"
Is it possible to test bar without doing any subclassing?
In newer versions of Python you can use unittest.mock.patch()
class MyAbcClassTest(unittest.TestCase):
#patch.multiple(MyAbcClass, __abstractmethods__=set())
def test(self):
self.instance = MyAbcClass() # Ha!
Here is what I have found: If you set __abstractmethods__ attribute to be an empty set you'll be able to instantiate abstract class. This behaviour is specified in PEP 3119:
If the resulting __abstractmethods__ set is non-empty, the class is considered abstract, and attempts to instantiate it will raise TypeError.
So you just need to clear this attribute for the duration of tests.
>>> import abc
>>> class A(metaclass = abc.ABCMeta):
... #abc.abstractmethod
... def foo(self): pass
You cant instantiate A:
>>> A()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class A with abstract methods foo
If you override __abstractmethods__ you can:
>>> A.__abstractmethods__=set()
>>> A() #doctest: +ELLIPSIS
<....A object at 0x...>
It works both ways:
>>> class B(object): pass
>>> B() #doctest: +ELLIPSIS
<....B object at 0x...>
>>> B.__abstractmethods__={"foo"}
>>> B()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class B with abstract methods foo
You can also use unittest.mock (from 3.3) to override temporarily ABC behaviour.
>>> class A(metaclass = abc.ABCMeta):
... #abc.abstractmethod
... def foo(self): pass
>>> from unittest.mock import patch
>>> p = patch.multiple(A, __abstractmethods__=set())
>>> p.start()
{}
>>> A() #doctest: +ELLIPSIS
<....A object at 0x...>
>>> p.stop()
>>> A()
Traceback (most recent call last):
TypeError: Can't instantiate abstract class A with abstract methods foo
As properly put by lunaryon, it is not possible. The very purpose of ABCs containing abstract methods is that they are not instantiatable as declared.
However, it is possible to create a utility function that introspects an ABC, and creates a dummy, non abstract class on the fly. This function could be called directly inside your test method/function and spare you of having to wite boiler plate code on the test file just for testing a few methods.
def concreter(abclass):
"""
>>> import abc
>>> class Abstract(metaclass=abc.ABCMeta):
... #abc.abstractmethod
... def bar(self):
... return None
>>> c = concreter(Abstract)
>>> c.__name__
'dummy_concrete_Abstract'
>>> c().bar() # doctest: +ELLIPSIS
(<abc_utils.Abstract object at 0x...>, (), {})
"""
if not "__abstractmethods__" in abclass.__dict__:
return abclass
new_dict = abclass.__dict__.copy()
for abstractmethod in abclass.__abstractmethods__:
#replace each abc method or property with an identity function:
new_dict[abstractmethod] = lambda x, *args, **kw: (x, args, kw)
#creates a new class, with the overriden ABCs:
return type("dummy_concrete_%s" % abclass.__name__, (abclass,), new_dict)
You can use multiple inheritance practice to have access to the implemented methods of the abstract class. Obviously following such design decision depends on the structure of the abstract class since you need to implement abstract methods (at least bring the signature) in your test case.
Here is the example for your case:
class Abstract(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def id(self):
return
#abc.abstractmethod
def foo(self):
print("foo")
def bar(self):
print("bar")
class AbstractTest(unittest.TestCase, Abstract):
def foo(self):
pass
def test_bar(self):
self.bar()
self.assertTrue(1==1)
No, it's not. The very purpose of abc is to create classes that cannot be instantiated unless all abstract attributes are overridden with concrete implementations. Hence you need to derive from the abstract base class and override all abstract methods and properties.
Perhaps a more compact version of the concreter proposed by #jsbueno could be:
def concreter(abclass):
class concreteCls(abclass):
pass
concreteCls.__abstractmethods__ = frozenset()
return type('DummyConcrete' + abclass.__name__, (concreteCls,), {})
The resulting class still has all original abstract methods (which can be now called, even if this is not likely to be useful...) and can be mocked as needed.

Class-level read-only properties in Python

Is there some way to make a class-level read-only property in Python? For instance, if I have a class Foo, I want to say:
x = Foo.CLASS_PROPERTY
but prevent anyone from saying:
Foo.CLASS_PROPERTY = y
EDIT:
I like the simplicity of Alex Martelli's solution, but not the syntax that it requires. Both his and ~unutbu's answers inspired the following solution, which is closer to the spirit of what I was looking for:
class const_value (object):
def __init__(self, value):
self.__value = value
def make_property(self):
return property(lambda cls: self.__value)
class ROType(type):
def __new__(mcl,classname,bases,classdict):
class UniqeROType (mcl):
pass
for attr, value in classdict.items():
if isinstance(value, const_value):
setattr(UniqeROType, attr, value.make_property())
classdict[attr] = value.make_property()
return type.__new__(UniqeROType,classname,bases,classdict)
class Foo(object):
__metaclass__=ROType
BAR = const_value(1)
BAZ = 2
class Bit(object):
__metaclass__=ROType
BOO = const_value(3)
BAN = 4
Now, I get:
Foo.BAR
# 1
Foo.BAZ
# 2
Foo.BAR=2
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# AttributeError: can't set attribute
Foo.BAZ=3
#
I prefer this solution because:
The members get declared inline instead of after the fact, as with type(X).foo = ...
The members' values are set in the actual class's code as opposed to in the metaclass's code.
It's still not ideal because:
I have to set the __metaclass__ in order for const_value objects to be interpreted correctly.
The const_values don't "behave" like the plain values. For example, I couldn't use it as a default value for a parameter to a method in the class.
The existing solutions are a bit complex -- what about just ensuring that each class in a certain group has a unique metaclass, then setting a normal read-only property on the custom metaclass. Namely:
>>> class Meta(type):
... def __new__(mcl, *a, **k):
... uniquemcl = type('Uniq', (mcl,), {})
... return type.__new__(uniquemcl, *a, **k)
...
>>> class X: __metaclass__ = Meta
...
>>> class Y: __metaclass__ = Meta
...
>>> type(X).foo = property(lambda *_: 23)
>>> type(Y).foo = property(lambda *_: 45)
>>> X.foo
23
>>> Y.foo
45
>>>
this is really much simpler, because it's based on nothing more than the fact that when you get an instance's attribute descriptors are looked up on the class (so of course when you get a class's attribute descriptors are looked on the metaclass), and making class/metaclass unique isn't terribly hard.
Oh, and of course:
>>> X.foo = 67
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
just to confirm it IS indeed read-only!
The ActiveState solution that Pynt references makes instances of ROClass have read-only attributes. Your question seems to ask if the class itself can have read-only attributes.
Here is one way, based on Raymond Hettinger's comment:
#!/usr/bin/env python
def readonly(value):
return property(lambda self: value)
class ROType(type):
CLASS_PROPERTY = readonly(1)
class Foo(object):
__metaclass__=ROType
print(Foo.CLASS_PROPERTY)
# 1
Foo.CLASS_PROPERTY=2
# AttributeError: can't set attribute
The idea is this: Consider first Raymond Hettinger's solution:
class Bar(object):
CLASS_PROPERTY = property(lambda self: 1)
bar=Bar()
bar.CLASS_PROPERTY=2
It shows a relatively simple way to give bar a read-only property.
Notice that you have to add the CLASS_PROPERTY = property(lambda self: 1)
line to the definition of the class of bar, not to bar itself.
So, if you want the class Foo to have a read-only property, then the parent class of Foo has to have CLASS_PROPERTY = property(lambda self: 1) defined.
The parent class of a class is a metaclass. Hence we define ROType as the metaclass:
class ROType(type):
CLASS_PROPERTY = readonly(1)
Then we make Foo's parent class be ROType:
class Foo(object):
__metaclass__=ROType
Found this on ActiveState:
# simple read only attributes with meta-class programming
# method factory for an attribute get method
def getmethod(attrname):
def _getmethod(self):
return self.__readonly__[attrname]
return _getmethod
class metaClass(type):
def __new__(cls,classname,bases,classdict):
readonly = classdict.get('__readonly__',{})
for name,default in readonly.items():
classdict[name] = property(getmethod(name))
return type.__new__(cls,classname,bases,classdict)
class ROClass(object):
__metaclass__ = metaClass
__readonly__ = {'a':1,'b':'text'}
if __name__ == '__main__':
def test1():
t = ROClass()
print t.a
print t.b
def test2():
t = ROClass()
t.a = 2
test1()
Note that if you try to set a read-only attribute (t.a = 2) python will raise an AttributeError.

Categories