How to make an Python subclass uncallable - python

How do you "disable" the __call__ method on a subclass so the following would be true:
class Parent(object):
def __call__(self):
return
class Child(Parent):
def __init__(self):
super(Child, self).__init__()
object.__setattr__(self, '__call__', None)
>>> c = Child()
>>> callable(c)
False
This and other ways of trying to set __call__ to some non-callable value still result in the child appearing as callable.

You can't. As jonrsharpe points out, there's no way to make Child appear to not have the attribute, and that's what callable(Child()) relies on to produce its answer. Even making it a descriptor that raises AttributeError won't work, per this bug report: https://bugs.python.org/issue23990 . A python 2 example:
>>> class Parent(object):
... def __call__(self): pass
...
>>> class Child(Parent):
... __call__ = property()
...
>>> c = Child()
>>> c()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: unreadable attribute
>>> c.__call__
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: unreadable attribute
>>> callable(c)
True
This is because callable(...) doesn't act out the descriptor protocol. Actually calling the object, or accessing a __call__ attribute, involves retrieving the method even if it's behind a property, through the normal descriptor protocol. But callable(...) doesn't bother going that far, if it finds anything at all it is satisfied, and every subclass of Parent will have something for __call__ -- either an attribute in a subclass, or the definition from Parent.
So while you can make actually calling the instance fail with any exception you want, you can't ever make callable(some_instance_of_parent) return False.

It's a bad idea to change the public interface of the class so radically from the parent to the base.
As pointed out elsewhere, you cant uninherit __call__. If you really need to mix in callable and non callable classes you should use another test (adding a class attribute) or simply making it safe to call the variants with no functionality.
To do the latter, You could override the __call__ to raise NotImplemented (or better, a custom exception of your own) if for some reason you wanted to mix a non-callable class in with the callable variants:
class Parent(object):
def __call__(self):
print "called"
class Child (Parent):
def __call__(self):
raise NotACallableInstanceException()
for child_or_parent in list_of_children_and_parents():
try:
child_or_parent()
except NotACallableInstanceException:
pass
Or, just override call with pass:
class Parent(object):
def __call__(self):
print "called"
class Child (Parent):
def __call__(self):
pass
Which will still be callable but just be a nullop.

Related

python child class method calling overridden classmethod with non-classmethod

I am trying to do the following in python3:
class Parent:
#classmethod
def show(cls, message):
print(f'{message}')
#classmethod
def ask(cls, message):
cls.show(f'{message}???')
class Child(Parent):
#property
def name(self):
return 'John'
def show(self, message):
print(f'{self.name}: {message}')
instance = Child()
instance.ask('what')
But it then complains
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 7, in ask
TypeError: Child.show() missing 1 required positional argument: 'message'
even so child.show works as expected. So it seems that child.ask is calling Parent.show... I tried to mark Child.show as classmethod too, but then the cls.name is not showing the expected output:
class Child2(Parent):
#property
def name(self):
return 'John'
#classmethod
def show(cls, message):
print(f'{cls.name}: {message}')
instance2 = Child2()
instance2.ask('what')
this shows
<property object at 0xfc7b90>: what???
Is there a way to override a parent classmethod with a non-classmethod, but keeping other parent classmethod to call the overridden one?
I found it hard to follow for the second half of the question but there was an issue I saw and it might help you solve your problem.
When you said even so child.show works as expected. So it seems that child.ask is calling Parent.show, thats not what is happening.
When you called instance.ask("what"), it called the #classmethod decorated method of the Child class (which is inherited from the parent). This ask method is passing the class Child as the first argument, (not the instance you created). This means the line
cls.show(f'{message}???')
is equivalent to
Child.show(f'{message}???') # because cls is the Class not the instance
The show method inside the Child class is an instance method and expects the first argument to be the actual instance (self) but the string f'{message}???' is being passed to it and it expects a second message string to be passed so that's why its is throwing an error.
Hope this helped

python linting and subclass properties

I have a superclass that uses some properties from the child class.
but unless I define properties in my superclass ALSO then the linter throws errors.
what's a pythonic way to get around this?
# parent
class DigItem:
def fetch_bq(self):
query = f'''select * from {self.table_id}'''
# subclass
class ChatLog(DigItem):
def __init__(self, set_name):
super().__init__(set_name)
self.table_id = biglib.make_table_id('chat_logs')
The above code errors with:
Instance of 'DigItem' has no 'table_id' memberpylint(no-member)
now, I can add the property to the superclass but that's pretty redundant and also risks overwriting the subclass
class DigItem:
def __init__(self, set_name):
self.table_id = None # set by child
This is down to the linter not being able to know AOT that this is a 'superclass' so it's fair enough as an error in a standalone instance.
But I'd prefer clean linting, pythonic code and not writing special hacky stuff just to shut up the linter.
In your example, DigItem has no __init__ at all (so it will be object's), so passing an argument to super().__init__() will fail
>>> class A: pass
...
>>> class B(A):
... def __init__(self):
... super().__init__("something")
...
>>> B()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in __init__
TypeError: object.__init__() takes exactly one argument (the instance to initialize)
Further, you should (must) create the missing property in your parent in order for it to meaningfully make use of it in a method (otherwise different inheriting classes will not be able to make use of the method)
>>> class A:
... def foo(self):
... return self.bar
...
>>> class B(A):
... def __init__(self):
... self.bar = "baz"
...
>>> class C(A): pass # NOTE .bar is never defined!
...
>>> C().foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in foo
AttributeError: 'C' object has no attribute 'bar'
If the base class is not intended to be directly instantiable, consider making it an Abstract Base Class

exceptions.AttributeError: class KeyAgent has no attribute 'delele_customer_node()'

I've 2 files: customer.py & agent.py. It looks like this:
customer.py:
from agent import KeyAgent
class CustomerController(object):
def __init__(self, container):
self.key = container.agent
def delete(self, path):
KeyAgent.delele_customer_node()
agent.py:
class KeyAgent(service.Service):
def __init__(self):
pass
def delele_customer_node():
....
Python is throwing this exception while running:
exceptions.AttributeError: class KeyAgent has no attribute 'delele_customer_node()'
Even though I've imported KeyAgent class from agent.py why method delele_customer_node() is not accessible from delete() of customer.py?
You must have misspelled the method name (delele? or delete?). The KeyAgent class does have a method delete_customer_node (I will assume that was a typo).
>>> class KeyAgent(object):
... def delete_customer():
... pass
...
>>> KeyAgent.delete_customer
<unbound method KeyAgent.delete_customer>
That means, the method is there. However your code is quite broken. Unless you use the staticmethod or classmethod decorators, the first argument of a method "must be" self, and you need to instantiate the class to call it. See what happens if you try to call this method directly:
>>> KeyAgent.delete_customer()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: unbound method delete_customer() must be called with KeyAgent instance as first argument (got nothing instead)

Higher order classes in Python

Can anyone explain why the following code doesn't work? I'm trying to make a class decorator to provide new __repr__ and __init__ methods, and if I decorate a class with it only the repr method seems to get defined. I managed to fix the original problem by making the decorator modify the original class destructively instead of creating a new class (e.g. it defines the new methods and then just uses cl.__init__ = __init__ to overwrite them). Now I'm just curious why the subclassing-based attempt didn't work.
def higherorderclass(cl):
#functools.wraps(cl)
class wrapped(cl):
def __init__(self, *args, **kwds):
print 'in wrapped init'
super(wrapped, self).__init__(*args, **kwds)
def __repr__(self):
return 'in wrapped repr'
return wrapped
The first problem is that you're using old-style classes. (That is, classes that don't inherit from object, another built-in type, or another new-style class.) Special method lookup works differently in old-style classes. Really, you don't want to learn how it works; just use new-style classes instead.
But then you run into the next problem: functools.wraps doesn't work on classes in the first place. With new-style classes, you will get some kind of AttributeError; with old-style classes, things just silently fail in various ways. And you can't just use update_wrapper explicitly either. The problem is that you're trying to replace attributes of the class that aren't writeable, and there's no (direct) way around that.
If you use new-style classes, and don't try to wraps them, everything works fine.
Remove the #functools.wraps() decorator, this only applies to function decorators. With a newstyle class your decorator fails with:
>>> #higherorderclass
... class Foo(object):
... def __init__(self):
... print 'in foo init'
...
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "<stdin>", line 3, in higherorderclass
File "/Users/mj/Development/Library/buildout.python/parts/opt/lib/python2.7/functools.py", line 33, in update_wrapper
setattr(wrapper, attr, getattr(wrapped, attr))
AttributeError: attribute '__doc__' of 'type' objects is not writable
Without the #functools.wraps() line your decorator works just fine:
>>> def higherorderclass(cl):
... class wrapped(cl):
... def __init__(self, *args, **kwds):
... print 'in wrapped init'
... super(wrapped, self).__init__(*args, **kwds)
... def __repr__(self):
... return 'in wrapped repr'
... return wrapped
...
>>> #higherorderclass
... class Foo(object):
... def __init__(self):
... print 'in foo init'
...
>>> Foo()
in wrapped init
in foo init
in wrapped repr

Polymorphism with callables in Python

I have an interface class called iResource, and a number of subclasses, each of which implement the "request" method. The request functions use socket I/O to other machines, so it makes sense to run them asynchronously, so those other machines can work in parallel.
The problem is that when I start a thread with iResource.request and give it a subclass as the first argument, it'll call the superclass method. If I try to start it with "type(a).request" and "a" as the first argument, I get "" for the value of type(a). Any ideas what that means and how to get the true type of the method? Can I formally declare an abstract method in Python somehow?
EDIT: Including code.
def getSocialResults(self, query=''):
#for a in self.types["social"]: print type(a)
tasks = [type(a).request for a in self.types["social"]]
argss = [(a, query, 0) for a in self.types["social"]]
grabbers = executeChainResults(tasks, argss)
return igrabber.cycleGrabber(grabbers)
"executeChainResults" takes a list "tasks" of callables and a list "argss" of args-tuples, and assumes each returns a list. It then executes each in a separate thread, and concatenates the lists of results. I can post that code if necessary, but I haven't had any problems with it so I'll leave it out for now.
The objects "a" are DEFINITELY not of type iResource, since it has a single constructor that just throws an exception. However, replacing "type(a).request" with "iResource.request" invokes the base class method. Furthermore, calling "self.types["social"][0].request" directly works fine, but the above code gives me: "type object 'instance' has no attribute 'request'".
Uncommenting the commented line prints <type 'instance'> several times.
You can just use the bound method object itself:
tasks = [a.request for a in self.types["social"]]
# ^^^^^^^^^
grabbers = executeChainResults(tasks, [(query, 0)] * len(tasks))
# ^^^^^^^^^^^^^^^^^^^^^^^^^
If you insist on calling your methods through the base class you could also do it like this:
from abc import ABCMeta
from functools import wraps
def virtualmethod(method):
method.__isabstractmethod__ = True
#wraps(method)
def wrapper(self, *args, **kwargs):
return getattr(self, method.__name__)(*args, **kwargs)
return wrapper
class IBase(object):
__metaclass__ = ABCMeta
#virtualmethod
def my_method(self, x, y):
pass
class AddImpl(IBase):
def my_method(self, x, y):
return x + y
class MulImpl(IBase):
def my_method(self, x, y):
return x * y
items = [AddImpl(), MulImpl()]
for each in items:
print IBase.my_method(each, 3, 4)
b = IBase() # <-- crash
Result:
7
12
Traceback (most recent call last):
File "testvirtual.py", line 30, in <module>
b = IBase()
TypeError: Can't instantiate abstract class IBase with abstract methods my_method
Python doesn't support interfaces as e.g. Java does. But with the abc module you can ensure that certain methods must be implemented in subclasses. Normally you would do this with the abc.abstractmethod() decorator, but you still could not call the subclasses method through the base class, like you intend. I had a similar question once and I had the idea of the virtualmethod() decorator. It's quite simple. It essentially does the same thing as abc.abstratmethod(), but also redirects the call to the subclasses method. The specifics of the abc module can be found in the docs and in PEP3119.
BTW: I assume you're using Python >= 2.6.
The reference to "<type "instance" >" you get when you are using an "old style class" in Python - i.e.: classes not derived from the "object" type hierarchy. Old style classes are not supposed to work with several of the newer features of the language, including descriptors and others. AND, among other things, - you can't retrieve an attribute (or method) from the class of an old style class using what you are doing:
>>> class C(object):
... def c(self): pass
...
>>> type (c)
<class '__main__.C'>
>>> c = C()
>>> type(c).c
<unbound method C.c>
>>> class D: #not inheriting from object: old style class
... def d(self): pass
...
>>> d = D()
>>> type(d).d
>>> type(d)
<type 'instance'>
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: type object 'instance' has no attribute 'd'
>>>
Therefore, just make your base class inherit from "object" instead of "nothing" and check if you still get the error message when requesting the "request" method from type(a) :
As for your other observation:
"The problem is that when I start a thread with iResource.request and give it a subclass as the first argument, it'll call the superclass method."
It seems that the "right" thing for it to do is exactly that:
>>> class A(object):
... def b(self):
... print "super"
...
>>> class B(A):
... def b(self):
... print "child"
...
>>> b = B()
>>> A.b(b)
super
>>>
Here, I call a method in the class "A" giving it an specialized instance of "A" - the method is still the one in class "A".

Categories