First day learning Python, please excuse the basic question.
Assuming I have been given an object which contains an unimplemented method that I need to implement, e.g:
class myclass():
def __init__(self)
self.unimplementedmethod = False
What is the correct way to implement this in an instantiated object? I do not want to alter the base class in any way.
I have experimented and found the following code seems to work, but is it correct/good style?
def methodimplementation():
print("method called")
myobject = myclass()
myobject.unimplementedmethod=methodimplementation
Is this the right path? Or should I be doing something different like perhaps creating a derived class first, implementing the methods in it, and then instantiating an object based on the derived class? What is best practice?
You need to subclass the base class:
class myclass():
def some_method():
raise NotImplementedError
class my_subclass(myclass):
def some_method():
print("method called")
You want to create a abstract base class. For that, you need to inherit abc.ABCMeta in your base class. Then defining the method as abstract, you need to decorate it with #abstractmethod. For example:
from abc import ABCMeta, abstractmethod
class BaseClass(ABCMeta):
#abstractmethod
def my_method():
pass
Then you may create the child class as:
class MyChildClass(BaseClass):
def my_method():
print 'my method'
The good way is using subclasses, but if you can't do it, here is a way to access to self from a simple function not defined in a class:
class Bar:
def __init__(self):
pass
def foo(self):
try:
self._foo(self)
except AttributeError:
raise NotImplementedError
def set_foo(self, function):
setattr(self, '_foo', function)
def another_method(self):
print "Another method from {}".format(self)
def foo(self):
self.another_method()
bar = Bar()
bar.set_foo(foo)
bar.foo()
So, def foo(self) define a function with a single argument self, like a method. This function call a instance method another_method.
Bar.set_foo create a new attribute _foo in instance of Bar.
Finally, Bar.foo try to access to self._foo with self as argument. If _foo is do not exists, Bar.foo will raise a NotImplementedError as expected.
Like it you can access to self from foo without subclasses.
Related
I read on that instance methods can only be called by creating an instance (object) of the class. But it appears that I can call one without doing so. Check the code below:
class Test:
def func(self): #Instance Method
print(6)
Test.func(Test) # Here I am calling an instance method without creating an instance of class. How?
Please let me know what is happening behind the scenes.
Your code works because you feed as self argument the class itself.
Your function will work as long as you use self as class type and not as class instance, which is very bad practice.
I suggest to use staticmethods for such purposes:
class Test:
#staticmethod
def func():
print(6)
Test.func()
or #classmethod:
class Test:
#classmethod
def func(cls):
print(6)
Test.func()
Output:
6
I've got some code where I need to refer to a superclass when defining stuff in a derived class:
class Base:
def foo(self):
print('foo')
def bar(self):
print('bar')
class Derived_A(Base):
meth = Base.foo
class Derived_B(Base):
meth = Base.bar
Derived_A().meth()
Derived_B().meth()
This works, but I don't like verbatim references to Base in derived classes. Is there a way to use super or alike for this?
You can't do that.
class keyword in Python is used to create classes which are instances of type type. In it's simplified version, it does the following:
Python creates a namespace and executes the body of the class in that namespace so that it will be populated with all methods and attributes and so on...
Then calls the three-arguments form of type(). The result of this call is your class which is then assign to a symbol which is the name of your class.
The point is when the body of the class is being executed. It doesn't know about the "bases". Those bases are passed to the type() after that.
I also explained the reasons why you can't use super() here.
Does this work for you?
class Base:
def foo(self):
print('foo')
def bar(self):
print('bar')
class Derived_A(Base):
def __init__(self):
self.meth = super().foo
class Derived_B(Base):
def __init__(self):
self.meth = super().bar
a = Derived_A().meth()
b = Derived_B().meth()
You'll need to lookup the method on the base class after the new type is created. In the body of the class definition, the type and base classes are not accessible.
Something like:
class Derived_A(Base):
def meth(self):
return super().foo()
Now, it is possible to do some magic behind the scenes to expose Base to the scope of the class definition as its being executed, but that's much dirtier, and would mean that you'd need to supply a metaclass in your class definition.
Since you want "magic", there is still one sane option we can take before diving into metaclasses. Requires Python 3.9+
def alias(name):
def inner(cls):
return getattr(cls, name).__get__(cls)
return classmethod(property(inner))
class Base:
def foo(self):
...
class Derived_A(Base):
meth = alias("foo")
Derived_A().meth() # works
Derived_A.meth() # also works
Yes, this does require passing the method name as a string, which destroys your IDE and typechecker's ability to reason about it. But there isn't a good way to get what you are wanting without some compromises like that.
Really, a bit of redundancy for readability is probably worth it here.
I have a base class that looks something like this:
class myBaseClass:
def __init__(self):
self.name = None # All subclasses must define this
def foo(self): # All subclasses must define this
raise NotImplementedError
def bar(self): # Optional -- not all subclasses will define this
raise NotImplementedError
My API specification stipulates that anyone creating a subclass of myBaseClass must provide a meaningful value for .name, and for the function .foo(). However, .bar() is optional and calling code should be able to handle the case where that results in a NotImplementedError.
When and how should I check that subclasses contributed by third parties meet these requirements?
The options seem to be:
Build subclasses exclusively via metaclasses. However, this approach will be unfamiliar and potentially confusing to most of the contributors to my project, who tend not to be expert developers.
Add an __init_subclass__ method to the base class and use this to infer whether the subclass has overridden everything it is supposed to override. Seems to work, but feels a bit 'kludgy'.
Write build-time tests to instantiate each subclass, call each 'required' method, and verify that they do not raise a NotImplementedError. Seems like an excessive computational effort to answer such a simple question (calling .foo() may be expensive).
Ignore the issue. Deal with it if and when it causes something else to break.
I'm sure I'm not the only person who needs to deal with this issue - is there a 'correct' approach here?
Here's how I would structure it.
First off, what you're looking for here is an abstract base class. Using the built-in modules you can easily define it as such and have methods be forced to have an implementation, otherwise the class will raise an exception when instantiated.
If the name attribute needs to be set always, then you should make it part of the constructor arguments.
Because bar is not always required I wouldn't define it as a method in the base class you have. Instead I would make a child class that is also abstract and define it there as required. When checking to see if the method is available you can use isinstance.
This is what my final code would look like:
from abc import ABC, abstractmethod
class FooBaseClass(ABC):
def __init__(self, name):
self.name = name
#abstractmethod
def foo(self):
"""Some useful docs for foo"""
class FooBarBaseClass(FooBaseClass, ABC):
#abstractmethod
def bar(self):
"""Some useful docs for bar"""
When creating instances you can pick the base class you want and will be forced to define the methods.
class FooClass(FooBaseClass):
def __init__(self):
super().__init__("foo")
def foo(self):
print("Calling foo from FooClass")
class FooBarClass(FooBarBaseClass):
def __init__(self):
super().__init__("foobar")
def foo(self):
print("Calling foo from FooBarClass")
def bar(self):
print("Calling bar from FooBarClass")
Example checking if bar is callable:
def do_operation(obj: FooBaseClass):
obj.foo()
if isinstance(obj, FooBarBaseClass):
obj.bar()
Example:
do_operation(FooClass())
do_operation(FooBarClass())
Calling foo from FooClass
Calling foo from FooBarClass
Calling bar from FooBarClass
An example of invalid code
class InvalidClass(FooBaseClass):
def __init__(self):
super().__init__("foo")
InvalidClass()
Traceback (most recent call last):
File "C:\workspace\so\test.py", line 52, in <module>
InvalidClass()
TypeError: Can't instantiate abstract class InvalidClass with abstract method foo
I am given a designated factory of A-type objects. I would like to make a new version of A-type objects that also have the methods in a Mixin class. For reasons that are too long to explain here, I can't use class A(Mixin), I have to use the A_factory. Below I try to give a bare bones example.
I thought naively that it would be sufficient to inherit from Mixin to endow A-type objects with the mixin methods, but the attempts below don't work:
class A: pass
class A_factory:
def __new__(self):
return A()
class Mixin:
def method(self):
print('aha!')
class A_v2(Mixin): # attempt 1
def __new__(cls):
return A_factory()
class A_v3(Mixin): # attempt 2
def __new__(cls):
self = A_factory()
super().__init__(self)
return self
In fact A_v2().method() and A_v3().method() raises AttributeError: 'A' object has no attribute 'method'.
What is the correct way of using A_factory within class A_vn(Mixin) so that A-type objects created by the factory inherit the mixin methods?
There's no obvious reason why you should need __new__ for what you're showing here. There's a nice discussion here on the subject: Why is __init__() always called after __new__()?
If you try the below it should work:
class Mixin:
def method(self):
print('aha!')
class A(Mixin):
def __init__(self):
super().__init__()
test = A()
test.method()
If you need to use a factory method, it should be a function rather than a class. There's a very good discussion of how to use factory methods here: https://realpython.com/factory-method-python/
I have the following class hierarchy:
class AbstractClass(object):
__metaclass__ = ABCMeta
#abstractmethod
def foo(self):
pass
class A(AbstractClass):
def __init__():
super().__init__()
def foo(self):
//Logic
class B(A):
def __init__():
super().__init__()
I want to use foo as it is implemented in A, so I cannot override it in B.
Using B.foo() works, but I still get the warning from PyCharm:
"Class B must implement all abstract methods"
Do I have to override a method that already overrides an abstract method? How do I override it without losing the implementation? Just copy the method to the sub class?
I was just going to ask this question when I suddenly had an idea how it could work. I thought "how I can call a method after I just overrode it?"
After some thought I finally figured it out.
Call the overridden method from the super class while overriding it in the sub class:
class B(A):
def __init__():
super().__init__()
def foo(self):
super().foo()
This works because a supertypes method must work with its subtypes without further implementation provided. After I figured it out it just seems so logical.
This might be useful for people who are just figuring out how inheritance works.