When we subclass the threading class, is calling the original threading__init__ method within our new class's__init__ method, essentially just wiping the slate clean?
Or are we inheriting the original __init__method's attributes?
This is how the original __init__looks for the threading class (abridged form)
def __init__(self, group=None, target=None, name=None,
args=(), kwargs=None, *, daemon=None):
if kwargs is None:
kwargs = {}
self._target = target
self._name = str(name or _newname())
self._args = args
self._kwargs = kwargs
So now when I create a subclass and def my int as such:
class MyThread(threading.Thread):
def __init__(self, number):
threading.Thread.__init__(self)
self.number = number
print(number)
Does this mean I am overwriting the original threading classes
init attributes such as
group=None, target=None, name=None,
args=(), kwargs=None, *, daemon=None
and thus only have access to the one attribute number that I created within my
new init method.
If so, is there a way to still have access to the original init attributes
and just add on top of those attributes when I create my new
subclass?
Your current subclass can only instantiate a Thread with it's default arguments.
To avoid having to rewrite the arguments, a form like this can be used:
def __init__(self,
subclass_positional_arg, *args,
subclass_kw_arg=None, other_arg=None, **kwargs):
super(MyThread, self).__init__(*args, **kwargs)
# Do stuff with `subclass_positional_arg`,
# `subclass_kw_arg` and `other_arg`
And you could instantiate it like this:
MyThread(positional, subclass_kw_arg=value)
MyThread(positional)
In your specific situation, you could do one of these two things:
def __init__(self, *args, number, **kwargs):
super(MyThread, self).__init__(*args, **kwargs)
self.number = number
MyThread(number=<number>) # You could also give a default value
def __init__(self, number, *args, **kwargs):
super(MyThread, self).__init__(*args, **kwargs)
self.number = number
MyThread(<number>)
# or
MyThread(number=<number>)
# Though you could no longer give a default value
In Python 2, keyword-only arguments are not allowed. You achieve a similar effect by popping from **kwargs
def __init__(self, *args, **kwargs):
# Use `kwargs.pop('number', <default value>)` to be optional
number = kwargs.pop('number')
super(MyThread, self).__init__(*args, **kwargs)
self.number = number
Related
I'm writing some report generation code, where a class CoverageReporter is doing all of the heavy lifting, with just a few attributes changing between different instances of CoverageReporter. Let's say we want to alias
SubsystemCoverageReporter(*args, **kwargs)
to
CoverageReporter('subsystem', *args, **kwargs)
in a concise way. Currently, I accomplish it like so:
class SubsystemCoverageReporter(object):
def __init__(self, *args, **kwargs):
self.reporter = CoverageReporter('subsystem', *args, **kwargs)
def __getattr__(self, name):
return getattr(self.reporter, name)
Another one, subsystem 2, has almost exactly the same boilerplate, eg:
class SubsystemTwoCoverageReporter(object):
def __init__(self, *args, **kwargs):
self.reporter = CoverageReporter('subsystem2', *args, **kwargs)
def __getattr__(self, name):
return getattr(self.reporter, name)
Is there a more concise + Pythonic way to express this? I'd prefer not to rely on inheritance, but if there's no better ways I'm open to that.
Say you had your class CoverageReporter like so:
class CoverageReporter:
def __init__(self, _type, _name):
self.type = _type
self.name = _name
def do_something(self):
print(f"{self.type}:{self.name} did something")
def do_something_else(self):
print(f"{self.type}:{self.name} did something else")
You could define a function which creates the CoverageReporter and returns that object.
def SubsystemCoverageReporter(*args, **kwargs):
return CoverageReporter('subsystem', *args, **kwargs)
def SubsystemTwoCoverageReporter(*args, **kwargs):
return CoverageReporter('subsystem2', *args, **kwargs)
s1 = SubsystemCoverageReporter("Foo")
s2 = SubsystemTwoCoverageReporter("Bar")
s1.do_something() # subsystem:Foo did something
s2.do_something_else() # subsystem2:Bar did something else
Note that this isn't exactly equivalent to what you have, since SubsystemCoverageReporter isn't a class, so you won't be able to inherit it. If you want that option available, then off the top of my head I think inheritance is the best solution you have.
class SubsystemCoverageReporter(CoverageReporter):
def __init__(self, *args, **kwargs):
super().__init__('subsystem', *args, **kwargs)
class SubsystemTwoCoverageReporter(CoverageReporter):
def __init__(self, *args, **kwargs):
super().__init__('subsystem2', *args, **kwargs)
s1 = SubsystemCoverageReporter("Foo")
s2 = SubsystemTwoCoverageReporter("Bar")
s1.do_something() # subsystem:Foo did something
s2.do_something_else() # subsystem2:Bar did something else
If you want to avoid repeating yourself, use the same solution you always would, wrap the repeated code in a function and parameterize the parts that can change:
def make_klass(subsystem):
class SubsystemCoverageReporter(object):
def __init__(self, *args, **kwargs):
self.reporter = CoverageReporter(subsystem, *args, **kwargs)
def __getattr__(self, name):
return getattr(self.reporter, name)
return SubsystemCoverageReporter
SubsystemCoverageReporter = make_klass("subsystem")
SubsystemTwoCoverageReporter = make_klass("subsystem2")
Note, the classes you create will have the same value for their __name__ attribute. That is hopefully not important, but you can get around it, e.g.:
SubsystemCoverageReporter.name = "SubsystemCoverageReporter"
SubsystemTwoCoverageReporter.name = "SubsystemTwoCoverageReporter"
Although that is ugly.
You could use the type constructor directly, or if metaclasses are an issue, use types.new_class instead and it will handle it correctly. But in the simplest case,
def make_klass(name, subsystem):
def __init__(self, *args, **kwargs):
self.reporter = CoverageReporter(subsystem, *args, **kwargs)
def __getattr__(self, name):
return getattr(self.reporter, name)
klass = type(name, (object,), dict(__init__=__init__, __getattr__=__getattr__))
return klass
SubsystemCoverageReporter = make_klass("SubsystemCoverageReporter", "subsystem")
SubsystemTwoCoverageReporter = make_klass("SubsystemTwoCoverageReporter", "subsytem2")
Although note, you are repeating yourself here a little.
But again, normally you shouldn't care about the __name__ attribute
When designing a class, I find that in the class methods, there are repeated steps are called each time when invoke the class method. For example:
class Queue(object):
def __init__(self):
self.connection = Connection()
def create(self, name):
self.probe = self.connection.plug()
self.probe.create(name)
self.probe.unplug()
def delete(self, name):
self.probe = self.connection.plug()
self.probe.delete(name)
self.probe.unplug()
And there are many methods require the similar steps to 'plug' and 'unplug' the 'probe'. In this design we need to 'plug' and 'unplug' the 'probe' each time we perform the actions.
Thus I am thinking about the wrap those functions by decorator to make the code looking less repeated.
class Queue(object):
def __init__(self):
self.connection = Connection()
def _with_plug(self, fn):
def wrapper(*args, **kwargs):
self.probe = self.connection.plug()
fn(*args, **kwargs)
self.probe.unplug()
#_with_plug
def create(self, name):
self.probe.create(name)
#_with_plug
def delete(self, name):
self.probe.delete(name)
But this strategy is not working. How could I use a method in the class to decorate other methods to perform such actions before and after calling a method?
Seems like a bit of muddled arguments to me:
file deco.py, say
def _with_plug(fn): # decorator takes exactly one argument, the function to wrap
print("wrapping", fn.__name__)
def wrapper(self, *args, **kwds):
print("wrapper called")
self.probe = [self.connection, ".plug()"]
fn(self, *args, **kwds)
self.probe.append(".unplug()")
return wrapper # decorator must return the wrapped function
class Queue(object):
def __init__(self):
self.connection = "Connection()"
#_with_plug
def create(self, name):
self.probe.append("create(name)")
#_with_plug
def delete(self, name):
self.probe.append("delete(name)")
Check:
>>> import deco
wrapping create
wrapping delete
>>> q = deco.Queue()
>>> q.create("name")
wrapper called
>>> q.probe
['Connection()', '.plug()', 'create(name)', '.unplug()']
Observe that the decorator function is called at definition time of the to-be-wrapped function, i.e. before the class definition is completed and long before the first instance is created. Therefore you can't reference self in the way you tried.
You should define your decorator function outside of the class body and your decorator function should return the wrapped function in order for it to work. Something like:
def _with_plug(fn):
def wrapper(self, *args, **kwargs):
self.probe = self.connection.plug()
fn(self, *args, **kwargs)
self.probe.unplug()
return wrapper
class Queue(object):
def __init__(self):
self.connection = Connection()
#_with_plug
def create(self, name):
self.probe.create(name)
#_with_plug
def delete(self, name):
self.probe.delete(name)
I am confused on why the code below does not work:
class ComparativeAnnotatorConfiguration(HashableNamespace):
"""
Takes the initial configuration from the main driver script and builds paths to all files that will be produced
by these tasks.
"""
def __init__(self, args, gene_set, query_genome_files, target_genome_files, annot_files, transmap):
self.work_dir = os.path.join(args.workDir, 'comparativeAnnotator', gene_set.sourceGenome, gene_set.geneSet)
self.metrics_dir = os.path.join(args.outputDir, 'metrics')
self.tx_set_dir = os.path.join(args.outputDir, 'tm_transcript_set')
self.reference = self.Reference(args, query_genome_files, annot_files, self.work_dir)
self.transmap = self.TransMap(args, query_genome_files, target_genome_files, annot_files, transmap, self.work_dir)
class Reference(HashableNamespace):
"""
The args object that will be passed directly to jobTree
"""
def __init__(self, args, query_genome_files, annot_files, out_dir):
self.__dict__.update(vars(args.jobTreeOptions))
self.outDir = out_dir
self.refGenome = query_genome_files.genome
self.refFasta = query_genome_files.genome_fasta
self.sizes = query_genome_files.chrom_sizes
self.annotationGp = annot_files.gp
self.gencodeAttributes = annot_files.attributes
self.mode = 'reference'
class TransMap(Reference):
"""
The args object that will be passed directly to jobTree
"""
def __init__(self, args, query_genome_files, target_genome_files, annot_files, transmap, out_dir):
super(self.__class__, self).Reference.__init__(self, args, query_genome_files, annot_files, out_dir)
self.genome = target_genome_files.genome
self.psl = transmap.psl
self.refPsl = annot_files.psl
self.targetGp = transmap.gp
self.fasta = target_genome_files.fasta
self.mode = 'transMap'
Attempting to instantiate leads to the error:
AttributeError: 'super' object has no attribute 'Reference'
I have tried different versions such as super(TransMap, self).Reference.__init__ and Reference.__init__, but all give different versions of a NameError. How is this different than the simple case outlined here:
Using super() in nested classes
You want this:
super(ComparativeAnnotatorConfiguration.TransMap, self).__init__(...)
This is a consequence of Python's class scoping rules: class variables are not in scope inside methods. This does not change just because your "variable" is itself a class. As far as Python is concerned, that's exactly the same thing.
In Python 3, you can write the far simpler:
super().__init__(...)
This is yet another reason to upgrade.
You could use super(ChildClass, self).__init__()
class BaseClass(object):
def __init__(self, *args, **kwargs):
pass
class ChildClass(BaseClass):
def __init__(self, *args, **kwargs):
super(ChildClass, self).__init__(*args, **kwargs)
Sample Inheritance and initializing parent constructor code :
class Car(object):
condition = "new"
def __init__(self, model, color, mpg):
self.model = model
self.color = color
self.mpg = mpg
class ElectricCar(Car):
def __init__(self, battery_type, model, color, mpg):
self.battery_type=battery_type
super(ElectricCar, self).__init__(model, color, mpg)
car = ElectricCar('battery', 'ford', 'golden', 10)
print car.__dict__
Here's the output:
{'color': 'golden', 'mpg': 10, 'model': 'ford', 'battery_type': 'battery'}
super(self.__class__, self).__init__()
will call the __init__ method of parent class.
If I write an inheritance relationship as follows:
class Pet(object):
def __init__(self, n):
print n
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, 5)
Then the output is 5. If, however, I wanted to do:
class Pet(object):
def __init__(self, **kwargs):
if not "n" in kwargs:
raise ValueError("Please specify the number I am to print")
print kwargs["n"]
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, kwargs)
Then I get the error TypeError: __init__() takes exactly one argument (two given)
How can I pass the extra arguments up the inheritance chain in this way?
Simply pass the arguments down as keyword arguments:
class Pet(object):
def __init__(self, **kwargs):
if not "n" in kwargs:
raise ValueError("Please specify the number I am to print")
print kwargs["n"]
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, **kwargs)
However, you should use super rather than hardcoding the superclass.
Change your definition of Dog to (Python 2.X):
class Dog(Pet):
def __init__(self, **kwargs):
super(Dog, self).__init__(**kwargs)
And in Python 3.X it's nicer, just:
class Dog(Pet):
def __init__(self, **kwargs):
super().__init__(**kwargs)
Figured it out: Use Pet.__init__(self, **kwargs)
It's called "unpacking" and it transforms the dictionary kwargs into a list of argument=value pairs.
Then **kwargs in the parent constructor is able to handle them. Just passing a dictionary would not work because **kwargs in the constructor is expecting a bunch of argument=value pairs but instead I was just passing it one value, the dictionary kwargs from the child.
I have a class:
class A(object):
def __init__(self, *args):
# impl
Also a "mixin", basically another class with some data and methods:
class Mixin(object):
def __init__(self):
self.data = []
def a_method(self):
# do something
Now I create a subclass of A with the mixin:
class AWithMixin(A, Mixin):
pass
My problem is that I want the constructors of A and Mixin both called. I considered giving AWithMixin a constructor of its own, in which the super was called, but the constructors of the super classes have different argument lists. What is the best resolution?
class A_1(object):
def __init__(self, *args, **kwargs):
print 'A_1 constructor'
super(A_1, self).__init__(*args, **kwargs)
class A_2(object):
def __init__(self, *args, **kwargs):
print 'A_2 constructor'
super(A_2, self).__init__(*args, **kwargs)
class B(A_1, A_2):
def __init__(self, *args, **kwargs):
super(B, self).__init__(*args, **kwargs)
print 'B constructor'
def main():
b = B()
return 0
if __name__ == '__main__':
main()
A_1 constructor
A_2 constructor
B constructor
I'm fairly new to OOP too, but what is the problem on this code:
class AWithMixin(A, Mixin):
def __init__(self, *args):
A.__init__(self, *args)
Mixin.__init__(self)