See full gist here
Consider the case where we have a simple metaclass that generates the __init__ method for a class
class TestType(type):
def __new__(cls, cname, bases, attrs):
# Dynamically create the __init__ function
def init(self, message):
self.message = message
# Assign the created function as the __init__ method.
attrs['__init__'] = init
# Create the class.
return super().__new__(cls, cname, bases, attrs)
class Test(metaclass=TestType):
def get_message(self):
return self.message
Now this is all good and well to use
test = Test('hello')
assert test.get_message() == 'hello'
But we have problems when subclassing, because if you want to subclass the __init__ method what of course happens is the subclassed method just gets overwritten.
class SubTest(Test):
def __init__(self, first, second):
self.first = first
self.second = second
super().__init__(first + ' ' second)
subtest = SubTest('hello', 'there')
This will obviously give the
TypeError: init() takes 2 positional arguments but 3 were given
The only way I can think to solve this is to create an intermediate class in the __new__ method of the metaclass and make this the base for the class we are creating. But I can't get this to work, I tried something like this
class TestType(type):
def __new__(cls, cname, bases, attrs):
# Dynamically create the __init__ function
def init(self, message):
self.message = message
# If the __init__ method is being subclassed
if '__init__' in attrs:
# Store the subclass __init__
sub_init = attrs.pop('__init__')
# Assign the created function as the __init__ method.
attrs['__init__'] = init
# Create an intermediate class to become the base.
interm_base = type(cname + 'Intermediate', bases, attrs)
# Add the intermediate class as our base.
bases = (interm_base,)
# Assign the subclass __init__ as the __init__ method.
attrs['__init__'] = sub_init
else:
# Assign the created function as the __init__ method.
attrs['__init__'] = init
# Create the class.
return super().__new__(cls, cname, bases, attrs)
But this gives me recursion error
RecursionError: maximum recursion depth exceeded while calling a Python object
The infinite recursion is caused by the fact that the type constructor can return an instance of your metaclass.
In this line here:
interm_base = type(cname + 'Intermediate', bases, attrs)
If any of the base classes in bases is an instance of TestType, then the subclass will also be an instance of TestType. That is why Test can be created with no problems, but SubTest causes infinite recursion.
The fix is simple: Create the intermediate class without an __init__ attribute. That way if '__init__' in attrs: will be False, and the endless recursion is avoided.
class TestType(type):
def __new__(cls, cname, bases, attrs):
# Dynamically create the __init__ function
def init(self, message):
self.message = message
# If the __init__ method is being subclassed
if '__init__' in attrs:
# Create an intermediate class to become the base.
interm_base = type(cname + 'Intermediate', bases, {})
# Add the intermediate class as our base.
bases = (interm_base,)
else:
# Assign the created function as the __init__ method.
attrs['__init__'] = init
# Create the class.
return super().__new__(cls, cname, bases, attrs)
Related
So I have super(MyMeta, cls).__new__(cls, name, bases, attr) to call the default __new__ function? Can I do similar for __init__? To call the default __init__ function for a metaclass.
class MyMeta(type):
def __new__(cls, name, bases, attr): # real signature unknown
print("MyMeta.__new__ called")
return super(MyMeta, cls).__new__(cls, name, bases, attr)
def __init__(cls, name, bases, namespace, **kwargs):
print("MyMeta.__init__ called")
### how to call the default __init__()?
The correct way is super().__init__(name, bases, namespace, **kwargs) - super() will include the first argument (cls) automatically.
Also, there is no need to pass any parameters to super() (as there is no such need inside __new__): these are filled in automatically by the runtime.
When doing super().__new__(...) one have to explicitly include the first argument (the metaclass itself), because __new__ is a static method, and super does not add any parameters automatically.
For sake of completeness: the default metaclass implementation of __init__ (type.__init__) does nothing, and might just not be called at all. If you do include a __init__ method in a metaclass, though, it is important to call it with super() so that your metaclass can be used cooperatively with other metaclasses in case it is needed.
class MetaClass(type):
"""
Class is object. By default, "type" created the in memory class object.
Instance is object. Class created the in memory instance object.
"""
meta_counter = 0
def __new__(cls, name, bases, attr):
print("***MyMeta.__new__(): control class object creation")
cls_object = super(MetaClass, cls).__new__(cls, name, bases, attr)
return cls_object
def __init__(cls, name, bases, namespace, **kwargs):
print("***MyMeta.__init__() class-wide initializer")
cls.cls_order = MetaClass.meta_counter
MetaClass.meta_counter += 1
class AClass(metaclass=MetaClass):
def __init__(self):
print("AClass.__init__()")
class BClass(metaclass=MetaClass):
pass
class XClass:
def __new__(cls, name, *args, **kwargs):
print("ZClass.__new__(): constructor")
instance = super(XClass, cls).__new__(cls, *args, **kwargs)
return instance
def __init__(self, *args, **kwargs):
print("ZClass.__init__(): instance-wide initializer")
self.id = 1
print("\n\n---> ")
a = AClass()
print(a.cls_order)
b = BClass()
print(b.cls_order)
print(MetaClass.meta_counter)
print("\n\n***> ")
XClass('my-name')
class Remote:
aa=7
def __init__(self):
self.name="Lenovo"
self.b=self.Battery()
print("this is outer",self.b.t)
class Battery:
def __init__(self):
self.name="Hp"
self.t="df"
self.c=self.Cover()
class Cover:
def __init__(self):
self.name="Arplastic"
c1=Remote()
I knew today about inner class but i don't know how to i access properties and methods of outer class into inner class please let me know anyone.
Change the constructor(s) of the inner class(es) to accept a parent argument and have the creating instance pass itself to it:
class Remote:
aa=7
def __init__(self):
self.name="Lenovo"
self.b=self.Battery(self)
print("this is outer",self.b.t)
class Battery:
def __init__(self,parent):
self.name="Hp"
self.t="df"
self.c=self.Cover(self)
self.parent=parent
class Cover:
def __init__(self,parent):
self.name="Arplastic"
self.parent=parent
c1=Remote()
print(c1.b.c.parent.parent.name) # prints 'Lenovo'
One approach is to make a metaclass that automatically creates self.parent attributes for nested classes. Note that there is a trade-off between readability and boilerplate here - many programmers would rather you just manually pass parents as arguments and add them to __init__ methods. This is more fun though, and there is something to be said for having less cluttered code.
Here is the code:
import inspect
def inner_class(cls):
cls.__is_inner_class__ = True
return cls
class NestedClass(type):
def __new__(metacls, name, bases, attrs, parent=None):
attrs = dict(attrs.items())
super_getattribute = attrs.get('__getattribute__', object.__getattribute__)
inner_class_cache = {}
def __getattribute__(self, attr):
val = super_getattribute(self, attr)
if inspect.isclass(val) and getattr(val, '__is_inner_class__', False):
if (self, val) not in inner_class_cache:
inner_class_cache[self, val] = NestedClass(val.__name__, val.__bases__, val.__dict__, parent=self)
return inner_class_cache[self, val]
else:
return val
attrs['__getattribute__'] = __getattribute__
attrs['parent'] = parent
return type(name, bases, attrs)
class Remote(metaclass=NestedClass):
aa = 7
def __init__(self):
self.name = "Lenovo"
self.b = self.Battery()
print("this is outer", self.b.t)
#inner_class
class Battery:
def __init__(self):
self.name = "Hp"
self.t = "df"
self.c = self.Cover()
#inner_class
class Cover:
def __init__(self):
self.name = "Arplastic"
print(f'{self.parent=}, {self.parent.parent=}')
c1 = Remote()
print(f'{c1.b.c.parent.parent is c1=}')
print(f'{isinstance(c1.b, c1.Battery)=}')
Output:
self.parent=<__main__.Battery object at 0x7f11e74936a0>, self.parent.parent=<__main__.Remote object at 0x7f11e7493730>
this is outer df
c1.b.c.parent.parent is c1=True
isinstance(c1.b, c1.Battery)=True
The way this works is by storing the parent as a class attribute (which is None by default), and replacing the __getattribute__ method so that all inner classes are replaced with NestedClasses with the parent attribute correctly filled in.
The inner_class decorator is used to mark a class as an inner class by setting the __is_inner_class__ attribute.
def inner_class(cls):
cls.__is_inner_class__ = True
return cls
This is not strictly necessary if all attributes that are classes should be treated as inner classes, but it's good practice to do something like this to prevent Bar.foo being treated as an inner class in this example:
class Foo:
pass
class Bar(metaclass=NestedClass):
foo = Foo
All the NestedClass metaclass does is take the description of the class and modify it, adding the parent attribute:
class NestedClass(type):
def __new__(metacls, name, bases, attrs, parent=None):
attrs = dict(attrs.items())
...
attrs['parent'] = parent
return type(name, bases, attrs)
...and modifying the __getattribute__ method. The __getattribute__ method is a special method that gets called every time an attribute is accessed. For example:
class Foo:
def __init__(self):
self.bar = "baz"
def __getattribute__(self, item):
return 1
foo = Foo()
# these assert statements pass because even though `foo.bar` is set to "baz" and `foo.remote` doesn't exist, accessing either of them is the same as calling `Foo.__getattribute(foo, ...)`
assert foo.bar == 1
assert foo.remote == 1
So, by modifying the __getattribute__ method, you can make accessing self.Battery return a class that has its parent attribute equal to self, and also make it into a nested class:
class NestedClass(type):
def __new__(metacls, name, bases, attrs, parent=None):
attrs = dict(attrs.items())
# get the previous __getattribute__ in case it was not the default one
super_getattribute = attrs.get('__getattribute__', object.__getattribute__)
inner_class_cache = {}
def __getattribute__(self, attr):
# get the attribute
val = super_getattribute(self, attr)
if inspect.isclass(val) and getattr(val, '__is_inner_class__', False):
# if it is an inner class, then make a new version of it using the NestedClass metaclass, setting the parent attribute
if (self, val) not in inner_class_cache:
inner_class_cache[self, val] = NestedClass(val.__name__, val.__bases__, val.__dict__, parent=self)
return inner_class_cache[self, val]
else:
return val
attrs['__getattribute__'] = __getattribute__
attrs['parent'] = parent
return type(name, bases, attrs)
Note that a cache is used to ensure that self.Battery will always return the same object every time rather than re-making the class every time it is called. This ensures that checks like isinstance(c1.b, c1.Battery) work correctly, since otherwise c1.Battery would return a different object to the one used to create c1.b, causing this to return False, when it should return True.
And that's it! You can now enjoy nested classes without boilerplate!
Is there a way to wrap all methods of a superclass, if I can't change its code?
As a minimal working example, consider this base class Base, which has many methods that return a new instance of itself, and the descendent class Child
class Base:
def __init__(self, val):
self.val = val
def newinst_addseven(self):
return Base(self.val + 7)
def newinst_timestwo(self):
return Base(self.val * 2)
# ...
class Child(Base):
#property
def sqrt(self):
return math.sqrt(self.val)
The issue here is that calling childinstance.newinst_addseven() returns an instance of Base, instead of Child.
Is there a way to wrap the Base class's methods to force a return value of the type Child?
With something like this wrapper:
def force_child_i(result):
"""Turn Base instance into Child instance."""
if type(result) is Base:
return Child(result.val)
return result
def force_child_f(fun):
"""Turn from Base- to Child-instance-returning function."""
def wrapper(*args, **kwargs):
result = fun(*args, **kwargs)
return force_child_i(result)
return wrapper
Many thanks!
PS: What I currently do, is look at Base's source code and add the methods to Child directly, which is not very mainainable:
Child.newinst_addseven = force_child_f(Base.newinst_addseven)
Child.newinst_timestwo = force_child_f(Base.newinst_timestwo)
One option is to use a metaclass:
class ChildMeta(type):
def __new__(cls, name, bases, dct):
child = super().__new__(cls, name, bases, dct)
for base in bases:
for field_name, field in base.__dict__.items():
if callable(field):
setattr(child, field_name, force_child(field))
return child
class Child(Base, metaclass=ChildMeta):
pass
It will automatically wrap all the Bases methods with your force_child decorator.
I want to redefine a __metaclass__ but I want to fall back to the metaclass which would have been used if I hadn't redefined.
class ComponentMetaClass(type):
def __new__(cls, name, bases, dct):
return <insert_prev_here>.__new__(cls, name, bases, dct)
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
From what I understand, the __metaclass__ used by default goes through the process of checking for a definition in the scope of the class, then in the bases and then in global. Normally you would use type in the redefinition and that is usually the global one, however, my OtherObjects, may have redefined the __metaclass__. So in using type, I would ignore their definition and they wouldn't run, right?
edit: note that I don't know what OtherObjects are until runtime
As #unutbu puts it: "Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects."
Which means your problem is a bit more complicated than you though first - Not only you have to call the proper metaclass from the base classes, but your current metaclass has to properly inherit from then as well.
(hack some code, confront strange behavior, come back 90 min later)
It was tricky indeed - I had to create a class that receives the desired metaclass as a parameter, and which __call__ method generates dynamically a new metaclass, modifying its bases and adding a __superclass attribute to it.
But this should do what you want and some more - you just have to inherit all your metaclasses from BaseComponableMeta and call the superclasses in the hyerarchy through the metaclass "__superclass" attribute:
from itertools import chain
class Meta1(type):
def __new__(metacls, name, bases, dct):
print name
return type.__new__(metacls, name, bases, dct)
class BaseComponableMeta(type):
def __new__(metacls, *args, **kw):
return metacls.__superclass.__new__(metacls, *args, **kw)
class ComponentMeta(object):
def __init__(self, metaclass):
self.metaclass = metaclass
def __call__(self, name, bases,dct):
#retrieves the deepest previous metaclass in the object hierarchy
bases_list = sorted ((cls for cls in chain(*(base.mro() for base in bases)))
, key=lambda s: len(type.mro(s.__class__)))
previous_metaclass = bases_list[-1].__class__
# Adds the "__superclass" attribute to the metaclass, so that it can call
# its bases:
metaclass_dict = dict(self.metaclass.__dict__).copy()
new_metaclass_name = self.metaclass.__name__
metaclass_dict["_%s__superclass" % new_metaclass_name] = previous_metaclass
#dynamicaly generates a new metaclass for this class:
new_metaclass = type(new_metaclass_name, (previous_metaclass, ), metaclass_dict)
return new_metaclass(name, bases, dct)
# From here on, example usage:
class ComponableMeta(BaseComponableMeta):
pass
class NewComponableMeta_1(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 1"
return metacls.__superclass.__new__(metacls, *args)
class NewComponableMeta_2(BaseComponableMeta):
def __new__(metacls, *args):
print "Overriding the previous metaclass part 2"
return metacls.__superclass.__new__(metacls, *args)
class A(object):
__metaclass__ = Meta1
class B(A):
__metaclass__ = ComponentMeta(ComponableMeta)
# trying multiple inheritance, and subclassing the metaclass once:
class C(B, A):
__metaclass__ = ComponentMeta(NewComponableMeta_1)
# Adding a third metaclass to the chain:
class D(C):
__metaclass__ = ComponentMeta(NewComponableMeta_2)
# class with a "do nothing" metaclass, which calls its bases metaclasses:
class E(D):
__metaclass__ = ComponentMeta(ComponableMeta)
Within one class hierarchy, metaclasses must be subclasses of each other. That is, the metaclass of Component must be a subclass of the metaclass of OtherObjects.
If you don't name a __metaclass__ for Component then the metaclass of OtherObjects will be used by default.
If ComponentMetaClass and OtherObjectsMeta both inherit (independently) from type:
class OtherObjectsMeta(type): pass
class ComponentMetaClass(type): pass
class OtherObjects(object):
__metaclass__ = OtherObjectsMeta
class Component(OtherObjects):
__metaclass__ = ComponentMetaClass
then you get this error:
TypeError: Error when calling the metaclass bases
metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
but if you make ComponentMetaClass as subclass of OtherObjectsMeta
class ComponentMetaClass(OtherObjectsMeta): pass
then the error goes away.
Perhaps I misread your question. If want ComponentMetaClass.__new__ to call OtherObjectsMeta.__new__, then use super:
class OtherObjectsMeta(type):
def __new__(meta, name, bases, dct):
print('OtherObjectsMeta')
return super(OtherObjectsMeta,meta).__new__(meta,name,bases,dct)
class ComponentMetaClass(OtherObjectsMeta):
def __new__(meta, name, bases, dct):
print('ComponentMetaClass')
return super(ComponentMetaClass,meta).__new__(meta,name,bases,dct)
Regarding an alternative to using metaclasses, mentioned in the comments. Use super:
class Base(object):
def method(self): pass
class Base1(Base):
def method(self):
print('Base1')
super(Base1,self).method()
class Base2(Base):
def method(self):
print('Base2')
super(Base2,self).method()
class Component(Base1,Base2):
pass
c = Component()
c.method()
Is it possible to chain metaclasses?
I have class Model which uses __metaclass__=ModelBase to process its namespace dict. I'm going to inherit from it and "bind" another metaclass so it won't shade the original one.
First approach is to subclass class MyModelBase(ModelBase):
MyModel(Model):
__metaclass__ = MyModelBase # inherits from `ModelBase`
But is it possible just to chain them like mixins, without explicit subclassing? Something like
class MyModel(Model):
__metaclass__ = (MyMixin, super(Model).__metaclass__)
... or even better: create a MixIn that will use __metaclass__ from the direct parent of the class that uses it:
class MyModel(Model):
__metaclass__ = MyMetaMixin, # Automagically uses `Model.__metaclass__`
The reason: For more flexibility in extending existing apps, I want to create a global mechanism for hooking into the process of Model, Form, ... definitions in Django so it can be changed at runtime.
A common mechanism would be much better than implementing multiple metaclasses with callback mixins.
With your help I finally managed to come up to a solution: metaclass MetaProxy.
The idea is: create a metaclass that invokes a callback to modify the namespace of the class being created, then, with the help of __new__, mutate into a metaclass of one of the parents
#!/usr/bin/env python
#-*- coding: utf-8 -*-
# Magical metaclass
class MetaProxy(type):
""" Decorate the class being created & preserve __metaclass__ of the parent
It executes two callbacks: before & after creation of a class,
that allows you to decorate them.
Between two callbacks, it tries to locate any `__metaclass__`
in the parents (sorted in MRO).
If found — with the help of `__new__` method it
mutates to the found base metaclass.
If not found — it just instantiates the given class.
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Decorate a class before creation """
return (name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Decorate a class after creation """
return newclass
#classmethod
def _mrobases(cls, bases):
""" Expand tuple of base-classes ``bases`` in MRO """
mrobases = []
for base in bases:
if base is not None: # We don't like `None` :)
mrobases.extend(base.mro())
return mrobases
#classmethod
def _find_parent_metaclass(cls, mrobases):
""" Find any __metaclass__ callable in ``mrobases`` """
for base in mrobases:
if hasattr(base, '__metaclass__'):
metacls = base.__metaclass__
if metacls and not issubclass(metacls, cls): # don't call self again
return metacls#(name, bases, attrs)
# Not found: use `type`
return lambda name,bases,attrs: type.__new__(type, name, bases, attrs)
def __new__(cls, name, bases, attrs):
mrobases = cls._mrobases(bases)
name, bases, attrs = cls.pre_new(name, bases, attrs) # Decorate, pre-creation
newclass = cls._find_parent_metaclass(mrobases)(name, bases, attrs)
return cls.post_new(newclass) # Decorate, post-creation
# Testing
if __name__ == '__main__':
# Original classes. We won't touch them
class ModelMeta(type):
def __new__(cls, name, bases, attrs):
attrs['parentmeta'] = name
return super(ModelMeta, cls).__new__(cls, name, bases, attrs)
class Model(object):
__metaclass__ = ModelMeta
# Try to subclass me but don't forget about `ModelMeta`
# Decorator metaclass
class MyMeta(MetaProxy):
""" Decorate a class
Being a subclass of `MetaProxyDecorator`,
it will call base metaclasses after decorating
"""
#classmethod
def pre_new(cls, name, bases, attrs):
""" Set `washere` to classname """
attrs['washere'] = name
return super(MyMeta, cls).pre_new(name, bases, attrs)
#classmethod
def post_new(cls, newclass):
""" Append '!' to `.washere` """
newclass.washere += '!'
return super(MyMeta, cls).post_new(newclass)
# Here goes the inheritance...
class MyModel(Model):
__metaclass__ = MyMeta
a=1
class MyNewModel(MyModel):
__metaclass__ = MyMeta # Still have to declare it: __metaclass__ do not inherit
a=2
class MyNewNewModel(MyNewModel):
# Will use the original ModelMeta
a=3
class A(object):
__metaclass__ = MyMeta # No __metaclass__ in parents: just instantiate
a=4
class B(A):
pass # MyMeta is not called until specified explicitly
# Make sure we did everything right
assert MyModel.a == 1
assert MyNewModel.a == 2
assert MyNewNewModel.a == 3
assert A.a == 4
# Make sure callback() worked
assert hasattr(MyModel, 'washere')
assert hasattr(MyNewModel, 'washere')
assert hasattr(MyNewNewModel, 'washere') # inherited
assert hasattr(A, 'washere')
assert MyModel.washere == 'MyModel!'
assert MyNewModel.washere == 'MyNewModel!'
assert MyNewNewModel.washere == 'MyNewModel!' # inherited, so unchanged
assert A.washere == 'A!'
A type can have only one metaclass, because a metaclass simply states what the class statement does - having more than one would make no sense. For the same reason "chaining" makes no sense: the first metaclass creates the type, so what is the 2nd supposed to do?
You will have to merge the two metaclasses (just like with any other class). But that can be tricky, especially if you don't really know what they do.
class MyModelBase(type):
def __new__(cls, name, bases, attr):
attr['MyModelBase'] = 'was here'
return type.__new__(cls,name, bases, attr)
class MyMixin(type):
def __new__(cls, name, bases, attr):
attr['MyMixin'] = 'was here'
return type.__new__(cls, name, bases, attr)
class ChainedMeta(MyModelBase, MyMixin):
def __init__(cls, name, bases, attr):
# call both parents
MyModelBase.__init__(cls,name, bases, attr)
MyMixin.__init__(cls,name, bases, attr)
def __new__(cls, name, bases, attr):
# so, how is the new type supposed to look?
# maybe create the first
t1 = MyModelBase.__new__(cls, name, bases, attr)
# and pass it's data on to the next?
name = t1.__name__
bases = tuple(t1.mro())
attr = t1.__dict__.copy()
t2 = MyMixin.__new__(cls, name, bases, attr)
return t2
class Model(object):
__metaclass__ = MyModelBase # inherits from `ModelBase`
class MyModel(Model):
__metaclass__ = ChainedMeta
print MyModel.MyModelBase
print MyModel.MyMixin
As you can see this is involves some guesswork already, since you don't really know what the other metaclasses do. If both metaclasses are really simple this might work, but I wouldn't have too much confidence in a solution like this.
Writing a metaclass for metaclasses that merges multiple bases is left as an exercise to the reader ;-P
I don't know any way to "mix" metaclasses, but you can inherit and override them just like you would normal classes.
Say I've got a BaseModel:
class BaseModel(object):
__metaclass__ = Blah
and you now you want to inherit this in a new class called MyModel, but you want to insert some additional functionality into the metaclass, but otherwise leave the original functionality intact. To do that, you'd do something like:
class MyModelMetaClass(BaseModel.__metaclass__):
def __init__(cls, *args, **kwargs):
do_custom_stuff()
super(MyModelMetaClass, cls).__init__(*args, **kwargs)
do_more_custom_stuff()
class MyModel(BaseModel):
__metaclass__ = MyModelMetaClass
I don't think you can chain them like that, and I don't know how that would work either.
But you can make new metaclasses during runtime and use them. But that's a horrid hack. :)
zope.interface does something similar, it has an advisor metaclass, that will just do some things to the class after construction. If there was a metclass already, one of the things it will do it set that previous metaclass as the metaclass once it's finished.
(However, avoid doing these kinds of things unless you have to, or think it's fun.)
Adding to the answer by #jochenritzel, the following simplifies the combining step:
def combine_classes(*args):
name = "".join(a.__name__ for a in args)
return type(name, args, {})
class ABCSomething(object, metaclass=combine_classes(SomethingMeta, ABCMeta)):
pass
Here, type(name, bases, dict) works like a dynamic class statement (see docs). Surprisingly, there doesn't seem to be a way to use the dict argument for setting the metaclass in the second step. Otherwise one could simplify the whole process down to a single function call.