Set all classes within package as class attributes - python

Given the following package structure:
master.py
models/
__init__.py
model1.py
model2.py
model3.py
where each model*.py file contains one or more classes, how can I create a class in master.py that dynamically imports each of those classes and sets it as an attribute of itself? Currently I have the following:
class DB:
def __init__(self):
from models.model1 import foo, bar
from models.model2 import foo2
from models.model3 import bar2
self.foo = foo
self.bar = bar
self.foo2 = foo2
self.bar2 = bar2
but this requires me to state each imported module and class explicitly. I'd like the process to be automatic/soft-coded, so that if I add or remove models later, I won't need to update the DB class.

Well... I don't think it's the best way to do this — I would personally prefer to handle all the imports manually. But I'll assume you have your reasons and answer your question anyway.
First of all, you'll need the models to be a module, so add a file models/__init__.py before proceeding.
import inspect
import os
import sys
class DB(object):
def __init__(self):
# Get current file (`master.py`)'s directory and the `models` directory.
selfdir = os.path.dirname(os.path.abspath(__file__))
models_dir = os.path.join(selfdir, 'models')
# Get all files in `models` that are Python files.
models_files = [file for file in os.listdir(models_dir) if os.path.isfile(os.path.join(models_dir, file)) and file.endswith('.py')]
model_names = [model[:-3] for model in models_files]
for model_name in model_names:
# Exploit Python's ability to execute arbitrary code.
module_name = 'models.{name}'.format(name=model_name)
exec("import {module}".format(module=module_name))
module = sys.modules[module_name]
classes = inspect.getmembers(module, inspect.isclass)
for module_class in classes:
setattr(self, module_class[0], module_class[1])
I tested it (in 2.7.11) and it works great, so let me know if something doesn't work for you. There are a few assumption here that you'll just have to guarantee against in your codebase, probably via documentation. I didn't handle the IO exceptions I probably should've with the models directory, but you can add that I think. Additionally, somebody could probably craft a clever file name and force arbitrary code execution. So... be careful.

Related

get abstract class info declared in another module in base class module

UPDATE: I am having trouble getting the names of inherited/derived class names declared in other modules in the base class module.
Use case: I would like to create a common interface to call the subclasses from the base class.
Example:
#basemodule.py
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
def get_subclasses():
#I want derived class information here
for cls in list(BaseClass.__subclasses__):
# call subclasses here
# module1.py
import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
# module2.py
import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
Any suggestions are appreciated! Thanks :)
Your question is a little bit vague, and lacks important details, like the error you're getting, and a bit more context regarding your actual goals here. Having said that, I've noticed a couple of things that might be the cause of the problems you're having:
BaseClass.__subclasses__ is a method, therefore, you need to call it instead of accessing it, like you access a class property, or attribute. To do so, use BaseClass.__subclasses__() instead.
In order to your get_subclasses() function to work, you need to first import the subclasses. Otherwise python won't know which classes inherit from BaseClass.
Corrected code
Here's the correct implementation of get_subclasses() function, as mentioned above:
from basemodule import BaseClass
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
Example
Without importing module1, and module2
Here's an example of the output from get_subclasses() I receive, when I don't import the modules that host the subclasses:
Importing module1, and module2
When I import both modules that host the subclasses, you then get the output I think you're expecting:
Full-code
Here's the full code of the examples:
# my_pckg/basemodule.py
from abc import ABCMeta, abstractmethod
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
# ================================================
# my_pckg/module1.py
from my_pckg.basemodule import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
pass
# ================================================
# my_pckg/module2.py
from my_pckg.basemodule import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
pass
# ================================================
# my_pckg/test.ipynb
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
Important notes
The imports as shown in the example pictures won't work, if you're trying to use them from outside the parent module. In the example I just gave, here's the complete tree view of the entire package structure:
my_pckg
|______init__.py # <-- Needed to make my_pckg submodules "importable".
|____basemodule.py # <-- Hosts the BaseClass class.
|____module1.py # <-- Hosts the derivedClass1 subclass.
|____module2.py # <-- Hosts the derivedClass2 subclass.
|____Test.ipynb # <-- Where the test from the screenshots took place.
If you want to import these modules from outside the package you have two options:
Create a setup for you package, and pip install it (use the -e flag to install it in development mode).
Import sys, and add my_pckg path to the known paths.
import sys
sys.path.insert(0, './my_pckg')
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
For example:
Circular Imports
Do NOT import module1, and module2 inside basemodule, as this leads to a circular import. This happens because when you import basemodule, python will see that the module needs to import module1, and module2 and therefore goes to these modules. There, it finds out that both actually require basemodule themselves, so it goes back to basemodule. You can see that this becomes an infinite circle, where no module is able to be imported. To overcome this, place get_subclasses() function in a separate module, alongside all your necessary imports, like the example pictures.

How to set up python package/module directory and class definition for intuitive naming

I've been searching for this for ages and can't find anything. I'm an experienced programmer, but recently switched to Python.
In short, I want to expand/evolve an object-oriented library of related classes into a namespace with multiple separate files (due to complexity and the desire for increased modularity) :
Currently, I might have :
# file lib/NameSpace.py
class Foo(object):
def __init__(self):
....
class Bar(object):
def __init__(self):
....
class Baz(object):
def __init__(self):
....
And with this my program can do this:
import NameSpace
a = NameSpace.Foo()
b = NameSpace.Bar()
Now, these classes are getting complicated and Foo and Bar are functionally different though related in the conceptual NameSpace, so I want to move them to separate files within the namespace but otherwise keep my code library the same.
So, I think I want a file structure like this:
lib/NameSpace
lib/NameSpace/__init__.py
lib/NameSpace/Foo.py
lib/NameSpace/Bar.py
lib/NameSpace/Baz.py
But this would require me to change all the runtime code to initialize these as so:
import NameSpace.Foo
a = NameSpace.Foo.Foo()
# ***Boo.**** Why u not like 'a = NameSpace.Foo()'?
So, how I do structure these things to not have to add the 'Foo' class name to the module 'Foo'? I could accomplish this by editing init.py to be a factory, like so:
#lib/NameSpace/__init__.py
import NameSpace.Foo as NSF
def Foo(*args, **kwargs):
return(NSF.Foo(*args,**kwargs))
But that just seems more inelegant than I expect from Python. Is there a better way to do it?
In your __init__.py
from .Foo import Foo
from .Bar import Bar
from .Baz import Baz
naming the files differently than your classes prevents the overwrite.
For simplicity, you could import the Foo class from the module just like this:
from NameSpace.Foo import Foo
After that, you can create an object of Foo class like this:
a = Foo()
If you like, you can provide a different name when you import things too:
from NameSpace.Foo import Foo as F
a = F()

Can I build an automatic class-factory that works on import?

I have a class-factory F that generates classes. It takes no arguments other than a name. I'd like to be able to wrap this method and use it like this:
from myproject.myfactory.virtualmodule import Foo
"myfactory" is a real module in the project, but I want virtualmodule to be something that pretends to be a module.
Whenever I import something from virtualmodule I want it to build a new class using my factory method and make it look as if that was imported.
Can this be done? Is there a pattern that will allow me to wrap a class-factory as a module?
Thanks!
--
UPDATE0: Why do I need this? It's actually to test a process that will be run on a grid which requires that all classes should be importable.
An instance of the auto-generated class will be serialized on my PC and then unserialized on each of the grid nodes. If the class cannot be imported on the grid node the unserialize will fail.
If I hijack the import mechanism as an interface for making my test-classes, then I know for sure that anything I can import on my PC can be re-created exactly the same on the grid. That will satisfy my test's requirements.
You can stuff an arbitrary object into the sys.modules structure:
import sys
class VirtualModule(object):
def __init__(self, name):
self.__name__ = name.rsplit('.', 1)[-1]
self.__package__ = name
self.__loader__ = None
def __getattr__(self, name):
if name is valid:
# Return dynamic classes here
return VirtualClass(name)
raise AttributeError(name)
virtual_module_name = 'myproject.myfactory.virtualmodule'
sys.modules[virtual_module_name] = VirtualModule(virtual_module_name)
The Python import machinery will look up objects using attribute access, triggering the __getattr__ method on your VirtualModule instance.
Do this in the myproject/__init__.py or myproject/myfactory/__init__.py file and you are all set to go. The myproject.myfactory package does need to exist for the import machinery to find the myproject.myfactory.virtualmodule object.

Python Module Level Constants

I am creating my first rather complex module, and like the wx module, I wish to provide a variety of constants to ease the programmers task of setting styles.
assuming the following directory structure:
Module
+-__init__py
+- Frame.py
and the contents of init.py
__ALL__=["Frame.py"]
FRAME_DEFAULT_SIZE = (640, 480)
FRAME_DEFAULT_TITLE = "Some Simple Title"
then in Frame.py
class Frame(object)
"""Some docstrings go here
"""
def __init__(self, parent, ID=-1,title=HOW DO I REFERENCE MY CONSTANT,
size = HOW DO I REFERENCE THIS CONSTANT):
etc ...
If I were to import my module for use in a piece of program code the constants would be available as Module.FRAME_DEFAULT_SIZE, and Module.FRAME_DEFAULT_TITLE respectively.
But how do i reference them as part of a class definition which is supposed to be in the modules name-space, when they are defined in a separate file?
I realize that constants (if named well and used well) should apply only to a parent class and it's children, so I could instantiate them in the Frame.py file. But how does one do it this way?
Any help appreciated here.
Import Module and access it off that. That part doesn't change just because you're in a submodule.

Laying out MVC classes in Python

I'm working on a project in Python, and I'm trying to follow a somewhat-strict MVC pattern for practice. My idea was to separate things out into packages named model, view and controller, plus have a Model, View and Controller class for each. Each package would have the supporting files for each piece.
My problem is that I want to be able to use them like:
from controller import Controller
And then in the file with the Controller class I can:
from controller.someclass import SomeClass
But if I put them in packages with the same name, I get problems. I read up about how modules work, and realized I needed to name them controller/__init__.py, model/__init__.py and view/__init__.py, but it seems weird to put them in that file, and it's kind of annoying that all of them show up gedit as __init__.py
Is there any better way of doing this? Am I going about this the right way?
I've seen some black magic in the django source that pulls classes from a base.py file into the __init__.py namespace. However I'm not sure how that's done. ( See comments for how to do this. )
From what I do know, you can do one of two things.
A -
inside bar/controller/__init__.py
import os,sys
# Make sure the interpreter knows were your files are.
sys.path.append(os.path.join(os.path.dirname(__file__),'../')
from bar.controller import Controller
from bar.model import Model
from bar.view import View
class Controller(object):
model = Model()
view = View()
And now you make bar/model/__init__.py and bar/view/__init__.py
B -
inside bar/controller/__init__.py
class Model(object):
pass
class View(object):
pass
class Controller(object):
model = Model()
view = View()
Edit:...
After reading your comment, a third option comes to mind. A package doesn't litertly translate into a module in python. I think your desired result is to create a directory structure like this:
bar/
__init__.py
controller.py
model.py
view.py
Then inside controller.py
import os,sys
from bar.controller import Controller
from bar.model import Model
from bar.view import View
class Controller(object):
model = Model()
view = View()
This was a huge hurdle for me to get coming from java. Your class file names do not have to match the class name. Think of them as a step, you step into the folder(module) and then into the file(.py) and then you import your class.(Model(object))
If I understand correctly, all you're interested in doing here is having this happen:
from controller import Controller
without having the Controller class defined in controller/__init__.py is that right?
If so, then just do this:
In controller/base.py: (notice there is a file called base.py or something else)
class Controller(BaseClass):
# define Controller here
In controller/__init__.py:
from base import Controller
Now you can have the exact syntax you are looking for.

Categories