Intercepting module calls? - python

I'm trying to 'intercept' all calls to a specific module, and reroute them to another object. I'd like to do this so that I can have a fairly simple plugin architecture.
For example, in main.py
import renderer
renderer.draw('circle')
In renderer.py
specificRenderer = OpenGLRenderer()
#Then, i'd like to route all calls from main.py so that
#specificRenderer.methodName(methodArgs) is called
# i.e. the above example would call specificRenderer.draw('circle')
This means that any function can just import renderer and use it, without worrying about the details. It also means that I can completely change the renderer just by creating another object and assigning it to the 'specificRenderer' value in renderer.py
Any ideas?

In renderer.py:
import sys
if __name__ != "__main__":
sys.modules[__name__] = OpenGLRenderer()
The module name is now mapped to the OpenGLRenderer instance, and import renderer in other modules will get the same instance.
Actually, you don't even need the separate module. You can just do:
import sys
sys.modules["renderer"] = OpenGLRenderer()
import renderer # gives current module access to the "module"
... first thing in your main module. Imports of renderer in other modules, once again, will refer to the same instance.
Are you sure you really want to do this in the first place? It isn't really how people expect modules to behave.

The simplest way to do that is to have main.py do
from renderer import renderer
instead, then just name specificRenderer renderer.

My answer is very similar to #kindall's although I got the idea elsewhere. It goes a step further in the sense that it replaces the module object that's usually put in the sys.modules list with an instance of a class of your own design. At a minimum such a class would need to look something like this:
File renderer.py:
class _renderer(object):
def __init__(self, specificRenderer):
self.specificRenderer = specificRenderer
def __getattr__(self, name):
return getattr(self.specificRenderer, name)
if __name__ != '__main__':
import sys
# from some_module import OpenGLRenderer
sys.modules[__name__] = _renderer(OpenGLRenderer())
The __getattr__() method simply forwards most attribute accesses on to the real renderer object. The advantage to this level of indirection is that with it you can add your own attributes to the private _renderer class and access them through the renderer object imported just as though they were part of an OpenGLRenderer object. If you give them the same name as something already in an OpenGLRenderer object, they will be called instead, are free to forward, log, ignore, and/or modify the call before passing it along -- which can sometimes be very handy.
Class instances placed in sys.modules are effectively singletons, so if the module is imported in other scripts in the application, they will all share the single instance created by the first one.

If you don't mind that import renderer results an object rather than a module, then see kindall's brilliant solution.
If you want to make #property work (i.e. each time you fetch renderer.mytime, you want the function corresponding to OpenGLRenderer.mytime get called) and you want to keep renderer as a module, then it's impossible. Example:
import time
class OpenGLRenderer(object):
#property
def mytime(self):
return time.time()
If you don't care about properties, i.e. it's OK for you that mytime gets called only once (at module load time), and it will keep returning the same timestamp, then it's possible to do it by copying all symbols from the object to the module:
# renderer.py
specificRenderer = OpenGLRenderer()
for name in dir(specificRenderer):
globals()[name] = getattr(specificRenderer, name)
However, this is a one-time copy. If you add methods or other attributes to specificRenderer later dynamically, or change some attributes later, then they won't be automatically copied to the renderer module. This can be fixed, however, by some ugly __setattr__ hacking.

Edit: This answer does not do what the OP wants; it doesn't instantiate an object and then let calls to a module be redirected to that same object. This answer is about changing which rendering module is being used.
Easiest might be to import the OpenGLRenderer in the main.py program like this:
import OpenGLRenderer as renderer
That's code in just one place, and in the rest of your module OpenGLRenderer can be referred to as renderer.
If you have several modules like main.py, you could have your renderer.py file be just the same line:
import OpenGLRenderer as renderer
and then other modules can use
from renderer import renderer
If OpenGLRenderer doesn't quite quack right yet, you can monkeypatch it to work as you need in the renderer.py module.

Related

Does Python import copy all the code into the file

When we import a module in a Python script, does this copy all the required code into the script, or does it just let the script know where to find it?
What happens if we don't use the module then in the code, does it get optimized out somehow, like in C/C++?
None of those things are the case.
An import does two things. First, if the requested module has not previously been loaded, the import loads the module. This mostly boils down to creating a new global scope and executing the module's code in that scope to initialize the module. The new global scope is used as the module's attributes, as well as for global variable lookup for any code in the module.
Second, the import binds whatever names were requested. import whatever binds the whatever name to the whatever module object. import whatever.thing also binds the whatever name to the whatever module object. from whatever import somefunc looks up the somefunc attribute on the whatever module object and binds the somefunc name to whatever the attribute lookup finds.
Unused imports cannot be optimized out, because both the module loading and the name binding have effects that some other code might be relying on.

How to convert a "custom class"-based singleton object programmatically into a python module?

I would like to convert a singleton-object programmatically into a Python module so that I can use the methods of this singleton-object directly by importing them via the module instead of accessing them as object attributes. By "programmatically" I mean that I do not want to have to copy-paste the class methods explicitly into a module file. I need some sort of a workaround that allows me to import the object methods into to global scope of another module.
I would really appreciate if someone could help me on this one.
Here is a basic example that should illustrate my problem:
mymodule.py
class MyClass:
"""This is my custom class"""
def my_method(self):
return "myValue"
singleton = MyClass()
main_as_is.py
from mymodule import MyClass
myobject = MyClass()
print(myobject.my_method())
main_to_be.py
from mymodule import my_method # or from mymodule.singleton import my_method
print(my_method())
You can use the same strategy that the standard random module uses. All the functions in that module are actually methods of a "private" instance of the Random class. That's convenient for most common uses of the module, although sometimes it's useful to create your own instances of Random so that you can have multiple independent random streams.
I've adapted your code to illustrate that technique. I named the class and its instance with a single leading underscore, since that's the usual convention in Python to signify a private name, but bear in mind it's simply a convention, Python doesn't do anything to enforce this privacy.
mymodule.py
class _MyClass:
""" This is my custom class """
def my_method(self):
return "myValue"
_myclass = _MyClass()
my_method = _myclass.my_method
main_to_be.py
from mymodule import my_method
print(my_method())
output
myValue
BTW, the from mymodule import method1, method2 syntax is ok if you only import a small number of names, or it's clear from the name which module it's from (like math module functions and constants), and you don't import from many modules. Otherwise it's better to use this sort of syntax
import mymodule as mm
# Call a method from the module
mm.method1()
That way it's obvious which names are local, and which ones are imported and where they're imported from. Sure, it's a little more typing, but it makes the code a whole lot more readable. And it eliminates the possibility of name collisions.
FWIW, here's a way to automate adding all of the _myclass methods without explicitly listing them (but remember "explicit is better than implicit"). At the end of "mymodule.py", in place of my_method = _myclass.my_method, add this:
globals().update({k: getattr(_myclass, k) for k in _MyClass.__dict__
if not k.startswith('__')})
I'm not comfortable with recommending this, since it directly injects items into the globals() dict. Note that that code will add all class attributes, not just methods.
In your question you talk about singleton objects. We don't normally use singletons in Python, and many programmers in various OOP languages consider them to be an anti-pattern. See https://stackoverflow.com/questions/12755539/why-is-singleton-considered-an-anti-pattern for details. For this application there is absolutely no need at all to use a singleton. If you only want a single instance of _MyClass then simply don't create another instance of it, just use the instance that mymodule creates for you. But if your boss insists that you must use a singleton, please see the example code here.

How to make a user config available to all classes of a module

I'm writing a Python library which is meant to be used as a third party library.
great_library/__init__.py:
class ClassA(object):
#cache()
def foo(self):
pass
class ClassB(object):
#cache()
def bar(self):
pass
class GreatLibrary(object):
#classmethod
def great_api(cls):
pass
# uses ClassA and ClassB
this library is used as:
from great_library import GreatLibrary
GreatLibrary.great_api()
Now the problem is, I'd like the user to config cache expiration time. which should be passed to #cache(): #cache(seconds)
How should I design this module structure so the user could easily pass in the config and let it be used by classA and classB ? thanks
The base problem is that the variable passed to the decorator will be read when the module will be load. So there are no way to change it before (at least if you don't want to reload the module by some hacking but that cannot change the old objects). So you need some hook where great_library can get the value of the cache time and where the user can write the desired value.
The more simple and wide used method is set environment variables. At the top of your great_library module you can check the variables and load the default cache time:
import os
default_time = os.getenv("GREAT_LIBRARY_CACHE_TIME", None)
In your code use #cache(default_time). I'm not sure that the cache() API take None as default argument, otherwise is simple to modify the receipt to adapt it to your problem.
Now the users of great_library can be set it either by os.putenv() in devolopment stage (before import the module) or by OS environment in production.
An other way to put an hook can be use a configuration module to import. IMHO that method can be useful only if you have a bunch of property to set. If you follow that path your great_library module should implement something of this:
try:
from great_library_config import *
except ImportError:
# Default configurations like...
default_time = None
Personally I try to avoid solutions like that for a module but can be useful for applications or framework with an high degree of configurability. Anyway also in that case the user can use a config module for production and override it by a developing/testing one.

Does Python import instantiate a mystery class?

I thought about this for a while and can't think of a better title, sorry.
I'm new'ish to Python, and (like many other's it seems) I just can't get my head around import.
I think I understand 'modules' and 'packages', classes and attributes and all that. It's one specific behavior I need clarified.
Say I have a file, foo.py. It has one line it:
x = 1
If, in another file, I `import foo", I can reference x. And, wonderfully, in another file I can import foo and now those two files can share x. Leaving classes out of the discussion for simplicity, I believe this is the pythonic way to share attributes between files.
Here's the question: Is is fair to say, when I import foo, that foo.py itself is, (for lack of a better metaphor), secretly instantiated by the interpreter?
I realize if I define a class in a module, it follow traditional rules and only become instantiated if I explicitly do so. But, the python interpreter (via the import statement) instantiating an instance of my module in the global namespace is the only way to explain the attribute sharing behavior.
Is this true? Semi-true? Or am I wandering with the Sleestaks in the Land of the Lost?
When you import a module:
if the module has not been previously imported, the file is parsed in to a module object which is added to sys.modules with a key that is the import path from the pythonpath to your module
that module object (or some member thereof) is aliased in the importing namespace, the alias and object being referenced being determined by the specific form of import you used
So when you import foo, the interpreter checks sys.modules for something registered with the name foo. If it finds it, it provides a label foo in the local namespace for the foo module. If it doesn't, it searches down the pythonpath until it finds a foo module, parses that to a module object, adds that object to sys.modules, and adds a label in the local namespace for that module object.
import foo as foof does the same thing, only the local namespace label created is foof. from foo import x follows the same process up to the point of creating a label and reference in the local namespace, instead providing a label x in the namespace for the attribute x from the foo module. from foo import x as foox just combines the 2 ideas.
With classes, you can actually poke around this whole system by crawling up and down the tree using the __module__ attribute.
The import creates an instance of a "module" object. It is worth knowing that this is created only the first time the module is imported. The following times it is imported you are getting a reference to the original. You can create your own module objects on the fly with a bit of instrospection.
import glob # Import any python module
moduleType = type(glob)
onTheFly = moduleType("OnTheFly", "Docstring for this module")
Although there isn't much benefit to creating these.
Yes, indeed its true. If you execute import foo a module object foo is instatiated and the contents of your file e.g a class bar is added as a member of that object.

How can I set up global for every imported file in Python automatically?

I have a large python code with many modules and classes. I have a special class, whose single instance is needed everywhere throughout the code (it's a threaded application, and that instance of a class also holds Thread Local Storage, locks, etc). It's a bit uncomfortable to always "populate" that instance in every imported module. I know, using globals is not the best practice, but anyway: is there any "import hook" in python, so I can do with hooking on it to have my instance available in every modules without extra work? It should work for normal imports, "from mod import ..." too, and for import constructs too. If this is not possible, can you suggest a better solution? Certenly it's not fun to pass that instance to the constructors of every classes, etc ... Inheritance also does not help, since I have modules without classes, and also I need a single instance, not the class itself ...
class master():
def import_module(self, name):
mod = __import__(name)
mod.m = self
return mod
[...]
m = master()
Currently I am thinking something like that: but then I have to use m.import_module() to import modules, then other modules will have instance of master class with name of "m" available, so I can use m.import_module() too, etc. But then I have to give up to use "normal" import statements, and I should write this:
example_mod = m.module_import("example_mod")
instead of just this:
import example_mod
(but for sure I can do with this too, to assign "m" to example_mod.m then)
Certainly it's not fun to pass that instance to the constructors of
every classes
You don't have to do this. Set up your global class in a module like config and import it
# /myapp/enviroment/__init__.py
class ThatSingleInstanceClass: pass
# create the singleton object directly or have a function init the module
singleton = ThatSingleInstanceClass()
# /myapp/somewhere.py
# all you need to use the object is importing it
from myapp.enviroment import singleton
class SomeClass:
def __init__(self): # no need to pass that object
print "Always the same object:", singleton
What's wrong with having each module import the needed object? Explicit is better than implicit.

Categories