How to change docstring of TestCase Class in Python? - python

In Python 2.5 (Jython actually), for the UnitTest TestCase Class - there's is no SetUpClass method, and __init__ is not really acceptable (no refference to self).
When I try to change docstring inside the TestCase:
import os
fileName = os.path.split(__file__)[1]
testCaseName = os.path.splitext(fileName)[0]
setattr(__name__, '__doc__', testCaseName)
I'm getting:
setattr(__name__, '__doc__', testCaseName)
TypeError: readonly attribute
I tried to change the docstring by instantiate it into an object (where self.__doc__ is writable).
UPDATED: but I want to avoid additional coding
in the sub-class (i.e. inheriting
super-class function to set docstring
of sub-class), for example:
File DynamicTestCase.py includes:
class DynamicTestCase(unittest.TestCase):
def setDocstring(self, testCaseDocstring=None):
if not testCaseDocstring:
fileName = os.path.split(__file__)[1]
testCaseDocstring = os.path.splitext(fileName)[0]
setattr(self, '__doc__', testCaseDocstring)
File MyTestCase.py includes:
class MyTestCase(DynamicTestCase):
def test_print_docstring(self):
self.setDocstring()
print 'MyTestCase Docstring = ', self.__doc__
But still, the unittest run result is:
MyTestCase Docstring = DynamicTestCase
When I expected MyTestCase Docstring = MyTestCase

Updated - __file__ is the path name from which the current module was loaded, so naturally using __file__ inside DynamicTestCase.py will result in the path DynamicTestCase.py. However, you can just pass the path into setDocstring() from subclasses like this:
DynamicTestCase.py:
class DynamicTestCase(unittest.TestCase):
def setDocstring(self, docstring=None):
if docstring is None:
docstring = __file__
if os.path.exists(docstring):
name = os.path.split(docstring)[1]
docstring = os.path.splitext(name)[0]
setattr(self, '__doc__', docstring)
MyTestCase.py:
class MyTestCase(DynamicTestCase):
def __init__(self, *args, **kwargs):
DynamicTestCase.__init__(self, *args, **kwargs)
self.setDocstring(__file__)
def test_print_docstring(self):
print 'MyTestCase Docstring = ', self.__doc__
def test_new_docstring(self):
self.setDocstring('hello')
print 'MyTestCase Docstring = ', self.__doc__
Output:
MyTestCase Docstring = MyTestCase
MyTestCase Docstring = hello
Rest of answer
In your original code above __name__ is a string, not a class. Jython rightly rejects altering the __doc__ attribute on the str type.
Could you explain a bit about why you want to change TestCase's docstring? For example, you could subclass TestCase and give your own docstring:
class MyTestCase(unittest.TestCase):
"Docstring of MyTestCase"
Not sure if you've tried it yet, but the unittest2 package's TestCase has setUpClass, tearDownClass class methods. It's a backport of Python 2.7's improvements to work with Python 2.6 and prior.
Jython allows you to set the __doc__ of new-style classes, but CPython does not. For that reason you might want to find another way to accomplish your goal if you want your code to be portable:
Jython 2.2.1 on java1.6.0_24
>>> unittest.TestCase.__doc__ = 'foo bar'
>>> unittest.TestCase.__doc__
'foo bar'
Python 2.6.6 (r266:84292, Feb 12 2011, 01:07:21)
>>> unittest.TestCase.__doc__ = 'foo bar'
AttributeError: attribute '__doc__' of 'type' objects is not writable

You are grabbing the filename of the DynamicTestCase file, not the file that is calling the function. In order to get that you have to go into it's stack frame:
import inspect
class DynamicTestCase(unittest.TestCase):
def setDocstring(self, testCaseDocstring=None):
if not testCaseDocstring:
fileName = 'unknown.py'
# Go up one stack frame and grab the file name
stack = inspect.stack()
try:
frame = stack[1][0]
fileName = frame.f_code.co_filename
finally:
del stack
testCaseDocstring = os.path.splitext(fileName)[0]

Related

How to load a method from a file into an existing class (a 'plugin' method)

Call me weird if you like, many have before, but I have a large class which I'd like to make extensible with methods loaded from a plugin directory. Essentially, I'm monkey patching the class. What I have almost works but the method loaded doesn't 'see' the globals defined in __main__. Ideally I'd like a way to tell globals() (or whatever mechanism is actually used to locate global variables) to use that existing in __main__. Here is the code I have (trimmed for the sake of brevity):
#!/usr/bin/env python3
import importlib
import os
import types
main_global = "Hi, I'm in main"
class MyClass:
def __init__(self, plugin_dir=None):
if plugin_dir:
self.load_plugins(plugin_dir, ext="plugin")
def load_plugins(self, plugin_dir, ext):
""" Load plugins
Plugins are files in 'plugin_dir' that have the given extension.
The functions defined within are imported as methods of this class.
"""
cls = self.__class__
# First check that we're not importing the same extension twice into
# the same class.
try:
plugins = getattr(cls, "_plugins")
except AttributeError:
plugins = set()
setattr(cls, "_plugins", plugins)
if ext in plugins:
return
plugins.add(ext)
for file in os.listdir(plugin_dir):
if not file.endswith(ext):
continue
filename = os.path.join(plugin_dir, file)
loader = importlib.machinery.SourceFileLoader("bar", filename)
module = types.ModuleType(loader.name)
loader.exec_module(module)
for name in dir(module):
if name.startswith("__"):
continue
obj = getattr(module, name)
if callable(obj):
obj = obj.__get__(self, cls)
setattr(cls, name, obj)
z = MyClass(plugin_dir="plugins")
z.foo("Hello")
And this is 'foo.plugin' from the plugins directory:
#!/usr/bin/env python3
foo_global = "I am global within foo"
def foo(self, value):
print(f"I am foo, called with {self} and {value}")
print(f"foo_global = {foo_global}")
print(f"main_global = {main_global}")
The output is...
I am foo, called with <__main__.MyClass object at 0x7fd4680bfac8> and Hello
foo_global = I am global within foo
Traceback (most recent call last):
File "./plugged", line 55, in <module>
z.foo("Hello")
File "plugins/foo.plugin", line 8, in foo
print(f"main_global = {main_global}")
NameError: name 'main_global' is not defined
I know it all feels a bit 'hacky', but it's become a challenge so please don't flame me on style etc. If there's another way to achieve this aim, I'm all ears.
Thoughts, learned friends?
You can do what you want with a variation of the technique shown in #Martijn Pieters' answer to the the question: How to inject variable into scope with a decorator? tweaked to inject multiple values into a class method.
from functools import wraps
import importlib
import os
from pathlib import Path
import types
main_global = "Hi, I'm in main"
class MyClass:
def __init__(self, plugin_dir=None):
if plugin_dir:
self.load_plugins(plugin_dir, ext="plugin")
def load_plugins(self, plugin_dir, ext):
""" Load plugins
Plugins are files in 'plugin_dir' that have the given extension.
The functions defined within are imported as methods of this class.
"""
cls = self.__class__
# First check that we're not importing the same extension twice into
# the same class.
try:
plugins = getattr(cls, "_plugins")
except AttributeError:
plugins = set()
setattr(cls, "_plugins", plugins)
if ext in plugins:
return
plugins.add(ext)
for file in Path(plugin_dir).glob(f'*.{ext}'):
loader = importlib.machinery.SourceFileLoader("bar", str(file))
module = types.ModuleType(loader.name)
loader.exec_module(module)
namespace = globals()
for name in dir(module):
if name.startswith("__"):
continue
obj = getattr(module, name)
if callable(obj):
obj = inject(obj.__get__(self, cls), namespace)
setattr(cls, name, obj)
def inject(method, namespace):
#wraps(method)
def wrapped(*args, **kwargs):
method_globals = method.__globals__
# Save copies of any of method's global values replaced by the namespace.
replaced = {key: method_globals[key] for key in namespace if key in method_globals}
method_globals.update(namespace)
try:
method(*args[1:], **kwargs)
finally:
method_globals.update(replaced) # Restore any replaced globals.
return wrapped
z = MyClass(plugin_dir="plugins")
z.foo("Hello")
Example output:
I am foo, called with <__main__.MyClass object at 0x0056F670> and Hello
foo_global = I am global within foo
main_global = Hi, I'm in main
You can approach the problem with a factory function and inheritance. Assuming each of your plugins is something like this, defined in a separate importable file:
class MyPlugin:
foo = 'bar'
def extra_method(self):
print(self.foo)
You can use a factory like this:
def MyClassFactory(plugin_dir):
def search_and_import_plugins(plugin_dir):
# Look for all possible plugins and import them
return plugin_list # a list of plugin classes, like [MyPlugin]
plugin_list = search_and_import_plugins(plugin_dir):
class MyClass(*plugin_list):
pass
return MyClass()
z = MyClassFactory('/home/me/plugins')

How do you invoke a method on a class instance by string name?

I am trying to invoke a method on a class dynamically using a String for the class name and a String for the method name. I am using getattr then invoking a method on the class. You'll have to forgive me if I am way off, I am kind of new to Python.
class mock:
def __init__(self):
pass
def create(self):
print('hello world?')
return 'hello world'
then creating the instance and invoking via :
module = importlib.import_module('xyz.module')
instance = getattr(module, 'mock')
invoke = getattr(instance, 'create')
result = invoke()
print(result)
The result is something like <object object at 0x10943ccd0>. "hello world?" is never printed. What am I doing wrong?
You have missed a step. The instance variable you have isn't actually an instance, it's the class mock itself. You need to call it to get an instance. Try something like this:
module = importlib.import_module('xyz.module')
klass = getattr(module, 'mock') # rename this variable (avoiding keywords)
instance = klass() # and call the class to create an instance
method = getattr(instance, 'create') # also renamed here, for clarity
result = method() # previously, this would have been an error (missing self argument)
print(result) # now you should get "hello world" printed twice (once with a ?)
As a note, PEP 8 naming conventions would have helped you avoid the issue here. If you'd used the name Mock instead of mock for the class, it might have been a bit more obvious what kind of thing you had, after importing and getattring it.
You don't need getattr at all; the instance returned by import_module is the same thing that would be implicitly bound to xyz.module had you used an import statement.
module = importlib.import_module('xyz.module')
result = module.mock().create()
assert result == "hello world"
Note that the above creates an instance of mock on which to call create, rather than accessing mock.create directly. If you have variables containing the name of the class and the method, you still need to do that, only using getattr this time.
module = importlib.import_module('xyz.module')
cls_name = 'mock'
method_name = 'create'
cls = getattr(module, 'mock')
instance = cls()
invoke = getattr(instance, 'create')
result = invoke()
assert result == "hello world"

__getattr__ on a module

How can implement the equivalent of a __getattr__ on a class, on a module?
Example
When calling a function that does not exist in a module's statically defined attributes, I wish to create an instance of a class in that module, and invoke the method on it with the same name as failed in the attribute lookup on the module.
class A(object):
def salutation(self, accusative):
print "hello", accusative
# note this function is intentionally on the module, and not the class above
def __getattr__(mod, name):
return getattr(A(), name)
if __name__ == "__main__":
# i hope here to have my __getattr__ function above invoked, since
# salutation does not exist in the current namespace
salutation("world")
Which gives:
matt#stanley:~/Desktop$ python getattrmod.py
Traceback (most recent call last):
File "getattrmod.py", line 9, in <module>
salutation("world")
NameError: name 'salutation' is not defined
There are two basic problems you are running into here:
__xxx__ methods are only looked up on the class
TypeError: can't set attributes of built-in/extension type 'module'
(1) means any solution would have to also keep track of which module was being examined, otherwise every module would then have the instance-substitution behavior; and (2) means that (1) isn't even possible... at least not directly.
Fortunately, sys.modules is not picky about what goes there so a wrapper will work, but only for module access (i.e. import somemodule; somemodule.salutation('world'); for same-module access you pretty much have to yank the methods from the substitution class and add them to globals() eiher with a custom method on the class (I like using .export()) or with a generic function (such as those already listed as answers). One thing to keep in mind: if the wrapper is creating a new instance each time, and the globals solution is not, you end up with subtly different behavior. Oh, and you don't get to use both at the same time -- it's one or the other.
Update
From Guido van Rossum:
There is actually a hack that is occasionally used and recommended: a
module can define a class with the desired functionality, and then at
the end, replace itself in sys.modules with an instance of that class
(or with the class, if you insist, but that's generally less useful).
E.g.:
# module foo.py
import sys
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
sys.modules[__name__] = Foo()
This works because the import machinery is actively enabling this
hack, and as its final step pulls the actual module out of
sys.modules, after loading it. (This is no accident. The hack was
proposed long ago and we decided we liked enough to support it in the
import machinery.)
So the established way to accomplish what you want is to create a single class in your module, and as the last act of the module replace sys.modules[__name__] with an instance of your class -- and now you can play with __getattr__/__setattr__/__getattribute__ as needed.
Note 1: If you use this functionality then anything else in the module, such as globals, other functions, etc., will be lost when the sys.modules assignment is made -- so make sure everything needed is inside the replacement class.
Note 2: To support from module import * you must have __all__ defined in the class; for example:
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
__all__ = list(set(vars().keys()) - {'__module__', '__qualname__'})
Depending on your Python version, there may be other names to omit from __all__. The set() can be omitted if Python 2 compatibility is not needed.
A while ago, Guido declared that all special method lookups on
new-style classes bypass __getattr__ and __getattribute__. Dunder methods had previously worked on modules - you could, for example, use a module as a context manager simply by defining __enter__ and __exit__, before those tricks broke.
Recently some historical features have made a comeback, the module __getattr__ among them, and so the existing hack (a module replacing itself with a class in sys.modules at import time) should be no longer necessary.
In Python 3.7+, you just use the one obvious way. To customize attribute access on a module, define a __getattr__ function at the module level which should accept one argument (name of attribute), and return the computed value or raise an AttributeError:
# my_module.py
def __getattr__(name: str) -> Any:
...
This will also allow hooks into "from" imports, i.e. you can return dynamically generated objects for statements such as from my_module import whatever.
On a related note, along with the module getattr you may also define a __dir__ function at module level to respond to dir(my_module). See PEP 562 for details.
This is a hack, but you can wrap the module with a class:
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
# Perform custom logic here
try:
return getattr(self.wrapped, name)
except AttributeError:
return 'default' # Some sensible default
sys.modules[__name__] = Wrapper(sys.modules[__name__])
We don't usually do it that way.
What we do is this.
class A(object):
....
# The implicit global instance
a= A()
def salutation( *arg, **kw ):
a.salutation( *arg, **kw )
Why? So that the implicit global instance is visible.
For examples, look at the random module, which creates an implicit global instance to slightly simplify the use cases where you want a "simple" random number generator.
Similar to what #Håvard S proposed, in a case where I needed to implement some magic on a module (like __getattr__), I would define a new class that inherits from types.ModuleType and put that in sys.modules (probably replacing the module where my custom ModuleType was defined).
See the main __init__.py file of Werkzeug for a fairly robust implementation of this.
This is hackish, but...
# Python 2.7
import types
class A(object):
def salutation(self, accusative):
print("hello", accusative)
def farewell(self, greeting, accusative):
print(greeting, accusative)
def AddGlobalAttribute(classname, methodname):
print("Adding " + classname + "." + methodname + "()")
def genericFunction(*args):
return globals()[classname]().__getattribute__(methodname)(*args)
globals()[methodname] = genericFunction
# set up the global namespace
x = 0 # X and Y are here to add them implicitly to globals, so
y = 0 # globals does not change as we iterate over it.
toAdd = []
def isCallableMethod(classname, methodname):
someclass = globals()[classname]()
something = someclass.__getattribute__(methodname)
return callable(something)
for x in globals():
print("Looking at", x)
if isinstance(globals()[x], (types.ClassType, type)):
print("Found Class:", x)
for y in dir(globals()[x]):
if y.find("__") == -1: # hack to ignore default methods
if isCallableMethod(x,y):
if y not in globals(): # don't override existing global names
toAdd.append((x,y))
# Returns:
# ('Looking at', 'A')
# ('Found Class:', 'A')
# ('Looking at', 'toAdd')
# ('Looking at', '__builtins__')
# ('Looking at', 'AddGlobalAttribute')
# ('Looking at', 'register')
# ('Looking at', '__package__')
# ('Looking at', 'salutation')
# ('Looking at', 'farewell')
# ('Looking at', 'types')
# ('Looking at', 'x')
# ('Looking at', 'y')
# ('Looking at', '__name__')
# ('Looking at', 'isCallableMethod')
# ('Looking at', '__doc__')
# ('Looking at', 'codecs')
for x in toAdd:
AddGlobalAttribute(*x)
if __name__ == "__main__":
salutation("world")
farewell("goodbye", "world")
# Returns:
# hello world
# goodbye world
This works by iterating over the all the objects in the global namespace. If the item is a class, it iterates over the class attributes. If the attribute is callable it adds it to the global namespace as a function.
It ignore all attributes which contain "__".
I wouldn't use this in production code, but it should get you started.
Here's my own humble contribution -- a slight embellishment of #Håvard S's highly rated answer, but a bit more explicit (so it might be acceptable to #S.Lott, even though probably not good enough for the OP):
import sys
class A(object):
def salutation(self, accusative):
print "hello", accusative
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
try:
return getattr(self.wrapped, name)
except AttributeError:
return getattr(A(), name)
_globals = sys.modules[__name__] = Wrapper(sys.modules[__name__])
if __name__ == "__main__":
_globals.salutation("world")
Create your module file that has your classes. Import the module. Run getattr on the module you just imported. You can do a dynamic import using __import__ and pull the module from sys.modules.
Here's your module some_module.py:
class Foo(object):
pass
class Bar(object):
pass
And in another module:
import some_module
Foo = getattr(some_module, 'Foo')
Doing this dynamically:
import sys
__import__('some_module')
mod = sys.modules['some_module']
Foo = getattr(mod, 'Foo')

Get fully qualified class name of an object in Python

For logging purposes I want to retrieve the fully qualified class name of a Python object. (With fully qualified I mean the class name including the package and module name.)
I know about x.__class__.__name__, but is there a simple method to get the package and module?
With the following program
#!/usr/bin/env python
import foo
def fullname(o):
klass = o.__class__
module = klass.__module__
if module == 'builtins':
return klass.__qualname__ # avoid outputs like 'builtins.str'
return module + '.' + klass.__qualname__
bar = foo.Bar()
print(fullname(bar))
and Bar defined as
class Bar(object):
def __init__(self, v=42):
self.val = v
the output is
$ ./prog.py
foo.Bar
If you're still stuck on Python 2, you'll have to use __name__ instead of __qualname__, which is less informative for nested classes - a class Bar nested in a class Foo will show up as Bar instead of Foo.Bar:
def fullname(o):
klass = o.__class__
module = klass.__module__
if module == '__builtin__':
return klass.__name__ # avoid outputs like '__builtin__.str'
return module + '.' + klass.__name__
The provided answers don't deal with nested classes.
Since Python 3.3 (PEP 3155), you can use __qualname__ of the class instead of the __name__. Otherwise, a class like
class Foo:
class Bar: # this one
pass
will show up as just Bar instead of Foo.Bar.
(You'll still need to attach the __module__ to the qualname separately - __qualname__ is not intended to include module names.)
Here's one based on Greg Bacon's excellent answer, but with a couple of extra checks:
__module__ can be None (according to the docs), and also for a type like str it can be __builtin__ (which you might not want appearing in logs or whatever). The following checks for both those possibilities:
def fullname(o):
module = o.__class__.__module__
if module is None or module == str.__class__.__module__:
return o.__class__.__name__
return module + '.' + o.__class__.__name__
(There might be a better way to check for __builtin__. The above just relies on the fact that str is always available, and its module is always __builtin__)
For python3.7 I use:
".".join([obj.__module__, obj.__name__])
Getting:
package.subpackage.ClassName
Consider using the inspect module which has functions like getmodule which might be what are looking for:
>>>import inspect
>>>import xml.etree.ElementTree
>>>et = xml.etree.ElementTree.ElementTree()
>>>inspect.getmodule(et)
<module 'xml.etree.ElementTree' from
'D:\tools\python2.5.2\lib\xml\etree\ElementTree.pyc'>
Some people (e.g. https://stackoverflow.com/a/16763814/5766934) arguing that __qualname__ is better than __name__.
Here is an example that shows the difference:
$ cat dummy.py
class One:
class Two:
pass
$ python3.6
>>> import dummy
>>> print(dummy.One)
<class 'dummy.One'>
>>> print(dummy.One.Two)
<class 'dummy.One.Two'>
>>> def full_name_with_name(klass):
... return f'{klass.__module__}.{klass.__name__}'
>>> def full_name_with_qualname(klass):
... return f'{klass.__module__}.{klass.__qualname__}'
>>> print(full_name_with_name(dummy.One)) # Correct
dummy.One
>>> print(full_name_with_name(dummy.One.Two)) # Wrong
dummy.Two
>>> print(full_name_with_qualname(dummy.One)) # Correct
dummy.One
>>> print(full_name_with_qualname(dummy.One.Two)) # Correct
dummy.One.Two
Note, it also works correctly for builtins:
>>> print(full_name_with_qualname(print))
builtins.print
>>> import builtins
>>> builtins.print
<built-in function print>
__module__ would do the trick.
Try:
>>> import re
>>> print re.compile.__module__
re
This site suggests that __package__ might work for Python 3.0; However, the examples given there won't work under my Python 2.5.2 console.
This is a hack but I'm supporting 2.6 and just need something simple:
>>> from logging.handlers import MemoryHandler as MH
>>> str(MH).split("'")[1]
'logging.handlers.MemoryHandler'
Since the interest of this topic is to get fully qualified names, here is a pitfall that occurs when using relative imports along with the main module existing in the same package. E.g., with the below module setup:
$ cat /tmp/fqname/foo/__init__.py
$ cat /tmp/fqname/foo/bar.py
from baz import Baz
print Baz.__module__
$ cat /tmp/fqname/foo/baz.py
class Baz: pass
$ cat /tmp/fqname/main.py
import foo.bar
from foo.baz import Baz
print Baz.__module__
$ cat /tmp/fqname/foo/hum.py
import bar
import foo.bar
Here is the output showing the result of importing the same module differently:
$ export PYTHONPATH=/tmp/fqname
$ python /tmp/fqname/main.py
foo.baz
foo.baz
$ python /tmp/fqname/foo/bar.py
baz
$ python /tmp/fqname/foo/hum.py
baz
foo.baz
When hum imports bar using relative path, bar sees Baz.__module__ as just "baz", but in the second import that uses full name, bar sees the same as "foo.baz".
If you are persisting the fully-qualified names somewhere, it is better to avoid relative imports for those classes.
Bellow is just an improvement of Greg Bacon's answer, tested for class, instance, method, function, both builtin and user defined.
def fullname(o):
try:
# if o is a class or function, get module directly
module = o.__module__
except AttributeError:
# then get module from o's class
module = o.__class__.__module__
try:
# if o is a class or function, get name directly
name = o.__qualname__
except AttributeError:
# then get o's class name
name = o.__class__.__qualname__
# if o is a method of builtin class, then module will be None
if module == 'builtins' or module is None:
return name
return module + '.' + name
This is an adaption of the answers by Greg Bacon and MB to use the qualified class name. Note that the question did ask for the qualified class name. It was tested with Python 3.8.
def fullname(obj: object) -> str:
"""Return the full name of the given object using its module and qualified class names."""
# Ref: https://stackoverflow.com/a/66508248/
module_name, class_name = obj.__class__.__module__, obj.__class__.__qualname__
if module_name in (None, str.__class__.__module__):
return class_name
return module_name + "." + class_name
None of the answers here worked for me. In my case, I was using Python 2.7 and knew that I would only be working with newstyle object classes.
def get_qualified_python_name_from_class(model):
c = model.__class__.__mro__[0]
name = c.__module__ + "." + c.__name__
return name
My solution is:
def fullname(obj) -> str:
if type(obj).__qualname__ != "type":
# obj is instance
return ".".join(
[
obj.__class__.__module__,
obj.__class__.__qualname__,
]
)
# obj is not instance
return ".".join([obj.__module__, obj.__qualname__])
# not instance
>>> print(fullname(datetime))
"datetime.datetime"
# instance
>>> print(fullname(datetime.now())
"datetime.datetime"
# instance
>>> print(fullname(3))
"builtins.int"

python modify __metaclass__ for whole program

EDIT: Note that this is a REALLY BAD idea to do in production code. This was just an interesting thing for me. Don't do this at home!
Is it possible to modify __metaclass__ variable for whole program (interpreter) in Python?
This simple example is working:
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
__metaclass__= ChattyType
class Data:
pass
data = Data() # prints "Class init Data"
print data
but I would love to be able change of __metaclass__ to work even in submodules. So for example (file m1.py):
class A:
pass
a=A()
print a
file main.py:
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
__metaclass__= ChattyType
import m1 # and now print "Class init A"
class Data:
pass
data = Data() # print "Class init Data"
print data
I understand that global __metaclass__ is no longer working in Python 3.X, but that is not my concern (my code if proof of concept). So is there any way to accomplish this in Python-2.x?
The "global __metaclass__" feature of Python 2 is designed to work per-module, only (just think what havoc it would wreak, otherwise, by forcing your own metaclass on all library and third-party modules that you imported from that point onwards -- shudder!). If it's very important to you to "secretly" alter the behavior of all modules you're importing from a certain point onwards, for whatever cloak-and-dagger reason, you could play very very dirty tricks with an import hook (at worst by first copying the sources to a temporary location while altering them...) but the effort would be proportionate to the enormity of the deed, which seems appropriate;-)
Okay; IMO this is gross, hairy, dark magic. You shouldn't use it, perhaps ever, but especially not in production code. It is kind of interesting just for curiosity's sake, however.
You can write a custom importer using the mechanisms described in PEP 302, and further discussed in Doug Hellmann's PyMOTW: Modules and Imports. That gives you the tools to accomplish the task you contemplated.
I implemented such an importer, just because I was curious. Essentially, for the modules you specify by means of the class variable __chatty_for__, it will insert a custom type as a __metaclass__ variable in the imported module's __dict__, before the code is evaluated. If the code in question defines its own __metaclass__, that will replace the one pre-inserted by the importer. It would be inadvisable to apply this importer to any modules before carefully considering what it would do to them.
I haven't written many importers, so I may have done one or more silly things while writing this one. If anyone notices flaws / corner cases I missed in the implementation, please leave a comment.
source file 1:
# foo.py
class Foo: pass
source file 2:
# bar.py
class Bar: pass
source file 3:
# baaz.py
class Baaz: pass
and the main event:
# chattyimport.py
import imp
import sys
import types
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
class ChattyImporter(object):
__chatty_for__ = []
def __init__(self, path_entry):
pass
def find_module(self, fullname, path=None):
if fullname not in self.__chatty_for__:
return None
try:
if path is None:
self.find_results = imp.find_module(fullname)
else:
self.find_results = imp.find_module(fullname, path)
except ImportError:
return None
(f,fn,(suf,mode,typ)) = self.find_results
if typ == imp.PY_SOURCE:
return self
return None
def load_module(self, fullname):
#print '%s loading module %s' % (type(self).__name__, fullname)
(f,fn,(suf,mode,typ)) = self.find_results
data = f.read()
if fullname in sys.modules:
module = sys.modules[fullname]
else:
sys.modules[fullname] = module = types.ModuleType(fullname)
module.__metaclass__ = ChattyType
module.__file__ = fn
module.__name__ = fullname
codeobj = compile(data, fn, 'exec')
exec codeobj in module.__dict__
return module
class ChattyImportSomeModules(ChattyImporter):
__chatty_for__ = 'foo bar'.split()
sys.meta_path.append(ChattyImportSomeModules(''))
import foo # prints 'Class init Foo'
import bar # prints 'Class init Bar'
import baaz
Nope. (This is a feature!)

Categories