Related
I'm trying to bypass importing from a module, so in my __init__.py I can inject code like this:
globals().update(
{
"foo": lambda: print("Hello stackoverflow!")
}
)
so if I do import mymodule I will be able to call mymodule.foo. That is a simple concept, useless for the purpose because you can actually just define foo.
So, the idea is to modify the globals module dictionary, so in case it doesn't find the function foo it will go wherever and I can inject the code, for that I tried:
from importer import load #a load function to search for the code
from functools import wraps
def global_get_wrapper(f):
#wraps(f)
def wrapper(*args):
module_name, default = args
res = f(*args)
if res is None:
return load(module_name)
return res
return wrapper
globals().get = global_get_wrapper(globals().get) # trying to substitute get method
But it gives me an error:
AttributeError: 'dict' object attribute 'get' is read-only
The other idea I had is to preload the available function, class, etc names into the module dictionary and lazily load them later.
I run out of ideas to accomplish this and I don't know if this is even possible.
Should I go for writing my own python importer? or is there any other possibility I could not think about?
Thanks in advance.
Instead of hacking globals() it would be better to define __getattr__ for your module as follows:
module_name.py
foo = 'foo'
def bar():
return 'bar'
my_module.py
import sys
import module_name
class MyModule(object):
def foobar(self):
return 'foobar'
def __getattr__(self, item):
return getattr(module_name, item)
sys.modules[__name__] = MyModule()
and then:
>>> import my_module
>>> my_module.foo
'foo'
>>> my_module.bar()
'bar'
>>> my_module.foobar()
'foobar'
PEP 562, which targets Python 3.7, introduces __getattr__ for modules. In the rationale it also describes workarounds for previous Python versions.
It is sometimes convenient to customize or otherwise have control over access to module attributes. A typical example is managing deprecation warnings. Typical workarounds are assigning __class__ of a module object to a custom subclass of types.ModuleType or replacing the sys.modules item with a custom wrapper instance. It would be convenient to simplify this procedure by recognizing __getattr__ defined directly in a module that would act like a normal __getattr__ method, except that it will be defined on module instances.
So your mymodule can look like:
foo = 'bar'
def __getattr__(name):
print('load you custom module and return it')
Here's how it behaves:
>>> import mymodule
>>> mymodule.foo
'bar'
>>> mymodule.baz
load you custom module and return it
I don't quite understand. Would this work for you?
try:
mymodule.foo()
except:
print("whatever you wanted to do")
I have an existing python (python v2.7) application that imports external py files on the fly which contain specifically named classes to processes data. The external py file loaded is chosen based on the type of post-processing of the data that is needed.
So I have this collection of classes, each in their own file. The files are named in a specific fashion based on the type of processing so that the main program knows what file to import from the upstream request.
Keep in mind that I and others are always tweaking these class files, but we can not change the code on the main application.
What I would like to do is to import a "template" of the common functions into the class scope which can provide the standard set of controls that the main program expects without needing to copy/paste them into each file. I hate it when I find a bug and make a correction in one of these main class i/o function which I then have to replicate in thirty-some other files.
Now, I understand from googling that my import here is bad... I get the message:
TestClassFile.py:5: SyntaxWarning: import * only allowed at module level
But this method is the only way I have found to import the functions so that they come into the namespace of the class itself. I have an example below...
What method (if any) is the appropriate way to do this in Python?
Example
main.py
import TestClassFile
print "New TestClass - Init"
oTest = TestClassFile.TestClass("foo")
print "Should run... Function A"
oTest.funcA()
print "Should run... Function b"
oTest.funcB()
TestClassFile.py
class TestClass:
from TestClassImport import *
def __init__(self, str):
print "INIT! and be ... ", str
def funcA(self):
print "Function A"
TestClassImport.py
def funcB(self):
print "Function B"
Much appreciated!
Update
Many thanks to everyone for the contributions. From researching MixIns, these appear to be the proper python way to extend a class.
TestClassImport.py
class ImportClass:
def funcB(self):
print "Function B"
TestClassFile.py
from TestClassImport import ImportClass
class TestClass(ImportClass):
def __init__(self, str):
print "INIT! and be ... ", str
def funcA(self):
print "Function A"
It sounds like you should make the imported functions into mixins, which you can inherit from. So:
TestClassImport.py
class ClassB(object):
def funcB(self):
print "Function B"
TestClassFile.py
from TestClassImport import ClassB
from OtherClassImport import ClassX
class TestClass(ClassB, ClassX):
...
This appears to work:
import types
from TestClassImport import funcB
class TestClass:
def __init__(self, str):
print "INIT! and be ... ", str
setattr(self, 'funcB', types.MethodType(funcB, self, TestClass))
def funcA(self):
print "Function A"
When I run it I get the following output:
INIT! and be ... foo
Should run... Function A
Function A
Should run... Function b
Function B
I don't know if this is by any means a good solution, but you can write a function to construct a metaclass to dynamically add properties to your classes.
def make_meta(*import_classes):
class TestMeta(type):
def __new__(meta, name, bases, dct):
new_class = super(TestMeta, meta).__new__(meta, name, bases, dct)
for import_class in import_classes:
for name in vars(import_class):
if not name.startswith('__'):
prop = getattr(import_class, name)
setattr(new_class, name, prop)
return new_class
return TestMeta
class TestClass:
import TestClassImport
__metaclass__ = make_meta(TestClassImport)
# other functions defined as normal...
This will add everything in the global scope of TestClassImport.py that doesn't start with '__' as a property on TestClass.
Or, you can use a class decorator to add properties dynamically in the same fashion.
def add_imports(*import_classes):
def augment_class(cls):
for import_class in import_classes:
for name in vars(import_class):
if not name.startswith('__'):
prop = getattr(import_class, name)
setattr(cls, name, prop)
return cls
return augment_class
import TestClassImport
#add_imports(TestClassImport)
class TestClass:
# normal class body
But mixins do seem like a better approach.
You can use importlib for this, e.g.:
import importlib
class TestClass:
def __init__(self, module_name):
_tmp = importlib.import_module(module_name)
for elem in _tmp.__dir__():
if not elem.startswith('_'):
prop = getattr(_tmp, elem)
setattr(self, elem, prop)
def funcA(self):
print("function A")
tc = TestClass('some_module')
tc.funcB()
>>> prints "function B"
With this approach, you can create function load_module(module_name) instead of __init__() to load modules independently of each other (e.g. to prevent names collision).
Suppose I have this snippet inside a module
def func(params):
class MyClass(object):
pass
How can I pickle an instance of the class MyClass ?
You can't, because picklable object's class definitions must reside in an imported module's scope. Just put your class inside module scope and you are good to go.
That said, in Python there is very little that can't be achieved with a bit of hacking the insides of the machinery (sys.modules in this case), but I wouldn't recommend that.
The MyClass definition is local variable for the func function. You cannot directly create an instance of it, but you can map it's functions to a new class, and then to use the new class as it is the original one. Here's an example:
def func(params):
class MyClass(object):
some_param = 100
def __init__(self, *args):
print "args:", args
def blabla(self):
self.x = 123
print self.some_param
def getme(self):
print self.x
func.func_code is the code of the func function, and func.func_code.co_consts[2] contains the bytecode of the MyClass definition:
In : func.func_code.co_consts
Out:
(None,
'MyClass',
<code object MyClass at 0x164dcb0, file "<ipython-input-35-f53bebe124be>", line 2>)
So we need the bytecode for the MyClass functions:
In : eval(func.func_code.co_consts[2])
Out:
{'blabla': <function blabla at 0x24689b0>,
'__module__': '__main__',
'getme': <function getme at 0x2468938>,
'some_param': 100,
'__init__': <function __init__ at 0x219e398>}
And finally we create a new class with metaclass, that assigns the MyClass functions to the new class:
def map_functions(name, bases, dict):
dict.update(eval(func.func_code.co_consts[2]))
return type(name, bases, dict)
class NewMyClass(object):
__metaclass__ = map_functions
n = NewMyClass(1, 2, 3, 4, 5)
>> args: (1, 2, 3, 4, 5)
n.blabla()
>> 100
n.getme()
>> 123
This is somewhat tough to do because the way Pickle does with objects from user defined classes by default is to create a new instance of the class - using the object's __class__.__name__ attribute to retrieve its type in the object's original module. Which means: pickling and unpickling only works (by default) for classes that have well defined names in the module they are defined.
When one defines a class inside a function, usulay there won't be a module level (i.e. global) variable holding the name of each class that was created inside the function.
The behavior for pickle and npickle can be customized through the __getstate__ and __setstate__ methods on the class - check the docs - but even them, doing it right for dynamic class can be tricky , but I managed to create a working implementation of it for another S.O. question - -check my answer here:
Pickle a dynamically parameterized sub-class
You can work around the pickle requirement that class definitions be importable by including the class definition as a string in the data pickled for the instance and exec()uting it yourself when unpickling by adding a __reduce__() method that passes the class definition to a callable. Here's a trivial example illustrating what I mean:
from textwrap import dedent
# Scaffolding
definition = dedent('''
class MyClass(object):
def __init__(self, attribute):
self.attribute = attribute
def __repr__(self):
return '{}({!r})'.format(self.__class__.__name__, self.attribute)
def __reduce__(self):
return instantiator, (definition, self.attribute)
''')
def instantiator(class_def, init_arg):
""" Create class and return an instance of it. """
exec(class_def)
TheClass = locals()['MyClass']
return TheClass(init_arg)
# Sample usage
import pickle
from io import BytesIO
stream = BytesIO() # use a memory-backed file for testing
obj = instantiator(definition, 'Foo') # create instance of class from definition
print('obj: {}'.format(obj))
pickle.dump(obj, stream)
stream.seek(0) # rewind
obj2 = pickle.load(stream)
print('obj2: {}'.format(obj2))
Output:
obj: MyClass('Foo')
obj2: MyClass('Foo')
Obviously it's inefficient to include the class definition string with every class instance pickled, so that redundancy may make it impractical, depending on the the number of class instances involved.
How can implement the equivalent of a __getattr__ on a class, on a module?
Example
When calling a function that does not exist in a module's statically defined attributes, I wish to create an instance of a class in that module, and invoke the method on it with the same name as failed in the attribute lookup on the module.
class A(object):
def salutation(self, accusative):
print "hello", accusative
# note this function is intentionally on the module, and not the class above
def __getattr__(mod, name):
return getattr(A(), name)
if __name__ == "__main__":
# i hope here to have my __getattr__ function above invoked, since
# salutation does not exist in the current namespace
salutation("world")
Which gives:
matt#stanley:~/Desktop$ python getattrmod.py
Traceback (most recent call last):
File "getattrmod.py", line 9, in <module>
salutation("world")
NameError: name 'salutation' is not defined
There are two basic problems you are running into here:
__xxx__ methods are only looked up on the class
TypeError: can't set attributes of built-in/extension type 'module'
(1) means any solution would have to also keep track of which module was being examined, otherwise every module would then have the instance-substitution behavior; and (2) means that (1) isn't even possible... at least not directly.
Fortunately, sys.modules is not picky about what goes there so a wrapper will work, but only for module access (i.e. import somemodule; somemodule.salutation('world'); for same-module access you pretty much have to yank the methods from the substitution class and add them to globals() eiher with a custom method on the class (I like using .export()) or with a generic function (such as those already listed as answers). One thing to keep in mind: if the wrapper is creating a new instance each time, and the globals solution is not, you end up with subtly different behavior. Oh, and you don't get to use both at the same time -- it's one or the other.
Update
From Guido van Rossum:
There is actually a hack that is occasionally used and recommended: a
module can define a class with the desired functionality, and then at
the end, replace itself in sys.modules with an instance of that class
(or with the class, if you insist, but that's generally less useful).
E.g.:
# module foo.py
import sys
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
sys.modules[__name__] = Foo()
This works because the import machinery is actively enabling this
hack, and as its final step pulls the actual module out of
sys.modules, after loading it. (This is no accident. The hack was
proposed long ago and we decided we liked enough to support it in the
import machinery.)
So the established way to accomplish what you want is to create a single class in your module, and as the last act of the module replace sys.modules[__name__] with an instance of your class -- and now you can play with __getattr__/__setattr__/__getattribute__ as needed.
Note 1: If you use this functionality then anything else in the module, such as globals, other functions, etc., will be lost when the sys.modules assignment is made -- so make sure everything needed is inside the replacement class.
Note 2: To support from module import * you must have __all__ defined in the class; for example:
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
__all__ = list(set(vars().keys()) - {'__module__', '__qualname__'})
Depending on your Python version, there may be other names to omit from __all__. The set() can be omitted if Python 2 compatibility is not needed.
A while ago, Guido declared that all special method lookups on
new-style classes bypass __getattr__ and __getattribute__. Dunder methods had previously worked on modules - you could, for example, use a module as a context manager simply by defining __enter__ and __exit__, before those tricks broke.
Recently some historical features have made a comeback, the module __getattr__ among them, and so the existing hack (a module replacing itself with a class in sys.modules at import time) should be no longer necessary.
In Python 3.7+, you just use the one obvious way. To customize attribute access on a module, define a __getattr__ function at the module level which should accept one argument (name of attribute), and return the computed value or raise an AttributeError:
# my_module.py
def __getattr__(name: str) -> Any:
...
This will also allow hooks into "from" imports, i.e. you can return dynamically generated objects for statements such as from my_module import whatever.
On a related note, along with the module getattr you may also define a __dir__ function at module level to respond to dir(my_module). See PEP 562 for details.
This is a hack, but you can wrap the module with a class:
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
# Perform custom logic here
try:
return getattr(self.wrapped, name)
except AttributeError:
return 'default' # Some sensible default
sys.modules[__name__] = Wrapper(sys.modules[__name__])
We don't usually do it that way.
What we do is this.
class A(object):
....
# The implicit global instance
a= A()
def salutation( *arg, **kw ):
a.salutation( *arg, **kw )
Why? So that the implicit global instance is visible.
For examples, look at the random module, which creates an implicit global instance to slightly simplify the use cases where you want a "simple" random number generator.
Similar to what #Håvard S proposed, in a case where I needed to implement some magic on a module (like __getattr__), I would define a new class that inherits from types.ModuleType and put that in sys.modules (probably replacing the module where my custom ModuleType was defined).
See the main __init__.py file of Werkzeug for a fairly robust implementation of this.
This is hackish, but...
# Python 2.7
import types
class A(object):
def salutation(self, accusative):
print("hello", accusative)
def farewell(self, greeting, accusative):
print(greeting, accusative)
def AddGlobalAttribute(classname, methodname):
print("Adding " + classname + "." + methodname + "()")
def genericFunction(*args):
return globals()[classname]().__getattribute__(methodname)(*args)
globals()[methodname] = genericFunction
# set up the global namespace
x = 0 # X and Y are here to add them implicitly to globals, so
y = 0 # globals does not change as we iterate over it.
toAdd = []
def isCallableMethod(classname, methodname):
someclass = globals()[classname]()
something = someclass.__getattribute__(methodname)
return callable(something)
for x in globals():
print("Looking at", x)
if isinstance(globals()[x], (types.ClassType, type)):
print("Found Class:", x)
for y in dir(globals()[x]):
if y.find("__") == -1: # hack to ignore default methods
if isCallableMethod(x,y):
if y not in globals(): # don't override existing global names
toAdd.append((x,y))
# Returns:
# ('Looking at', 'A')
# ('Found Class:', 'A')
# ('Looking at', 'toAdd')
# ('Looking at', '__builtins__')
# ('Looking at', 'AddGlobalAttribute')
# ('Looking at', 'register')
# ('Looking at', '__package__')
# ('Looking at', 'salutation')
# ('Looking at', 'farewell')
# ('Looking at', 'types')
# ('Looking at', 'x')
# ('Looking at', 'y')
# ('Looking at', '__name__')
# ('Looking at', 'isCallableMethod')
# ('Looking at', '__doc__')
# ('Looking at', 'codecs')
for x in toAdd:
AddGlobalAttribute(*x)
if __name__ == "__main__":
salutation("world")
farewell("goodbye", "world")
# Returns:
# hello world
# goodbye world
This works by iterating over the all the objects in the global namespace. If the item is a class, it iterates over the class attributes. If the attribute is callable it adds it to the global namespace as a function.
It ignore all attributes which contain "__".
I wouldn't use this in production code, but it should get you started.
Here's my own humble contribution -- a slight embellishment of #Håvard S's highly rated answer, but a bit more explicit (so it might be acceptable to #S.Lott, even though probably not good enough for the OP):
import sys
class A(object):
def salutation(self, accusative):
print "hello", accusative
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
try:
return getattr(self.wrapped, name)
except AttributeError:
return getattr(A(), name)
_globals = sys.modules[__name__] = Wrapper(sys.modules[__name__])
if __name__ == "__main__":
_globals.salutation("world")
Create your module file that has your classes. Import the module. Run getattr on the module you just imported. You can do a dynamic import using __import__ and pull the module from sys.modules.
Here's your module some_module.py:
class Foo(object):
pass
class Bar(object):
pass
And in another module:
import some_module
Foo = getattr(some_module, 'Foo')
Doing this dynamically:
import sys
__import__('some_module')
mod = sys.modules['some_module']
Foo = getattr(mod, 'Foo')
EDIT: Note that this is a REALLY BAD idea to do in production code. This was just an interesting thing for me. Don't do this at home!
Is it possible to modify __metaclass__ variable for whole program (interpreter) in Python?
This simple example is working:
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
__metaclass__= ChattyType
class Data:
pass
data = Data() # prints "Class init Data"
print data
but I would love to be able change of __metaclass__ to work even in submodules. So for example (file m1.py):
class A:
pass
a=A()
print a
file main.py:
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
__metaclass__= ChattyType
import m1 # and now print "Class init A"
class Data:
pass
data = Data() # print "Class init Data"
print data
I understand that global __metaclass__ is no longer working in Python 3.X, but that is not my concern (my code if proof of concept). So is there any way to accomplish this in Python-2.x?
The "global __metaclass__" feature of Python 2 is designed to work per-module, only (just think what havoc it would wreak, otherwise, by forcing your own metaclass on all library and third-party modules that you imported from that point onwards -- shudder!). If it's very important to you to "secretly" alter the behavior of all modules you're importing from a certain point onwards, for whatever cloak-and-dagger reason, you could play very very dirty tricks with an import hook (at worst by first copying the sources to a temporary location while altering them...) but the effort would be proportionate to the enormity of the deed, which seems appropriate;-)
Okay; IMO this is gross, hairy, dark magic. You shouldn't use it, perhaps ever, but especially not in production code. It is kind of interesting just for curiosity's sake, however.
You can write a custom importer using the mechanisms described in PEP 302, and further discussed in Doug Hellmann's PyMOTW: Modules and Imports. That gives you the tools to accomplish the task you contemplated.
I implemented such an importer, just because I was curious. Essentially, for the modules you specify by means of the class variable __chatty_for__, it will insert a custom type as a __metaclass__ variable in the imported module's __dict__, before the code is evaluated. If the code in question defines its own __metaclass__, that will replace the one pre-inserted by the importer. It would be inadvisable to apply this importer to any modules before carefully considering what it would do to them.
I haven't written many importers, so I may have done one or more silly things while writing this one. If anyone notices flaws / corner cases I missed in the implementation, please leave a comment.
source file 1:
# foo.py
class Foo: pass
source file 2:
# bar.py
class Bar: pass
source file 3:
# baaz.py
class Baaz: pass
and the main event:
# chattyimport.py
import imp
import sys
import types
class ChattyType(type):
def __init__(cls, name, bases, dct):
print "Class init", name
super(ChattyType, cls).__init__(name, bases, dct)
class ChattyImporter(object):
__chatty_for__ = []
def __init__(self, path_entry):
pass
def find_module(self, fullname, path=None):
if fullname not in self.__chatty_for__:
return None
try:
if path is None:
self.find_results = imp.find_module(fullname)
else:
self.find_results = imp.find_module(fullname, path)
except ImportError:
return None
(f,fn,(suf,mode,typ)) = self.find_results
if typ == imp.PY_SOURCE:
return self
return None
def load_module(self, fullname):
#print '%s loading module %s' % (type(self).__name__, fullname)
(f,fn,(suf,mode,typ)) = self.find_results
data = f.read()
if fullname in sys.modules:
module = sys.modules[fullname]
else:
sys.modules[fullname] = module = types.ModuleType(fullname)
module.__metaclass__ = ChattyType
module.__file__ = fn
module.__name__ = fullname
codeobj = compile(data, fn, 'exec')
exec codeobj in module.__dict__
return module
class ChattyImportSomeModules(ChattyImporter):
__chatty_for__ = 'foo bar'.split()
sys.meta_path.append(ChattyImportSomeModules(''))
import foo # prints 'Class init Foo'
import bar # prints 'Class init Bar'
import baaz
Nope. (This is a feature!)