I have a module which contains a lot of functions (more than 25). I want to add a common decorator function to each of these functions. The normal way to do is to add a #decorator line above each function, but I was wondering if there is a better way to do it? Probably I can declare a global decorator at the top of the module or something else?
Note that since I am using someone else's code, I want to minimize the number of lines changed, so modifying the module is not ideal for me.
Thanks.
If your decorator is called my_decorator
### Decorate all the above functions
import types
for k,v in globals().items():
if isinstance(v, types.FunctionType):
globals()[k] = my_decorator(v)
You could also apply this to the module after importing it
import othermodule
import types
for k,v in vars(othermodule).items():
if isinstance(v, types.FunctionType):
vars(othermodule)[k] = my_decorator(v)
I think applying a decorator en-masse such that it's not obvious where you will go looking to find out about the function (at its definition) is generally a bad idea. Explicit is better than implicit, and all that.
If you want to apply the decorator to some third party module's functions, without modifying the third-party code, here is how I would do it:
# my_wrapper_module.py
import some_module
import functools
def some_decorator(func):
#functools.wraps(func):
def wrapper(*args, **kwargs):
...
return wrapper
FUNCTION_NAMES = [
'some_func_1',
'some_func_2',
'some_func_3',
...
]
for name in FUNCTION_NAMES:
globals()[name] = some_decorator(getattr(some_module, name))
And then use these functions elsewhere by doing from my_wrapper_module import some_func_2, etc.
For me, this has the following advantages:
No need to modify the third-party source file
It is clear from the call site that I should go look at my_wrapper_module to see what I'm calling, and that I'm not using the undecorated versions of the functions
It is clear from my_wrapper_module what functions are being exported, that they originally come from some_module, and that they all have the same decorator applied
Any code that imports some_module directly isn't silently and inexplicably affected; this could be particularly important if the third-party code is more than one module
But if what you're trying to do is hack a third-party library so that internal calls are affected, then this is not what you want.
Related
I would like to convert a singleton-object programmatically into a Python module so that I can use the methods of this singleton-object directly by importing them via the module instead of accessing them as object attributes. By "programmatically" I mean that I do not want to have to copy-paste the class methods explicitly into a module file. I need some sort of a workaround that allows me to import the object methods into to global scope of another module.
I would really appreciate if someone could help me on this one.
Here is a basic example that should illustrate my problem:
mymodule.py
class MyClass:
"""This is my custom class"""
def my_method(self):
return "myValue"
singleton = MyClass()
main_as_is.py
from mymodule import MyClass
myobject = MyClass()
print(myobject.my_method())
main_to_be.py
from mymodule import my_method # or from mymodule.singleton import my_method
print(my_method())
You can use the same strategy that the standard random module uses. All the functions in that module are actually methods of a "private" instance of the Random class. That's convenient for most common uses of the module, although sometimes it's useful to create your own instances of Random so that you can have multiple independent random streams.
I've adapted your code to illustrate that technique. I named the class and its instance with a single leading underscore, since that's the usual convention in Python to signify a private name, but bear in mind it's simply a convention, Python doesn't do anything to enforce this privacy.
mymodule.py
class _MyClass:
""" This is my custom class """
def my_method(self):
return "myValue"
_myclass = _MyClass()
my_method = _myclass.my_method
main_to_be.py
from mymodule import my_method
print(my_method())
output
myValue
BTW, the from mymodule import method1, method2 syntax is ok if you only import a small number of names, or it's clear from the name which module it's from (like math module functions and constants), and you don't import from many modules. Otherwise it's better to use this sort of syntax
import mymodule as mm
# Call a method from the module
mm.method1()
That way it's obvious which names are local, and which ones are imported and where they're imported from. Sure, it's a little more typing, but it makes the code a whole lot more readable. And it eliminates the possibility of name collisions.
FWIW, here's a way to automate adding all of the _myclass methods without explicitly listing them (but remember "explicit is better than implicit"). At the end of "mymodule.py", in place of my_method = _myclass.my_method, add this:
globals().update({k: getattr(_myclass, k) for k in _MyClass.__dict__
if not k.startswith('__')})
I'm not comfortable with recommending this, since it directly injects items into the globals() dict. Note that that code will add all class attributes, not just methods.
In your question you talk about singleton objects. We don't normally use singletons in Python, and many programmers in various OOP languages consider them to be an anti-pattern. See https://stackoverflow.com/questions/12755539/why-is-singleton-considered-an-anti-pattern for details. For this application there is absolutely no need at all to use a singleton. If you only want a single instance of _MyClass then simply don't create another instance of it, just use the instance that mymodule creates for you. But if your boss insists that you must use a singleton, please see the example code here.
I have created many functions that are divided into different files, now I would like to apply the same decorator for all of them without modifying the files and without applying the decorators one by one.
I have tried to use this explanation written by delnan, but I got no success for imported functions.
About the decorator, it must update a list every time a function within a class is executexecuted with the function arguments and values, just like this other question I asked.
Any suggestions to help me with this issue?
Thanks
A little bit of introspection (dir()) and dynamic look-up with getattr() and setattr().
First we iterate over all names found in module and check for objects that look like functions. After that we simply reassign old function with decorated one.
main.py:
import types
import functools
def decorate_all_in_module(module, decorator):
for name in dir(module):
obj = getattr(module, name)
if isinstance(obj, types.FunctionType):
setattr(module, name, decorator(obj))
def my_decorator(f):
#functools.wraps(f)
def wrapper(*args, **kwargs):
print(f)
return f(*args, **kwargs)
return wrapper
import mymod1
decorate_all_in_module(mymod1, decorator)
mymod1.py:
def f(x):
print(x)
def g(x, y):
print(x + y)
Output:
<function f at 0x101e309d8>
2
<function g at 0x101e30a60>
7
Process does not goes that smooth if you use star imports (from mymod import *). Reason is simple - because all names are in one huge bag and there no differentiation on where they come from, you need a lot of additional tricks to find what exactly you want to patch. But, well, that's why we use namespaces - because they are one honking great idea.
I have a number of modules that needs to have a database connection instance, and I would prefer they share the same instance and don't create their own. My current way of doing this is to explicitly send each function in all modules the object instance like such:
def func(arg1, arg2, database_connection):
pass
This becomes quite ugly and in a way redundant when there should be a better way to import a separate module containing the instance, but I'm not quite sure how to guarantee that it's actually one single instance, and not multiple instances.
That is, I'm looking for a way to do something like this:
import db_module
def func(arg1, arg2):
database_connection = db_module.get_db_instance()
The solution you've described:
import db_module
def func(arg1, arg2):
database_connection = db_module.get_db_instance()
is perfectly viable because Python imports each module exactly once. If multiple import statements are executed, they each refer to a single instance of the module.
You can read more about modules and importing in the Python Tutorial and the
Python Language Reference.
Is this what you mean? Create a module that imports the defined function?
db_module.py
my_connection = func(x,y)
from db_module import my_connection
I've decorated a method in Python. And when I import the module that contains the method, the decorator autoruns.
I realize that this is how decorators were made however Is there a way to have decorators NOT do this?
It sounds like what you want to do is to choose what decorator to apply at run time. Something like this might work:
to_decorate = []
def decorate_later(func):
to_decorate.append(func)
return func
#decorate_later
def do_stuff(*args, **kw):
print('I am doing stuff')
#decorate_later
def do_more_stuff(*args, **kw):
print('Even more stuff')
def apply_decorator(decorator):
for func in to_decorate:
globals()[func.func_name] = decorator(func)
Then you can import the module and all the functions will be defined as normal. decorate_later returns the original function unmodified. You can call apply_decorator() to apply a specified decorator to all of the functions in the module that were registered by #decorate_later
This is exactly what the venusian library does; you define your decorators according to their API, but the actual behavior isn't triggered until you do a "scan" of the containing module or package.
You don't even need to have a global app object to use venusian decorators; you can pass in the app object as part of the scan, and it'll get passed along to the decorator implementations. So, for example, the same functions can be shared among multiple owners with only a single decorator, just by doing more than one scan.
This is what the Pyramid web framework uses for e.g. event registration, so that merely importing a module doesn't expect to need an app instance. A good example is their event subscriber.
Use
if __name__ == "__main__":
#code
in the file, where code is all outside a method or class ( that runs when you import it).
I've gotten myself in trouble a few times now with accidentially (unintentionally) referencing global variables in a function or method definition.
My question is: is there any way to disallow python from letting me reference a global variable? Or at least warn me that I am referencing a global variable?
x = 123
def myfunc() :
print x # throw a warning or something!!!
Let me add that the typical situation where this arrises for my is using IPython as an interactive shell. I use 'execfile' to execute a script that defines a class. In the interpreter, I access the class variable directly to do something useful, then decide I want to add that as a method in my class. When I was in the interpreter, I was referencing the class variable. However, when it becomes a method, it needs to reference 'self'. Here's an example.
class MyClass :
a = 1
b = 2
def add(self) :
return a+b
m = MyClass()
Now in my interpreter I run the script 'execfile('script.py')', I'm inspecting my class and type: 'm.a * m.b' and decide, that would be a useful method to have. So I modify my code to be, with the non-intentional copy/paste error:
class MyClass :
a = 1
b = 2
def add(self) :
return a+b
def mult(self) :
return m.a * m.b # I really meant this to be self.a * self.b
This of course still executes in IPython, but it can really confuse me since it is now referencing the previously defined global variable!
Maybe someone has a suggestion given my typical IPython workflow.
First, you probably don't want to do this. As Martijn Pieters points out, many things, like top-level functions and classes, are globals.
You could filter this for only non-callable globals. Functions, classes, builtin-function-or-methods that you import from a C extension module, etc. are callable. You might also want to filter out modules (anything you import is a global). That still won't catch cases where you, say, assign a function to another name after the def. You could add some kind of whitelisting for that (which would also allow you to create global "constants" that you can use without warnings). Really, anything you come up with will be a very rough guide at best, not something you want to treat as an absolute warning.
Also, no matter how you do it, trying to detect implicit global access, but not explicit access (with a global statement) is going to be very hard, so hopefully that isn't important.
There is no obvious way to detect all implicit uses of global variables at the source level.
However, it's pretty easy to do with reflection from inside the interpreter.
The documentation for the inspect module has a nice chart that shows you the standard members of various types. Note that some of them have different names in Python 2.x and Python 3.x.
This function will get you a list of all the global names accessed by a bound method, unbound method, function, or code object in both versions:
def get_globals(thing):
thing = getattr(thing, 'im_func', thing)
thing = getattr(thing, '__func__', thing)
thing = getattr(thing, 'func_code', thing)
thing = getattr(thing, '__code__', thing)
return thing.co_names
If you want to only handle non-callables, you can filter it:
def get_callable_globals(thing):
thing = getattr(thing, 'im_func', thing)
func_globals = getattr(thing, 'func_globals', {})
thing = getattr(thing, 'func_code', thing)
return [name for name in thing.co_names
if callable(func_globals.get(name))]
This isn't perfect (e.g., if a function's globals have a custom builtins replacement, we won't look it up properly), but it's probably good enough.
A simple example of using it:
>>> def foo(myparam):
... myglobal
... mylocal = 1
>>> print get_globals(foo)
('myglobal',)
And you can pretty easily import a module and recursively walk its callables and call get_globals() on each one, which will work for the major cases (top-level functions, and methods of top-level and nested classes), although it won't work for anything defined dynamically (e.g., functions or classes defined inside functions).
If you only care about CPython, another option is to use the dis module to scan all the bytecode in a module, or .pyc file (or class, or whatever), and log each LOAD_GLOBAL op.
One major advantage of this over the inspect method is that it will find functions that have been compiled, even if they haven't been created yet.
The disadvantage is that there is no way to look up the names (how could there be, if some of them haven't even been created yet?), so you can't easily filter out callables. You can try to do something fancy, like connecting up LOAD_GLOBAL ops to corresponding CALL_FUNCTION (and related) ops, but… that's starting to get pretty complicated.
Finally, if you want to hook things dynamically, you can always replace globals with a wrapper that warns every time you access it. For example:
class GlobalsWrapper(collections.MutableMapping):
def __init__(self, globaldict):
self.globaldict = globaldict
# ... implement at least __setitem__, __delitem__, __iter__, __len__
# in the obvious way, by delegating to self.globaldict
def __getitem__(self, key):
print >>sys.stderr, 'Warning: accessing global "{}"'.format(key)
return self.globaldict[key]
globals_wrapper = GlobalsWrapper(globals())
Again, you can filter on non-callables pretty easily:
def __getitem__(self, key):
value = self.globaldict[key]
if not callable(value):
print >>sys.stderr, 'Warning: accessing global "{}"'.format(key)
return value
Obviously for Python 3 you'd need to change the print statement to a print function call.
You can also raise an exception instead of warning pretty easily. Or you might want to consider using the warnings module.
You can hook this into your code in various different ways. The most obvious one is an import hook that gives each new module a GlobalsWrapper around its normally-built globals. Although I'm not sure how that will interact with C extension modules, but my guess is that it will either work, or be harmlessly ignored, either of which is probably fine. The only problem is that this won't affect your top-level script. If that's important, you can write a wrapper script that execfiles the main script with a GlobalsWrapper, or something like that.
I've been struggling with a similar challenge (especially in Jupyter notebooks) and created a small package to limit the scope of functions.
>>> from localscope import localscope
>>> a = 'hello world'
>>> #localscope
... def print_a():
... print(a)
Traceback (most recent call last):
...
ValueError: `a` is not a permitted global
The #localscope decorator uses python's disassembler to find all instances of the decorated function using a LOAD_GLOBAL (global variable access) or LOAD_DEREF (closure access) statement. If the variable to be loaded is a builtin function, is explicitly listed as an exception, or satisfies a predicate, the variable is permitted. Otherwise, an exception is raised.
Note that the decorator analyses the code statically. Consequently, it does not have access to the values of variables accessed by closure.