In python, how to store 'constants' for functions only once? - python

Some function need 'constant' values (ie. not designed to be redefined later) that are not to be parametrized. While default arguments are stored only once for each function, some are just not very meaningful to be put as parameters (ie. to be part of the signature). For (a not very useful) example:
def foo(bar):
my_map = {"rab": barType, "oof": fooType}
return my_map.get(bar,defaultType)()
It wasted CPU time and RAM space to re-define such a constant for each call. Some other ways are to store such constants as module level globals or make the function a callable class, but there may be other ways, maybe?
When doing the module level global way, I prefix my (meant as a) constant variable with a "_" to show that it is there not for anyone's interest. Still I feel the module namespace slightly "polluted", not to speak of the shame of using something as discouraged as globals at all:
_my_map = {"rab": barType, "oof": fooType}
def foo(bar):
return _my_map.get(bar,defaultType)()
Or the transform it into a class way. I make the __call__ a classmethod, to avoid the need of creating instances:
class foo:
my_map = {"rab": barType, "oof": fooType}
#classmethod
def __call__(cls,bar):
return cls.my_map.get(bar,defaultType)()
Are these solutions pythonic enough?
Are there other ways to do this?
Is it even ok as a practice to use such 'constants'?
Note these objects in my examples are not necessarily actual constants, but used (and could be thought) as such by their purpose.

Set it as an attribute on the function:
def foo(bar):
return foo.my_map.get(bar, defaultType)()
foo.my_map = {"rab": barType, "oof": fooType}
A callable class or a closure is not simple enough IMO.

IMHO, there is nothing wrong with module level constants.
Note that according to PEP 8, constants should be all upper case, like this:
_MY_MAP = {"rab": barType, "oof": fooType}
def foo(bar):
return _MY_MAP.get(bar,defaultType)()
The regular expression module in the standard library uses this style and many established third-party libraries do as well. If you are not convinced, just go to your site-packages directory and grep:
egrep "^_?[A-Z]+ =" *

You could also use closures:
def make_foo():
my_map = {"rab": barType, "oof": fooType}
def foo(bar):
return my_map.get(bar,defaultType)()
return foo
foo = make_foo()

To make something that is as self-contained as possible. You could create a function object (aka functor) class and give it a __call__() method or a classmethod (but probably not both):
class bazType(object): pass
class barType(object): pass
class fooType(object): pass
class Foo(object):
_DEFAULT_TYPE = bazType
_MY_MAP = {"rab": barType, "oof": fooType}
def __call__(self, bar):
return self._MY_MAP.get(bar, self._DEFAULT_TYPE)()
#classmethod
def foo(cls, bar):
return cls._MY_MAP.get(bar, cls._DEFAULT_TYPE)()
# using classmethod
print Foo.foo("rab")
# using __call__ method
foo = Foo()
print foo("baz")
# alternative way to use classmethod
foo = Foo.foo
print foo("oof")
Yet another alternative would be to define a staticmethod, which I won't illustrate because it's so similar to the other two -- but you get the idea I hope.

Related

Defer variable initialization until evaluation

I need a way to defer the initialization of a global variable until the firs access to it, the overall idea is expressed in the following Python pseudocode:
FOO = bar
FOO.some_method_on_bar() # Init bar: bar = Bar(); bar.some_method_on_bar()
FOO.some_method_on_bar() # Use cached bar: bar.some_method_on_bar()
So far I'm thinking of somehow telling Python to call a special class method every time its instance is evaluated, but I can't seem to google it up:
class LazyGetter:
def __init__(self, get_value) -> None:
self.get_value = get_value
def __class__instance__access__(self):
return self.get_value()
FOO = LazyGetter(get_value=lambda: Bar())
FOO # = LazyGetter.__class__instance__access__()
FOO.some_method_on_bar() # = LazyGetter.__class__instance__access__().some_method_on_bar()
So, basically I need to know if there's something equivalent to the madeup __class__instance__access__ method.
If you have to defer initialization, you may be doing too much in the __init__ method. But if you don't control that code, then you seem to be needing something like a proxy class, so you can do:
proxied_bar = Proxy(Bar)
...
proxied_bar.some_bar_method() # this would initialize Bar, if it isn't yet initialized, and then call the some_bar_method
One way to do so, see: Python proxy class
In that answer an instantiated object is proxied (rather than the class), so you have to make some modifications if you want to defer the __init__ call.
Since Python 3.7, one can define a module __getattr__ method to programmatically provide "global" attributes. Earlier and more generally, one can define a custom module type to provide such a method.
Assuming that Bar() is needed to initialise the global FOO, the following __getattr__ at module scope can be used.
# can type-annotate to "hint" that FOO will exist at some point
FOO: Bar
# called if module.<item> fails
def __getattr__(item: str):
if item == "FOO":
global FOO # add FOO to global scope
FOO = Bar()
return FOO
raise AttributeError(f"module {__name__!r} has no attribute {item!r}")
This makes FOO available programmatically when accessed as an attribute, i.e. as module.FOO or an import. It is only available in the global scope after the first such access.
If the access to FOO is expected to happen inside the module first, it is easier to provide a "getter" function instead.
def get_FOO() -> Bar:
global _FOO
try:
return _FOO
except NameError:
_FOO = Bar()
return _FOO
You might want to consider just having an actual global variable and accessing it with global <variable> but I can't say if that fits the use-case. It should work fine if you're just looking for some caching logic.
You might be able to do this with metaclasses which is a way of modifying a class when it's instantiated. Whether this is useful depends on what you're trying to achieve.
If you control the class code, you can use __getattribute__ to delay initialization until the first time you access an attribute.
class Bar:
def __init__(self, *args):
self._args = args
def __getattribute__(self, name):
args = super().__getattribute__('_args')
if args is not None:
# Initialize the object here.
self.data = args[0]
self.args = None
return super().__getattribute__(name)
def some_method_on_bar(self):
return self.data

Class decorators vs function decorators [duplicate]

This question already has answers here:
Difference between decorator classes and decorator functions
(3 answers)
Closed 7 years ago.
In python there are two ways to declare decorators:
Class based
class mydecorator(object):
def __init__(self, f):
self.f = f
def __call__(self, *k, **kw):
# before f actions
self.f(*k, **kw)
# after f actions
Function based
def mydecorator(f):
def decorator(*k, **kw):
# before f actions
f(*k, **kw)
# after f actions
return decorator
Is there any difference between these declarations?
In which cases each of them should be used?
If you want to keep state in the decorator you should use a class.
For example, this does not work
def mydecorator(f):
x = 0
def decorator():
x += 1 # x is a nonlocal name and cant be modified
return f(x)
return decorator
There are many workarounds for this but the simplest way is to use a class
class mydecorator(object):
def __init__(self, f):
self.f = f
self.x = 0
def __call__(self, *k, **kw):
self.x += 1
return f(self.x)
When you're creating a callable returning another callable, the function approach is easier and cheaper. There are two main differences:
The function approach works automatically with methods, while if you're using your class approach, you'd have to read on descriptors and define a __get__ method.
The class approach makes keeping state easier. You could use a closure, especially in Python 3, but a class approach is generally preferred for keeping state.
Additionally, the function approach allows you to return the original function, after modifying it or storing it.
However, a decorator can return something other than a callable or something more than a callable. With a class, you can:
Add methods and properties to the decorated callable object, or implement operations on them (uh-oh).
Create descriptors that act in a special way when placed in classes (e.g. classmethod, property)
Use inheritance to implement similar but different decorators.
If you have any doubt, ask yourself: Do you want your decorator to return a function that acts exactly like a function should? Use a function returning a function. Do you want your decorator to return a custom object that does something more or something different to what a function does? Create a class and use it as a decorator.
In fact there are no 'two ways'. There is only one way (define a callable object) or as many ways as there are in python to make a callable object (it could be a method of other object, a result of lambda expression, a 'partial' object, anything that is callable).
Function definition is the easiest way to make a callable object and, as the simplest one, is probably the best in most cases. Using a class gives you more possibilities to cleanly code more complicated cases (even in the simplest cases it looks quite elegant), but it is not that obvious what it does.
No, there are (more than) two ways to make callable objects. One is to def a function, which is obviously callable. Another is to define a __call__ method in a class, which will make instances of it callable. And classes themselves are callable objects.
A decorator is nothing more than a callable object, which is intended to accept a function as its sole argument and return something callable. The following syntax:
#decorate
def some_function(...):
...
Is just a slightly nicer way of writing:
def some_function(...):
...
some_function = decorate(some_function)
The class-based example you give isn't a function that takes a function and return a function, which is the bog-standard vanilla decorator, it's a class that is initialised with a function whose instances are callable. To me, this is a little weird if you're not actually using it as a class (Does it have other methods? Does its state change? Do you make several instances of it that have common behaviour encapsulated by the class?). But normal use of your decorated function will not tell the difference (unless it's a particularly invasive decorator), so do whatever feels more natural to you.
Let's just test it!
test_class = """
class mydecorator_class(object):
def __init__(self, f):
self.f = f
def __call__(self, *k, **kw):
# before f actions
print 'hi class'
self.f(*k, **kw)
print 'goodbye class'
# after f actions
#mydecorator_class
def cls():
print 'class'
cls()
"""
test_deco = """
def mydecorator_func(f):
def decorator(*k, **kw):
# before f actions
print 'hi function'
f(*k, **kw)
print 'goodbye function'
# after f actions
return decorator
#mydecorator_func
def fun():
print 'func'
fun()
"""
if __name__ == "__main__":
import timeit
r = timeit.Timer(test_class).timeit(1000)
r2 = timeit.Timer(test_deco).timeit(1000)
print r, r2
I've got results like this : 0.0499339103699 0.0824959278107
This is means that class deco 2 times faster?

How is this called and how can be done ( `function_name.decorator` )?

Really sorry for the extremely stupid title, but if I know what it is, I wouldn't write here (:
def some_decorator( func ):
# ..
class A:
#some_decorator
def func():
pass
#func.some_decorator # this one here - func.some_decorator ?
def func():
pass
some_decorator decorates func - that's OK. But what is func.some_decorator and how some_decorator becomes a member ( or something else ? ) of func?
P.S. I'm 90% sure, that there's such question here (as this seems something basic), but I don't know how to search it. If there's a exact duplicate, I'll delete this question.
Note : It's not typo, nor accident, that both member functions are named func. The decorator is for overloading: the question is related to Decorating method (class methods overloading)
Remember that the function definition with decorator is equivalent to this:
def func():
pass
func = some_decorator(func)
So in the following lines, func doesn't refer to the function you defined but to what the decorator turned it into. Also note that decorators can return any object, not just functions. So some_decorator returns an object with a method (it's unfortunate that the names some_decorator and func are reused in the example - it's confusing, but doesn't change anything about the concept) that is itself a decorator. As the expression after the # is evaluated first, you still have a reference to the first decorator's method after you defined another plain function func. That decorator is applied to this new function. The full example is then equivalent to this:
class A:
def func():
pass
func = some_decorator(func)
_decorator = func.some_decorator
def func():
pass
func = _decorator(func)
One way to clarify this is to demonstrate it with a concrete example that behaves like this, the builtin property descriptor:
class C(object):
#property
def x(self):
"This is a property object, not a function"
return self._x
#x.setter
def x(self, val):
self._x = val
>>> c = C()
>>> c.x = 1
>>> c.x
1
>>> C.x
<property object at 0x2396100>
>>> C.x.__doc__
'This is a property object, not a function'
>>> C.x.getter.__doc__
'Descriptor to change the getter on a property.'
>>> C.x.setter.__doc__
'Descriptor to change the setter on a property.'
>>> C.x.deleter.__doc__
'Descriptor to change the deleter on a property.'
The first invocation of property (as a decorator) means that x is not a function - it is a property descriptor. A feature of properties is that they allow you to initially define just the fget method, and then provide fset and fdel later by using the property.setter and property.deleter decorators (although since each of these creates a new property object, you do need to make sure to use the same name each time).
Something similar will usually be the case whenever you see code using this kind of pattern. Ideally, the naming of the decorators involved will make it reasonably clear what is going on (e.g. most people seem to grasp the idiom for defining property attributes reasonably easily).

How do I dynamically create a function with the same signature as another function?

I'm busy creating a metaclass that replaces a stub function on a class with a new one with a proper implementation. The original function could use any signature. My problem is that I can't figure out how to create a new function with the same signature as the old one. How would I do this?
Update
This has nothing to do with the actual question which is "How do I dynamically create a function with the same signature as another function?" but I'm adding this to show why I can't use subclasses.
I'm trying to implement something like Scala Case Classes in Python. (Not the pattern matching aspect just the automatically generated properties, eq, hash and str methods.)
I want something like this:
>>> class MyCaseClass():
... __metaclass__ = CaseMetaClass
... def __init__(self, a, b):
... pass
>>> instance = MyCaseClass(1, 'x')
>>> instance.a
1
>>> instance.b
'x'
>>> str(instance)
MyCaseClass(1, 'x')
As far as I can see, there is no way to that with subclasses.
I believe functools.wraps does not reproduce the original call signature. However, Michele Simionato's decorator module does:
import decorator
class FooType(type):
def __init__(cls,name,bases,clsdict):
#decorator.decorator
def modify_stub(func, *args,**kw):
return func(*args,**kw)+' + new'
setattr(cls,'stub',modify_stub(clsdict['stub']))
class Foo(object):
__metaclass__=FooType
def stub(self,a,b,c):
return 'original'
foo=Foo()
help(foo.stub)
# Help on method stub in module __main__:
# stub(self, a, b, c) method of __main__.Foo instance
print(foo.stub(1,2,3))
# original + new
use functools.wraps
>>> from functools import wraps
>>> def f(a,b):
return a+b
>>> #wraps(f)
def f2(*args):
print(args)
return f(*args)
>>> f2(2,5)
(2, 5)
7
It is possible to do this, using inspect.getargspecs. There's even a PEP in place to make it easier.
BUT -- this is not a good thing to do. Can you imagine how much of a debugging/maintenance nightmare it would be to have your functions dynamically created at runtime -- and not only that, but done so by a metaclass?! I don't understand why you have to replace the stub dynamically; can't you just change the code when you want to change the function? I mean, suppose you have a class
class Spam( object ):
def ham( self, a, b ):
return NotImplemented
Since you don't know what it's meant to do, the metaclass can't actually implement any functionality. If you knew what ham were meant to do, you could do it in ham or one of its parent classes, instead of returning NotImplemented.

How to differentiate between method and function in a decorator?

I want to write a decorator that acts differently depending on whether it is applied to a function or to a method.
def some_decorator(func):
if the_magic_happens_here(func): # <---- Point of interest
print 'Yay, found a method ^_^ (unbound jet)'
else:
print 'Meh, just an ordinary function :/'
return func
class MyClass(object):
#some_decorator
def method(self):
pass
#some_decorator
def function():
pass
I tried inspect.ismethod(), inspect.ismethoddescriptor() and inspect.isfunction() but no luck. The problem is that a method actually is neither a bound nor an unbound method but an ordinary function as long as it is accessed from within the class body.
What I really want to do is to delay the actions of the decorator to the point the class is actually instantiated because I need the methods to be callable in their instance scope. For this, I want to mark methods with an attribute and later search for these attributes when the .__new__() method of MyClass is called. The classes for which this decorator should work are required to inherit from a class that is under my control. You can use that fact for your solution.
In the case of a normal function the delay is not necessary and the decorator should take action immediately. That is why I wand to differentiate these two cases.
I would rely on the convention that functions that will become methods have a first argument named self, and other functions don't. Fragile, but then, there's no really solid way.
So (pseudocode as I have comments in lieu of what you want to do in either case...):
import inspect
import functools
def decorator(f):
args = inspect.getargspec(f)
if args and args[0] == 'self':
# looks like a (future) method...
else:
# looks like a "real" function
#functools.wraps(f)
def wrapper # etc etc
One way to make it a bit more solid, as you say all classes involved inherit from a class under your control, is to have that class provide a metaclass (which will also of course be inherited by said classes) which checks things at the end of the class body. Make the wrapped function accessible e.g. by wrapper._f = f and the metaclass's __init__ can check that all wrapped methods did indeed have self as the first argument.
Unfortunately there's no easy way to check that other functions (non-future-methods) being wrapped didn't have such a first argument, since you're not in control of the environment in that case. The decorator might check for "top-level" functions (ones whose def is a top-level statement in their module), via the f_globals (globals dict, i.e., module's dict) and f_name attributes of the function -- if the function's such a global presumably it won't later be assigned as an attribute of the class (thereby becoming a future-method anyway;-) so the self named first arg, if there, can be diagnosed as wrong and warned about (while still treating the function as a real function;-).
One alternative would be to do the decoration in the decorator itself under the hypothesis of a real function, but also make available the original function object as wrapper._f. Then, the metaclass's __init__ can re-do the decoration for all functions in the class body that it sees have been marked this way. This approach is much more solid than the convention-relying one I just sketched, even with the extra checks. Still, something like
class Foo(Bar): ... # no decorations
#decorator
def f(*a, **k): ...
Foo.f = f # "a killer"... function becomes method!
would still be problematic -- you could try intercepting this with a __setattr__ in your metaclass (but then other assignments to class attributes after the class statement can become problematic).
The more the user's code has freedom to do funky things (and Python generally leaves the programmer a lot of such freedom), the harder time your "framework-y" code has keeping things under tight control instead, of course;-).
From Python 3.3 onwards by using PEP 3155:
def some_decorator(func):
if func.__name__ != func.__qualname__:
print('Yay, found a method ^_^ (unbound jet)')
else:
print('Meh, just an ordinary function :/')
return func
A method x of class A will have a __qualname__ that is A.x while a function x will have a __qualname__ of x.
Do you need to have the magic happen where you choose which wrapper to return, or can you defer the magic until the function is actually called?
You could always try a parameter to your decorator to indicate which of the two wrappers it should use, like
def some_decorator( clams ):
def _mydecor(func ):
#wraps(func)
def wrapping(*args....)
...
return wrapping
def _myclassdecor(func):
#wraps(func)
.....
return _mydecor if clams else _myclassdecor
The other thing that I might suggest is to create a metaclass and define the init method in the metaclass to look for methods decorated with your decorator and revise them accordingly, like Alex hinted at. Use this metaclass with your base class, and since all the classes that will use the decorator will inherit from the base class, they'll also get the metaclass type and use its init as well.
You just need to check to see if the function being decorated has an im_func attribute. If it does, then it is a method. If it doesn't then it is a function.
Note that the code sample below does the detection at call time but you can do it at decoration time as well. Just move the hasattr check to the outer decorator generator.
Python 2.6.4 (r264:75706, Dec 7 2009, 18:45:15)
[GCC 4.4.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> def deco(f):
... def _wrapper(*args, **kwargs):
... if hasattr(f, 'im_func'):
... print 'method'
... else:
... print 'function'
... return _wrapper
...
>>> deco(lambda x: None)()
function
>>> def f(x):
... return x + 5
...
>>> deco(f)()
function
>>> class A:
... def f(self, x):
... return x + 5
...
>>> a = A()
>>> deco(a.f)()
method
>>> deco(A.f)()
method
>>>
Edit
Oh snap! And I get it totally wrong. I so should have read Alex's post more thoroughly.
>>> class B:
... #deco
... def f(self, x):
... return x +5
...
>>> b = B()
>>> b.f()
function

Categories