Python inspect: Get arguments of specific decorator - python

I need a script that given a function returns the arguments of a specific decorator.
Imagine the following function:
#decorator_a
#decorator_b(41,42,43)
#decorator_c(45)
def foo(self):
return 'bar'
I need a function that given foo returns the arguments of decorator_b - something like [41,42,43]. Is there a way to achieve this?

After a few hours of trying out different stuff I figured out a feasible solution:
inspect.getclosurevars(foo.__wrapped__).nonlocals
If you know the argument names of the decorator you try to inspect you can check for existence in the nonlocals dict. If it's not there, check one __wrapped__ layer higher and so on.

Related

I am trying to make my own module and I want to show the users a help for the module functions

I am creating a module in which I'm defining multiple functions and I want to give the users some help regarding it, just like the help() function does. Still, I'm not quite sure how I do that, so can someone help me with it, for example, if I want help regarding this function, what will I have to do?
def shanky_calculate_average(*args):
my_average = sum(args) / len(args)
return my_average
I just want to know if I can get help in the same way as you can on things like help(pandas.read_excel)
What help() actually does is printing the function name with arguments and its docstring. Docstrings are comments directly after the function or class definition and are declared using either '''triple single quotes''' or """triple double quotes""".
So doing something like this:
def shanky_calculate_average(*args):
'''
Computes the average of the arguments.
Example: shanky_calculate_average(1,2,3,4)
'''
my_average = sum(args) / len(args)
return my_average
will result in help(shanky_calculate_average)
to print
shanky_calculate_average(*args)
Computes the average of the arguments.
Example: shanky_calculate_average(1,2,3,4)
As a sidenote, you can also access docstrings via the __doc__ property of the object like this: shanky_calculate_average.__doc__.

A decorator that returns multiple functions

I'd like to write a decorator that places multiple functions into the module namespace. Consider the following module:
# my_module.py
from scipy import signal
#desired_decorator(new_size=(8, 16, 32))
def resample(x, new_size):
return signal.resample(x, new_size)
I'd like to now be able to import resample_8, resample_16, and resample_32 from my_module. I can write the decorator and have it return a list of functions, but how can those functions be made available in the module namespace?
Due to the fact that you can assign to the global dictionary without using sneaky hacks, this is just almost possible. (grammar nice)
EDIT: K, maybe it's a lil bit sneaky. Don't try this at home without a supervising Pythonista. martineau
EDIT2: It is possible to get the caller's globals by using stack introspection, which avoids the importing problem, but it won't work when invoked in a non-global namespace, or dissipate your confusion in 6 months. user2357112
globals() returns a dictionary of the global variables. Assigning to this makes it possible for a user to import these functions
functools.partial is a great way to make partial functions. This basically makes a 'half complete' function call. Creating a partial function makes it remember the arguments and keyword arguments and calling that partial function will call the original function with the arguments and keyword arguments. Read more about it here.
Here's the decorator you want, though I would strongly suggest against using this.
from functools import partial
def desired_decorator(**kwargs):
# make sure there's only one keyword argument
assert len(kwargs) == 1
# unpack the single keyword and the values
keyword, values = (*kwargs.items(),)[0]
# this is the actual decorator that gets called
def _make_variants(func):
for value in values:
# assign to the globals dictionary
globals()[
f"{func.__name__}_{value}"
] = partial(func, **{keyword: value})
# keep the original function available
return func
return _make_variants
My alternative would be to use what Chris said as creating many functions from a decorator would not be good for maintenance and for clarity.
Here's the code I suggest, but you can use the one above if you want.
from functools import partial
# assign your function things here
resample_8 = partial(resample, new_size=8)
# repeat for other names

Python: call function with default arguments that come before positional arguments

For example, I'd like to do something like: greet(,'hola'), where greet is:
def greet(person='stranger', greeting='hello')
This would help greatly for testing while writing code
Upon calling a function you can use the variable names to make it even more clear what variable will assume which value. At the same time, if defaults are provided in the function definition, skipping variables when calling the function does not raise any errors. So, in short you can just do this:
def greet(person='stranger', greeting='hello')
print('{} {}'.format(greeting, person))
return
greet(greeting='hola') # same as greet(person='stranger', greeting='hola')
# returns 'hola stranger'
Note that, as I said above this would not work if for example your function definition was like this:
def greet(person, greeting)
print('{} {}'.format(greeting, person))
return
Since in this case, Python would complain saying that it does not know what to do with person; no default is supplied..
And by the way, the problem you are describing is most likely the very reason defaults are used in the first place
Without knowing the other parameters, and only knowing that the parameter you want to change is in second position you could use the inspect module to get function signature & associated default values.
Then make a copy of the default values list and change the one at the index you want:
import inspect
def greet(person='stranger', greeting='hello'):
print(person,greeting)
argspec = inspect.getargspec(greet)
defaults = list(argspec.defaults)
defaults[1] = "hola" # change second default parameter
greet(**dict(zip(argspec.args,defaults)))
Assuming that all parameters have default values (else it shifts the lists an that fails) that prints:
stranger hola

How to pass parameters in a Python Dispatch Table

I am trying to construct a dispatch the following way:
def run_nn(type=None):
print type, 'nn'
return
def run_svm(type=None):
print type, 'svm'
return
action = {'nn' : run_nn( type=None),
'svm' : run_svm(type=None),}
I want the function to be executed only when called with something like:
action.get('nn',type='foo')
With expectation it to print:
foo nn
But it breaks giving:
TypeError: get() takes no keyword arguments
What's the right way to do it?
Furthermore, two functions run_nn() and run_svm() were executed without even being called. I don't want that. How can I avoid it?
You're calling the functions while building the dictionary. You should instead put the function objects in the dict without calling them. And afterwards, get the appropriate function from the dict and call it with the keyword argument.
What you want is:
action = {'nn' : run_nn,
'svm' : run_svm,}
...
action.get('nn')(type='foo') # get function object from dict and then call it.
I'll suggest you use action['nn'] over action.get('nn') since you're not specifying any default callable in the get method; the get method returns None when you don't specify one. A KeyError is much more intuitive than a TypeError NoneType object is not callable in this scenario.
On another note, you can drop those return statements as you aren't actually returning anything. Your function will still return without them.
BTW, I have the feeling your function(s) want to change behavior depending on type (although your type is counter-intuitive as it is always a string). In any case, you may have a look at functools.singledispatch. That'll transform your function(s) into a single-dispatch generic function with the possibility to create several overloaded implementations.
Finally, although type does make for a good argument name, you will run into problems when you need to use the builtin type in your function.

Can I use same argument names when passing arguments in Python

Can you please help me guys. I believe I've got pretty easy questions but don't want to stuff up with my assignment. I'm going to have Class in my module, this class will have few functions.
I just want to be sure it works alright and this is a not ugly code practice.
I.e. my first function test_info accepts one parameter test_code and returns something and the second function check_class accepts two parameter, one of them is called test_code as well
Can I use same argument name: test_code? Is it normal code practice?
def test_info (self, test_code):
my_test_code = test_code
#here we'll be using my_test_code to get info from txt file and return other info
def check_class (self, test_code, other_arg):
my_test_code = test_code
#here some code goes
Also is it fine to use my_test_code in both functions to get argument value or is it better to use different ones like my_test_code_g etc.
Many thanks
Yes you may.
The two variables test_code are defined only in the scope of their respective functions and therefore will not interfere with one another since the other functions lie outside their scope.
Same goes for my_test_code
Read online about variable scopes. Here is a good start
There is no technical reason to resolve this one way or another. But if the variables don't serve exactly the same purpose in both functions, it's confusing for a human reader if they have the same name.

Categories