Don't know where to start with these parameters - python

I'm creating a custom module to help me write python faster with less code and cleaner syntax.
At the moment i am creating a function which is a cleaner version of wx.Accelerator.
By the end of this, i expect:
accel_tbl = wx.AcceleratorTable([(wx.ACCEL_CTRL, ord('O'), PUT_ID_HERE),
(wx.ACCEL_CTRL, ord('S'), PUT_ID_HERE)])
self.SetAcceleratorTable(accel_tbl)
to become:
accelerate((wx.ACCEL_CTRL,'O',PUT_ID_HERE),
(wx.ACCEL_CTRL,'S',PUT_ID_HERE))
The only problem is.. I dont know where to start.... I know how to process information through parameters, but i've never learnt how to process multiple tuples with dynamic information inside parameters.
Can someone provide insight? Thankyou.
EDIT:
Current code:
## Create a cleaner accelerator
def accelerate(*args):
accel_tbl = wx.AcceleratorTable(list(args))
wx.SetAcceleratorTable(accel_tbl)
Current call:
import Sky
Sky.accelerate((wx.ACCEL_CTRL,'s',wx.ID_ANY),
(wx.ACCEL_CTRL,'t',wx.ID_ANY))

Add this method to your custom wx.Window class:
def accelerate(self,*args):
accel_tbl = wx.AcceleratorTable(args)
self.SetAcceleratorTable(accel_tbl)
Then invoke it as follows:
win = YourCustomWindowClass(PUT_WINDOW_ARGUMENTS_HERE)
win.accelerate((wx.ACCEL_CTRL,'O',PUT_ID_HERE),(wx.ACCEL_CTRL,'S',PUT_ID_HERE))
Alternatively, you can define it as a function taking a wx.Window argument as follows:
def accelerate(win,*args):
accel_tbl = wx.AcceleratorTable(args)
win.SetAcceleratorTable(accel_tbl)
The *args represents all the positional arguments as a tuple. If AcceleratorTable really requires a list, you can use list(args) instead as the argument (i.e. accel_tbl = wx.AcceleratorTable(list(args))).
You can learn about *args here.

Related

A partial wrapper of a python function

Given the following code,
def myfunc(a=None, b=None, c=None, **kw):
func(arga=a, argb=b, **kw)
#do something with c
def func(arga=None, argb=None, argc=None):
....
Can I replicate part of the signature of func, namely the missing args, without imitating every missing arg of func manually?
Put it more simply, I want to see argc in keywords of myfunc such that myfunc? would be different. It would contain argc. myfunc(a=None,b=None,c=None,argc=None)
#functools.wraps allows for wrapping a complete functions. Using partial can subtract args. But don't know to add.
yes, it is possible, though not trivial -
Python's introspection capabilities allow you to check all parameters the target function declares, and it is possible to build a new function programmatically that will include those attributes automatically.
I have written this for a project of mine, and had exposed the relevant code as my answer here: Signature-changing decorator: properly documenting additional argument
I will not mark this as duplicate, since the other question is more worried about documenting the new function.
If you want to give a try, with your code, maybe with something simpler, you can check the inspect.signature call from the standard library, which allows one to discover everything about parameters and default arguments of the target function.
Building a new function from this information is a bit more tricky, but possible - but one can always resort to a exec call which will can create a new function from a string template. The answer there follows this line.
I'm not sure what is being asked here either but I have here alternative code to functools.partial that might be adapted ???
(edit>)
The difference here from partial is that the mkcall argument is a string rather than a series of arguments. This string can then be formatted and analysed according to whatever appropriate requirements are needed before the target function is called.
(<edit)
def mkcall(fs, globals=None,locals=None):
class func:
def __init__(f,fcnm=None,params=None,globals=None,locals=None):
f.nm = fcnm
f.pm = params
f.globals = globals
f.locals = locals
def __call__(f):
s = f.nm + f.pm
eval(s,f.globals,f.locals)
if '(' in fs:
funcn,lbr,r = fs.partition('(')
tp = lbr + r
newf = func(funcn,tp,globals,locals)
callf = newf.__call__
else:
callf = eval(fs,globals,locals)
return callf
#call examples
# mkcall("func(arg)")
# mkcall("func")

How might I use the same step in the same scenario, but with different parameters in pytest-bdd?

Assume I have a scenario similar to this:
Scenario Outline: Example scenario
Given the subprocess is running
When I generate the input
And I add <argument1> to the input
And I add <argument2> to the input
And this input is passed to the subprocess
Then the output should match the <output> for <argument1> and <argument2>
I'd very much like to reuse the 'when' step as, e.g. And I add <argument> to the input, but don't want to use an Examples table as I wish the fixtures to by dynamically generated in the step definition/conftest file. I'm currently using #pytest.mark.parametrize to parametrize the scenario outlines like so:
import pytest
from pytest_bdd import scenario
from functools import partial
from some_lib import test_data, utils
#pytest.fixture(scope='module')
def context():
return {}
scenario = partial(scenario, '../features/example.feature')
#pytest.mark.parametrize(
[argument1, argument2],
[(test_data.TEST_ARGUMENT[1], test_data.TEST_ARGUMENT[2]),],
)
#scenario('Example scenario')
def test_example_scenario(context, argument1, argument2):
pass
I would like to be able to reuse the same step definition in the same scenario with the different arguments somehow, e.g.
#when('I add <argument> to the input')
def add_argument(context, argument):
context['input'] = utils.add_argument(context['input'], argument)
rather than having to have two step definitions, e.g.
#when('I add <argument1> to the input')
def add_argument(context, argument1):
context['input'] = utils.add_argument(context['input'], argument1)
#when('I add <argument2> to the input')
def add_argument(context, argument2):
context['input'] = utils.add_argument(context['input'], argument2)
The pytest-bdd documentation seems to suggest this is possible, but I can't quite wrap my head around how I might accomplish this without using example tables.
Often it’s possible to reuse steps giving them a parameter(s). This allows to have single implementation and multiple use, so less code. Also opens the possibility to use same step twice in single scenario and with different arguments! [sic] (Emphasis my own)
Does anyone have any ideas on how I might accomplish this?
Thank you for your time as always!
I think the pytest-bdd documentation is rather suggesting re-usage of a step due to a variable in the step definition instead of a hard-coded value...so I think the documentation does not give you any solution for your problem.
Anyway, there is a solution that I use, which is getting the value of the step variable dynamically. Pytest-bdd will create a pytest-fixture for every variable you define in your steps and therefore you can obtain the value of a fixture by calling request.getfixturevalue(name_of_fixture), as long as you know the name of the fixture.
For your case I would use parsers.parse() for the step definitions, so that the variables argument1 and argument2 will hold the name of the fixtures instead of their value.
Example
#when(parsers.parse('I add {argument1} to the input'))
def add_argument(request, context, argument1):
# Remove angle brackets, because they are not part of the fixture name
argument1 = argument1.replace('<', '').replace('>', '')
argument_value = request.getfixturevalue(argument1)
context['input'] = utils.add_argument(context['input'], argument_value)

Suspending function calls in Python for passing later (functional paradigm)

I'm writing a python command line program which has some interdependent options, I would like for the user to be able to enter the options in whichever order they please.
Currently I am using the getopts library to parse the command line options, unfortunately that parses them in-order. I've thrown together a system of boolean flags to leave the processing of certain command line arguments until the one they're dependent on is processed, however I had the idea of using a Priority Queue of function calls which would execute after all the command line options are parsed.
I know that Python can store functions under variable names, but that seems to call the function at the same time.
For example:
help = obj.PrintHelp()
heapq.heappush(commandQ, (0, help))
Will print the help dialog immediately. How would I go about implementing my code such that it won't call PrintHelp() immediately upon assigning it a name.
EDIT:
Oh i just realized I was pushing into a queue called help, that's my mistake.
Thanks for the tip on removing the () after PrintHelp.
What if I want to now call a function that requires more than the self argument?
myFun = obj.parseFile(path)
heapq.heappush(commandQ, (1, myFun))
Would I just make the tuple bigger and take the command line argument?
If you heappush like this:
myFun = obj.parseFile
heapq.heappush(commandQ, (1, myFun, path))
then to later call the function, you could do this:
while commandQ:
x=heapq.heappop(commandQ)
func=x[1]
args=x[2:]
func(*args)
Use
help = obj.PrintHelp
without the parentheses. This makes help reference the function.
Later, you can call the function with help().
Note also (if I understand your situation correctly), you could just use the optparse or (if you have Python2.7 or better) argparse modules in the standard library to handle the command-line options in any order.
PS. help is a built-in function in Python. Naming a variable help overrides the built-in, making it difficult (though not impossible) to access the built-in. Generally, it's a good idea not to overwrite the names of built-ins.
Instead of using getopts, I would suggest using optparse (argparse, if you are using a newer python version): most probably, you will get everything you need, already implemented.
That said, in your example code, you are actually calling the function, while you should simply get its name:
help = obj.PrintHelp
heapq.heappush(help, (0, help))
If you want to store a complete function call in Python, you can do it one of two ways:
# option 1: hold the parameters separately
# I've also skipped saving the function in a 'help' variable'
heapq.heappush(commandQ, (0, obj.PrintHelp, param1, param2))
# later:
command = commandQ[0]
heapq.heappop(commandQ)
command[1](*command[2:]) # call the function (second item) with args (remainder of items)
Alternatively, you can use a helper to package the arguments up via lambda:
# option 2: build a no-argument anonymous function that knows what arguments
# to give the real one
# module scope
def makeCall(func, *args):
return lambda: func(*args)
# now you can:
help = makeCall(obj.PrintHelp, param1, param2)
heapq.heappush(commandQ, (0, help))
If you need keyword arguments, let me know and I'll edit to take care of those too.

Python - Call a function in a module dynamically

I'm pretty new to Python and I have a situation where I have a variable representing a function inside of a module and I'm wondering how to call it dynamically. I have filters.py:
def scale(image, width, height):
pass
And then in another script I have something like:
import filters
def process_images(method='scale', options):
filters[method](**options)
... but that doesn't work obviously. If someone could fill me in on the proper way to do this, or let me know if there is a better way to pass around functions as parameters that would be awesome.
you need built-in getattr:
getattr(filters, method)(**options)
To avoid the problem, you could pass the function directly, instead of "by name":
def process_images(method=filters.scale, options):
method(**options)
If you have a special reason to use a string instead, you can use getattr as suggested by SilentGhost.
import filters
def process_images(function=filters.scale, options):
function(**options)
This can then be called like, for example:
process_images(filters.rotate, **rotate_options)
Note that having a default function arg doesn't seem like a good idea -- it's not intuitively obvious why filters.scale is the default out of several possible image-processing operations.

How to avoid excessive parameter passing?

I am developing a medium size program in python spread across 5 modules. The program accepts command line arguments using OptionParser in the main module e.g. main.py. These options are later used to determine how methods in other modules behave (e.g. a.py, b.py). As I extend the ability for the user to customise the behaviour or the program I find that I end up requiring this user-defined parameter in a method in a.py that is not directly called by main.py, but is instead called by another method in a.py:
main.py:
import a
p = some_command_line_argument_value
a.meth1(p)
a.py:
meth1(p):
# some code
res = meth2(p)
# some more code w/ res
meth2(p):
# do something with p
This excessive parameter passing seems wasteful and wrong, but has hard as I try I cannot think of a design pattern that solves this problem. While I had some formal CS education (minor in CS during my B.Sc.), I've only really come to appreciate good coding practices since I started using python. Please help me become a better programmer!
Create objects of types relevant to your program, and store the command line options relevant to each in them. Example:
import WidgetFrobnosticator
f = WidgetFrobnosticator()
f.allow_oncave_widgets = option_allow_concave_widgets
f.respect_weasel_pins = option_respect_weasel_pins
# Now the methods of WidgetFrobnosticator have access to your command-line parameters,
# in a way that's not dependent on the input format.
import PlatypusFactory
p = PlatypusFactory()
p.allow_parthenogenesis = option_allow_parthenogenesis
p.max_population = option_max_population
# The platypus factory knows about its own options, but not those of the WidgetFrobnosticator
# or vice versa. This makes each class easier to read and implement.
Maybe you should organize your code more into classes and objects? As I was writing this, Jimmy showed a class-instance based answer, so here is a pure class-based answer. This would be most useful if you only ever wanted a single behavior; if there is any chance at all you might want different defaults some of the time, you should use ordinary object-oriented programming in Python, i.e. pass around class instances with the property p set in the instance, not the class.
class Aclass(object):
p = None
#classmethod
def init_p(cls, value):
p = value
#classmethod
def meth1(cls):
# some code
res = cls.meth2()
# some more code w/ res
#classmethod
def meth2(cls):
# do something with p
pass
from a import Aclass as ac
ac.init_p(some_command_line_argument_value)
ac.meth1()
ac.meth2()
If "a" is a real object and not just a set of independent helper methods, you can create an "p" member variable in "a" and set it when you instantiate an "a" object. Then your main class will not need to pass "p" into meth1 and meth2 once "a" has been instantiated.
[Caution: my answer isn't specific to python.]
I remember that Code Complete called this kind of parameter a "tramp parameter". Googling for "tramp parameter" doesn't return many results, however.
Some alternatives to tramp parameters might include:
Put the data in a global variable
Put the data in a static variable of a class (similar to global data)
Put the data in an instance variable of a class
Pseudo-global variable: hidden behind a singleton, or some dependency injection mechanism
Personally, I don't mind a tramp parameter as long as there's no more than one; i.e. your example is OK for me, but I wouldn't like ...
import a
p1 = some_command_line_argument_value
p2 = another_command_line_argument_value
p3 = a_further_command_line_argument_value
a.meth1(p1, p2, p3)
... instead I'd prefer ...
import a
p = several_command_line_argument_values
a.meth1(p)
... because if meth2 decides that it wants more data than before, I'd prefer if it could extract this extra data from the original parameter which it's already being passed, so that I don't need to edit meth1.
With objects, parameter lists should normally be very small, since most appropriate information is a property of the object itself. The standard way to handle this is to configure the object properties and then call the appropriate methods of that object. In this case set p as an attribute of a. Your meth2 should also complain if p is not set.
Your example is reminiscent of the code smell Message Chains. You may find the corresponding refactoring, Hide Delegate, informative.

Categories