I am using Python RPyC to communicate between two machines. Since the link may be prone to errors I would like to have a generic wrapper function which takes a remote function name plus that function's parameters as its input, does some status checking, calls the function with the parameters, does a little more status checking and then returns the result of the function call. The wrapper should have no knowledge of the function, its parameters/parameter types or the number of them, or the return value for that matter, the user has to get that right; it should just pass them transparently through.
I get the getattr(conn.root, function)() pattern to call the function but my Python expertise runs out at populating the parameters. I have read various posts on the use of *arg and **kwarg, in particular this one, which suggests that it is either difficult or impossible to do what I want to do. Is that correct and, if so, might there be a scheme which would work if I, say, ensured that all the function parameters were keyword parameters?
I do own both ends of this interface (the caller and the called) so I could arrange to dictionary-ise all the function parameters but I'd rather not make my API too peculiar if I could possibly avoid it.
Edit: the thing being called, at the remote end of the link, is a class with very ordinary methods, e.g.;
def exposed_a(self)
def exposed_b(self, thing1)
def exposed_c(self, thing1=None)
def exposed_d(self, thing1=DEFAULT_VALUE1, thing2=None)
def exposed_e(self, thing1, thing2, thing3=DEFAULT_VALUE1, thing4=None)
def exposed_f(self, thing1=None, thing2=None)
...where the types of each argument (and the return values) could be string, dict, number or list.
And it is indeed, trivial, my Goggle fu had simply failed me in finding the answer. In the hope of helping anyone else who is inexperienced in Python and is having a Google bad day:
One simply takes *arg and **kwarg as parameters and passes them directly on, with the asterisks attached. So in my case, to do my RPyC pass-through, where conn is the RPyC connection:
def my_passthru(conn, function_name, *function_args, **function_kwargs):
# Do a check of something or other here
return_value = getattr(conn.root, function_name)(*function_args, **function_kwargs)
# Do another check here
return return_value
Then, for example, a call to my exposed_e() method above might be:
return_value = my_passthru(conn, e, thing1, thing2, thing3)
(the exposed_ prefix being added automagically by RPyC in this case).
And of course one could put a try: / except ConnectionRefusedError: around the getattr() call in my_passthru() to generically catch the case where the connection has dropped underneath RPyC, which was my main purpose.
Related
This is my first time building out unit tests, and I'm not quite sure how to proceed here. Here's the function I'd like to test; it's a method in a class that accepts one argument, url, and returns one string, task_id:
def url_request(self, url):
conn = self.endpoint_request()
authorization = conn.authorization
response = requests.get(url, authorization)
return response["task_id"]
The method starts out by calling another method within the same class to obtain a token to connect to an API endpoint. Should I be mocking the output of that call (self.endpoint_request())?
If I do have to mock it, and my test function looks like this, how do I pass a fake token/auth endpoint_request response?
#patch("common.DataGetter.endpoint_request")
def test_url_request(mock_endpoint_request):
mock_endpoint_request.return_value = {"Auth": "123456"}
# How do I pass the fake token/auth to this?
task_id = DataGetter.url_request(url)
The code you have shown is strongly dominated by interactions. Which means that there will most likely be no bugs to find with unit-testing: The potential bugs are on the interaction level: You access conn.authorization - but, is this the proper member? And, does it already have the proper representation in the way you need it further on? Is requests.get the right method for the job? Is the argument order as you expect it? Is the return value as you expect it? Is task_id spelled correctly?
These are (some of) the potential bugs in your code. But, with unit-testing you will not be able to find them: When you replace the depended-on components with some mocks (which you create or configure), your unit-tests will just succeed: Lets assume that you have a misconception about the return value of requests.get, namely that task_id is spelled wrongly and should rather be spelled taskId. If you mock requests.get, you would implement the mock based on your own misconception. That is, your mock would return a map with the (misspelled) key task_id. Then, the unit-test would succeed despite of the bug.
You will only find that bug with integration testing, where you bring your component and depended-on components together. Only then you can test the assumptions made in your component against the reality of the other components.
I am looking for a way in python to stop certain parts of the code inside a function but only when the output of the function is assigned to a variable. If the the function is run without any assignment then it should run all the inside of it.
Something like this:
def function():
print('a')
return ('a')
function()
A=function()
The first time that I call function() it should display a on the screen, while the second time nothing should print and only store value returned into A.
I have not tried anything since I am kind of new to Python, but I was imagining it would be something like the if __name__=='__main__': way of checking if a script is being used as a module or run directly.
I don't think such a behavior could be achieved in python, because within the scope of the function call, there is no indication what your will do with the returned value.
You will have to give an argument to the function that tells it to skip/stop with a default value to ease the call.
def call_and_skip(skip_instructions=False):
if not skip_instructions:
call_stuff_or_not()
call_everytime()
call_and_skip()
# will not skip inside instruction
a_variable = call_and_skip(skip_instructions=True)
# will skip inside instructions
As already mentionned in comments, what you're asking for is not technically possible - a function has (and cannot have) any knowledge of what the calling code will do with the return value.
For a simple case like your example snippet, the obvious solution is to just remove the print call from within the function and leave it out to the caller, ie:
def fun():
return 'a'
print(fun())
Now I assume your real code is a bit more complex than this so such a simple solution would not work. If that's the case, the solution is to split the original function into many distinct one and let the caller choose which part it wants to call. If you have a complex state (local variables) that need to be shared between the different parts, you can wrap the whole thing into a class, turning the sub functions into methods and storing those variables as instance attributes.
I am trying to make a python library that allows me to make custom tkinter widgets that are more aesthetically pleasing than the built-in ones. However, I have run into a problem while defining a few functions.
The problem stems from the difference between functions like append() and str(). While the append function works as follows...
somelist = ['a', 'b', 'c']
somelist.append('d')
The str() function works like this...
somenumber = 99
somenumber_text = str(some_number)
You 'call upon' the append function by (1) stating the list that you are modifying (somelist), (2) adding a period, and (3) actually naming the append funtion itself (append()). Meanwhile you 'call upon' the str function by placing a positional argument (somenumber) within its argument area. I have no idea why there is this difference, and more importantly if there is a way to specify which method to use to 'call upon' a function that I define myself?
Thanks...
In Python, function is a group of related statements that perform a specific task.
Functions help break our program into smaller and modular chunks. As our program grows larger and larger, functions make it more organized and manageable.
Furthermore, it avoids repetition and makes code reusable.
Syntax of Function
def function_name(parameters):
"""docstring"""
statement(s)
Above shown is a function definition which consists of following components.
Keyword def marks the start of function header.
A function name to uniquely identify it. Function naming follows the same rules of writing identifiers in Python.
Parameters (arguments) through which we pass values to a function. They are optional.
A colon (:) to mark the end of function header.
Optional documentation string (docstring) to describe what the function does.
One or more valid python statements that make up the function body. Statements must have same indentation level (usually 4 spaces).
An optional return statement to return a value from the function.
You really don't need to create a class, or any methods. You can make a plain-old function that's similar to bind, and just take the widget to bind as a normal parameter. For example:
def bind_multi(widget, callback, *events):
for event in events:
widget.bind(event, callback)
That means you have to call this function as bind_multi(mybutton, callback, event1, event2) instead of mybutton.bind_multi(callback, event1, event2), but there's nothing wrong with that.
I have a setting where event handlers are always functions taking a single event argument.
But more often than not, I find myself writing handlers that doesn´t use any of the event information. So I constantly write handlers of the form:
def handler(_):
#react
In order to discard the argument.
But I wish I didn´t have to, as sometimes I want to reuse handlers as general action functions that take no arguments, and sometimes I have existing zero-arguments functions that I want to use as handlers.
My current solution is wrapping the function using a lambda:
def handler():
#react
event.addHandler(lambda _:handler())
But that seems wrong for other reasons.
My intuitive understanding of a lambda is that it is first and foremost a description of a return value, and event handlers return nothing. I feel lambdas are meant to express pure functions, and here I´m using them only to cause side effects.
Another solution would be a general decorator discarding all arguments.
def discardArgs(func):
def f(*args):
return func()
return f
But I need this in a great many places, and it seems silly having to import such a utility to every script for something so simple.
Is there a particularly standard or "pythonic" way of wrapping a function to discard all arguments?
Use *args:
def handler(*args):
#react
Then handler can take 0 arguments, or any number of position arguments.
I was working with generator functions and private functions of a class. I am wondering
Why when yielding (which in my one case was by accident) in __someFunc that this function just appears not to be called from within __someGenerator. Also what is the terminology I want to use when referring to these aspects of the language?
Can the python interpreter warn of such instances?
Below is an example snippet of my scenario.
class someClass():
def __init__(self):
pass
#Copy and paste mistake where yield ended up in a regular function
def __someFunc(self):
print "hello"
#yield True #if yielding in this function it isn't called
def __someGenerator (self):
for i in range(0, 10):
self.__someFunc()
yield True
yield False
def someMethod(self):
func = self.__someGenerator()
while func.next():
print "next"
sc = someClass()
sc.someMethod()
I got burned on this and spent some time trying to figure out why a function just wasn't getting called. I finally discovered I was yielding in function I didn't want to in.
A "generator" isn't so much a language feature, as a name for functions that "yield." Yielding is pretty much always legal. There's not really any way for Python to know that you didn't "mean" to yield from some function.
This PEP http://www.python.org/dev/peps/pep-0255/ talks about generators, and may help you understand the background better.
I sympathize with your experience, but compilers can't figure out what you "meant for them to do", only what you actually told them to do.
I'll try to answer the first of your questions.
A regular function, when called like this:
val = func()
executes its inside statements until it ends or a return statement is reached. Then the return value of the function is assigned to val.
If a compiler recognizes the function to actually be a generator and not a regular function (it does that by looking for yield statements inside the function -- if there's at least one, it's a generator), the scenario when calling it the same way as above has different consequences. Upon calling func(), no code inside the function is executed, and a special <generator> value is assigned to val. Then, the first time you call val.next(), the actual statements of func are being executed until a yield or return is encountered, upon which the execution of the function stops, value yielded is returned and generator waits for another call to val.next().
That's why, in your example, function __someFunc didn't print "hello" -- its statements were not executed, because you haven't called self.__someFunc().next(), but only self.__someFunc().
Unfortunately, I'm pretty sure there's no built-in warning mechanism for programming errors like yours.
Python doesn't know whether you want to create a generator object for later iteration or call a function. But python isn't your only tool for seeing what's going on with your code. If you're using an editor or IDE that allows customized syntax highlighting, you can tell it to give the yield keyword a different color, or even a bright background, which will help you find your errors more quickly, at least. In vim, for example, you might do:
:syntax keyword Yield yield
:highlight yield ctermbg=yellow guibg=yellow ctermfg=blue guifg=blue
Those are horrendous colors, by the way. I recommend picking something better. Another option, if your editor or IDE won't cooperate, is to set up a custom rule in a code checker like pylint. An example from pylint's source tarball:
from pylint.interfaces import IRawChecker
from pylint.checkers import BaseChecker
class MyRawChecker(BaseChecker):
"""check for line continuations with '\' instead of using triple
quoted string or parenthesis
"""
__implements__ = IRawChecker
name = 'custom_raw'
msgs = {'W9901': ('use \\ for line continuation',
('Used when a \\ is used for a line continuation instead'
' of using triple quoted string or parenthesis.')),
}
options = ()
def process_module(self, stream):
"""process a module
the module's content is accessible via the stream object
"""
for (lineno, line) in enumerate(stream):
if line.rstrip().endswith('\\'):
self.add_message('W9901', line=lineno)
def register(linter):
"""required method to auto register this checker"""
linter.register_checker(MyRawChecker(linter))
The pylint manual is available here: http://www.logilab.org/card/pylint_manual
And vim's syntax documentation is here: http://www.vim.org/htmldoc/syntax.html
Because the return keyword is applicable in both generator functions and regular functions, there's nothing you could possibly check (as #Christopher mentions). The return keyword in a generator indicates that a StopIteration exception should be raised.
If you try to return with a value from within a generator (which doesn't make sense, since return just means "stop iteration"), the compiler will complain at compile-time -- this may catch some copy-and-paste mistakes:
>>> def foo():
... yield 12
... return 15
...
File "<stdin>", line 3
SyntaxError: 'return' with argument inside generator
I personally just advise against copy and paste programming. :-)
From the PEP:
Note that return means "I'm done, and have nothing interesting to
return", for both generator functions and non-generator functions.
We do this.
Generators have names with "generate" or "gen" in their name. It will have a yield statement in the body. Pretty easy to check visually, since no method is much over 20 lines of code.
Other methods don't have "gen" in their name.
Also, we do not every use __ (double underscore) names under any circumstances. 32,000 lines of code. Non __ names.
The "generator vs. non-generator" method function is entirely a design question. What did the programmer "intend" to happen. The compiler can't easily validate your intent, it can only validate what you actually typed.