Python3 variable passing issue - python

Example code, try to ignore how it is seems unnecessarily overcomplicated- this is way dumbed down from the actual code, but mimics the flow exactly.
def setup():
print("Setting up...")
do_something()
def do_something():
task = str(input("Enter a task to do: "))
try:
print("Doing {}...".format(task))
except:
print("Failed to do {}...".format(task))
finally:
return task
def choose_2(choice):
print("You chose {}".format(choice))
def menu_1():
choice = int(input("Choose 1 or 2: "))
if choice == 1:
setup()
menu_2(task)
menu_1()
However, the program returns "UnboundLocalError: local variable 'task' referenced before assignment"
Why is do_something() not returning the variable task to the if statement within menu_1()? Once setup() (and subsequently do_something()) finishes running, shouldn't do_something()'s returned value remain inside the if statement, since it's not done yet?

The flow is :
menu_1() => menu_2(task)
task has not been defined in the scope of menu_1() so it has no way of being defined.
You may have intended to do this instead:
def setup():
print("Setting up...")
return do_something()
.....
# in menu_1():
menu_2(setup())
Notice that because setup now RETURNS something, it can have that return value used.

setup() and menu_1() function should be changed like this:
def setup():
print("Setting up...")
do_something()
def menu_1():
choice = int(input("Choose 1 or 2: "))
if choice == 1:
task=setup()
menu_2(task)
Explanation:
menu_1() calls setup(), setup() calls do_something(). Now 'do_something()' will return the value of task but you are not returning it from 'setup()' function to menu_1() and then in menu_1() you have to store the returned value in a variable named 'task'.

Related

Function prints none instead of printing what I want it to print in decorators in Python

I am studying on decorators in Python. I was trying to use the decorators with arguments. I'm having a problem with the decorators. I defined two inner function in the default decorator function. It returns none when I use it as below:
def prefix(write: bool = False):
def thread(func):
def wrapper(*args, **kwargs):
t1 = Thread(target=func, args=args, kwargs=kwargs)
t1.start()
if write:
print("write parameter is true.")
return wrapper
return thread
#prefix(write=True)
def something(x):
return x + x
print(something(5))
As you see, I defined two different functions named prefix and something. If the write parameter is true, it prints the string. But something function is printing "None" instead of printing 5 + 5.
What's wrong?
Well, your wrapper() function doesn't have a return statement, so it will always return None.
Furthermore, how would you expect it to print 5 + 5 (or rather, the result thereof) when that may not have been computed yet, considering you're starting a new thread to do that and never do anything with the return value of func at all?
IOW, if we expand your example a bit:
import time
from threading import Thread
def prefix(write: bool = False):
def thread(func): # <- this function replaces `something`
def wrapper(*args, **kwargs):
t1 = Thread(target=func, args=args, kwargs=kwargs)
t1.start()
if write:
print("write parameter is true.")
return "hernekeitto"
return wrapper
return thread
#prefix(write=True)
def something(x):
print("Computing, computing...")
time.sleep(0.5)
print("Hmm, hmm, hmm...")
time.sleep(0.5)
print("Okay, got it!")
return x + x
value = something(9)
print("The value is:", value)
This will print out
Computing, computing...
write parameter is true.
The value is: hernekeitto
Hmm, hmm, hmm...
Okay, got it!
As you can see, the thread's first print() happens first, then the write print, then the value print, and then the rest of what happens in the thread. And as you can see, we only know what x + x is after "Okay, got it!", so there's no way you could have returned that out of wrapper() where "hernekeitto" is returned.
See futures (or the equivalent JavaScript concept promises) for a "value that's not yet ready":
import time
from concurrent.futures import Future
from threading import Thread
def in_future(func):
def wrapper(*args, **kwargs):
fut = Future()
def func_wrapper():
# Wraps the threaded function to resolve the future.
try:
fut.set_result(func(*args, **kwargs))
except Exception as e:
fut.set_exception(e)
t1 = Thread(target=func_wrapper)
t1.start()
return fut
return wrapper
#in_future
def something(x):
print("Computing, computing...")
time.sleep(0.5)
print("Hmm, hmm, hmm...")
time.sleep(0.5)
print("Okay, got it!")
return x + x
value_fut = something(9)
print("The value is:", value_fut)
print("Waiting for it to be done...")
print("Here it is!", value_fut.result())
This prints out
Computing, computing...
The value is: <Future at 0x... state=pending>
Waiting for it to be done...
Hmm, hmm, hmm...
Okay, got it!
Here it is! 18
so you can see the future is just a "box" where you'll need to wait for the actual value to be done (or an error to occur getting it).
Normally you'd use futures with the executors in concurrent, but the above is an example of how to do it by hand.

How to detect an error condition in functions called from the main function?

I have a small program which calls multiple functions from the main() function and in each of the functions there is a small probability, that an error occurs. In order to be compatible with the existing monitoring system, I need to make a file to /var/tmp/error_triggered/ directory if there was at least one error or remove the file from this directory if there were no errors. One way to do this is to use variable with global scope:
#!/usr/bin/env python3
import os
import random
from pathlib import Path
def f1():
global error
for i in range(1, 5):
if random.randint(0, 9) == 0:
error = 1
def f2():
global error
for i in range(1, 5):
if random.randint(0, 9) == 0:
error = 1
def main():
f1()
f2()
if __name__ == '__main__':
error = 0
main()
if error == 1:
Path('/var/tmp/error_triggered/' + os.path.splitext(os.path.basename(__file__))[0]).touch()
else:
Path('/var/tmp/error_triggered/' + os.path.splitext(os.path.basename(__file__))[0]).unlink(missing_ok=True)
It is usually a bad practice to modify a global variable from inside functions. Is it justified here? Or is there a better approach to solve this problem?
The standard way of handling errors in Python is raising exceptions:
def f1():
for i in range(1, 5):
if random.randint(0, 9) == 0:
raise ValueError()
def f2():
for i in range(1, 5):
if random.randint(0, 9) == 0:
raise ValueError()
def main():
f = Path('/var/tmp/error_triggered/' + os.path.splitext(os.path.basename(__file__))[0])
try:
f1()
f2()
f.unlink(missing_ok=True)
except ValueError:
f.touch()
There's a subtle difference between this and what you were doing: if f1 raises an exception, f2 won't be called (that is, raising an exception immediately stops the try block and goes straight to the except). This is desirable behavior more often than not, since usually if an error happens you don't want to keep going with what you were doing.
If for some reason you want f2 to be called even if f1 has already produced an error, then it might be more appropriate to return the error the same way you'd return any other value:
def f1():
for i in range(1, 5):
if random.randint(0, 9) == 0:
return 1
return 0
def f2():
for i in range(1, 5):
if random.randint(0, 9) == 0:
return 1
return 0
def main():
f = Path('/var/tmp/error_triggered/' + os.path.splitext(os.path.basename(__file__))[0])
if f1() + f2():
f.touch()
else:
f.unlink(missing_ok=True)
Using a global is almost never the best way to pass information between functions. For anything beyond the simplest cases it will just make your code very difficult to debug.
I'd suggest creating an ERROR_COUNTER int variable or dictionary which stores the counts, something like:
{ functionname: count, functionname2: count }
Which is updated with every occurence of an error.
Making variables global from inside functions is not great.
I don't know your use case/context specifically but you could also return an int of errors with each function call if all they do is perform an action. This would make the caller responsible for handling the count of errors than individual functions.
I'd echo what some of the others have raised here, that raising exceptions is the appropriate response to errors but add that embedding a try/except loop if you need specific error behaviour:
def a():
if random.randint(0, 9) == 0:
raise Exception("error!")
def main():
f = Path('/var/tmp/error_triggered/' + os.path.splitext(os.path.basename(__file__))[0])
try:
for i in range(0,5):
a()
f.unlink(missing_ok=True)
except Exception as e:
print(e)
f.touch()
I'm assuming your code is a bit more complicated that raising errors at random, but the principles the same- if something hits and error, it should raise an exception, if you need to handle the exception in a way that isn't just leaving the shell, use a try/except loop.

Attempting to Understand Functional Arguments

I recognize this may be a very 101 type question, but I'm still having trouble understanding functional programming in general, and have a particular code snippet that I can't make sense of:
Full code, but leaving out most of the function definitions:
import blpapi
import sys
SESSION_STARTED = blpapi.Name("SessionStarted")
SESSION_STARTUP_FAILURE = blpapi.Name("SessionStartupFailure")
SERVICE_OPENED = blpapi.Name("ServiceOpened")
SERVICE_OPEN_FAILURE = blpapi.Name("ServiceOpenFailure")
ERROR_INFO = blpapi.Name("ErrorInfo")
GET_FILLS_RESPONSE = blpapi.Name("GetFillsResponse")
d_service="//blp/emsx.history"
d_host="localhost"
d_port=8194
bEnd=False
class SessionEventHandler():
def processEvent(self, event, session):
try:
if event.eventType() == blpapi.Event.SESSION_STATUS:
self.processSessionStatusEvent(event,session)
elif event.eventType() == blpapi.Event.SERVICE_STATUS:
self.processServiceStatusEvent(event,session)
elif event.eventType() == blpapi.Event.RESPONSE:
self.processResponseEvent(event)
else:
self.processMiscEvents(event)
except:
print ("Exception: %s" % sys.exc_info()[0])
return False
def processSessionStatusEvent(self,event,session):
print ("Processing SESSION_STATUS event")
for msg in event:
pass
def processServiceStatusEvent(self,event,session):
print ("Processing SERVICE_STATUS event")
for msg in event:
pass
def processResponseEvent(self, event):
print ("Processing RESPONSE event")
for msg in event:
global bEnd
bEnd = True
def processMiscEvents(self, event):
print ("Processing " + event.eventType() + " event")
for msg in event:
print ("MESSAGE: %s" % (msg.tostring()))
def main():
sessionOptions = blpapi.SessionOptions()
sessionOptions.setServerHost(d_host)
sessionOptions.setServerPort(d_port)
print ("Connecting to %s:%d" % (d_host,d_port))
eventHandler = SessionEventHandler()
session = blpapi.Session(sessionOptions, eventHandler.processEvent)
if not session.startAsync():
print ("Failed to start session.")
return
global bEnd
while bEnd==False:
pass
session.stop()
I can follow the code up to here:
session = blpapi.Session(sessionOptions, eventHandler.processEvent)
Here, I see I'm calling "Session" from the blpapi library, and passing it some options as well as my eventHandler.processEvent. Here is where I get lost. I look at that particular function, and see:
def processEvent(self, event, session):
try:
if event.eventType() == blpapi.Event.SESSION_STATUS:
self.processSessionStatusEvent(event,session)
elif event.eventType() == blpapi.Event.SERVICE_STATUS:
self.processServiceStatusEvent(event,session)
elif event.eventType() == blpapi.Event.RESPONSE:
self.processResponseEvent(event)
else:
self.processMiscEvents(event)
except:
print ("Exception: %s" % sys.exc_info()[0])
return False
I see that the function is attempting to discern what type of event has been passed in, and will execute a different function within the class depending on that event type. The trouble is, I can't figure out where the event is ever specified! Where does "event" come from? I see it as an argument in that particular function, but no event argument was passed to:
session = blpapi.Session(sessionOptions, eventHandler.processEvent)
So how does it know what to do at this point? How did this "event" object magically appear?
Thanks for entertaining my dumb questions
session = blpapi.Session(sessionOptions, eventHandler.processEvent)
Note that processEvent here lacks parentheses () after it. This means you are passing the function itself as a parameter to the Session class. This class will later call processEvent with appropriate parameters.
Side Note:
I'm still having trouble understanding functional programming
"Functional programming" has a very specific definition and this example isn't it. If you are interested, you can google "functional programming" or read the Wikipedia article to find out more. However, this isn't really important at this stage in your learning process.

Python - Returning a break statement

If you call a function to check exit conditions, can you have it return a break statement? Something like:
def check():
return break
def myLoop:
while myLoop:
check()
Is there anything like this allowed? I know the syntax as written isn't valid.
No, it doesn't work like that unfortunately. You would have to check the return value and then decide to break out of the loop in the caller.
while myLoop:
result = check()
if result == 'oh no':
break
Of course, depending on what you are trying to do, it may just be as simple as:
result = check()
while result != 'oh no':
result = check()
break is a keyword but not an object so it is treated differently by the interpreter, link. Python function can only return objects or the like.
If you want to break out of a loop when deep inside functions, one way is to use an exception:
class BreakException(Exception):
pass
Raise the exception somewhere in a function:
def some_func():
raise BreakException()
And you can break out of a loop like this:
try:
while True:
some_func()
except BreakException:
pass
I do not feel this is good practice, but have seen some language using it, Scala for example link.
def check():
#define your conditions here with if..else statement
return True
def myLoop():
while True:
if check()==True: break

How to access (and edit) variables from a callback function?

I use Boto to access Amazon S3. And for file uploading I can assign a callback function. The problem is that I cannot access the needed variables from that callback function until I make them global. In another hand, if I make them global, they are global for all other Celery tasks, too (until I restart Celery), as the file uploading is executed from a Celery task.
Here is a function that uploads a JSON file with information about video conversion progress.
def upload_json():
global current_frame
global path_to_progress_file
global bucket
json_file = Key(bucket)
json_file.key = path_to_progress_file
json_file.set_contents_from_string('{"progress": "%s"}' % current_frame,
cb=json_upload_callback, num_cb=2, policy="public-read")
And here are 2 callback functions for uploading frames generated by ffmpeg during the video conversion and a JSON file with the progress information.
# Callback functions that are called by get_contents_to_filename.
# The first argument is representing the number of bytes that have
# been successfully transmitted from S3 and the second is representing
# the total number of bytes that need to be transmitted.
def frame_upload_callback(transmitted, to_transmit):
if transmitted == to_transmit:
upload_json()
def json_upload_callback(transmitted, to_transmit):
global uploading_frame
if transmitted == to_transmit:
print "Frame uploading finished"
uploading_frame = False
Theoretically, I could pass the uploading_frame variable to the upload_json function, but it wouldn’t get to json_upload_callback as it’s executed by Boto.
In fact, I could write something like this.
In [1]: def make_function(message):
...: def function():
...: print message
...: return function
...:
In [2]: hello_function = make_function("hello")
In [3]: hello_function
Out[3]: <function function at 0x19f4c08>
In [4]: hello_function()
hello
Which, however, doesn’t let you edit the value from the function, just lets you read the value.
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
return lfun
my_function = myfunc()
my_function("hello")
This works.
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
stuff += 1
return lfun
my_function = myfunc()
my_function("hello")
And this gives an UnboundLocalError: local variable 'stuff' referenced before assignment.
Thanks.
In Python 2.x closed over variables are read-only (not for the Python VM, but just because of the syntax that doesn't allow writing to a non local and non global variable).
You can however use a closure over a mutable value... i.e.
def myfunc():
stuff = [17] # <<---- this is a mutable object
def lfun(arg):
print "got arg", arg, "and stuff[0] is", stuff[0]
stuff[0] += 1
return lfun
my_function = myfunc()
my_function("hello")
my_function("hello")
If you are instead using Python 3.x the keyword nonlocal can be used to specify that a variable used in read/write in a closure is not a local but should be captured from the enclosing scope:
def myfunc():
stuff = 17
def lfun(arg):
nonlocal stuff
print "got arg", arg, "and stuff is", stuff
stuff += 1
return lfun
my_function = myfunc()
my_function("hello")
my_function("hello")
You could create a partial function via functools.partial. This is a way to call a function with some variables pre-baked into the call. However, to make that work you'd need to pass a mutable value - eg a list or dict - into the function, rather than just a bool.
from functools import partial
def callback(arg1, arg2, arg3):
arg1[:] = [False]
print arg1, arg2, arg3
local_var = [True]
partial_func = partial(callback, local_var)
partial_func(2, 1)
print local_var # prints [False]
A simple way to do these things is to use a local function
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
stuff += 1
def register_callback(lfun)
This will create a new function every time you call myfunc, and it will be able to use the local "stuff" copy.

Categories