I use Boto to access Amazon S3. And for file uploading I can assign a callback function. The problem is that I cannot access the needed variables from that callback function until I make them global. In another hand, if I make them global, they are global for all other Celery tasks, too (until I restart Celery), as the file uploading is executed from a Celery task.
Here is a function that uploads a JSON file with information about video conversion progress.
def upload_json():
global current_frame
global path_to_progress_file
global bucket
json_file = Key(bucket)
json_file.key = path_to_progress_file
json_file.set_contents_from_string('{"progress": "%s"}' % current_frame,
cb=json_upload_callback, num_cb=2, policy="public-read")
And here are 2 callback functions for uploading frames generated by ffmpeg during the video conversion and a JSON file with the progress information.
# Callback functions that are called by get_contents_to_filename.
# The first argument is representing the number of bytes that have
# been successfully transmitted from S3 and the second is representing
# the total number of bytes that need to be transmitted.
def frame_upload_callback(transmitted, to_transmit):
if transmitted == to_transmit:
upload_json()
def json_upload_callback(transmitted, to_transmit):
global uploading_frame
if transmitted == to_transmit:
print "Frame uploading finished"
uploading_frame = False
Theoretically, I could pass the uploading_frame variable to the upload_json function, but it wouldn’t get to json_upload_callback as it’s executed by Boto.
In fact, I could write something like this.
In [1]: def make_function(message):
...: def function():
...: print message
...: return function
...:
In [2]: hello_function = make_function("hello")
In [3]: hello_function
Out[3]: <function function at 0x19f4c08>
In [4]: hello_function()
hello
Which, however, doesn’t let you edit the value from the function, just lets you read the value.
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
return lfun
my_function = myfunc()
my_function("hello")
This works.
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
stuff += 1
return lfun
my_function = myfunc()
my_function("hello")
And this gives an UnboundLocalError: local variable 'stuff' referenced before assignment.
Thanks.
In Python 2.x closed over variables are read-only (not for the Python VM, but just because of the syntax that doesn't allow writing to a non local and non global variable).
You can however use a closure over a mutable value... i.e.
def myfunc():
stuff = [17] # <<---- this is a mutable object
def lfun(arg):
print "got arg", arg, "and stuff[0] is", stuff[0]
stuff[0] += 1
return lfun
my_function = myfunc()
my_function("hello")
my_function("hello")
If you are instead using Python 3.x the keyword nonlocal can be used to specify that a variable used in read/write in a closure is not a local but should be captured from the enclosing scope:
def myfunc():
stuff = 17
def lfun(arg):
nonlocal stuff
print "got arg", arg, "and stuff is", stuff
stuff += 1
return lfun
my_function = myfunc()
my_function("hello")
my_function("hello")
You could create a partial function via functools.partial. This is a way to call a function with some variables pre-baked into the call. However, to make that work you'd need to pass a mutable value - eg a list or dict - into the function, rather than just a bool.
from functools import partial
def callback(arg1, arg2, arg3):
arg1[:] = [False]
print arg1, arg2, arg3
local_var = [True]
partial_func = partial(callback, local_var)
partial_func(2, 1)
print local_var # prints [False]
A simple way to do these things is to use a local function
def myfunc():
stuff = 17
def lfun(arg):
print "got arg", arg, "and stuff is", stuff
stuff += 1
def register_callback(lfun)
This will create a new function every time you call myfunc, and it will be able to use the local "stuff" copy.
Related
I am trying to create a cache array that my function can modify and access multiple times.
I can achieve this the ugly way:
def func():
global cache
try: cache
except NameError: cache = {"stuff"}
### does stuff with cache
...
but I can't do this with the nonlocal keyword, I would really like to know why...
UPDATE: user #quamrana asked for an example:
def main():
def func1():
global x
x = 3.14
def func2():
nonlocal y
y = 3.14
func1()
print(x)
func2()
print(y)
If __name__ == "__main__":
main()
I can also pass an outside variable to the function and use it with some methods, but I would like to avoid creating one manually:
def func(cache):
cache.update({"stuff": 42})
my_stuff = cache.get("stuff")
def main():
func_cache = {}
func(func_cache)
if __name__ == "__main__":
main()
are there good ways to do this kind of thing?
I am studying on decorators in Python. I was trying to use the decorators with arguments. I'm having a problem with the decorators. I defined two inner function in the default decorator function. It returns none when I use it as below:
def prefix(write: bool = False):
def thread(func):
def wrapper(*args, **kwargs):
t1 = Thread(target=func, args=args, kwargs=kwargs)
t1.start()
if write:
print("write parameter is true.")
return wrapper
return thread
#prefix(write=True)
def something(x):
return x + x
print(something(5))
As you see, I defined two different functions named prefix and something. If the write parameter is true, it prints the string. But something function is printing "None" instead of printing 5 + 5.
What's wrong?
Well, your wrapper() function doesn't have a return statement, so it will always return None.
Furthermore, how would you expect it to print 5 + 5 (or rather, the result thereof) when that may not have been computed yet, considering you're starting a new thread to do that and never do anything with the return value of func at all?
IOW, if we expand your example a bit:
import time
from threading import Thread
def prefix(write: bool = False):
def thread(func): # <- this function replaces `something`
def wrapper(*args, **kwargs):
t1 = Thread(target=func, args=args, kwargs=kwargs)
t1.start()
if write:
print("write parameter is true.")
return "hernekeitto"
return wrapper
return thread
#prefix(write=True)
def something(x):
print("Computing, computing...")
time.sleep(0.5)
print("Hmm, hmm, hmm...")
time.sleep(0.5)
print("Okay, got it!")
return x + x
value = something(9)
print("The value is:", value)
This will print out
Computing, computing...
write parameter is true.
The value is: hernekeitto
Hmm, hmm, hmm...
Okay, got it!
As you can see, the thread's first print() happens first, then the write print, then the value print, and then the rest of what happens in the thread. And as you can see, we only know what x + x is after "Okay, got it!", so there's no way you could have returned that out of wrapper() where "hernekeitto" is returned.
See futures (or the equivalent JavaScript concept promises) for a "value that's not yet ready":
import time
from concurrent.futures import Future
from threading import Thread
def in_future(func):
def wrapper(*args, **kwargs):
fut = Future()
def func_wrapper():
# Wraps the threaded function to resolve the future.
try:
fut.set_result(func(*args, **kwargs))
except Exception as e:
fut.set_exception(e)
t1 = Thread(target=func_wrapper)
t1.start()
return fut
return wrapper
#in_future
def something(x):
print("Computing, computing...")
time.sleep(0.5)
print("Hmm, hmm, hmm...")
time.sleep(0.5)
print("Okay, got it!")
return x + x
value_fut = something(9)
print("The value is:", value_fut)
print("Waiting for it to be done...")
print("Here it is!", value_fut.result())
This prints out
Computing, computing...
The value is: <Future at 0x... state=pending>
Waiting for it to be done...
Hmm, hmm, hmm...
Okay, got it!
Here it is! 18
so you can see the future is just a "box" where you'll need to wait for the actual value to be done (or an error to occur getting it).
Normally you'd use futures with the executors in concurrent, but the above is an example of how to do it by hand.
Say I have an flag --debug/--no-debug defined for the base command. This flag will affect the behavior of many operations in my program. Right now I find myself passing this flag as function parameters all over the place, which doesn't seem elegant. Especially when I need to access this flag in a deep call stack, I'll have to add this parameter to every single function on the stack.
I can instead create a global variable is_debug and set its value at the beginning of the command function that receives the value of this flag. But this doesn't seem elegant to me either.
Is there a better way to make some option values globally accessible using the Click library?
There are two ways to do so, depending on your needs. Both of them end up using the click Context.
Personally, I'm a fan of Option 2 because then I don't have to modify function signatures (and I rarely write multi-threaded programs). It also sounds more like what you're looking for.
Option 1: Pass the Context to the function
Use the click.pass_context decorator to pass the click context to the function.
Docs:
Usage: https://click.palletsprojects.com/en/7.x/commands/#nested-handling-and-contexts
API: https://click.palletsprojects.com/en/7.x/api/#click.pass_context
# test1.py
import click
#click.pass_context
def some_func(ctx, bar):
foo = ctx.params["foo"]
print(f"The value of foo is: {foo}")
#click.command()
#click.option("--foo")
#click.option("--bar")
def main(foo, bar):
some_func(bar)
if __name__ == "__main__":
main()
$ python test1.py --foo 1 --bar "bbb"
The value of foo is: 1
Option 2: click.get_current_context()
Pull the context directly from the current thread via click.get_current_context(). Available starting in Click 5.0.
Docs:
Usage: https://click.palletsprojects.com/en/7.x/advanced/#global-context-access
API: https://click.palletsprojects.com/en/7.x/api/#click.get_current_context
Note: This only works if you're in the current thread (the same thread as what was used to set up the click commands originally).
# test2.py
import click
def some_func(bar):
c = click.get_current_context()
foo = c.params["foo"]
print(f"The value of foo is: {foo}")
#click.command()
#click.option("--foo")
#click.option("--bar")
def main(foo, bar):
some_func(bar)
if __name__ == "__main__":
main()
$ python test2.py --foo 1 --bar "bbb"
The value of foo is: 1
To build on top of the Option 2 given by #dthor, I wanted to make this more seamless, so I combined it with the trick to modify global scope of a function and came up with the below decorator:
def with_click_params(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
g = func.__globals__
sentinel = object()
ctx = click.get_current_context()
oldvalues = {}
for param in ctx.params:
oldvalues[param] = g.get(param, sentinel)
g[param] = ctx.params[param]
try:
return func(*args, **kwargs)
finally:
for param in ctx.params:
if oldvalues[param] is sentinel:
del g[param]
else:
g[param] = oldvalues[param]
return wrapper
You would use it like this (borrowing sample from #dthor's answer):
#with_click_params
def some_func():
print(f"The value of foo is: {foo}")
print(f"The value of bar is: {bar}")
#click.command()
#click.option("--foo")
#click.option("--bar")
def main(foo, bar):
some_func()
if __name__ == "__main__":
main()
Here is it in action:
$ python test2.py --foo 1 --bar "bbb"
The value of foo is: 1
The value of bar is: bbb
Caveats:
Function can only be called from a click originated call stack, but this is a conscious choice (i.e., you would make assumptions on the variable injection). The click unit testing guide should be useful here.
The function is no longer thread safe.
It is also possible to be explicit on the names of the params to inject:
def with_click_params(*params):
def wrapper(func):
#functools.wraps(func)
def inner_wrapper(*args, **kwargs):
g = func.__globals__
sentinel = object()
ctx = click.get_current_context()
oldvalues = {}
for param in params:
oldvalues[param] = g.get(param, sentinel)
g[param] = ctx.params[param]
try:
return func(*args, **kwargs)
finally:
for param in params:
if oldvalues[param] is sentinel:
del g[param]
else:
g[param] = oldvalue
return inner_wrapper
return wrapper
#with_click_params("foo")
def some_func():
print(f"The value of foo is: {foo}")
#click.command()
#click.option("--foo")
#click.option("--bar")
def main(foo, bar):
some_func()
if __name__ == "__main__":
main()
I am printing to a console in python. I am looking for a one off piece of code so that all print statments after a line of code have 4 spaces at the start. Eg.
print('Computer: Hello world')
print.setStart(' ')
print('receiving...')
print('received!')
print.setStart('')
print('World: Hi!')
Output:
Computer: Hello world
receiving...
received!
World: Hi!
This would be helpful for tabbing all of the output that is contained in a function, and setting when functions output are tabbed. Is this possible?
You can define a print function which first prints your prefix, and then internally calls the built-in print function. You can even make your custom print() function to look at the call-stack and accordingly determine how many spaces to use as a prefix:
import builtins
import traceback
def print(*objs, **kwargs):
my_prefix = len(traceback.format_stack())*" "
builtins.print(my_prefix, *objs, **kwargs)
Test it out:
def func_f():
print("Printing from func_f")
func_g()
def func_g():
print ("Printing from func_g")
func_f()
Output:
Printing from func_f
Printing from func_g
Reverting back to the built-in print() function:
When you are done with your custom printing, and want to start using the built-in print() function, just use del to "delete" your own definition of print:
del print
Why not define your own custom function and use that when needed:
def tprint(*args):
print(' ', *args)
It would be used like so:
print('Computer: Hello world')
tprint('receiving...')
tprint('received!')
print('World: Hi!')
Output:
Computer: Hello world
receiving...
received!
World: Hi!
You might want to use specific prefixes only at specific places
import sys
from contextlib import contextmanager
#contextmanager
def add_prefix(prefix):
global is_new_line
orig_write = sys.stdout.write
is_new_line = True
def new_write(*args, **kwargs):
global is_new_line
if args[0] == "\n":
is_new_line = True
elif is_new_line:
orig_write("[" + str(prefix) + "]: ")
is_new_line = False
orig_write(*args, **kwargs)
sys.stdout.write = new_write
yield
sys.stdout.write = orig_write
with add_prefix("Computer 1"):
print("Do something", "cool")
print("Do more stuffs")
with add_prefix("Computer 2"):
print("Do further stuffs")
print("Done")
#[Computer 1]: Do something cool
#[Computer 1]: Do more stuffs
#[Computer 2]: Do further stuffs
#Done
The advantage is that it's a utility function, i.e. you just have to import to use it, without having to redefine every time you write a new script.
I have defined TaskTimer Class below and that I would like to execute multiple functions when the event is triggered which may or may not have arguements. I would like to come up with a generic way of doing this. My functions are not being executed and I do not undestand why. Are my arguements in t.start() incorrect?
import System
from System.Timers import (Timer, ElapsedEventArgs)
class TaskTimer(object):
def __init__ (self):
self.timer = Timer ()
self.timer.Enabled = False
self.handlers =[]
def On_Timed_Event (self,source, event):
print 'Event fired', event.SignalTime
for handler in self.handlers:
handler(*self.my_args,**self.kwargs)
def start(self,interval, repeat, *args, **kwargs):
self.repeat = repeat
self.run = True #True if timer is running
self.timer.Interval= interval
self.timer.Enabled = True
self.my_args= args
self.kwargs = kwargs
for x in self.my_args:
self.handlers.append(x)
self.timer.Elapsed += self.On_Timed_Event
def func1(a,b):
print 'function 1. this function does task1'
print '1...1...1...'
return None
def func2(d):
print 'Function 2. This function does something'
print '2...2...2...'
return None
def func3(a,b,c):
print 'function 3. This function does something else'
return None
def main():
t= TaskTimer()
D= {'fun2':'func2', 'arg2':'3'}
t.start(5000,False,func1, func2, func3, a= 1, b=3, c=4, d=D)
if __name__ == '__main__':
main()
I am experimenting so I edited the def_Timed_Event function and func1, func2 and func3 as shown below. I also added print statement to the functions as suggested by #Ewan. Does Python automatically substitute function variables from **self.kwargs?
def On_Timed_Event (self,source, event):
print '\nEvent fired', event.SignalTime
for handler in self.handlers:
print 'length of self.handlers', len(self.handlers)
print 'args', self.my_args
print 'kwargs', self.kwargs
print 'handler', handler
handler(**self.kwargs)
self.handler[handler](**self.kwargs)
def func1(a,b):
print 'function 1. this function does task1'
print 'func1', a,b
print '1...1...1...'
return None
def func2(d):
print 'Function 2. This function does something'
print 'func2', d
print '2...2...2...'
return None
def func3(a,b,c):
print 'function 3. This function does something else'
print 'func3', a,b,c
return None
The code runs inside IronPyhton console.
![IronPython_console][2]
[2]: http://i.stack.imgur.com/vYW0S.jpg
First of all I can see you having trouble with this line:
handler(*self.my_args,**self.kwargs)
As you aren't accepting any extra kwargs in func1, func2 or func3 I would expect to see the following if it was reaching them:
In [1]: def func1(a, b):
...: print(a, b)
...:
In [2]: kwg = {'a':1, 'b':3, 'c':4, 'd':{'fun2':'func2', 'arg2':'3'}}
In [3]: func1(**kwg)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-3-12557b6441ce> in <module>()
----> 1 func1(**kwg)
TypeError: func1() got an unexpected keyword argument 'c'
Personally I'd put some decent logging/debug statements in to find out where your program is getting in your stack.
Is the program completing successfully or are you getting a TraceBack?
What is this designed to do: self.timer.Elapsed += self.On_Timed_Event?
It looks like it's supposed to be adding a time onto your timer.Elapsed value however nothing is returned by self.On_Timed_Event
On_Timed_Event takes 2 parameters (source, event) however you don't seem to be passing them in, this should also cause the program to fail.
Are you seeing Event fired in your stdout or is the program not getting to On_Timed_Event at all? This may show a problem in your System.Timers code.