I have a function for sending an email which is used in a celery task i need to make the code sleep for a second if that email function is ran for 20 times how can i make this happpen.
#app.task(bind=True)
def send_email_reminder_due_date(self):
send_email_from_template(
subject_template_path='emails/0.txt',
body_template_path='emails/1.html',
template_data=template_data,
to_email_list=email_list,
fail_silently=False,
content_subtype='html'
)
so the above code is celery periodic task which runs daily i filter the todays date and send the email for all the records which are today's date if the number of records is more than 20 lets say so for evry 20 emails sent we need to make the code sleep for a second
send_email_from_template is function which is used for sending an email
I may be missing a vital point here around what you can/can't do because you are using celery, but I'll post in case it's useful. You can track function state by assigning attributes to functions. In the below code I assign an attribute times_without_sleep to a dummy function and track its value to sleep every 20 calls.
# ====================
def my_function():
if not hasattr(my_function, 'times_without_sleep'):
my_function.times_without_sleep = 0
print('Do the stuff')
if my_function.times_without_sleep >= 20:
print('Sleep')
my_function.times_without_sleep = 0
else:
my_function.times_without_sleep += 1
# ====================
if __name__ == "__main__":
for i in range(0, 100):
my_function()
You can also set the attribute value for the function if you need to set e.g. my_function.times_without_sleep = 0 at the end of round of emails etc.
Couldn't you just do something like this with send_email_from_template? You can also make it a decorator as in this answer to avoid cluttering the function code.
Related
I am trying to run scheduled tasks only once at the specific time but I couldn't figure out how to do that. All I understand is, every 10 seconds 1 task is stored and then when the time comes job runs them all.(I shared the result in below.) For example test_1 task scheduled to 11:02. I started code at 11:01. So for 60 seconds I got 6 results.
For second scheduled task 120 seconds passed since the beginning so I got 12 results.
Sorry for the misinformation. I updated my question. I also shared my excel date list I hope this gives more information.If today is work day (1) then run scheduled tasks else do nothing.This code will run nonstop so I have to check the current date. This is why I placed schedule inside the while loop.
import pandas as pd
from datetime import datetime
import time
import schedule
def test_1():
print("do task 1")
def test_2():
print("do task 2")
while(1 != 2):
today = datetime.today().strftime('%Y/%m/%d')
df_Calender = pd.read_excel('Calender.xlsx', sheet_name=0)
date = df_Calender['Date'] #Date column
filter_1 = date == today
df1_Calender = df_Calender.loc[filter_1]
week_day = df1_Calender['WeekDays']
if (week_day.item() != 1):
print("do nothing")
else:
schedule.every().day.at("11:02").do(test_1)
schedule.every().day.at("11:03").do(test_2)
time.sleep(10)
schedule.run_pending()
Here is the date list(Calender.xlsx)
This is the result
do task 1
do task 1
do task 1
do task 1
do task 1
do task 1
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
do task 2
You would want to move your schedule out of the while loop, otherwise, it will keep scheduling the task after every 10-seconds due to the time.sleep(10) in the endless loop.
You can try something like:
def task1():
# Do something
schedule.every().day.at("11:02").do(task1)
# If you only want it to run once, then no loop is required
# Else if you want it to run once per day then add the while True loop
schedule.run_pending()
time.sleep(1)
Not sure if this is what you want. But it's what I understood you want
You have a list of times when you want a certain function to be called.
In this example you want task1 to be executed at 3 different times of the day and task 2 at times of the day
task1_schedule = ["11:02", "13:30", "17:00"]
task2_schedule = ["11:03", "14:20"]
Then you just iterate over the list and set a schedule
for start_time in task1_schedule:
schedule.every().day.at(start_time).do(test_1)
for start_time in task2_schedule:
schedule.every().day.at(start_time).do(test_2)
In this case since there are only 2 different taks I made 2 different list.
This code will create the schedules for each task once for each element in your schedule list.
Since your calls to schedule are inside while loop, they get placed on schedule multiple times. You need to place items on schedule once and only check if they are scheduled in loop. Try moving both schedule lines in front of while loop.
from datetime import datetime
import time
import schedule
def test_1():
print("do task 1")
def test_2():
print("do task 2")
schedule.every().day.at("11:02").do(test_1)
schedule.every().day.at("11:03").do(test_2)
while(1 != 2):
time.sleep(10)
schedule.run_pending()
See this example in documentation.
Based on comment you also need to check date before running task. I advise not to schedule conditionally, but to check for date either inside test_1 and test_2 functions or specify more day/week conditions (for example .every(3).days for each third day).
I have found the solution. Just adding schedule.clear() after schedule.run_pending() clears all stored tasks and leaves only one scheduled job. Thank you for your help.
I want to loop over tasks, again and again, until reaching a certain condition before continuing the rest of the workflow.
What I have so far is this:
# Loop task
class MyLoop(Task):
def run(self):
loop_res = prefect.context.get("task_loop_result", 1)
print (loop_res)
if loop_res >= 10:
return loop_res
raise LOOP(result=loop_res+1)
But as far as I understand this does not work for multiple tasks.
Is there a way to come back further and loop on several tasks at a time ?
The solution is simply to create a single task that itself creates a new flow with one or more parameters and calls flow.run(). For example:
class MultipleTaskLoop(Task):
def run(self):
# Get previous value
loop_res = prefect.context.get("task_loop_result", 1)
# Create subflow
with Flow('Subflow', executor=LocalDaskExecutor()) as flow:
x = Parameter('x', default = 1)
loop1 = print_loop()
add = add_value(x)
loop2 = print_loop()
loop1.set_downstream(add)
add.set_downstream(loop2)
# Run subflow and extract result
subflow_res = flow.run(parameters={'x': loop_res})
new_res = subflow_res.result[add]._result.value
# Loop
if new_res >= 10:
return new_res
raise LOOP(result=new_res)
where print_loop simply prints "loop" in the output and add_value adds one to the value it receives.
Unless I'm missing something, the answer is no.
Prefect flows are DAGs, and what you are describing (looping over multiple tasks in order again and again until some condition is met) would make a cycle, so you can't do it.
This may or may not be helpful, but you could try and make all of the tasks you want to loop into one task, and loop within that task until your exit condition has been met.
having issues trying to get threading working in python using the awesome Appjar package.
The following program needs to count through a list, and update a progress bar simultaneously. I've followed the appjar documentation for threading, but it's returning NameError: name 'percent_complete' is not defined in the app.thread (line 35), in which you're meant to insert function params - my code is below:
from appJar import gui
import time
# define method the counts through a list of numbers, and updates the progress meter
def press(btn):
objects = [1,3,6]
total = len(objects)
current_object = 0
for i in objects:
print(i)
current_object += 1
current_percent_complete = (current_object / total) * 100
updateMeter(current_percent_complete)
time.sleep(1)
def updateMeter(percent_complete):
app.queueFunction(app.setMeter, "progress", percent_complete)
# create a GUI variable called app
app = gui("Login Window")
app.setBg("orange")
app.setFont(18)
# add GUI elements : a label, a meter, & a button
app.addLabel("title", "COUNTER")
app.setLabelBg("title", "blue")
app.setLabelFg("title", "orange")
app.addMeter("progress")
app.setMeterFill("progress", "green")
app.addButton("START COUNTING", press)
# put the updateMeter function in its own thread
app.thread(updateMeter, percent_complete)
# start the GUI
app.go()
I can get rid of the error by defining percent_complete like so:
from appJar import gui
import time
# define method the counts through a list of numbers, and updates the progress meter
percent_complete = 0
def press(btn):
...
However, when GUI loads and button is pressed it doesn't thread. Instead it iterates through the list, then updates the progress bar afterwards.
Has anyone come across the same issue? any insight would be awesomely appreciated!
Thanks!
There are a couple of issues here:
First, I'm not sure your maths result in good percentages to update the meter with, so you might not see much change - should you be using i?
Second, the GUI won't be updated until the loop (and the sleeps inside it) all complete. Instead, you should try counting how many items to process, and iterating through them with an after() function, see here: http://appjar.info/pythonLoopsAndSleeps/#conditional-loops
Third, the call to app.thread() at the end doesn't achieve much - it calls the update_meter() function with a parameter that doesn't exist, it can be removed.
Fourth, the actual update_meter() function isn't necessary, as you're not really using a thread - that can be removed as well...
Give this a try, once you've had a look at the maths:
current_object = 0
def press(btn):
global current_object
current_object = 0
processList()
def processList():
global current_object
objects = [1,3,6]
total = len(objects)
if current_object < total:
i = objects[current_object]
print(i)
current_object += 1
current_percent_complete = (current_object / total) * 100
app.setMeter("progress", current_percent_complete)
app.after(1000, processList)
UPDATE: just to clarify on the maths issue, you're dividing one integer by another: 0/3, 1/3, 2/3, 3/3 and so on. In python2 this will result in 0, in python3 you'll get fractions.
I am looking to reset a counter every day using Redis. I am new to Redis so I want to make sure I well understood how transactions and pipes work.
Does the following code ensure that I will always get a unique couple of (date, number) while working in a multi processes environment or do I need to use a Redis lock?
import datetime
import redis
r = redis.Redis(...)
def get_utc_date_now():
return datetime.datetime.utcnow().date()
def get_daily_counter(r, dt_key='dt', counter_key='counter'):
def incr_daily_number(pipe):
dt_now = get_utc_date_now().isoformat() # e.g.: "2014-10-18"
dt = pipe.get(dt_key)
pipe.multi()
if dt != dt_now:
pipe.set(dt_key, dt_now)
pipe.set(counter_key, 0)
pipe.get(dt_key)
pipe.incr(counter_key)
result = r.transaction(incr_daily_number, dt_key)
return result[-2:]
# Get the (dt, number) couple
# 2014-10-18, 1
# 2014-10-18, 2
# etc.
dt, number = get_daily_counter(r)
UPDATE
Try with LUA Script:
r = redis.Redis(...)
incr_with_reset_on_change_lua_script = """
local dt = redis.call('GET', KEYS[2])
if dt ~= ARGV[2] then
redis.call('MSET', KEYS[1], ARGV[1], KEYS[2], ARGV[2])
end
return redis.call('INCR', KEYS[1])
"""
# Incr KEYS1 but reset first if KEYS2 has changed.
incr_with_reset_on_change = r.register_script(incr_with_reset_on_change_lua_script)
counter_key = 'dcounterA'
watch_key = 'dcounterA_dt'
watch_value = get_utc_date_now().isoformat()
number = incr_with_reset_on_change(keys=[counter_key, watch_key], args=[reset_value, watch_value])
Consider two concurrent transactions occuring at midnight. Both can execute get(dt_key), but one will execute the MULTI/EXEC block first. It will reset the counter, set the new date, increment the counter. The second one will enter also in its MULTI/EXEC block, but because the value of 'dt' has changed, the execution will fail, and incr_daily_number will be called again. This time get(dt_key) will return the new date, so when the MULTI/EXEC block will be executed, the counter will be incremented without any reset. The two transactions will return the new date with different counter values.
So, I believe there is no race condition here, and that the (date,number) couples will be unique.
You could also have implemented this using a server-side Lua script (whose execution is always atomic). It is usually more convenient.
Note that actually, there is no such thing as a Redis lock. The locking mechanism available in the API is provided by the Python client - not by the Redis server. If you look at its implementation, you will realize it is also based on SETNX + WATCH/MULTI/EXEC blocks or Lua scripting.
I have an IRC bot that I made for automating stuff.
Here's a snippet of it:
def analyseIRCText(connection, event):
global adminList, userList, commandPat, flood
userName = extractUserName(event.source())
userCommand = event.arguments()[0]
escapedChannel = cleanUserCommand(config.channel).replace('\\.', '\\\\.')
escapedUserCommand = cleanUserCommand(event.arguments()[0])
#print userName, userCommand, escapedChannel, escapedUserCommand
if flood.has_key(userName):
flood[userName] += 1
else:
flood[userName] = 1
... (if flood[userName] > certain number do...)
So the idea is that flood thing is a dictionary where a list of users who have entered in a command to the bot in the recent... some time is kept, and how many times they've said so and so within that time period.
Here's where I run into trouble. There has to be SOMETHING that resets this dictionary so that the users can say stuff every once in awhile, no? I think that a little thing like this would do the trick.
def floodClear():
global flood
while 1:
flood = {} # Clear the list
time.sleep(4)
But what would be the best way to do this?
At the end of the program, I have a little line called:
thread.start_new_thread(floodClear,())
so that this thing doesn't get called at gets stuck in an infinite loop that halts everything else. Would this be a good solution or is there something better that I could do?
Your logic should be enough. If you have say:
if flood.has_key(userName):
flood[userName] += 1
else:
flood[userName] = 1
if flood[userName] > say 8:
return 0
That should make your bot ignore the user if he has spammed too many times within your given time period. What you have there should also work to clear up your flood dictionary.