Running methods in the same time - python

At the begging I have write very general topic, even if I know about thread and proccess, but I don't know which of these both will be better for my case.
Ok, so.. code:
class Proces(object):
[...]
def Obsluz(self):
proces = LRU(self.sekwencja, int(self.przydzielone_ramki))
proces.Symulacja("T")
#.thread.join()
def Threads(self):
thread = Thread(target = self.Obsluz)
thread.start()
thread.join()
and running that code :
for lru in self.lru_procesy:
lru.Watek()
What I want achieve is running at the same time method Obsluz several times with different params(which are taking from Proces.attributes). It's random number how many will be proces object. It can be 10/20/30 ect.
My code above is not running like I want to, because each thread is ending one by one(because of .join()). Is it possible to running these at the same time?
thank you!

You are just starting one worker and immediately waiting for it to finish.
To spawn several worker threads and wait for them all to finish use something like this:
workers = []
for wid in range(nworkers):
w = Thread(target = dowork, args = ...)
w.start()
workers.append(w)
# join all of the workers
for w in workers: w.join()
print "All done!"

Related

Is there a way to dynamically specify the number of threads in a python script?

On several occasions, I have a list of tasks that need to be executed via Python. Typically these tasks take a few seconds, but there are hundreds-of-thousands of tasks and treading significantly improves execution time. Is there a way to dynamically specify the number of threads a python script should utilize in order to solve a stack of tasks?
I have had success running threads when executed in the body of Python code, but I have never been able to run threads correctly when they are within a function (I assume this is because of scoping). Below is my approach to dynamically define a list of threads which should be used to execute several tasks.
The problem is that this approach waits for a single thread to complete before continuing through the for loop.
import threading
import sys
import time
def null_thread():
""" used to instanciate threads """
pass
def instantiate_threads(number_of_threads):
""" returns a list containing the number of threads specified """
threads_str = []
threads = []
index = 0
while index < number_of_threads:
exec("threads_str.append(f't{index}')")
index += 1
for t in threads_str:
t = threading.Thread(target = null_thread())
t.start()
threads.append(t)
return threads
def sample_task():
""" dummy task """
print("task start")
time.sleep(10)
def main():
number_of_threads = int(sys.argv[1])
threads = instantiate_threads(number_of_threads)
# a routine that assigns work to the array of threads
index = 0
while index < 100:
task_assigned = False
while not task_assigned:
for thread in threads:
if not thread.is_alive():
thread = threading.Thread(target = sample_task())
thread.start()
# script seems to wait until thread is complete before moving on...
print(f'index: {index}')
task_assigned = True
index += 1
# wait for threads to finish before terminating
for thread in threads:
while thread.is_alive():
pass
if __name__ == '__main__':
main()
Solved:
You could convert to using concurrent futures ThreadPoolExecutor,
where you can set the amount of threads to spawn using
max_workers=amount of threads. – user56700

Starting thread after thread is finished

Lets say I want to run 10 threads at same time and after one is finished start immediately new one. How can I do that?
I know with thread.join() I can wait to get finished, but than 10 threads needs to be finished, but I want after one finished to start new one immediately.
Well, what I understand is that you need to execute 10 thread at the same time.
I suggest you to use threading.BoundedSemaphore()
A sample code on using it is given below:
import threading
from typing import List
def do_something():
print("I hope this cleared your doubt :)")
sema4 = threading.BoundedSemaphore(10)
# 10 is given as parameter since your requirement stated that you need just 10 threads to get executed parallely
threads_list: List[threading.Thread] = []
# Above variable is used to save threads
for i in range(100):
thread = threading.Thread(target=do_something)
threads_list.append(thread) # saving thread in order to join it later
thread.start() # starting the thread
for thread in threads_list:
thread.join() # else, parent program is terminated without waiting for child threads

Managing thread execution in Python

I am currently using worker threads in Python to get tasks from a Queue and execute them, as follows:
from queue import Queue
from threading import Thread
def run_job(item)
#runs an independent job...
pass
def workingThread():
while True:
item = q.get()
run_job(item)
q.task_done()
q = Queue()
num_worker_threads = 2
for i in range(num_worker_threads):
t = Thread(target=workingThread)
t.daemon = True
t.start()
for item in listOfJobs:
q.put(item)
q.join()
This is functional, but there is an issue: some of the jobs to be executed under the run_job function are very memory-demanding and can only be run individually. Given that I could identify these during runtime, how could I manage to put the parallel worker threads to halt their execution until said job is taken care of?
Edit: It has been flagged as a possible duplicate of Python - Thread that I can pause and resume, and I have referred to this question before asking, and it surely is a reference that has to be cited. However, I don't think it adresses this situation specifically, as it does not consider the jobs being inside a Queue, nor how to specifically point to the other objects that have to be halted.
I would pause/resume the threads so that they run individually.
The following thread
Python - Thread that I can pause and resume indicates how to do that.

Python: threads can only be started once

I want to do threading in python. I have 100 words and want to put them in 6 different links. If one of the links is ready, I want that the link can get the new word. This while the other threads have still the first word in work. My complete program should be allowed to do more code first when the 100 keywords are done. I have the following code:
threads = []
def getresults(seed):
for link in links:
t = threading.Thread(target=getLinkResult, args = (suggestengine, seed))
threads.append(t)
for thread in threads:
thread.start()
for seed in tqdm:
getresults(seed + a)
getresults(seed + b)
for thread in threads:
thread.join()
#code that should happen after
I get an error at the moment:
threads can only be started once
You are calling getresults twice, and both times, they reference the same global threads list. This means, that when you call getresults for the first time, threads are started.
When you call them for the second time, the previous threads that are already running, have the .start() method invoked again.
You should start threads in the getresults as local threads, and then append them to the global threads list.
Although you can do the following:
for thread in threads:
if not thread.is_alive():
thread.start()
it does not solve the problem as one or more threads might've already ended and therefore be started again, and would therefore cause the same error.
You should start only new threads in your getresults
threads = []
def getresults(seed):
local_threads = []
for link in links:
t = threading.Thread(target=getLinkResult, args = (suggestengine, seed))
local_threads.append(t)
threads.append(t)
for thread in local_threads:
thread.start()
for seed in tqdm:
getresults(seed + a)
getresults(seed + b)
for thread in threads:
thread.join()
Fastest way, but not the brightest (general problem):
from tkinter import *
import threading, time
def execute_script():
def sub_execute():
print("Wait 5 seconds")
time.sleep(5)
print("5 seconds passed by")
threading.Thread(target=sub_execute).start()
root = Tk()
button_1 = Button(master=root, text="Execute Script", command=execute_script)
button_1.pack()
root.mainloop()
The error is explicit. You start your threads twice, while you shouldn't.
getresults(seed + a)
getresults(seed + b)
When you sequence these calls you start twice the loop of threads. To properly do what you want to do, you to make a thread pool and a task queue. Basically, you need a second list of words to process and a mutex. Each thread will lock the mutex, read and dequeue a word, then unlock and process the word.

Execute Thread 5 by 5

I have a huge list of information and my program should analyze each one of them. To speed up I want to use threads, but I want to limit them by 5. So I need to make a loop with 5 threads and when one finish their job grab a new one till the end of the list.
But I don't have a clue how to do that. Should I use queue? For now I just running 5 threads in the most simple way:
Thank you!
for thread_number in range (5):
thread = Th(thread_number)
thread.start()
Separate the idea of worker thread and task -- do not have one worker work on one task, then terminate the thread. Instead, spawn 5 threads, and let them all get tasks from a common queue. Let them each iterate until they receive a sentinel from the queue which tells them to quit.
This is more efficient than continually spawning and terminating threads after they complete just one task.
import logging
import Queue
import threading
logger = logging.getLogger(__name__)
N = 100
sentinel = object()
def worker(jobs):
name = threading.current_thread().name
for task in iter(jobs.get, sentinel):
logger.info(task)
logger.info('Done')
def main():
logging.basicConfig(level=logging.DEBUG,
format='[%(asctime)s %(threadName)s] %(message)s',
datefmt='%H:%M:%S')
jobs = Queue.Queue()
# put tasks in the jobs Queue
for task in range(N):
jobs.put(task)
threads = [threading.Thread(target=worker, args=(jobs,))
for thread_number in range (5)]
for t in threads:
t.start()
jobs.put(sentinel) # Send a sentinel to terminate worker
for t in threads:
t.join()
if __name__ == '__main__':
main()
It seems that you want a thread pool. If you're using python 3, you're lucky : there is a ThreadPoolExecutor class
Else, from this SO question, you can find various solutions, either handcrafted or using hidden modules from the python library.

Categories