Python threading global variable issue - python

I have a few scripts that i want to run simultaneously, they read a CSV file, im trying the following;
import sys
import csv
out = open("C:\PYDUMP\PYDUMPINST.csv","r")
dataf=csv.reader(out)
for row in dataf:
take = row[0]
give = row[1]
def example():
try:
lfo = int(take)
if lfo > 0:
#code
except Exception, e:
pass
example()
This is saved as takefile1.py. I have 20 scripts with similar structures that i want to run simultaneously. So im using(which i have been using for running other batches of scripts trouble free) the following;
import csv
import sys
from threading import Thread
def firstsend_lot():
execfile("C:\Users\takefile1.py")
execfile("C:\Users\takefile2.py")
def secondsend_lot():
execfile("C:\Users\takefile3.py")
execfile("C:\Users\takefile4.py")
if __name__ == '__main__':
Thread(target = firstsend_lot).start()
Thread(target = secondsend_lot).start()
So i am getting the error "global name 'take' is not defined". Anyone got any suggestions? Im pretty hopeless at Python so pretend you are talking to an idiot.

Your function example(), do not have access to take. Try adding a line in it:
def example():
global take
try:
lfo = int(take)
if lfo > 0:
#code
except Exception, e:
pass

Related

Python: How to resume the python script as soon as vpn network is up?

I have a python script (xyz.py) that I run through the command prompt. My question is that don't we have any method which helps to resume the python code automatically from where it was lost the VPN connection, without any manual intervention. This will help to avoid monitoring the code frequently. Below is my code but it reads from the start if there is any disconnection. Please suggest.
filename = 'xyz.py'
while True:
p = subprocess.Popen('python '+filename, shell=True).wait()
""" #if your there is an error from running 'xyz.py',
the while loop will be repeated,
otherwise the program will break from the loop"""
if p != 0:
continue
else:
break
If me, time.sleep will be used:
import os
import time
from datetime import datetime
import requests
script = 'xyz.py'
def main():
network_check_url = 'http://8.8.8.8'
while True:
try:
requests.get(network_check_url)
except Exception as e:
print(datetime.now(), e)
time.sleep(1)
else:
print('Network is ok. {datetime.now():%Y-%m-%d_%H:%M:%S}')
os.system(f'python {script}')
return
if __name__ == '__main__':
main()

Packaging python modules having MultiProcessing Code

I am trying to package my python project into an executable using pyinstaller. The main module contains code for multiprocessing. When I run the executable, only the lines of code prior to the multi processing part get executed again and again. Neither does it throw an exception or exit the program.
Code in main module:
from Framework.ExcelUtility import ExcelUtility
from Framework.TestRunner import TestRunner
import concurrent.futures
class Initiator:
def __init__(self):
self.exec_config_dict = {}
self.test_list = []
self.test_names = []
self.current_test_set = []
def set_first_execution_order(self):
# Code
def set_subsequent_execution_order(self):
# Code
def kick_off_tests(self):
'''Method to do Multi process execution'''
if(__name__=="__main__"):
with concurrent.futures.ProcessPoolExecutor(max_workers=int(self.exec_config_dict.get('Parallel'))) as executor:
for test in self.current_test_set:
executor.submit(TestRunner().runner,test) ***This line is not being executed from the exe file.
initiator = Initiator()
initiator.get_run_info()
initiator.set_first_execution_order()
initiator.kick_off_tests()
while len(initiator.test_list) > 0:
initiator.set_subsequent_execution_order()
try:
initiator.kick_off_tests()
except BaseException as exception:
print(exception)
From the problem definition I'm assuming you are using ms-windows, and that the main module is not named __main__.py.
In that case, multiprocessing has some special guidelines:
Make sure that the main module can be safely imported by a new Python interpreter without causing unintended side effects (such a starting a new process).
and
Instead one should protect the “entry point” of the program by using if __name__ == '__main__'
So, change the last part of your main module like this:
from multiprocessing import freeze_support
def kick_off_tests(self):
'''Method to do Multi process execution'''
with concurrent.futures.ProcessPoolExecutor(max_workers=int(self.exec_config_dict.get('Parallel'))) as executor:
for test in self.current_test_set:
executor.submit(TestRunner().runner,test)
if __name__ == '__main__':
freeze_support()
initiator = Initiator()
initiator.get_run_info()
initiator.set_first_execution_order()
initiator.kick_off_tests()
while len(initiator.test_list) > 0:
initiator.set_subsequent_execution_order()
try:
initiator.kick_off_tests()
except BaseException as exception:
print(exception)

Get live value of variable from another script

I have two scripts, new.py and test.py.
Test.py
import time
while True:
x = "hello"
time.sleep(1)
x = "world"
time.sleep(1)
new.py
import time
while True:
import test
x = test.x
print(x)
time.sleep(1)
Now from my understanding this should print "hello" and a second later "world" all the time when executing new.py.
It does not print anything, how can i fix that?
Thanks
I think the code below captures what you are asking. Here I simulate two scripts running independently (by using threads), then show how you can use shelve to communicate between them. Note, there are likely much better ways to get to what you are after -- but if you absolutely must run the scripts independently, this will work for you.
Incidentally, any persistent source would do (such as a database).
import shelve
import time
import threading
def script1():
while True:
with shelve.open('my_store') as holder3:
if holder3['flag'] is not None: break
print('waiting')
time.sleep(1)
print("Done")
def script2():
print("writing")
with shelve.open('my_store') as holder2:
holder2['flag'] = 1
if __name__ == "__main__":
with shelve.open('my_store') as holder1:
holder1['flag'] = None
t = threading.Thread(target=script1)
t.start()
time.sleep(5)
script2()
t.join()
Yields:
waiting
waiting
waiting
waiting
waiting
writing
Done
Test.py
import time
def hello():
callList = ['hello', 'world']
for item in callList:
print item
time.sleep(1)
hello()
new.py
from parent import hello
while True:
hello()

Run multiple servers in python at same time (Threading)

I have 2 servers in python, I want to mix them up in one single .py and run together:
Server.py:
import logging, time, os, sys
from yowsup.layers import YowLayerEvent, YowParallelLayer
from yowsup.layers.auth import AuthError
from yowsup.layers.network import YowNetworkLayer
from yowsup.stacks.yowstack import YowStackBuilder
from layers.notifications.notification_layer import NotificationsLayer
from router import RouteLayer
class YowsupEchoStack(object):
def __init__(self, credentials):
"Creates the stacks of the Yowsup Server,"
self.credentials = credentials
stack_builder = YowStackBuilder().pushDefaultLayers(True)
stack_builder.push(YowParallelLayer([RouteLayer, NotificationsLayer]))
self.stack = stack_builder.build()
self.stack.setCredentials(credentials)
def start(self):
self.stack.broadcastEvent(YowLayerEvent(YowNetworkLayer.EVENT_STATE_CONNECT))
try:
logging.info("#" * 50)
logging.info("\tServer started. Phone number: %s" % self.credentials[0])
logging.info("#" * 50)
self.stack.loop(timeout=0.5, discrete=0.5)
except AuthError as e:
logging.exception("Authentication Error: %s" % e.message)
if "<xml-not-well-formed>" in str(e):
os.execl(sys.executable, sys.executable, *sys.argv)
except Exception as e:
logging.exception("Unexpected Exception: %s" % e.message)
if __name__ == "__main__":
import sys
import config
logging.basicConfig(stream=sys.stdout, level=config.logging_level, format=config.log_format)
server = YowsupEchoStack(config.auth)
while True:
# In case of disconnect, keeps connecting...
server.start()
logging.info("Restarting..")
App.py:
import web
urls = (
'/', 'index'
)
app = web.application(urls, globals())
class index:
def GET(self):
greeting = "Hello World"
return greeting
if __name__ == "__main__":
app.run()
I want to run both together from single .py file together.
If I try to run them from one file, either of the both starts and other one starts only when first one is done working.
How can I run 2 servers in python together?
import thread
def run_app1():
#something goes here
def run_app2():
#something goes here
if __name__=='__main__':
thread.start_new_thread(run_app1)
thread.start_new_thread(run_app2)
if you need to pass args to the functions you can do:
thread.start_new_thread(run_app1, (arg1,arg2,....))
if you want more control in your threads you could go:
import threading
def app1():
#something here
def app2():
#something here
if __name__=='__main__':
t1 = threading.Thread(target=app1)
t2 = threading.Thread(target=app2)
t1.start()
t2.start()
if you need to pass args you can go:
t1 = threading.Thread(target=app1, args=(arg1,arg2,arg3.....))
What's the differences between thread vs threading? Threading is higher level module than thread and in 3.x thread got renamed to _thread... more info here: http://docs.python.org/library/threading.html but that's for another question I guess.
So in your case, just make a function that runs the first script, and the second script, and just spawn threads to run them.

atexit not writing to file

I'm testing to see if the atexit module runs properly by attempting to print to file. The actual purpose of atexit will be to close the open port, however this is to test if atexit is working in the first place. However, atexit is not printing to file, any ideas why not? Any help is appreciated.
import atexit
import scala5
from scala5 import sharedvars
scalavars = sharedvars()
import serial
myPort=serial.Serial()
myPort.baudrate = 9600
myPort.port = "COM4"
myPort.parity=serial.PARITY_NONE
myPort.stopbits=serial.STOPBITS_ONE
myPort.bytesize=serial.EIGHTBITS
myPort.timeout=2
myPort.writeTimeout=2
try:
myPort.close()
except:
print("didn't need to close")
def port_exit():
fo = open("test.txt", "w")
fo.write("This works!")
fo.close()
myPort.open()
while True:
x = myPort.readline()
if x == "1\r\n":
scalavars.door = x
scala5.ScalaPlayer.Sleep(10)
atexit.register(port_exit)
port_exit()
You are registering the function after your while loop, which I am assuming you are exiting by killing the script (meaning the function port_exit isn't registered). If you register the function before the while then it should trigger and write the file when the script exits.
atexit.register(port_exit)
while True:
x = myPort.readline()
if x == "1\r\n":
scalavars.door = x
scala5.ScalaPlayer.Sleep(10)

Categories