Python Scheduling jobs with schedule along with socket module not working - python

I am trying to create a sockets server (TCP/IP) and inside it based on few data from client I am scheduling few background jobs.
following code is working ->
import schedule
import time
def test1():
print('hi from 1')
def test2():
print('hi from test2')
while True:
schedule.run_pending()
time.sleep(1)
Then I tried following thing with socket server then its not executing the jobs/function. Can someone help me what's happening here.
Not working code
import schedule
import time
import socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_address = ('localhost', 8009)
print('starting up on {} port {}'.format(*server_address))
sock.bind(server_address)
# Listen for incoming connections
sock.listen(1)
def test1():
print('hi from 1')
def test2():
print('hi from test2')
schedule.every(1).minutes.do(test1)
schedule.every(2).minutes.do(test2)
while True:
schedule.run_pending()
time.sleep(1)
print('waiting for a connection')
connection, client_address = sock.accept()
data = connection.recv(1024)
result = data.decode('utf-8')
print('data recived from clinet : ', result)
Thing I am trying to achieve is I want to create python socket server which
will accept request from node client's and based on clients data I want to schedule few jobs in python. for this I am using socket, schedule moduls from python to create socket server and schedule jobs respective and net module at node js's client for sending data to python server.

Please explain your problem in more detail. sock.accept is blocking, so the loop is blocking, is this your problem?
To prevent the program from blocking you can run the scheduler loop in a separate thread and the acceptance loop also in a separate thread too. Create a main thread to manage your child threads. Have a look at the module threading.
Maybe it makes sense to use an other scheduler library that can handle threading, see here.
Disclosure: I'm one of the authors of the scheduler library

Related

Python multithreading for simultaneous input and socket connection

I have started with a friend of mine to dig deeper into network coding. Concurrency and parallelism are a big part of this.
We have created a server and client to connect them and this works fine. Now we want to create a thread in the server for checking for inputs from the keyboard while listing to connections on the socket. Maybe we get something totally wrong but we tried it with this code and a threadpoolexecution but the program get stuck at the first await call
i = await ainput.asyncInput()
We thought that after the await starts the thread wait for an input and the main thread goes on in execution but that seems to be wrong.
Here is the server module:
import socket
import asyncio
import asyncron_Input as ainput
def closeServer():
exit()
server_address = ('localhost',6969)
async def main():
#create TCP Socket
serverSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# prints the adress and the port of the server
print("starting up on: ")
print(server_address[0])
print("port: ")
print(server_address[1])
# binds the socket to the given adress
serverSock.bind(server_address)
#listen for connection
serverSock.listen(1)
print("End server with 1")
while True:
#close server with asynco inputs
i = await ainput.asyncInput()
if i == "1":
closeServer()
#wait for connection
print("waiting for conncetion")
conn,client_address = serverSock.accept()
try:
print("connected to",client_address)
while True:
data = conn.recv(16)
if data:
print("received:",data)
data = "succsessful"
else:
print("no data")
break
finally:
#close connection
conn.close
asyncio.run(main())
Here is the async input:
import asyncio
from concurrent.futures import ThreadPoolExecutor
async def asyncInput():
with ThreadPoolExecutor(1,'Async Input') as executor:
return await asyncio.get_event_loop().run_in_executor(executor,input)
Thanks for your help in advance
There's two problems with your code:
You wait for input before accepting any socket connections, you can't use await if you want code proceeding it to happen concurrently, you need to use a Task.
You're using blocking sockets. sock.accept and sock.recv are blocking by default. They'll halt execution of your event loop, you need to use them in an await expression, which means making your sever socket non-blocking and then using them with special asyncio specific socket methods.
To fix this, you'll need to wrap listening for input in a task, make your server socket non-blocking, get the running event loop and then use the sock_accept and sock_recv methods of the event loop. Putting this all together, your code will look something like this:
import asyncio
import socket
from concurrent.futures import ThreadPoolExecutor
async def asyncInput():
with ThreadPoolExecutor(1,'Async Input') as executor:
return await asyncio.get_event_loop().run_in_executor(executor,input)
def closeServer():
exit()
server_address = ('localhost',8000)
async def loop_for_input():
while True:
#close server with asynco inputs
i = await asyncInput()
if i == "1":
closeServer()
async def main():
#create TCP Socket
serverSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM, )
serverSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
# prints the adress and the port of the server
print("starting up on: ")
print(server_address[0])
print("port: ")
print(server_address[1])
# binds the socket to the given adress
serverSock.bind(server_address)
serverSock.setblocking(False) #make your socket non-blocking
#listen for connection
serverSock.listen(1)
print("End server with 1")
loop = asyncio.get_running_loop() # get the running event loop
input_task = asyncio.create_task(loop_for_input()) # create an task to run your input loop
while True:
#wait for connection
print("waiting for conncetion")
conn,client_address = await loop.sock_accept(serverSock) # use the sock_accept coroutine to asynchronously listen for connections
try:
print("connected to",client_address)
while True: # you may also want to create a task for this loop.
data = await loop.sock_recv(conn, 16) # use the sock_recv coroutine to asynchronously listen for data
if data:
print("received:",data)
data = "succsessful"
else:
print("no data")
break
finally:
#close connection
conn.close()
asyncio.run(main())
There's potentially a third problem in that your code can only handle one client at a time since you enter an infinite loop for the first connection that comes in. This means any additional clients who connect will be blocked. If you want to solve that problem, any time a client connects, create a new Task to listen for data from the client, similar to what the code above does with asyncInput()

Blender API (bpy) and socket server

I am running blender with a python file via shell like this
./blender visualizer.blend -P Visualizer.py
in my python file I have a socket server, that receives a list and loops over the list to create meshes accordingly.
Now the problem is I have to use threading so that blender doesn't freeze because otherwise blender's window won't even show.
but from the socket thread as it appears I can not create meshes as it crashes my blender without throwing any exception. I have tried multiprocessing as well and it freezes blender.
now does anybody has any idea how to have socket server receive data and create meshes without freezing blender?
def socket_server(*args):
HOST = '127.0.0.1'
PORT = 12345
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, True)
s.bind((HOST, PORT))
s.listen(5)
conn, addr = s.accept()
while 1:
data = conn.recv(16384)
if not data:
break
//creates mesh here
conn.send(b'ok')
time.sleep(1.0)
conn.close()
if __name__ == '__main__':
try:
t = Thread(None, socket_server)#crashes
t.start()
# socket_server() #freezes
# worker = mp.Process(target=socket_server()) #freezes
# worker.daemon = True
# worker.start()
except Exception as e:
print (e)
On blender API docs it clearly warns you not to use threading at all.
You can Create Operator and run it with timer event.
with timer event you can poll network messages from socket and perform any action.
I came to realize that blender needs time to finish drawing an object before asking it to draw another one. so all I had to do was to put time.sleep(1) in my loop for creating objects.

Python asyncio - starting coroutines in an infinite loop

I am making a simple server/client chat program in Python. This program should allow for multiple users to connect at once, and then execute their requests concurrently. For this, I am using the asyncio module and sockets.
async def accept_new_connections(socket):
socket.listen(1)
while True:
connection, client_address = sock.accept()
print("accepted conn")
asyncio.create_task(accept_commands(socket, connection))
async def accept_commands(socket, connection):
print("accept cmd started")
while True:
# get and execute commands
def main():
asyncio.run(accept_new_connections(socket))
main()
What I would hope to do is running accept_commands for each of the connections, which would then execute commands concurrently. However, the current code only starts accept_commands for the first connection, and blocks the while loop (the one in accept_new_connections).
Any idea what I need to change to have accept_command started for each of the connections instead?
It is tough to tell because your example does have the implementation of accept_commands, but based on your issue it is likely you need to use the async socket methods on event loop itself so that your coroutine can yield execution and let something else happen.
The below example shows how to do this. This starts a socket on port 8080 and will send back any data it receives back to the client. You can see this work concurrently by connecting two clients with netcat or telnet and sending data.
import asyncio
import socket
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.bind(('localhost', 8080))
socket.listen(1)
socket.setblocking(False)
loop = asyncio.new_event_loop()
async def main():
while True:
connection, client_address = await loop.sock_accept(socket)
print('connected')
loop.create_task(accept_commands(connection))
async def accept_commands(connection):
while True:
request = await loop.sock_recv(connection, 16)
await loop.sock_sendall(connection, request)
loop.run_until_complete(main())

Python code not continuing execution after thread started

I am writing a threaded Python script for the first time and running into some trouble. The general idea is that a Raspberry Pi receives data from a Bluetooth connection, this data is then used to create a thread that calls the start_laps method. Once this thread is started, I need to continue listening for new data to determine if the thread should be killed. However, my code is not continuing execution after the thread is started. What would cause this?
import json
import bluetooth
import threading
import timed_LEDs
import subprocess
import ast
def start_laps(delay, lap_times):
timed_LEDs.start_LEDs(delay, lap_times)
# put pi in discoverable
subprocess.call(['sudo', 'hciconfig', 'hci0', 'piscan'])
server_socket = bluetooth.BluetoothSocket(bluetooth.RFCOMM)
port = 1
server_socket.bind(("", port))
server_socket.listen(1)
client_socket, address = server_socket.accept()
print("Accepted connection from ", address)
threads = []
while True:
print("RECEIVING")
data = client_socket.recv(1024)
data = json.loads(data.decode())
print(data)
if(data["lap_times"]):
print("STARTING THREAD")
t = threading.Thread(target=start_laps(int(data["delay"]), ast.literal_eval(data["lap_times"])))
threads.append(t)
t.start()
elif data == "stop":
print("Stop dat lap")
else:
print(data)
client_socket.close()
You are using the threading module wrong.
This line
threading.Thread(target=start_laps(int(data["delay"]), ast.literal_eval(data["lap_times"])))
executes the function start_laps, which obviously blocks the program. What you want is the following:
threading.Thread(target=start_laps, args=(int(data["delay"]), ast.literal_eval(data["lap_times"])))
This executes the function in the created Thread with the given args

I need the server to send messages to all clients (Python, sockets)

This is my server program, how can it send the data received from each client to every other client?
import socket
import os
from threading import Thread
import thread
def listener(client, address):
print "Accepted connection from: ", address
while True:
data = client.recv(1024)
if not data:
break
else:
print repr(data)
client.send(data)
client.close()
host = socket.gethostname()
port = 10016
s = socket.socket()
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind((host,port))
s.listen(3)
th = []
while True:
print "Server is listening for connections..."
client, address = s.accept()
th.append(Thread(target=listener, args = (client,address)).start())
s.close()
If you need to send a message to all clients, you need to keep a collection of all clients in some way. For example:
clients = set()
clients_lock = threading.Lock()
def listener(client, address):
print "Accepted connection from: ", address
with clients_lock:
clients.add(client)
try:
while True:
data = client.recv(1024)
if not data:
break
else:
print repr(data)
with clients_lock:
for c in clients:
c.sendall(data)
finally:
with clients_lock:
clients.remove(client)
client.close()
It would probably be clearer to factor parts of this out into separate functions, like a broadcast function that did all the sends.
Anyway, this is the simplest way to do it, but it has problems:
If one client has a slow connection, everyone else could bog down writing to it. And while they're blocking on their turn to write, they're not reading anything, so you could overflow the buffers and start disconnecting everyone.
If one client has an error, the client whose thread is writing to that client could get the exception, meaning you'll end up disconnecting the wrong user.
So, a better solution is to give each client a queue, and a writer thread servicing that queue, alongside the reader thread. (You can then extend this in all kinds of ways—put limits on the queue so that people stop trying to talk to someone who's too far behind, etc.)
As Anzel points out, there's a different way to design servers besides using a thread (or two) per client: using a reactor that multiplexes all of the clients' events.
Python 3.x has some great libraries for this built in, but 2.7 only has the clunky and out-of-date asyncore/asynchat and the low-level select.
As Anzel says, Python SocketServer: sending to multiple clients has an answer using asyncore, which is worth reading. But I wouldn't actually use that. If you want to write a reactor-based server in Python 2.x, I'd either use a better third-party framework like Twisted, or find or write a very simple one that sits directly on select.

Categories