I am trying to setup a program with twisted to emulate a serial devices that sends/receives AT commands. Different devices open a different amount of serial ports and use these ports for different things.
I would like my application to be able to open as many SerialPorts as needed and to know what serial Device is writing to dataReceived. I did not want to run each port on a different reactor or thread.
Is there anyway to do this ?
class VirtualDeviceBase(LineReceiver):
def __init__(self, reactor, serial_address):
[...]
def open_port(self):
self.serial_device = SerialPort(self, serial_address, reactor)
self.serial_device2 = SerialPort(? , serial_address, reactor)
def dataReceived(self,data):
[...]
I have tried this:
class VirtualDeviceBase(LineReceiver):
class Protocol(LineReceiver):
def __init__(self,reactor,address):
[...]
def open_port(self):
new_protocol = self.Protocol()
self.serial_device = SerialPort(self, serial_address, reactor)
self.serial_device2 = SerialPort(new_protocol , serial_address, reactor)
and it does not throw any errors but than neither of them call dataRecevied any more.
Just instantiate two SerialPorts. Give them whatever protocols you like. They can share a reactor. A single reactor can handle many different event sources.
from twisted.internet.protocol import Protocol
from twisted.internet.serial import SerialPort
from twisted.internet.task import react
class Echo(Protocol):
def dataReceived(self, data):
print("Received: {}".format(data))
def main(reactor):
SerialPort(Echo(), "com0", reactor)
SerialPort(Echo(), "com1", reactor)
react(main, [])
Related
I would like to use Twisted as a client/server manager that is part of regular Python objects.
The solution I am trying to implement is to isolate Twisted in its own process using multiprocessing.Process, and communicate with this process through multiprocessing.Pipe.
I have coded the client/server logic with Twisted already, but now I am stuck at interfacing the multiprocessing.Pipe communication with the reactor.
I am a beginner with Twisted so I may be missing something obvious, but from what I understand about how reactors work, I guess the reactor is somehow supposed to poll from my multiprocessing.Pipe along with the sockets that it already seems to handle nicely. So my question is, how can I make the reactor listen to my multiprocessing.Pipe on top of what it is already doing please?
Thus far my code looks something like this:
class ServerProtocol(Protocol):
def __init__(self, server):
self._server = server
def connectionMade(self):
pass
def connectionLost(self, reason):
pass
def dataReceived(self, data):
pass
class ServerProtocolFactory(Factory):
protocol = ServerProtocol
def __init__(self, server):
self.server = server
def buildProtocol(self, addr):
return ServerProtocol(self.server)
class Server:
def __init__(self):
pass
def run(self, pipe):
"""
This is called in its own process
"""
from twisted.internet import reactor
endpoint = TCP4ServerEndpoint(reactor, self._port)
endpoint.listen(ServerProtocolFactory(self))
reactor.run() # main Twisted reactor loop
class MyObject:
def __init__(self):
self._pipe = Pipe()
self._server = Server()
self._p = Process(target=self._server.run, args=(self._pipe, ))
self._p.start()
def stop(self):
# I want to send some stop command through the Pipe here
self._p.join()
if __name__ == "__main__":
obj = MyObject()
# do stuff here
obj.stop()
I don't know if Twisted will work as run this way (i.e., as the target of a multiprocessing.Process). Let's assume it will, though.
multiprocessing.Pipe is documented as returning a two-tuple of multiprocessing.Connection objects. multiprocessing.Connection is documented as having a fileno method returning a file descriptor (or handle) used by the Connection.
If it is a file descriptor then there is probably a very easy path to integrating it with a Twisted reactor. Most Twisted reactors implement IReactorFDSet which has an addReader method which accepts an IReadDescriptor value.
Connection is not quite an IReadDescriptor but it is easily adapted to be one:
from attrs import define
from multiprocessing import Connection
from twisted.python.failure import Failure
#define
class ConnectionToDescriptor:
_conn: Connection
def fileno(self) -> int:
return self._conn.fileno()
def doRead(self) -> None:
some_data = self._conn.recv()
# Process some_data how you like
def connectionLost(self, reason: Failure) -> None:
self._conn.close()
If you wrap this around your read Connection and then pass the result to reactor.addReader the reactor will use fileno to figure out what to monitor for readiness and call doRead when there is something to read.
You could apply similar treatment to the write end of the pipe if you also want reactor-friendly support for sending bytes back to the parent process.
I'm writing a small web server in Python, using BaseHTTPServer and a custom subclass of BaseHTTPServer.BaseHTTPRequestHandler. Is it possible to make this listen on more than one port?
What I'm doing now:
class MyRequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def doGET
[...]
class ThreadingHTTPServer(ThreadingMixIn, HTTPServer):
pass
server = ThreadingHTTPServer(('localhost', 80), MyRequestHandler)
server.serve_forever()
Sure; just start two different servers on two different ports in two different threads that each use the same handler. Here's a complete, working example that I just wrote and tested. If you run this code then you'll be able to get a Hello World webpage at both http://localhost:1111/ and http://localhost:2222/
from threading import Thread
from SocketServer import ThreadingMixIn
from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler
class Handler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header("Content-type", "text/plain")
self.end_headers()
self.wfile.write("Hello World!")
class ThreadingHTTPServer(ThreadingMixIn, HTTPServer):
daemon_threads = True
def serve_on_port(port):
server = ThreadingHTTPServer(("localhost",port), Handler)
server.serve_forever()
Thread(target=serve_on_port, args=[1111]).start()
serve_on_port(2222)
update:
This also works with Python 3 but three lines need to be slightly changed:
from socketserver import ThreadingMixIn
from http.server import HTTPServer, BaseHTTPRequestHandler
and
self.wfile.write(bytes("Hello World!", "utf-8"))
Not easily. You could have two ThreadingHTTPServer instances, write your own serve_forever() function (don't worry it's not a complicated function).
The existing function:
def serve_forever(self, poll_interval=0.5):
"""Handle one request at a time until shutdown.
Polls for shutdown every poll_interval seconds. Ignores
self.timeout. If you need to do periodic tasks, do them in
another thread.
"""
self.__serving = True
self.__is_shut_down.clear()
while self.__serving:
# XXX: Consider using another file descriptor or
# connecting to the socket to wake this up instead of
# polling. Polling reduces our responsiveness to a
# shutdown request and wastes cpu at all other times.
r, w, e = select.select([self], [], [], poll_interval)
if r:
self._handle_request_noblock()
self.__is_shut_down.set()
So our replacement would be something like:
def serve_forever(server1,server2):
while True:
r,w,e = select.select([server1,server2],[],[],0)
if server1 in r:
server1.handle_request()
if server2 in r:
server2.handle_request()
I would say that threading for something this simple is overkill. You're better off using some form of asynchronous programming.
Here is an example using Twisted:
from twisted.internet import reactor
from twisted.web import resource, server
class MyResource(resource.Resource):
isLeaf = True
def render_GET(self, request):
return 'gotten'
site = server.Site(MyResource())
reactor.listenTCP(8000, site)
reactor.listenTCP(8001, site)
reactor.run()
I also thinks it looks a lot cleaner to have each port be handled in the same way, instead of having the main thread handle one port and an additional thread handle the other. Arguably that can be fixed in the thread example, but then you're using three threads.
I need to receive connections by sockets, read input data, do hard and long calculations and then send an answer. Queries at the same time may be a lot (i.e. 100)
I understood, that because of GIL I can't use normal threads, and tried to use C++ with boost:threads and boost:python, and running subinterpreter of python in each thread. But anyway it's not utilised all cores 100% at the same time.
So I decided to use multiprocessing, but create a static count pool of workers to serve these requests with a queue. This way, we don't waste time to fork a process, and we will not have 100 or more processess at the same time, only static count.
I am new to Python, mostly I utilised C++
So now I have this code, but it is not working. The connection opens and immediately closes, I don't know why:
#!/usr/bin/env python
import os
import sys
import SocketServer
import Queue
import time
import socket
import multiprocessing
from multiprocessing.reduction import reduce_handle
from multiprocessing.reduction import rebuild_handle
class MultiprocessWorker(multiprocessing.Process):
def __init__(self, sq):
self.SLEEP_INTERVAL = 1
# base class initialization
multiprocessing.Process.__init__(self)
# job management stuff
self.socket_queue = sq
self.kill_received = False
def run(self):
while not self.kill_received:
try:
h = self.socket_queue.get_nowait()
fd=rebuild_handle(h)
client_socket=socket.fromfd(fd,socket.AF_INET,socket.SOCK_STREAM)
#client_socket.send("hellofromtheworkerprocess\r\n")
received = client_socket.recv(1024)
print "Recieved on client: ",received
client_socket.close()
except Queue.Empty:
pass
#Dummy timer
time.sleep(self.SLEEP_INTERVAL)
class MyTCPHandler(SocketServer.BaseRequestHandler):
"""
The RequestHandler class for our server.
It is instantiated once per connection to the server, and must
override the handle() method to implement communication to the
client.
"""
def handle(self):
# self.request is the TCP socket connected to the client
#self.data = self.request.recv(1024).strip()
#print "{} wrote:".format(self.client_address[0])
#print self.data
# just send back the same data, but upper-cased
#self.request.sendall(self.data.upper())
#Either pipe it to worker directly like this
#pipe_to_worker.send(h) #instanceofmultiprocessing.Pipe
#or use a Queue :)
h = reduce_handle(self.request.fileno())
socket_queue.put(h)
if __name__ == "__main__":
#Mainprocess
address = ('localhost', 8082)
server = SocketServer.TCPServer(address, MyTCPHandler)
socket_queue = multiprocessing.Queue()
for i in range(5):
worker = MultiprocessWorker(socket_queue)
worker.start()
try:
server.serve_forever()
except KeyboardInterrupt:
sys.exit(0)
Is there a reason why you do not use
def reduce_socket(s):
...
def rebuild_socket(ds):
...
?
It seems like you could do this:
import copyreg
copyreg.pickle(type(socket.socket), reduce_socket, rebuild_socket)
and then pass the socket to the queue.
These are suggestions. Do they help?
try this:
def handle(self):
h = reduce_handle(self.request.fileno())
socket_queue.put(h)
self.request.close()
note the self.request.close() addition.
I started working with Twisted Framework, I wrote a TCP server and I connect to it throw Telnet, it works fine. Now I want to manage connections and connected clients( sending data, cutting connections, etc etc) using an GUI like PyUI or GTK..
this is my code
import sys
import os
from twisted.internet import reactor, protocol
from twisted.python import log
class Server(protocol.Protocol):
def dataReceived(self, data):
log.msg ("data received: %s"%data)
self.transport.write("you sent: %s"%data)
def connectionMade(self):
self.client_host = self.transport.getPeer().host
self.client_port = self.transport.getPeer().port
if len(self.factory.clients) >= self.factory.clients_max:
log.msg("Too many connections !!")
self.transport.write("Too many connections, sorry\n")
self.transport.loseConnection()
else:
self.factory.clients.append((self.client_host,self.client_port))
log.msg("connection from %s:%s\n"%(self.client_host,str(self.client_port)))
self.transport.write(
"Welcome %s:%s\n" %(self.client_host,str(self.client_port)))
def connectionLost(self, reason):
log.msg('Connection lost from %s:%s. Reason: %s\n' % (self.client_host,str(self.client_port),reason.getErrorMessage()))
if (self.client_host,self.client_port) in self.factory.clients:
self.factory.clients.remove((self.client_host,self.client_port))
class MyFactory(protocol.ServerFactory):
protocol = Server
def __init__(self, clients_max=10):
self.clients_max = clients_max
self.clients = []
def main():
"""This runs the protocol on port 8000"""
log.startLogging(sys.stdout)
reactor.listenTCP(8000,MyFactory)
reactor.run()
if __name__ == '__main__':
main()
Thanks.
If you want to write a single Python program (process) that runs both your UI and your networking, you will first need to choose an appropriate Twisted reactor that integrates with the UI toolkit's event loop. See here.
Next, you might start with something simple, like have a button that when pressed will send a text message to all currently connected clients.
Another thing: what clients will connect? Browsers (also)? If so, you might contemplate about using WebSocket instead of raw TCP.
I am trying to write an application with twistd library written for Python. The application file ends like the following:
factory = protocol.ServerFactory()
factory.protocol = EchoServer
application = service.Application("Echo")
internet.TCPServer(8001, factory).setServiceParent(application)
I want to run something before my appication terminates (e.g. close a file). Does anyone know how to do that? because this is an event-handler and I don't know where the clean-up function is called.
You need to add code to the startService and stopService methods of the Service.
One way would be something like this:
from twisted.application import service
from twisted.internet import protocol
class MyService(service.Service):
def __init__(self,port=8001):
self.port = port
def startService(self):
self.factory = protocol.ServerFactory()
self.factory.protocol = EchoServer
from twisted.internet import reactor
reactor.callWhenRunning(self.startListening)
def startListening(self):
from twisted.internet import reactor
self.listener = reactor.listenTCP(self.port,self.factory)
print "Started listening"
def stopService(self):
self.listener.stopListening()
# Any other tidying
application = service.Application("Echo")
MyService().setServiceParent(application)