Why does my HTTP response using Python Sockets fail? - python

Code:
from socket import *
sP = 14000
servSock = socket(AF_INET,SOCK_STREAM)
servSock.bind(('',sP))
servSock.listen(1)
while 1:
connSock, addr = servSock.accept()
connSock.send('HTTP/1.0 200 OK\nContent-Type:text/html\nConnection:close\n<html>...</html>')
connSock.close()
When I go to the browser and type in localhost:14000, I get an error 101- ERR_CONNECTION_RESET The connection was reset? Not sure why! What am I doing wrong

Several bugs, some more severe than others ... as #IanWetherbee already noted, you need an empty line before the body. You also should send \r\n not just \n. You should use sendall to avoid short sends. Last, you need to close the connection once you're done sending.
Here's a slightly modified version of the above:
from socket import *
sP = 14000
servSock = socket(AF_INET,SOCK_STREAM)
servSock.bind(('',sP))
servSock.listen(1)
while 1:
connSock, addr = servSock.accept()
connSock.sendall('HTTP/1.0 200 OK\r\nContent-Type:text/html\r\nConnection:close\r\n\r\n<html><head>foo</head></html>\r\n')
connSock.close()

Running your code, I have similar errors and am unsure on their origins too. However, rather than rolling your own HTTP server, have you considered a built in one? Check out the sample below. This can also support POST as well (have to add the do_POST method).
Simple HTTP Server
from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler
class customHTTPServer(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.end_headers()
self.wfile.write('<HTML><body>Hello World!</body></HTML>')
return
def main():
try:
server = HTTPServer(('',14000),customHTTPServer)
print 'server started at port 14000'
server.serve_forever()
except KeyboardInterrupt:
server.socket.close()
if __name__=='__main__':
main()

Related

http.server rfile read blocking in python

Trying to build a simple python3 HTTP Server with http.server:
import http.server
from http.server import HTTPServer, BaseHTTPRequestHandler
import socketserver
class Handler(BaseHTTPRequestHandler):
def do_PATCH(self):
print ("do_Patch called")
contentLength = int(self.headers['content-length'])
res = self.rfile.read(contentLength)
self.send_response(200)
self.send_header('Content-type', 'text/plain')
self.end_headers()
self.wfile.write('Something brilliant'.encode())
httpd = HTTPServer(("127.0.0.1",33181),Handler)
httpd.serve_forever()
Problem is that self.rfile.read() is blocking, so if I don't get the content-length exactly right I either get too little data or I hang.. Docs I've found says it's supposed to throw an exception in non-blocking mode but I've not found out how to set non-blocking mode.
I can get around this by setting the content-length as above, but I can also corrupt the content-length which hangs the thread:
import http.client
myValueableString = "Shorter than 300"
for mylen in [len(myValueableString), 300]:
conn = http.client.HTTPConnection("127.0.0.1",33181)
conn.request("PATCH","/test",myValueableString ,{'Content-length': mylen})
print(conn.getresponse().read().decode())
Is there any good way to 1) Force finding the "true" length of rfile stream, or 2) Force a timeout or 3) Enable "non-blocking" mode? I realize the code isn't recommended for production but this is the only real stumbling block I've encountered.

Respond HTTP 200 and continue processing

I have a scenario where I need to first respond with HTTP 200 to a server request (due to a time limit) and then continue processing with the actual work.
I also can not use threads, processes, tasks, queues or any other method that would allow me to do this by starting a parallel "process".
My approach is to use the build in "Simple HTTP" server and I am looking for a way to force the server to respond with HTTP 200 and then be able to continue processing.
The current code will receive a POST request and print its content after a 3 seconds. I put a placeholder where I would like to send the response.
from http.server import BaseHTTPRequestHandler, HTTPServer
import time
class MyWebServer(BaseHTTPRequestHandler):
def do_POST(self):
content_length = int(self.headers['Content-Length'])
post_data = self.rfile.read(content_length)
self.send_response_only(200)
self.end_headers()
# force server to send request ???
time.sleep(3)
print(post_data)
def run(server_class=HTTPServer, handler_class=MyWebServer, port=8000):
server_address = ('', port)
httpd = server_class(server_address, handler_class)
print('Starting httpd...')
httpd.serve_forever()
if __name__ == "__main__":
run()
I figured out a workaround solution. You can force the server to send a 200 OK and continue processing after with these two commands:
self.finish()
self.connection.close()
This solution is from this SO question: SimpleHTTPRequestHandler close connection before returning from do_POST method
However, this will apparently close the internal IO buffer that the server uses and it won't be able to server any additional requests after that.
To avoid running into an exception it works to terminate the program (which works for me). However this is just a workaround and I would still be looking for a solution that allows the server to keep processing new requests.
from http.server import BaseHTTPRequestHandler, HTTPServer
import time
class MyHandler(BaseHTTPRequestHandler):
def do_POST(self):
content_length = int(self.headers['Content-Length'])
post_data = self.rfile.read(content_length)
self.send_response_only(200)
self.end_headers()
self.finish()
self.connection.close()
time.sleep(3)
print(post_data)
quit()
def run(server_class=HTTPServer, handler_class=MyHandler, port=8000):
server_address = ('', port)
httpd = server_class(server_address, handler_class)
print('Starting httpd...')
httpd.serve_forever()
if __name__ == "__main__":
run()

Web Server Issue with Concurrency

I am a new Python learner and I want to implement a simple web server that can deal with multi-threads. If I do not make the lineconn.close() a comment, everything works well. Problems would occur if I make conn.close() a comment. A client can successfully get the response after first request but when I refresh the web page, the browser would fail to receive the response. Is there anyone who can tell me how to fix this? Is there something wrong with my code?
import socket
import threading
import time
class MyServer:
def __init__(self, port, doc_root):
self.port = port
self.doc_root = doc_root
self.host = "localhost"
def run(self):
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as soc:
soc.bind((self.host, self.port))
soc.listen(5)
req = b''
while True:
conn, addr = soc.accept()
threading.Thread(target=self.handle_connection, args=(conn, addr)).start()
def handle_connection(self, conn, addr):
assert isinstance(conn, socket.socket)
req = b''
while b'\r\n\r\n' not in req:
req += conn.recv(1024)
print(addr)
print(req.decode())
time.sleep(0.5)
conn.sendall(b'HTTP/1.1 200 OK\r\n\r\nHello World')
print(addr, 'response sent')
# conn.close()
if __name__ == '__main__':
input_port = 8006
input_doc_root = r'/'
server = MyServer(input_port, input_doc_root)
server.run()
You need the conn.close() because refreshing the browser without the server closing the connection will lead to the request being sent down that very same connection. Your handle ends after sending one response; there's simply nothing reading from and writing to that socket after the first request-response conversation in your code.
On a side note, I also had a few issues with omitting conn.close() on the client side, at least with the browser I tested. Since you don't tell your clients when they can stop reading from their side's socket, FF kept reading its endpoint, even after rendering the full response.
If you put your recv and sendall into some loop and give the clients some hint as to when the data has been fully received, it will work:
def handle_connection(self, conn, addr):
assert isinstance(conn, socket.socket)
while conn:
req = b''
while b'\r\n\r\n' not in req:
req += conn.recv(1024)
print(addr)
print(req.decode())
time.sleep(0.5)
# Note Content-Lenght in the 'header' below
# to let the clients know when to stop reading
conn.sendall(b'HTTP/1.1 200 OK\r\nContent-Length: 11\r\n\r\nHello World')
print(addr, 'response sent')
# conn.close()
However, closing the connection after sending the response is the right thing to do, IMHO. You cannot rely on the clients always closing the connections properly, so you risk wasting resources on what effectively are stale sessions. With the solution outlined above, the threads tend to stick around for some time, basically waiting for nothing.
In the end, HTTP is stateless, so there is no real point in maintaining the connection beyond sending the response.

python socketserver opening files from webpage

I am trying to make a small server that runs on local machines that will get a request from a webpage and open up a file in openoffice. This approach so far works. However, there are times when requests will not come through right away. When this happens I will wait for at least 5 seconds and then try to hit it a couple more times and then all of the requests come in at the same time. I would really like this to be reliable. Is there something I am missing that will stop this from happening? Any help would be greatly appreciated. Also I am aware that the way I am doing things may not be the safest. I am trying to make it functional and then will work on making it more secure.
import SocketServer
import subprocess
import time
class MyTCPHandler(SocketServer.StreamRequestHandler):
def handle(self):
# self.request is the TCP socket connected to the client
self.data = self.rfile.readline().strip()
print self.data
try:
if self.data != '':
st = self.data.split('\n', 1)[0]
#print st
st = st.split(' ')[1]
print st
if ".odt" in st:
p = subprocess.Popen('C:\openoffice\program\swriter.exe "'+st[1:]+'"')
time.sleep(1)
p.terminate()
except Exception as err:
print err
# just send back the same data, but upper-cased
self.wfile.write(self.data.upper())
PORT = 8081
httpd = SocketServer.TCPServer(("", PORT), MyTCPHandler)
print "serving at port", PORT
httpd.serve_forever()

Setting up an HTTP server that listens over a file-socket

How can I use HTTPServer (or some other class) to set up an HTTP server that listens to a filesystem socket instead of an actual network socket? By "filesystem socket" I mean sockets of the AF_UNIX type.
HTTPServer inherits from SocketServer.TCPServer, so I think it's fair to say that it isn't intended for that use-case, and even if you try to work around it, you may run into problems since you are kind of "abusing" it.
That being said, however, it would be possible per se to define a subclass of HTTPServer that creates and binds Unix sockets quite simply, as such:
class UnixHTTPServer(HTTPServer):
address_family = socket.AF_UNIX
def server_bind(self):
SocketServer.TCPServer.server_bind(self)
self.server_name = "foo"
self.server_port = 0
Then, just pass the path you want to bind to by the server_address argument to the constructor:
server = UnixHTTPServer("/tmp/http.socket", ...)
Again, though, I can't guarantee that it will actually work well. You may have to implement your own HTTP server instead.
I followed the example from #Dolda2000 above in Python 3.5 and ran into an issue with the HTTP handler falling over with an invalid client address. You don't have a client address with Unix sockets in the same way that you do with TCP, so the code below fakes it.
import socketserver
...
class UnixSocketHttpServer(socketserver.UnixStreamServer):
def get_request(self):
request, client_address = super(UnixSocketHttpServer, self).get_request()
return (request, ["local", 0])
...
server = UnixSocketHttpServer((sock_file), YourHttpHandler)
server.serve_forever()
With these changes, you can perform an HTTP request against the Unix socket with tools such as cURL.
curl --unix-socket /run/test.sock http:/test
Overview
In case it help anyone else, I have created a complete example (made for Python 3.8) based on Roger Lucas's example:
Server
import socketserver
from http.server import BaseHTTPRequestHandler
class myHandler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
self.wfile.write(b"Hello world!")
return
class UnixSocketHttpServer(socketserver.UnixStreamServer):
def get_request(self):
request, client_address = super(UnixSocketHttpServer, self).get_request()
return (request, ["local", 0])
server = UnixSocketHttpServer(("/tmp/http.socket"), myHandler)
server.serve_forever()
This will listen on the unix socket and respond with "Hello World!" for all GET requests.
Client Request
You can send a request with:
curl --unix-socket /tmp/http.socket http://any_path/abc/123
Troubleshooting
If you run into this error:
OSError: [Errno 98] Address already in use
Then delete the socket file:
rm /tmp/http.socket

Categories