Prevent Flask Server From Including FIN bit in Response to Client [duplicate] - python

I have a jQuery Ajax call, like so:
$("#tags").keyup(function(event) {
$.ajax({url: "/terms",
type: "POST",
contentType: "application/json",
data: JSON.stringify({"prefix": $("#tags").val() }),
dataType: "json",
success: function(response) { display_terms(response.terms); },
});
I have a Flask method like so:
#app.route("/terms", methods=["POST"])
def terms_by_prefix():
req = flask.request.json
tlist = terms.find_by_prefix(req["prefix"])
return flask.jsonify({'terms': tlist})
tcpdump shows the HTTP dialog:
POST /terms HTTP/1.1
Host: 127.0.0.1:5000
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:12.0) Gecko/20100101 Firefox/12.0
Accept: application/json, text/javascript, */*; q=0.01
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
Content-Type: application/json; charset=UTF-8
X-Requested-With: XMLHttpRequest
Referer: http://127.0.0.1:5000/
Content-Length: 27
Pragma: no-cache
Cache-Control: no-cache
{"prefix":"foo"}
However, Flask replies without keep-alive.
HTTP/1.0 200 OK
Content-Type: application/json
Content-Length: 445
Server: Werkzeug/0.8.3 Python/2.7.2+
Date: Wed, 09 May 2012 17:55:04 GMT
{"terms": [...]}
Is it really the case that keep-alive is not implemented?

The default request_handler is WSGIRequestHandler.
Before app.run(), Add one line,
WSGIRequestHandler.protocol_version = "HTTP/1.1"
Don't forget from werkzeug.serving import WSGIRequestHandler.

Werkzeug's integrated web server builds on BaseHTTPServer from Python's standard library. BaseHTTPServer seems to support Keep-Alives if you set its HTTP protocol version to 1.1.
Werkzeug doesn't do it but if you're ready to hack into the machinery that Flask uses to instantiate Werkzeug's BaseWSGIServer, you can do it yourself. See Flask.run() which calls werkzeug.serving.run_simple(). What you have to do boils down to BaseWSGIServer.protocol_version = "HTTP/1.1".
I haven't tested the solution. I suppose you do know that Flask's web server ought to be used for development only.

Related

Don't receive 302 Status Code with Python's Requests

Similar to a question asked here: Http Redirection code 3XX in python requests. I do also not receive redirection when I'm trying to post a form with python's requests.
To bypass same origin policy, my goal is it to proxy (redirect) an internal site with my flask application through the following code:
method_requests_mapping = {
'GET': requests.get,
'HEAD': requests.head,
'POST': requests.post,
'PUT': requests.put,
'DELETE': requests.delete,
'PATCH': requests.patch,
'OPTIONS': requests.options,
}
#bp.route('/<path:url>', methods=method_requests_mapping.keys())
def proxy(url):
url='https://intern.something.com/'+url
username=session['username']
password=session['password']
requests_function = method_requests_mapping[flask.request.method]
request = requests_function(url, stream=True, params=flask.request.args,auth=(username, password),allow_redirects=False)
response = flask.Response(flask.stream_with_context(request.iter_content()),
content_type=request.headers['content-type'],
status=request.status_code, )
response.headers['Access-Control-Allow-Origin'] = '*'
print(request.history)
print(request.cookies)
print(request.status_code)
return response
If I am trying to use the site without my flask proxy network analysis shows me this:
Request:
Host: intern.something.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate, br
Referer: https://intern.something.com/contract_config_edit.php4?Contract_ID=1463234
Content-Type: application/x-www-form-urlencoded
Content-Length: 4024
Authorization: Basic YWhvZWhuZTpLYXR6ZTc0MzYh
Connection: keep-alive
Cookie: PHPSESSID=kr9am6tpid67ikct3up67f03h0
Upgrade-Insecure-Requests: 1
Answer:
HTTP/1.1 302 Found
Date: Wed, 02 Jan 2019 07:50:31 GMT
Server: Apache/2.2.3 (Red Hat)
X-Powered-By: PHP/5.1.6
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre- check=0
Pragma: no-cache
Location: https://intern.something.com /contract_show.php4?Contract_ID=1463234
Content-Length: 0
Connection: close
Content-Type: text/html
But if I do it with the proxy it seems not to work correctly:
Request:
Host: 10.146.177.18:7000
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Referer: http://10.146.177.18:7000/backoffice/contract /contract_config_edit.php4?Contract_ID=1463234
Content-Type: application/x-www-form-urlencoded
Content-Length: 4024
Authorization: Basic RWluaG9ybjpGZXVlcnphbmdlbmJvaGxlNTU0ISE/
Connection: keep-alive
Cookie: _pk_id.7.1c19=5f552d1eb2170bab.1546180080.2.1546185355.1546184002.; session=.eJwtj1FKxTAQRddivt9Hkk5mJm8LLqJMJjdUxFbaPgTFvVvRz3PhwD1fYR47jiXcz_2BW5hfergHjTrIMlHxOrgSWh- NxNU0e67iEch5SpqaQaRxSz4oo1dzcRLNXcQ5Ugd4yMhVS8m9oVMt3pJpacw2UUEtrUfXaNQ7C DJaEw234Mc-5nN7xXr9YWdTBpJAY-KRMBVCKYYqrPEyJFav-fLe7Tg- tv234tnOTwhN_HTtjwP7X1z6p9XecKEtG5YV4fsHxkJOZg.Dw34rg.p2bNxLLF26aIXxth9VN7 BHA5x4U
Upgrade-Insecure-Requests: 1
Answer:
HTTP/1.0 200 OK
Content-Type: text/html
Access-Control-Allow-Origin: *
Vary: Cookie
Connection: close
Server: Werkzeug/0.14.1 Python/3.5.2
Date: Wed, 02 Jan 2019 08:15:38 GMT
Maybe it could be a problem with the cookies though it seems in the console it sends the correct cookie:
10.146.177.49 - - [02/Jan/2019 09:15:38] "POST /backoffice/contract/contract_config_edit.php4?Contract_ID=1463234 HTTP/1.1" 200 -
<RequestsCookieJar[<Cookie PHPSESSID=saqjj7n6m61aee19k3pe6moaf4 for intern.something.com/>]>
Does anyone know what the problem is here?

Enable Webiopi CORS request

I'd like to call the Webiopi REST API from my angular application in a browser running on the Raspberry. As Webiopi HTTP server doesn't allow CORS request, I have created a proxy with apache that sends the Header add "Access-Control-Allow-Origin" "*" header.
This is working fine, however the call to the REST API throws many errors mainly because the browser sends an OPTIONS request to the server in case of a CORS request to check wether it is allowed or not. But the webiopi http handler doesn't handle the OPTIONS verb at all.
So I started to write it into the code myself with zero python experience. In the file python/webiopi/protocols/http.py I have added at the end:
def do_OPTIONS(self):
self.send_response(200,"ok")
self.send_header("Access-Control-Allow-Origin", "*")
self.send_header("Access-Control-Allow-Methods", "POST,GET,OPTIONS")
self.send_header("Access-Control-Allow-Headers", "Authorization")
self.send_header("Access-Control-Allow-Headers", "Content-Type")
self.end_headers()
Now it doesn't throw any error but doesn't give me the proper response to my GET request. It just stops after the OPTIONS. The request and response looks like this:
Request headers:
OPTIONS /GPIO/1/value HTTP/1.1
Host: localhost:8000
Connection: keep-alive
Access-Control-Request-Method: GET
Origin: http://192.168.1.108:51443
User-Agent: Mozilla/5.0 (X11; Linux armv7l) AppleWebKit/537.36 (KHTML, like Gecko) Raspbian Chromium/65.0.3325.181 Chrome/65.0.3325.181 Safari/537.36
Access-Control-Request-Headers: authorization
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: hu-HU,hu;q=0.9,en-US;q=0.8,en;q=0.7
Response headers:
HTTP/1.1 200 OK
Date: Fri, 23 Nov 2018 22:06:28 GMT
Server: WebIOPi/0.7.1/Python3.5
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: POST,GET,OPTIONS
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET,POST,OPTIONS
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
General (from chrome network tab):
Request URL: http://localhost:8000/GPIO/1/value
Request Method: OPTIONS
Status Code: 200 OK
Remote Address: [::1]:8000
Referrer Policy: no-referrer-when-downgrade
Where is my GET request? Why do I see only the OPTIONS which by the way I'm not initiating at all?
The request from angular:
this.http.get<number>(this.route+'GPIO/'+gpio+'/value').subscribe(result => {
resolve(result);
})
I had to enable all headers to the http server:
def do_OPTIONS(self):
self.send_response(200,"ok")
self.send_header("Access-Control-Allow-Origin", "*")
self.send_header("Access-Control-Allow-Methods", "*")
self.send_header("Access-Control-Allow-Headers", "*")
self.end_headers()

Python - Send HTTP GET string - Receive 301 Moved Permanently - What's next?

I'm trying to use Python 2 to send my own HTTP GET message to a web server, retrieve html text, and write it to an html file (no urllib, urllib2, httplib, requests, etc. allowed).
import socket
tcpSocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
tcpSocket.connect(('python.org', 80))
http_get = """GET / HTTP/1.1\r
Host: www.python.org/\r
Connection: keep-alive\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36\r
Accept-Encoding: gzip, deflate, sdch\r
Accept-Language: en-US,en;q=0.8\r\n\r\n"""
tcpSocket.send(http_get)
m = tcpSocket.recv(4096)
tcpSocket.close()
print m
Output:
HTTP/1.1 301 Moved Permanently
Location: https://www.python.org//
Connection: Keep-Alive
Content-length: 0
Why does it return 301 when the location is apparently still the same? What message and to where should I send next to get the html content?
Thank you very much!
Your problem is that the url you are seeking doesn't serve over http://, but rather redirects to https://. To show that your code fundamentally works with a proper target I have changed your get request to
import socket
tcpSocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
tcpSocket.connect(('www.cnn.com', 80))
http_get = """GET / HTTP/1.1\r
Host: www.cnn.com/\r
Connection: keep-alive\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36\r
Accept-Encoding: gzip, deflate, sdch\r
Accept-Language: en-US,en;q=0.8\r\n\r\n"""
http_get_minimum = """GET / HTTP/1.1\r\nHost: www.cnn.com\r\nConnection: close\r\n\r\n"""
tcpSocket.send(http_get_minimum)
m = tcpSocket.recv(4096)
tcpSocket.close()
and received
HTTP/1.1 200 OK
x-servedByHost: prd-10-60-168-42.nodes.56m.dmtio.net
Cache-Control: max-age=60
X-XSS-Protection: 1; mode=block
Content-Security-Policy: default-src 'self' http://.cnn.com: https://.cnn.com: .cnn.net: .turner.com: .ugdturner.com: .vgtf.net:; script-src 'unsafe-inline' 'unsafe-eval' 'self' *; style-src 'unsafe-inline' 'self' *; frame-src 'self' *; object-src 'self' *; img-src 'self' * data:; media-src 'self' *; font-src 'self' *; connect-src 'self' *;
Content-Type: text/html; charset=utf-8
Via: 1.1 varnish
Content-Length: 74864
Accept-Ranges: bytes
Date: Mon, 05 Oct 2015 00:39:54 GMT
Via: 1.1 varnish
Age: 170
Connection: close
X-Served-By: cache-iad2144-IAD, cache-sjc3129-SJC
X-Cache: HIT, HIT
X-Cache-Hits: 2, 95
X-Timer: S1444005594.675567,VS0,VE0
Vary: Accept-Encoding
UPDATE: Yes, there is extra functionality required from what you have presented to be able to request over HTTPS. There are some primary differences between http and https, however, beginning with the default port, which is 80 for http and 443 for https. Https works by transmitting normal http interactions through an encrypted system, so that in theory, the information cannot be accessed by any party other than the client and end server. There are two common types of encryption layers: Transport Layer Security (TLS) and Secure Sockets Layer (SSL), both of which encode the data records being exchanged.
When using an https connection, the server responds to the initial connection by offering a list of encryption methods it supports. In response, the client selects a connection method, and the client and server exchange certificates to authenticate their identities. After this is done, both parties exchange the encrypted information after ensuring that both are using the same key, and the connection is closed. In order to host https connections, a server must have a public key certificate, which embeds key information with a verification of the key owner's identity. Most certificates are verified by a third party so that clients are assured that the key is secure.
I had the same problem and changing port from 80 to 443 solved it.

download files with python (REST URL)

I am trying to write a script that will download a bunch files from a website that has REST URLs.
Here is the GET request:
GET /test/download/id/5774/format/testTitle HTTP/1.1
Host: testServer.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:23.0) Gecko/20100101 Firefox/23.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: __utma=11863783.1459862770.1379789243.1379789243.1379789243.1; __utmb=11863783.28.9.1379790533699; __utmc=11863783; __utmz=11863783.1379789243.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); PHPSESSID=fa844952890e9091d968c541caa6965f; loginremember=Qraoz3j%2BoWXxwqcJkgW9%2BfGFR0SDFLi1FLS7YVAfvbcd9GhX8zjw4u6plYFTACsRruZM4n%2FpX50%2BsjXW5v8vykKw2XNL0Vqo5syZKSDFSSX9mTFNd5KLpJV%2FFlYkCY4oi7Qyw%3D%3D; ma-refresh-storage=1; ma-pref=KLSFKJSJSD897897; skipPostLogin=0; pp-sid=hlh6hs1pnvuh571arl59t5pao0; __utmv=11863783.|1=MemberType=Yearly=1; nats_cookie=http%253A%252F%252Fwww.testServer.com%252F; nats=NDc1NzAzOjQ5MzoyNA%2C74%2C0%2C0%2C0; nats_sess=fe3f77e6e326eb8d18ef0111ab6f322e; __utma=163815075.1459708390.1379790355.1379790355.1379790355.1; __utmb=163815075.1.9.1379790485255; __utmc=163815075; __utmz=163815075.1379790355.1.1.utmcsr=ppp.contentdef.com|utmccn=(referral)|utmcmd=referral|utmcct=/postlogin; unlockedNetworks=%5B%22rk%22%2C%22bz%22%2C%22wkd%22%5D
Connection: close
If the request is good, it will return a 302 response such as this one:
HTTP/1.1 302 Found
Date: Sat, 21 Sep 2013 19:32:37 GMT
Server: Apache
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
location: http://downloads.test.stuff.com/5774/stuff/picture.jpg?wed=20130921152237&wer=20130922153237&hash=0f20f4a6d0c9f1720b0b6
Vary: User-Agent,Accept-Encoding
Content-Length: 0
Connection: close
Content-Type: text/html; charset=UTF-8
What I need the script to do is check to see if it was a 302 response. If it is not, it will "pass", if it is, it will need to parse out the location parameter shown here:
location: http://downloads.test.stuff.com/5774/stuff/picture.jpg?wed=20130921152237&wer=20130922153237&hash=0f20f4a6d0c9f1720b0b6
Once I have the location parameter, I will have to make another GET request to download that file. I will also have to maintain the cookie for my session in order to download the file.
Can someone point me in the right direction for what library is best to use for this? I am having trouble finding out how to parse the 302 response and adding a cookie value like the one shown in my GET request above. I am sure there must be some library that can do all of this.
Any help would be much appreciated.
import urllib.request as ur
import urllib.error as ue
'''
Note that http.client.HTTPResponse.read([amt]) reads and returns the response body, or up to
the next amt bytes. This is because there is no way for urlopen() to automatically determine
the encoding of the byte stream it receives from the http server.
'''
url = "http://www.example.org/images/{}.jpg"
dst = ""
arr = ["01","02","03","04","05","06","07","08","09"]
# arr = range(10,20)
try:
for x in arr:
print(str(x)+"). ".ljust(4),end="")
hrio = ur.urlopen(url.format(x)) # HTTPResponse iterable object (returns the response header and body, together, as bytes)
fh = open(dst+str(x)+".jpg","b+w")
fh.write(hrio.read())
fh.close()
print("\t[REQUEST COMPLETE]\t\t<Error ~ [None]>")
except ue.URLError as e:
print("\t[REQUEST INCOMPLETE]\t",end="")
print("<Error ~ [{}]>".format(e))

how to create .ASPXAUTH cookie on python

i need to create .ASPXAUTH cookie on python. i programing to desktop client. and first request not need .ASPXAUTH cookie but second request is need.
My First Request Headers:
User-Agent: WebPolicy
Host: xxx.host
Cache-Control: no-cache
My First Response Headers:
reply: 'HTTP/1.1 200 OK\r\n'
header: Cache-Control: private
header: Transfer-Encoding: chunked
header: Content-Type: text/xml; charset=utf-8
header: Server: Microsoft-IIS/7.5
header: Set-Cookie: tivi_=3tnihi55ezuk50zyrrpuwv45; path=/; HttpOnly
header: X-AspNet-Version: 2.0.50727
header: X-Powered-By: ASP.NET
header: Date: Thu, 14 Oct 2010 13:05:50 GMT
And i need second send headers :
Accept: */*
Content-Length: 259
Content-Type: text/xml; charset=utf-8
SOAPAction: "http://tempuri.org/IMiddlewareServices/Login"
UA-CPU: x86
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 2.0.50727; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; .NET CLR 1.1.4322)
Host: mw.webtv.ttnet.com.tr
Connection: Keep-Alive
Cache-Control: no-cache
Cookie: .ASPXAUTH=090F2718E32AF3F9B1C9E5A15BA54CFD8D4430C44A91029719953D4A6D38DD1D9164D86D772E2645C0C0545A71C12EA80AE5A8F725FD6037BD00DB291A863DD577735E16D8745E2833979F337935F29A37C509FB0350F1180DA0D2C1C44F97D0F081B13D33984C198ECD695C34B2E79A3E7CFBDD2D67D630C019714C3A70280E; tivi_=nqkngs45drnsh4z4y4b30g55
please help me! how to create ".ASPXAUTH" cookie ?
Python requests module does the job for me with .ASPXAUTH cookie-based authentication. In a simple example below .ASPXAUTH cookie is received after the first session.get call and used for the second call:
import requests
session = requests.Session()
session.get('https://host/user/login?username=###&password=###')
print session.cookies
session.get('https://host/apicall')
print session.cookies

Categories