python microframeworks and the requests library - python

Working on a side project in python that both initiates http requests and has a small web server. It got me thinking - it seems like every python microframework has its own model for accessing properties of the http request (e.g. the underlying request object that you can use to get at the querystring parameters, headers, etc. and the underlying response object for setting the status code, response headers, etc). They all allow you access to the same data and they've all kind of re-invented the wheel.
Are there any microframeworks that use the Request and Response objects from the popular requests library instead of having their own implementation? It seems like the requests library is becoming the canonical way to do http requests in python, so this would make a framework especially minimal. It would also be cool when making apps that essentially glue other services together because forwarding/wrapping requests would be trivial. You could just change the .url attribute on the incoming request and call .prepare() to forward the request (yes, for simple forwarding doing it at the webserver level makes more sense, but you get the idea).
Or if there aren't any frameworks that do this in particular, are there any with similar benefits i.e. the incoming http request object also works as an http client?
Edit: Wanted to point out that this is how the http Request object works in Go, that's partly what inspired my question. From the net/http library "A Request represents an HTTP request received by a server or to be sent by a client."

Flask is based on Werkzeug. Werkzeug itself is using Request which itself is using BaseRequest. But this is not the Requests library.
Note that there was a plan to create an httpcore library using both Requests and Werkzeug, but that seems to have stopped. That said both projects are kicking.
Some people have modified Flask in their apps using Requests

Related

Using django for non http requests

Can I use django to handle non http-requests and responses? I have a django web application serving up webpages, and I would like to use it to also communicate with other devices (hand-held gps sending in status reports and receiving ack) over tcp, but django reports that the requests are "
code 400, message Bad HTTP/0.9 request type".
[28/Sep/2015 15:14:26] code 400, message Bad HTTP/0.9 request type ('[V1.0.0,244565434376396,1,abcd,2015-09-28')
[28/Sep/2015 15:14:26] "[V1.0.0,244565434376396,1,abcd,2015-09-28 14:14:12,1-2,865456543459367,2,T1]" 400 -
The message from the device is sent as text over tcp with no http parameters at all.
I haven't found any information on how to do this with django, but it would make my life easier if it was possible.
Thanks!
Not that I know of.
Django is a web framework, so it's designed around a certain paradigm if not a certain protocol.
The design is heavily informed - if not by HTTP - by the notions of URL, request, a stateless protocol, et cetera.
If the template system and the routing system were taken away you would be left with a glorified ORM and some useless bits of code.
However, unless you are dealing with existing devices with their own protocol, you can use Django to build a RESTful service to successfully exchange information with something other than bipeds in front of a web browser.
This article on Dr. Dobb's is very informative.
Django REST, although by no means necessary, can help you.
If you are really stuck with legacy devices and protocols, you could write an adapter/proxy that would receive your devices' requests and translate them to RESTful calls, if you protocol looks enough like HTTP semantically rather than syntactically (as in, if you just have to translate QUUX aaa:bbb:ccc: to GET xx/yy/zz).
If it does not share the slightest bit of HTTP's semantics, I'd say Django can't help you much.
I second the suggestion that you can better handle non-http with other methods, but I do have a suggestion as to how to structure a Django app that could do it. HTTP processing takes place in middleware and you could just make your app be on the top of that stack and either pre-empt other middlewares by returning the response instead of passing it down the stack or preparing a mock request to pass down to other handlers, grabbing the response on the way back to post-process it for your receiver.
This feels hacky and might require a bunch of un-orthodox tricks but that's how I would approach the problem as stated.

Python Eve with custom webserver

I would like to exploit Python Eve features, but I have a custom web environment where I have my Request objects and (can be disabled) a Router.
I know Python Eve is built on top of Flask and those features are already there, but I would like somehow to wrap/adapt my custom requests into Python Eve / Flask ones.
I have a process acting as a webserver (It receives and sends messages in protocols different from HTTP). I was searching for a standard way to interface it to Eve or Flask. I found out WSGI.
To further clarify: Imagine you have your ESB which is able to vehicle HTTP requests.
If you want to handle those requests with Eve, you should build a gateway/bridge.
It means, implementing something that:
Receives the proprietary or not standard protocol containing the request
Extracts the most important parameters from the request, such as URL, QUERY_STRING, HTTP method and so on...
Fills a WSGI Environment with those parameters following the PEP
Runs the WSGI application (in our case an Eve instance)
We get the response from the WSGI application
Pack the response back into your proprietary or custom protocol
Send back to requestor
A really simple example can be found at http://ivory.idyll.org/articles/wsgi-intro/what-is-wsgi.html
I am not sure what you mean. Do you want to use Eve with a different (custom) framework other than Flask? That's going to be really hard without an almost total rewrite, as Eve is, in fact, a Flask application (a subclass actually).

Python: What's the difference between httplib2 and urllib2?

I'm trying to implement an OAuth2 authentication server and for the client part i wanted to send a json request to the server (from a Django view) and i found several libraries to do that tho' the most common are httplib2 and urllib2 i was wondering which is the difference between them and which is the best library for this purpose.
Thanks in advance.
Edit:
After searching, i found an extremely useful library called Requests and i use this one since then. (http://docs.python-requests.org/en/latest/)
urllib2 handles opening and reading URLs. It also handles extra stuff like storing cookies.
httplib handles http requests, its what happens behind the curtain when you open a url.
you can send json request with urllib2 so you should use that.
see this.

How can I send a GET request from my flask app to another site?

Originally, I tried to post an ajax request from my client side to a third party url, but it seems that the browser have security issues with that. I thought about sending an ajax to the server side, from there to send a GET request to the third party, get the response and send it back to the client side. How can I do that with flask?
Install the requests module (much nicer than using urllib2) and then define a route which makes the necessary request - something like:
import requests
from flask import Flask
app = Flask(__name__)
#app.route('/some-url')
def get_data():
return requests.get('http://example.com').content
Depending on your set up though, it'd be better to configure your webserver to reverse proxy to the target site under a certain URL.
Flask alone does not have this capability, but it is a simple matter to write a request handler that makes a request to another server using an HTTP client library and then return that response.
# third-party HTTP client library
import requests
# assume that "app" below is your flask app, and that
# "Response" is imported from flask.
#app.route("/proxy-example")
def proxy_example():
r = requests.get("http://example.com/other-endpoint")
return Response(
r.text,
status=r.status_code,
content_type=r.headers['content-type'],
)
However, this will not achieve exactly the same result as you might expect from a client-side request. Since your server cannot "see" any cookies that the client browser has stored for the target site, your proxied request will be effectively anonymous and so, depending on the target site, may fail or give you a different response than you'd get requesting that resource in the browser.
If you have a relationship with the third-party URL (that is, if you control it or are able to work with the people who do) they can give access for cross-domain requests in the browser using CORS (which is only supported in modern browsers) or JSON-P (an older workaround that predates CORS).
The third-party provider could also give you access to the data you want at an endpoint that is designed to accept requests from other servers and that provides a mechanism for you to authenticate your app. The most popular protocol for this is OAuth.
As the other answers have stated using the requests module for python would be the best way to tackle this from the coding perspective. However as the comments mentioned (and the reason I came to this question) this can give an error that the request was denied. This error is likely cause by SELinux.
To check if this is the issue first make sure SELinux is enabled with this command:
sestatus
If 'Current Mode' is 'enforcing' then SELinux is enabled.
Next get the current bool values that apply directly to apache with this command:
getsebool -a | grep httpd
Look for the setting 'httpd_can_network_connect' this determines if apache is allowed to make TCP requests out to the network. If it is on then all apache TCP requests will be allowed. To turn it on run the following as root:
setsebool -P httpd_can_network_connect 1
If you only need database access (I had this problem before which is why I suspected SELinux here) then it would probably be better to only turn on 'httpd_cna_network_connect'.
The purpose of this policy is that if a hacker was to hijack your apache server they would not be able to get out through the server to the rest of your internal network.
This probably would've been better as a comment but I don't have enough rep..
Sources:
https://tag1consulting.com/blog/stop-disabling-selinux
https://wiki.centos.org/TipsAndTricks/SelinuxBooleans

python (flask) redirect request via client

I have been trying to do something which I think should be pretty simple. The situation is as follows. The client makes a request for a resource on my web server. My flask application processes the request and determines that this resource is located at a certain location on another web server and the client should make a request of that server instead.
I know I can use the redirect function to tell the client to send a request to the remote location, but my problem is that the remote location is the Amazon Glacier servers. These servers require a request to be made in a certain way, with a special signature (see http://docs.aws.amazon.com/amazonglacier/latest/dev/amazon-glacier-signing-requests.html). My flask application knows how to go about the business of making these requests in the required way. I essentially want to know if it's possible to send a response to my client saying, send this request (generated by my application, with all the required signing) to the Amazon server?
Any ideas?
If the request can be encoded with get params like
http://www.redirecturl.com/?param1=bla&param2=blub
then it should work no problem. Just construct the request as a string and pass it to redirect().
As far as i know, you can't tell a client to send specific headers to a HTTP redirect URL.
Hitting the Glacier URL serverside would be the easiest. Using javascript on the clientside would only work if Glacier is implementing CORS.

Categories