I work on a Django based app, and I want to know if there's a way to know if my server uses http connections or https.
I know that using
import socket
if socket.gethostname().startswith('****'):
I can get the hostname, is it possible to do something like that so I can get to know if the hosting uses a ssl certificate?
PD: I'm a rookie here, so I'm asking to see if it's possible and, if it is, how should I do it.
Thanks
it's completely possible:
def some_request_function(request):
if request.is_secure():
#You are safe!
else:
#You are NOT safe!
More details:
https://docs.djangoproject.com/en/2.0/ref/request-response/#django.http.HttpRequest.is_secure
There simply is an is_secure() method on the request object, returning True if the connection is secure.
Depending on your specific server configuration you may also need to set SECURE_PROXY_SSL_HEADER in your settings.
django requests (HttpRequest) have is_secure method:
https://docs.djangoproject.com/en/dev/ref/request-response/#django.http.HttpRequest.is_secure
Related
I just deployed a Flask app on Webfaction and I've noticed that request.remote_addr is always 127.0.0.1. which is of course isn't of much use.
How can I get the real IP address of the user in Flask on Webfaction?
Thanks!
If there is a proxy in front of Flask, then something like this will get the real IP in Flask:
if request.headers.getlist("X-Forwarded-For"):
ip = request.headers.getlist("X-Forwarded-For")[0]
else:
ip = request.remote_addr
Update: Very good point mentioned by Eli in his comment. There could be some security issues if you just simply use this. Read Eli's post to get more details.
Werkzeug middleware
Flask's documentation is pretty specific about recommended reverse proxy server setup:
If you deploy your application using one of these [WSGI] servers behind an HTTP [reverse] proxy you will need to rewrite a few headers in order for the application to work [properly]. The two problematic values in the WSGI environment usually are REMOTE_ADDR and HTTP_HOST... Werkzeug ships a fixer that will solve some common setups, but you might want to write your own WSGI middleware for specific setups.
And also about security consideration:
Please keep in mind that it is a security issue to use such a middleware in a non-proxy setup because it will blindly trust the incoming headers which might be forged by malicious clients.
The suggested code (that installs the middleware) that will make request.remote_addr return client IP address is:
from werkzeug.contrib.fixers import ProxyFix
app.wsgi_app = ProxyFix(app.wsgi_app, num_proxies=1)
Note num_proxies which is 1 by default. It's the number of proxy servers in front of the app.
The actual code is as follows (lastest werkzeug==0.14.1 at the time of writing):
def get_remote_addr(self, forwarded_for):
if len(forwarded_for) >= self.num_proxies:
return forwarded_for[-self.num_proxies]
Webfaction
Webfaction's documentation about Accessing REMOTE_ADDR says:
...the IP address is available as the first IP address in the comma separated list in the HTTP_X_FORWARDED_FOR header.
They don't say what they do when a client request already contains X-Forwarded-For header, but following common sense I would assume they replace it. Thus for Webfaction num_proxies should be set to 0.
Nginx
Nginx is more explicit about it's $proxy_add_x_forwarded_for:
the “X-Forwarded-For” client request header field with the $remote_addr variable appended to it, separated by a comma. If the “X-Forwarded-For” field is not present in the client request header, the $proxy_add_x_forwarded_for variable is equal to the $remote_addr variable.
For Nginx in front of the app num_proxies should be left at default 1.
Rewriting the Ignas's answer:
headers_list = request.headers.getlist("X-Forwarded-For")
user_ip = headers_list[0] if headers_list else request.remote_addr
Remember to read Eli's post about spoofing considerations.
You can use request.access_route to access list of ip :
if len(request.access_route) > 1:
return request.access_route[-1]
else:
return request.access_route[0]
Update:
You can just write this:
return request.access_route[-1]
The problem is there's probably some kind of proxy in front of Flask. In this case the "real" IP address can often be found in request.headers['X-Forwarded-For'].
I'm trying to use the salesforce-python-toolkit to make web services calls to the Salesforce API, however I'm having trouble getting the client to go through a proxy. Since the toolkit is based on top of suds, I tried going down to use just suds itself to see if I could get it to respect the proxy setting there, but it didn't work either.
This is tested on suds 0.3.9 on both OS X 10.7 (python 2.7) and ubuntu 12.04.
an example request I've made that did not end up going through the proxy (just burp or charles proxy running locally):
import suds
ws = suds.client.Client('file://sandbox.xml',proxy={'http':'http://localhost:8888'})
ws.service.login('user','pass')
I've tried various things with the proxy - dropping http://, using an IP, using a FQDN. I've stepped through the code in pdb and see it setting the proxy option. I've also tried instantiating the client without the proxy and then setting it with:
ws.set_options(proxy={'http':'http://localhost:8888'})
Is proxy not used by suds any longer? I don't see it listed directly here http://jortel.fedorapeople.org/suds/doc/suds.options.Options-class.html, but I do see it under transport. Do I need to set it differently through a transport? When I stepped through in pdb it did look like it was using a transport, but I'm not sure how.
Thank you!
I went into #suds on freenode and Xelnor/rbarrois provided a great answer! Apparently the custom mapping in suds overrides urllib2's behavior for using the system configuration environment variables. This solution now relies on having the http_proxy/https_proxy/no_proxy environment variables set accordingly.
I hope this helps anyone else running into issues with proxies and suds (or other libraries that use suds). https://gist.github.com/3721801
from suds.transport.http import HttpTransport as SudsHttpTransport
class WellBehavedHttpTransport(SudsHttpTransport):
"""HttpTransport which properly obeys the ``*_proxy`` environment variables."""
def u2handlers(self):
"""Return a list of specific handlers to add.
The urllib2 logic regarding ``build_opener(*handlers)`` is:
- It has a list of default handlers to use
- If a subclass or an instance of one of those default handlers is given
in ``*handlers``, it overrides the default one.
Suds uses a custom {'protocol': 'proxy'} mapping in self.proxy, and adds
a ProxyHandler(self.proxy) to that list of handlers.
This overrides the default behaviour of urllib2, which would otherwise
use the system configuration (environment variables on Linux, System
Configuration on Mac OS, ...) to determine which proxies to use for
the current protocol, and when not to use a proxy (no_proxy).
Thus, passing an empty list will use the default ProxyHandler which
behaves correctly.
"""
return []
client = suds.client.Client(my_wsdl, transport=WellBehavedHttpTransport())
I think you can do by using a urllib2 opener like below.
import suds
t = suds.transport.http.HttpTransport()
proxy = urllib2.ProxyHandler({'http': 'http://localhost:8888'})
opener = urllib2.build_opener(proxy)
t.urlopener = opener
ws = suds.client.Client('file://sandbox.xml', transport=t)
I was actually able to get it working by doing two things:
making sure there were keys in the proxy dict for http as well as https.
setting the proxy using set_options AFTER creation of the client.
So, my relevant code looks like this:
self.suds_client = suds.client.Client(wsdl)
self.suds_client.set_options(proxy={'http': 'http://localhost:8888', 'https': 'http://localhost:8888'})
I had multiple issues using Suds, even though my proxy was configured properly I could not connect to the endpoint wsdl. After spending significant time attempting to formulate a workaround, I decided to give soap2py a shot - https://code.google.com/p/pysimplesoap/wiki/SoapClient
Worked straight off the bat.
For anyone who's attempting cji's solution over HTTPS, you actually need to keep one of the handlers for the basic authentication. I also am using python3.7 so urllib2 has been replaced with urllib.request.
from suds.transport.https import HttpAuthenticated as SudsHttpsTransport
from urllib.request import HTTPBasicAuthHandler
class WellBehavedHttpsTransport(SudsHttpsTransport):
""" HttpsTransport which properly obeys the ``*_proxy`` environment variables."""
def u2handlers(self):
""" Return a list of specific handlers to add.
The urllib2 logic regarding ``build_opener(*handlers)`` is:
- It has a list of default handlers to use
- If a subclass or an instance of one of those default handlers is given
in ``*handlers``, it overrides the default one.
Suds uses a custom {'protocol': 'proxy'} mapping in self.proxy, and adds
a ProxyHandler(self.proxy) to that list of handlers.
This overrides the default behaviour of urllib2, which would otherwise
use the system configuration (environment variables on Linux, System
Configuration on Mac OS, ...) to determine which proxies to use for
the current protocol, and when not to use a proxy (no_proxy).
Thus, passing an empty list (asides from the BasicAuthHandler)
will use the default ProxyHandler which behaves correctly.
"""
return [HTTPBasicAuthHandler(self.pm)]
I want to write a script to support my clients online. I decided to use ICQ-protocol for this (kind a icq-bot).
I have 25 icq uins. I need something that will be able to:
Make them all online
If some uins got disconnected - reconnect them.
Use proxy to login, because ICQ server could not accept connection from one IP.
Receive some messages and answer on them.
What should I use to do this?
Thanks a lot.
I suggest using XMPP (a.k.a. Jabber) instead of ICQ, really.
It's a free protocoll, and there are python APIs for it, like jabber.py and xmpppy.
xmpppy is as easy as:
jid = xmpp.protocol.JID('your id')
cl = xmpp.Client(jid.getDomain(),debug=[])
cl.connect()
cl.auth(jid.getNode(), 'your password')
cl.send(xmpp.protocol.Message('reciever id', 'your text'))
Also, you can use so called transports to use XMPP to transparently send/reviece messages from other protocols like ICQ, MSN, AOL etc, which may be what you need if you really need ICQ.
Otherwise, I only know about NanoICQ, which claims to be a python based ICQ client, I don't know if the project is still active...
I am trying to add authentication to a xmlrpc server (which will be running on nodes of a P2P network) without using user:password#host as this will reveal the password to all attackers. The authentication is so to basically create a private network, preventing unauthorised users from accessing it.
My solution to this was to create a challenge response system very similar to this but I have no clue how to add this to the xmlrpc server code.
I found a similar question (Where custom authentication was needed) here.
So I tried creating a module that would be called whenever a client connected to the server. This would connect to a challenge-response server running on the client and if the client responded correctly would return True. The only problem was that I could only call the module once and then I got a reactor cannot be restarted error. So is there some way of having a class that whenever the "check()" function is called it will connect and do this?
Would the simplest thing to do be to connect using SSL? Would that protect the password? Although this solution would not be optimal as I am trying to avoid having to generate SSL certificates for all the nodes.
Don't invent your own authentication scheme. There are plenty of great schemes already, and you don't want to become responsible for doing the security research into what vulnerabilities exist in your invention.
There are two very widely supported authentication mechanisms for HTTP (over which XML-RPC runs, therefore they apply to XML-RPC). One is "Basic" and the other is "Digest". "Basic" is fine if you decide to run over SSL. Digest is more appropriate if you really can't use SSL.
Both are supported by Twisted Web via twisted.web.guard.HTTPAuthSessionWrapper, with copious documentation.
Based on your problem description, it sounds like the Secure Remote Password Protocol might be what you're looking for. It's a password-based mechanism that provides strong, mutual authentication without the complexity of SSL certificate management. It may not be quite as flexible as SSL certificates but it's easy to use and understand (the full protocol description fits on a single page). I've often found it a useful tool for situations where a trusted third party (aka Kerberos/CA authorities) isn't appropriate.
For anyone that was looking for a full example below is mine (thanks to Rakis for pointing me in the right direction). In this the user and password is stored in a file called 'passwd' (see the first useful link for more details and how to change it).
Server:
#!/usr/bin/env python
import bjsonrpc
from SRPSocket import SRPSocket
import SocketServer
from bjsonrpc.handlers import BaseHandler
import time
class handler(BaseHandler):
def time(self):
return time.time()
class SecureServer(SRPSocket.SRPHost):
def auth_socket(self, socket):
server = bjsonrpc.server.Server(socket, handler_factory=handler)
server.serve()
s = SocketServer.ForkingTCPServer(('', 1337), SecureServer)
s.serve_forever()
Client:
#! /usr/bin/env python
import bjsonrpc
from bjsonrpc.handlers import BaseHandler
from SRPSocket import SRPSocket
import time
class handler(BaseHandler):
def time(self):
return time.time()
socket, key = SRPSocket.SRPSocket('localhost', 1337, 'dht', 'testpass')
connection = bjsonrpc.connection.Connection(socket, handler_factory=handler)
test = connection.call.time()
print test
time.sleep(1)
Some useful links:
http://members.tripod.com/professor_tom/archives/srpsocket.html
http://packages.python.org/bjsonrpc/tutorial1/index.html
I have a server that has to respond to HTTP and XML-RPC requests. Right now I have an instance of SimpleXMLRPCServer, and an instance of BaseHTTPServer.HTTPServer with a custom request handler, running on different ports. I'd like to run both services on a single port.
I think it should be possible to modify the CGIXMLRPCRequestHandler class to also serve custom HTTP requests on some paths, or alternately, to use multiple request handlers based on what path is requested. I'm not really sure what the cleanest way to do this would be, though.
Use SimpleXMLRPCDispatcher class directly from your own request handler.
Is there a reason not to run a real webserver out front with url rewrites to the two ports you are usign now? It's going to make life much easier in the long run
Simplest way would be (tested for Python 3.3 but should work for 2.x with modified imports):
from http.server import SimpleHTTPRequestHandler
from xmlrpc.server import SimpleXMLRPCRequestHandler,SimpleXMLRPCServer
class MixRequestHandler(SimpleHTTPRequestHandler,SimpleXMLRPCRequestHandler):
pass
srv=SimpleXMLRPCServer(("localhost",8080),MixRequestHandler)
#normal stuff for SimpleXMLRPCServer