Find Host and Port in a Django Application - python

I'm connecting to the Twitter Streaming API and am setting up the OAuth handshake. I need to request a token and send a callback_url as a params dictionary along with post request.
I've hardcoded in the url for development (http://localhost:8000/oauth) but when I deploy this will change. I want to set up something that will find the host and port and set a reference it. Ideally, looking like "http://%s/oauth" % (domain_name)
I've tried using both the os and socket modules, and code is below:
class OAuth:
def request_oauthtoken(self):
name = socket.gethostname()
ip = socket.gethostbyname(name)
domain = socket.gethostbyaddr(ip) # Sequence attempts to find the current domain name. This will add expandability to the calling the API, as opposed to hard coding it in. It's not returning what I'm expecting
payload = { 'oauth_callback': 'http://localhost:8000/oauth' }
print(domain)
return payload
domain returns ('justins-mbp-2.local.tld', ['145.15.168.192.in-addr.arpa'], ['192.168.15.145'])
name returns the first element of the tuple above and ip returns the last item of the tuple unwrapped from the collection.
I'm looking for a return value of localhost or localhost:8000. I can work with either one.

call request.build_absolute_uri(), then extract the domain.
docs:
HttpRequest.build_absolute_uri(location) Returns the absolute URI form
of location. If no location is provided, the location will be set to
request.get_full_path().
If the location is already an absolute URI, it will not be altered.
Otherwise the absolute URI is built using the server variables
available in this request.
Example: "http://example.com/music/bands/the_beatles/?print=true"

Related

how do you check different api calls until it is succesful in python

I have a server list:
hosts=['server1','server2','server3','server4']
there are server 4 monitoring tools accessed by api calls. these monitoring env urls basically the same except for each env has unique id which is part of the url. These servers can be on any of the 4 monitoring tools. I need to find out which url these servers belong to.
for example, these are the monitoring tools urls:
production_env="https://example.com/e/envid123"
dev_env="https://example.com/e/envid678"
test_env="https://example.com/e/envid567"
uat_env="https://example.com/e/envid1000"
given the server name, I need to find out which env they belong to.
given a server name, for example "server1",
api url would become https://example.com/e/envid123&serverName="server1", this url will give whether server1 exists in production_env or not. I need to check each env url until I find the given server.
I am trying something like this:
envId=['envid123','envid678','envid567','envid1000']
for server in hosts:
for id in envId:
url="https://example.com/e/"+id+&serverName=server
resp=request.get(url)
Any ideas how could do this the best way?
You need quotes around &serverName= to concatenate them. But it's simpler to use a formatting method, such as f-strings.
To find the desired URL, use resp.json to get the decoded JSON, and check the value of the appropriate dictionary element.
found = False
for server in hosts:
for id in envId:
url = f"https://example.com/e/{id}&serverName={server}"
resp = request.get(url)
if resp.json['totalCount'] > 0:
found = True
print(f"Success at host = {server} id = {id}")
break
if found:
break
else:
print("No server found")

Issue with url_for mapping variables to URL if multiple routes are used

I've hit an issue with url_for, where it won't automatically remap the variable straight into the URL because there are two routes.
My use case is an API, where creating an object will return the same data as if a GET command was run on it.
Here's an example of the code:
#app.route('/test', methods=['POST'])
#app.route('/test/<string:name>', methods=['GET'])
def test(name=None):
if request.method == 'POST':
return redirect(url_for('test', name='xyz'))
return name
If the first app.route is removed, then url_for('test', name='xyz') will correctly return "test/xyz".
However, with both app.route lines, it instead returns "test?name=xyz". This then causes name to be None, where the variable is actually located at request.args['name'].
I don't want to do a if name is None: name=request.args.get('name'), so is there any way I can force it to only look at routes with a GET method? My case right now is simple enough I could just do url_for('test')+'/xyz', but it seems like there should be better way of doing this.
According to the Flask Docs you can specify which method to map against use the _method argument.
flask.url_for(endpoint, **values)
And the values you can pass are:
endpoint – the endpoint of the URL (name of the function)
values – the variable arguments of the URL rule
_external – if set to True, an absolute URL is generated. Server address can be changed via SERVER_NAME configuration variable which falls back to the Host header, then to the IP and port of the request.
_scheme – a string specifying the desired URL scheme. The _external parameter must be set to True or a ValueError is raised. The default behavior uses the same scheme as the current request, or PREFERRED_URL_SCHEME from the app configuration if no request context is available. As of Werkzeug 0.10, this also can be set to an empty string to build protocol-relative URLs.
_anchor – if provided this is added as anchor to the URL.
_method – if provided this explicitly specifies an HTTP method. <---- This one
Specify the _method argument in url_for like this:
url_for('test', name='xyz', _method='GET')

How to get validate DNSSEC with python? [duplicate]

As the title says I want to programmatically check if a DNS response for a domain are protected with DNSSEC.
How could I do this?
It would be great, if there is a pythonic solution for this.
UPDATE:
changed request to response, sorry for the confusion
Using a DNS resolver (e.g. dnspython), you can query the domain for its DNSKEY RRset and turn on the DO (dnssec OK) query flag. If the query succeeds, the answer will have the AD (authenticated data) flag set and will contain the RRSIG signatures for the zone (if it is signed).
Update: a basic example using dnspython
import dns.name
import dns.query
import dns.dnssec
import dns.message
import dns.resolver
import dns.rdatatype
# get nameservers for target domain
response = dns.resolver.query('example.com.',dns.rdatatype.NS)
# we'll use the first nameserver in this example
nsname = response.rrset[0].to_text() # name
response = dns.resolver.query(nsname,dns.rdatatype.A)
nsaddr = response.rrset[0].to_text() # IPv4
# get DNSKEY for zone
request = dns.message.make_query('example.com.',
dns.rdatatype.DNSKEY,
want_dnssec=True)
# send the query
response = dns.query.udp(request,nsaddr)
if response.rcode() != 0:
# HANDLE QUERY FAILED (SERVER ERROR OR NO DNSKEY RECORD)
# answer should contain two RRSET: DNSKEY and RRSIG(DNSKEY)
answer = response.answer
if len(answer) != 2:
# SOMETHING WENT WRONG
# the DNSKEY should be self signed, validate it
name = dns.name.from_text('example.com.')
try:
dns.dnssec.validate(answer[0],answer[1],{name:answer[0]})
except dns.dnssec.ValidationFailure:
# BE SUSPICIOUS
else:
# WE'RE GOOD, THERE'S A VALID DNSSEC SELF-SIGNED KEY FOR example.com
To see if a particular request is protected, look at the DO flag in the request packet. Whatever language and library you use to interface to DNS should have an accessor for it (it may be called something else, like "dnssec").
The first answer is correct but incomplete if you want to know if a certain zone is protected. The described procedure will tell you if the zone's own data is signed. In order to check that the delegation to the zone is protected, you need to ask the parent zone's name servers for a (correctly signed) DS record for the zone you're interested in.

How to configure a single flask application to handle multiple domains?

Currently, my flask application (that uses sessions) does the following to handle ONE domain:
app.config.from_object(settings)
and in the settings object:
SESSION_COOKIE_DOMAIN = ".first.com"
What I'd like to do now is to dynamically set the session cookie domain to handle, for example, requests from www.first.com and www.second.com. Please note that I'm talking about domains but not subdomains. Thank you.
Grepping SESSION_COOKIE_DOMAIN through Flask's Github repo one can see that it is used like this:
def get_cookie_domain(self, app):
"""Helpful helper method that returns the cookie domain that should
be used for the session cookie if session cookies are used.
"""
if app.config['SESSION_COOKIE_DOMAIN'] is not None:
return app.config['SESSION_COOKIE_DOMAIN']
if app.config['SERVER_NAME'] is not None:
# chop of the port which is usually not supported by browsers
rv = '.' + app.config['SERVER_NAME'].rsplit(':', 1)[0]
# Google chrome does not like cookies set to .localhost, so
# we just go with no domain then. Flask documents anyways that
# cross domain cookies need a fully qualified domain name
if rv == '.localhost':
rv = None
# If we infer the cookie domain from the server name we need
# to check if we are in a subpath. In that case we can't
# set a cross domain cookie.
if rv is not None:
path = self.get_cookie_path(app)
if path != '/':
rv = rv.lstrip('.')
return rv
Doing the same thing with get_cookie_domain( you'll see:
def save_session(self, app, session, response):
domain = self.get_cookie_domain(app)
path = self.get_cookie_path(app)
...
OK. Now we only need to find out what domain name to use. Digging through docs or code you'll see that save_session() is called in request context. So you just need to import the request object from flask module:
from flask import request
and use it inside save_session() to determine domain name for the cookies (e.g. from the Host header) like this:
def save_session(self, app, session, response):
domain = '.' + request.headers['Host']
path = self.get_cookie_path(app)
# the rest of the method is intact
The only time you need to specify cookies domain is when you send them back with response object.
Also bear in mind that Host header might be absent.
To wire up the whole thing you'll need to specify your version (subclass) of SecureCookieSessionInterface:
app = Flask(__name__)
app.session_interface = MySessionInterface()
More doc links:
Response Object
Session Interface

pysimplesoap web service return connection refused

I've created some web services using pysimplesoap like on this documentation:
https://code.google.com/p/pysimplesoap/wiki/SoapServer
When I tested it, I called it like this:
from SOAPpy import SOAPProxy
from SOAPpy import Types
namespace = "http://localhost:8008"
url = "http://localhost:8008"
proxy = SOAPProxy(url, namespace)
response = proxy.dummy(times=5, name="test")
print response
And it worked for all of my web services, but when I try to call it by using an library which is needed to specify the WSDL, it returns "Could not connect to host".
To solve my problem, I used the object ".wsdl()" to generate the correct WSDL and saved it into a file, the WSDL generated by default wasn't correct, was missing variable types and the correct server address...
The server name localhost is only meaningful on your computer. Once outside, other computers won't be able to see it.
1) find out your external IP, with http://www.whatismyip.com/ or another service. Note that IPs change over time.
2) plug the IP in to http://www.soapclient.com/soaptest.html
If your local service is answering IP requests as well as from localhost, you're done!

Categories