Python requests get website using custom dns - python

I need to access a specific server and it just responses to connections with a specific DNS server. So before connecting to that website I need to set my system DNS servers to custom IPs. That's ok, but now I'm working on a python script with a requests module and I want to access that server. How can I set custom DNS IPs to requests session to do GET function with those DNS servers?
I should say that I just need a JSON file from that server, so It's just exhausting to change DNS servers every time.

Related

Can I intercept HTTP requests that are coming for another application and port using python

I am currently thinking on a project that automatically executes defensive actions such as adding the IP of a DoS attacker to iptables list to drop their requests permanently.
My question is can I intercept the HTTP requests that are coming for another application, using python? For example, can I count how many times an Apache server running on port 80, recieved a HTTP POST request and extract its sender etc.
I tried looking into requests documentation but couldn't find anything relevant.

Getting Client IP address while using development server in Flask

What I am doing
I have a flask website and I am making it accessible to a client using ngrok tunneling.
What I want
I am trying to get the IP address of the client.
What I have done so far
I have tried these so far,
request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
and
request.environ['REMOTE_ADDR']
But both of them are returning 127.0.0.1. I have also checked out this question But it didn't help me out since the answer written there are for getting client IP's in production server environment whereas I am looking for any method using which I can get IP address of client in the development mode of server which is tunneled using ngrok.
I have two possible methods in my mind,
If I can get the IP address of the connection requester from Ngrok. I don't know is there any way to do it but this can solve my problem.
Or I add something to my javascript code so that whenever the index page loads up it sends an ajax request to the server telling it the IP address of the client. (Correct me if wrong)
In case of Flask , you cannot get the client ip address directly on the server side but you cannot get the IP address if your web app grabs the client ip and then using AJAX request sends it back to the server so that you can log it.
That's the only possible way i think you can do it in flask.

How to bind a Python socket to a specific domain?

I have a Heroku application that has a domain moarcatz.tk. It listens for non-HTTP requests using Python's socket.
The documenatation states that if I bind a socket to an empty string as an IP address, it will listen on all available interfaces. I kept getting empty requests from various IP addresses, so I assume that setting the socket to only listen for connections to moarcatz.tk would fix the problem. But I don't know how to bind a socket to a domain name.
I tried 'moarcatz.tk' and gethostbyname('moarcatz.tk'), but both give me this error:
OSError: [Errno 99] Cannot assign requested address
What's up with that?
You can't control this via your code, but you can control this via Heroku.
Heroku has a pretty nifty DNS CNAME tool you can use to ensure your app ONLY listens to incoming requests for specific domains -- it's part of the core Heroku platform.
What you do is this:
heroku domains:add www.moarcatz.tk
Then, go to your DNS provider for moarcatz.tk and add a CNAME record for:
www <heroku-app-name>.herokuapp.com
This will do two things:
Point your DNS to Heroku.
Make Heroku filter the incoming traffic and ALLOW it for that specific domain.

Using IP authenticated proxies in a distributed crawler

I'm working on a distributed web crawler in Python running on a cluster of CentOS 6.3 servers, the crawler uses many proxies from different proxy providers. Everything works like a charm for username/password authenticated proxy providers. But now we have bought some proxies that uses IP based authentication, this means that when I want do crawl into a webpage using one of this proxies I need to make the request from a subset of our servers.
The question is, is there a way in Python (using a library/software) to make a request to a domain passing trough 2 proxies? (one proxy is one of the subset needed to be used for the IP authentication and the second is the actual proxy from the provider) Or is there another way to do this without setting up this subset of our servers as proxies?
The code I'm using now to make the request trough a proxy uses the requests library:
import requests
from requests.auth import HTTPProxyAuth
proxy_obj = {
'http':proxy['ip']
}
auth = HTTPProxyAuth(proxy['username'], proxy['password')
data = requests.get(url, proxies = proxy_obj, auth = auth)
Thanks in advance!
is there a way in Python (using a library/software) to make a request
to a domain passing trough 2 proxies?
If you need to go through two proxies, it looks like you'll have to use HTTP tunneling, so any host which isn't on the authorized list would have to connect an HTTP proxy server on one of the hosts which is, and use the HTTP CONNECT method to create a tunnel to the remote proxy, but it may not be possible to achieve that with the requests library.
Or is there another way to do this without setting up this subset of
our servers as proxies?
Assuming that the remote proxies which use IP address-based authentication are all expecting the same IP address, then you could instead configure a NAT router, between your cluster and the remote proxies, to translate all outbound HTTP requests to come from that single IP address.
But, before you look into implementing either of these unnecessarily complicated options, and given that you're paying for this service, can't you just ask the provider to allow requests for the entire range of IP addresses which you're currently using?

Get an IP adress of a client of a SOAP service

I'm writing a SOAP service using python and soaplib. I need to get IP adresses of all clients of the service to store them to the log file. How can I do that?
One way to do this is to implement a "hook" which is called at different stages of the wsgi executation. See the section "Hooks" in the soaplib readme file for details and the example hook.py in that distribution.
For example, you could implement onMethodExec and then use the wsgi environ.get('REMOTE_ADDR') to obtain the client's IP address and log it.

Categories