How to create a request in python insecure by header - python

I want to request to show my pods in kubsernets
and this command work for me
curl -X GET https://203.0.113.106:6443/api --header "Authorization: Bearer $TOKEN" --insecure
{
"kind": "APIVersions",
"versions": [
"v1"
],
"serverAddressByClientCIDRs": [
{
"clientCIDR": "0.0.0.0/0",
"serverAddress": "203.0.113.106:6443"
}
]
I want to write it in python on windows so I write this code
import requests
auth_token='eyJhbGciOiJSUzI1NiIsImtpZCI6IiJ9.eyJpc3MiOiJrdWJlcm5ldGVzL3NlcnZpY2VhY2NvdW50Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9uYW1lc3BhY2UiOiJkZWZhdWx0Iiwia3ViZXJuZXRlcy5pby9zZXJ2aWNlYWNjb3VudC9zZWNyZXQubmFtZSI6ImRlZmF1bHQtdG9rZW4tN2Y2NGYiLCJrdWJlcm5ldGVzLmlvL3NlcnZpY2VhY2NvdW50L3NlcnZpY2UtYWNjb3VudC5uYW1lIjoiZGVmYXVsdCIsImt1YmVybmV0ZXMuaW8vc2VydmljZWFjY291bnQvc2VydmljZS1hY2NvdW50LnVpZCI6IjQ5OThhYzJkLTcwZGMtMTFlOS1hNzQ2LWZhMTYzZTBmNmI5MSIsInN1YiI6InN5c3RlbTpzZXJ2aWNlYWNjb3VudDpkZWZhdWx0OmRlZmF1bHQifQ.nhvHvzSkM_cy84mxra5i12jUNNrsWOf3XSoZtKvMXkIW07Ftto-ce8tr_gceAExbTYdVY5lmhxptHIosevfVCAafceLwE8wN3gXsaguaU8nZjXSR_fX5lFSK5J1s19hfh2vl2lKkb-A2_Yu2j3RdFn70LPL6dRKg9GmJIyIREICe3jq1ZATQj6V9rRjXg1wHc9qdnESmlb5qc9V9_ZJuiT_WbSXwzpgUmwm1YuwajxmV7rbFSFd-TKXsotGIwijoCztxbRRgy_8m_xoinC9UnUtLV-TrRNrSBhuuZe0Wl6ZoItjOSOfMj0NkE5EHPGqqvPjgRcSwMWvUc-pZ6UjoNw'
hed = {'Authorization': 'Bearer ' + auth_token}
url = 'https://203.0.113.106:6443/api'
requests.packages.urllib3.disable_warnings()
response = requests.get(url, headers=hed,verify=False)
print(response)
print(response.json())
After I run it in pycharm I get this error
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='203.0.113.106', port=6443): Max retries exceeded with url: /api (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x06152BB0>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond'))
And if I run it in ubuntu I get the correct respond
{u'serverAddressByClientCIDRs': [{u'clientCIDR': u'0.0.0.0/0', u'serverAddress': u'203.0.113.106:6443'}], u'kind': u'APIVersions', u'versions': [u'v1']}
What should I do?

Related

proxy works fine with http but not https

I wanted to use proxies in python requests but when I run the code with like this
req = requests.get("https://httpbin.org/ip", proxies={'https': 'user:pass#host:port',
'http': 'user:pass#host:port'})
print(req.content)
I get this error
HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with url: /ip (Caused by ProxyError('Cannot connect to proxy.', TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None
but If I use "http://httpbin.org/ip" instead of "https://httpbin.org/ip"
it works really fine
and in other stuff like if I run this code
proxies = { 'http' : 'user:pass#host:port' }
req =requests.get("https://lumtest.com/myip.json",proxies =proxies )
print(req.content)
I get my ip address which means that the proxies are not working. But if I use this
which is the same url just without the s in https and I run it over HTTP
proxies = { 'http' : 'user:pass#host:port' }
req =requests.get("http://lumtest.com/myip.json",proxies =proxies )
print(req.content)
I get the ip of the proxy which means that its working fine
It doesn't bother me changing the s in HTTP or HTTPS but in some website when I use proxies over HTTP
I get a different response I get this
b''
instead of getting the response that I wanted that works fine without proxies even If I run it on HTTPS or HTTP
but If I run it it only works over http with the proxies and it doesn't give me a valid response
I hope someone can help me bcuz I have been trying to solve this forever

Python requests Max retries exceeded with dynatrace SAAS url

Trying to get User session data from Dynatrace SAAS using python3 script. The get Request is giving me error Max retries exceeded with url: Failed to establish a new connection:
I'm not sure if I'm passing the token or proxy wrong.
import requests
from requests.exceptions import ConnectionError
try:
response = requests.get('https://jbu0001.live.dynatrace.com/api/v1/userSessionQueryLanguage/table?SELECT%20*%20from%20usersession&explain=true',
headers={'Authorization': 'Api-Token XXXXXXXX'}, proxies={'http': 'http://proxy.com:PORT'}, verify=False)
if response.status_code == 200:
response.text
except ConnectionError as e:
print(e)
ERR0R
HTTPSConnectionPool(host='jbu0000.live.dynatrace.com', port=443): Max retries exceeded with url: /api/v1/userSessionQueryLanguage/table?SELECT%20*%20from%20usersession&explain=true (Caused by NewConnectionError('<urllib3.conne
ction.VerifiedHTTPSConnection object at 0x00000160212D5B38>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',))
BUT
I'm able to get data using CURL with proxy from same machine.
curl -X GET "https://jbu00XXX.live.dynatrace.com/api/v1/userSessionQueryLanguage/table?query=select%20*%20from%20usersession&explain=true" -H "accept: application/json" -H "Authorization: Api-Token XXXXXXXXX" --proxy http://proxy.com:PORT
Thanks in advance!
Are you sure that proxy is under http not https? If so, try sending api-token as:
headers={'Authorization': r'Api-Token XXXXXXXX'}
or
headers={'Authorization': r'XXXXXXXX'}
prefix r assures that you are sending raw string data.

Failed to establish a new connection error using Python requests Errno -2 Name or service unknown

I am trying to make a request to an API with Python. I am able to make the request with curl without issue but I have something wrong with my Python request.
Why does this code,
import requests
from requests.auth import HTTPBasicAuth
emailadd = 'user123#example.com'
domain = 'example.com'
spantoken = 'omitted'
def checkUserLicensed(useremail):
url = ('https://api.spannigbackup.com/v1/users/' + useremail)
print(url)
response = requests.get(url, auth=(domain,spantoken))
print(response)
checkUserLicensed(emailadd)
Return this error
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.spannigbackup.com', port=443): Max retries exceeded with url: /v1/users/user123#example.com
(Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f73ca323748>: Failed to establish a new connection: [Errno -2] Name or service not known'))

Easiest way to use HTTPS through a proxy [duplicate]

I try to use https proxy in python like this:
proxiesDict ={
'http': 'http://' + proxy_line,
'https': 'https://' + proxy_line
}
response = requests.get('https://api.ipify.org/?format=json', proxies=proxiesDict, allow_redirects=False)
proxy_line is a proxy read from file in the format of ip:port. I checked this https proxy in browser and it works. But in python this code hangs for a few seconds and then i get exception:
HTTPSConnectionPool(host='api.ipify.org', port=443): Max retries exceeded with url: /?format=json (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x0425E450>: Failed to establish a new connection: [WinError 10060]
I tried to use socks5 proxy, and it works on socks5 proxies with a PySocks installed. But for https i get this exception, can someone help me
When specifying a proxy list for requests, the key is the protocol, and the value is the domain/ip. You don't need to specify http:// or https:// again, for the actual value.
So, your proxiesDict will be:
proxiesDict = {
'http': proxy_line,
'https': proxy_line
}
You can also configure proxies by setting the enviroment variables:
$ export HTTP_PROXY="http://proxyIP:PORT"
$ export HTTPS_PROXY="http://proxyIP:PORT"
Then, you only need to execute your python script without proxy request.
Also, you can configure your proxy with http://user:password#host
For more information see this documentation: http://docs.python-requests.org/en/master/user/advanced/
Try using pycurl this function may help:
import pycurl
def pycurl_downloader(url, proxy_url, proxy_usr):
"""
Download files with pycurl
the proxy configuration:
proxy_url = 'http://10.0.0.0:3128'
proxy_usr = 'user:password'
"""
c = pycurl.Curl()
c.setopt(pycurl.FOLLOWLOCATION, 1)
c.setopt(pycurl.MAXREDIRS, 5)
c.setopt(pycurl.CONNECTTIMEOUT, 30)
c.setopt(pycurl.AUTOREFERER, 1)
if proxy_url: c.setopt(pycurl.PROXY, proxy_url)
if proxy_usr: c.setopt(pycurl.PROXYUSERPWD, proxy_usr)
content = StringIO()
c.setopt(pycurl.URL, url)
c.setopt(c.WRITEFUNCTION, content.write)
try:
c.perform()
c.close()
except pycurl.error, error:
errno, errstr = error
print 'An error occurred: ', errstr
return content.getvalue()

Fetching a .onion domain with requests

I'm trying to access the following domain nzxj65x32vh2fkhk.onion using requests.
I have tor running and I configured the session's object proxies correctly.
import requests
session = requests.session()
session.proxies = {'http': 'socks5://localhost:9050',
'https': 'socks5://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
However when I try to access the URL with the .onion domain I get the following error:
session.get('http://nzxj65x32vh2fkhk.onion/all')
ConnectionError: SOCKSHTTPConnectionPool(host='nzxj65x32vh2fkhk.onion', port=80): Max retries exceeded with url: /all (Caused by NewConnectionError('<requests.packages.urllib3.contrib.socks.SOCKSConnection object at 0x7f5e8c2dbbd0>: Failed to establish a new connection: [Errno -2] Name or service not known',))
I also tried to replace localhost with 127.0.0.1 as suggested in one of the answers. The result is the same unfortunately.
Performing the same request using urllib2 works just fine.
import socks, socket, urllib2
def create_connection(address, timeout=None, source_address=None):
sock = socks.socksocket()
sock.connect(address)
return sock
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, '127.0.0.1', 9050)
socket.socket = socks.socksocket
socket.create_connection = create_connection
print(urllib2.urlopen('http://nzxj65x32vh2fkhk.onion/all').read()) # Prints the URL's contents
cURL also retrieves the contents of the page correctly.
I'm using Python 2.7.13, requests 2.13.0 & PySocks 1.6.7. Tor is running through a docker container with the following command:
sudo docker run -it -p 8118:8118 -p 9050:9050 -d dperson/torproxy
What am I doing wrong here? What do I need to do to make requests recognize the .onion URLs?
The solution is to use the socks5h protocol in order to enable remote DNS resolving in case the local DNS resolving process fails. See https://github.com/kennethreitz/requests/blob/e3f89bf23c53b98593e4248054661472aacac820/requests/packages/urllib3/contrib/socks.py#L158
The following code works as expected:
import requests
session = requests.session()
session.proxies = {'http': 'socks5h://localhost:9050',
'https': 'socks5h://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
print(session.get('http://nzxj65x32vh2fkhk.onion/all').text) # Prints the contents of the page

Categories