cURL to Python: Connection error when using requests module - python

I want to move my bash code which uses a cURL command to a Python 2.7 script.
The cURL working command is:
$ curl --data "vm_id='52e4130d-ffe0-495a-87c0-fc84200252ed'&gpu_ip='10.2.0.22'&gpu_port='8308'&mock_ip='10.254.254.254'&mock_port='8308'" http://rodvr-services:8080/rodvr-assign_gpu
And my Python script contains this:
import requests
import requests.packages.urllib3
requests.packages.urllib3.disable_warnings()
payload = {'vm_id': '52e4130d-ffe0-495a-87c0-fc84200252ed', 'gpu_ip': '10.2.0.22', 'gpu_port': '8308', 'mock_ip': '10.254.254.254', 'mock_port': '8308'}
r = requests.get('http://rodvr-services:8080/rodvr-assign_gpu', params=payload)
When I execute the script, I get the following error:
$ python exec.py
Traceback (most recent call last):
File "exec.py", line 9, in <module>
r = requests.post('http://rodvr-services:8080/rodvr-assign_gpu', params=payload)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 112, in post
return request('post', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 502, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 612, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 490, in send
raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine('\n',))
Just in case, I checked what would happen using Python 3, and this is the output:
HTTPConnectionPool(host='rodvr-services', port=8080): Max retries exceeded with url: /rodvr-assign_gpu?mock_ip=10.254.254.254&vm_id=52e4130d-ffe0-495a-87c0-fc84200252ed&gpu_ip=10.2.0.22&mock_port=8308&gpu_port=8308 (Caused by <class 'http.client.BadStatusLine'>:
However, using the urllib2 library, it works:
data = "vm_id='52e4130d-ffe0-495a-87c0-fc84200252ed'&gpu_ip='10.2.0.22'&gpu_port='8308'&mock_ip='10.254.254.254'&mock_port='8308'"
r = urllib2.Request(url='http://rodvr-services:8080/rodvr-assign_gpu', data=data)
f = urllib2.urlopen(r)
print f.read()

Try r = requests.post('http://rodvr-services:8080/rodvr-assign_gpu', data=payload)

This website helps you to convert your curl command to python code.
You can see the code suggested by that website below:
import requests
data = [
('vm_id', '\'52e4130d-ffe0-495a-87c0-fc84200252ed\''),
('gpu_ip', '\'10.2.0.22\''),
('gpu_port', '\'8308\''),
('mock_ip', '\'10.254.254.254\''),
('mock_port', '\'8308\''),
]
requests.post('http://rodvr-services:8080/rodvr-assign_gpu', data=data)
# it is slightly different from your code
Due to my personal problems with my laptop, I can't test your code. hope this works for you.

Related

Python requests module timeouts with every https proxy and uses my real ip with http proxies

Basically i get this error with every single https proxy I've tried on every website.
Code:
import requests
endpoint = 'https://ipinfo.io/json'
proxies = {'http':'http://45.167.95.184:8085','https':'https://45.167.95.184:8085'}
r = requests.get(endpoint,proxies=proxies,timeout=10)
Error:
Traceback (most recent call last):
File "<pyshell#5>", line 1, in <module>
r = requests.get(endpoint,proxies=proxies,timeout=10)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 529, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 645, in send
r = adapter.send(request, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='ipinfo.io', port=443): Max retries exceeded with url: /json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x000001FDCFB9E6A0>, 'Connection to 45.167.95.184 timed out. (connect timeout=10)'))
And when I only use http
import requests
endpoint = 'https://ipinfo.io/json'
proxies = {'http':'http://45.167.95.184:8085'}
r = requests.get(endpoint,proxies=proxies,timeout=10)
The request is sent but websites that return public ips show my real ip. Both requests (2.27.1) and urllib3 (1.26.8) are update to their lastest versions, what could the issue be?

requests.get crashes on certain urls

import requests
r = requests.get('https://www.whosampled.com/search/?q=marvin+gaye')`
This returns the following error
Traceback (most recent call last):
File "C:\Users\thoma\Downloads\RealisticMellowProfile\Python\New folder\Term project demo.py", line 8, in <module>
r = requests.get('https://www.whosampled.com/search/?q=marvin+gaye')
File "c:\users\thoma\miniconda3\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "c:\users\thoma\miniconda3\lib\site-packages\requests\api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "c:\users\thoma\miniconda3\lib\site-packages\requests\sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "c:\users\thoma\miniconda3\lib\site-packages\requests\sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "c:\users\thoma\miniconda3\lib\site-packages\requests\adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
You can change the user agent so the server does not close the connection:
import requests
headers = {"User-Agent": "Mozilla/5.0"}
r = requests.get('https://www.whosampled.com/search/?q=marvin+gaye', headers=headers)
The url is broken (or the server serving this url)
Try to get it with
wget https://www.whosampled.com/search/?q=marvin+gaye
or with
curl https://www.whosampled.com/search/?q=marvin+gaye
Use try / except to handle such situations.
However you wnat be able to gat data from it (same as with wget or curl)
import requests
try:
r = requests.get('https://www.whosampled.com/search/?q=marvin+gaye')`
except requests.exceptions.ConnectionError:
print("can't get data from this server")
r = None
if r is not None:
# handle succesful request
else:
# handler error situation

Imitating curl command with python requests

I run curl command, something like this:
curl --tlsv1.2 -k -i -X POST -d 'payload={<json-payload>}' https://url.com:/handles/handle1
It was working perfectly. Now I need to imitate this in python. Referring to this solution, I tried running this in python console:
>>> import requests
>>> data = 'payload={<json-payload>}'
>>> headers = {'Content-type':'application/json'}
>>> response = requests.post('https://url.com:/handles/handle',headers=headers,data=data)
But getting following error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/requests/api.py", line 116, inpost
return request('post', url, data=data, json=json, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='url.com', port=443): Max retries exceeded with url: /handles/handle (Caused by SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:618)'),))
How can I resolve this?
For ignoring TLS errors, like -k (--insecure) in curl, you need to use the verify=False paramter.
And to pass the POST data, use a dict:
data = {'payload': <json-payload>}
Now your request becomes:
requests.post('https://url.com:/handles/handle', headers=headers, data=data, verify=False)
If you want your POST data to be JSON serialized, use the json parameter instead of data:
requests.post('https://url.com:/handles/handle', headers=headers, json=data, verify=False)

Http Request through proxy in python having # in password

How to escape # character in the password of proxy. So that python can create the request correctly. I have tried \\ but still not able to hit the url correctly.
proxy = {
"http": "http://UserName:PassWord#X.X.X.X:Port_No"
}
Update question:
I am using python requests module for the http request. It split the string (to get host) from first occurrence of # where as it was suppose to split from second #.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 327, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='XXXXXXXX#X.X.X.X', port=XXXXX): Max retries exceeded with url: http:/URL (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
You have to do urlencoding like in this post:
Escaping username characters in basic auth URLs
This way the # in the PW becomes %40
You don't mention which library you are using to perform your HTTP requests, so you should consider using requests, not only to solve this problem, but because it is a great library.
Here is how to use a proxy with basic authentication:
import requests
proxy = {'http': 'http://UserName:PassWord#X.X.X.X:Port_No'}
r = requests.get("http://whereever.com", proxies=proxy)
Update
Successfully tested with requests and proxy URLs:
http://UserName:PassWord#127.0.0.1:1234
http://UserName:PassWord##127.0.0.1:1234
http://User#Name:PassWord#1234#127.0.0.1:1234
If, instead, you need to use Python's urllib2 library, you can do this:
import urllib2
handler = urllib2.ProxyHandler({'http': 'http://UserName:PassWord#X.X.X.X:Port_No'})
opener = urllib2.build_opener(handler)
r = opener.open('http://whereever.com')
Note that in neither case is it necessary to escape the #.
A third option is to set environment variables HTTP_PROXY and/or HTTPS_PROXY (in *nix).

Get context type of requested url using python

I am trying to get headers of url using python using http://docs.python-requests.org/en/latest/ this tutorial. I am trying following code in python idle , I am getting following error,
>>> import requests
>>> r = requests.get('https://api.github.com/user')
Traceback (most recent call last):
File "<pyshell#32>", line 1, in <module>
r = requests.get('https://api.github.com/user')
File "C:\Python27\lib\site-packages\requests-2.3.0-py2.7.egg\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests-2.3.0-py2.7.egg\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests-2.3.0-py2.7.egg\requests\sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests-2.3.0-py2.7.egg\requests\sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests-2.3.0-py2.7.egg\requests\adapters.py", line 375, in send
raise ConnectionError(e, request=request)
ConnectionError: HTTPSConnectionPool(host='api.github.com', port=443): Max retries exceeded with url: /user (Caused by <class 'socket.error'>: [Errno 10013] An attempt was made to access a socket in a way forbidden by its access permissions)
Looks like github is denying you access to the requested page. Before attempting to request pages in python try typing them into the browser to see what is returned. When I did this I was returned some JSON stating
{
"message": "Requires authentication",
"documentation_url": "https://developer.github.com/v3"
}
If you want to test your code and find headers of a webpage, try a publicly accessible webpage before delving into APIs.

Categories