I'm attempting to write a program that grabs data from my password protected gradebook and analyzes it for me because my university's gradebook doesn't automatically calculate averages. I'm running into issues at the very beginning of my program and it's growing more and more frustrating. I'm running on Python 2.7.9.
This is the Code.
import logging
import requests
import re
url = "https://learn.ou.edu/d2l/m/login"
s = requests.session()
r = s.get(url,verify = False)
This is the error that is occurring.
Traceback (most recent call last):
File "/Users/jackson/Desktop/untitled folder/Grade Calculator.py", line 7, in <module>
r = s.get(url,verify = False)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 473, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 461, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 431, in send
raise SSLError(e, request=request)
SSLError: EOF occurred in violation of protocol (_ssl.c:581)
Even weirder, this only happens with the gradebook URL. When I use a different URL such as "http://login.live.com", I get this error.
Warning (from warnings module):
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 734
InsecureRequestWarning)
InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
Does anybody know what I could to to fix this issue? Thanks, Jackson.
Requests does not support so you need subclass the HTTPAdapter
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
import ssl
class MyAdapter(HTTPAdapter):
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize,
block=block,
ssl_version=ssl.PROTOCOL_TLSv1)
import logging
import requests
import re
url = "https://learn.ou.edu/d2l/m/login"
s = requests.session()
s.mount('https://', MyAdapter())
r = s.get(url,verify = False)
print r.status_code
Gives status code:
200
This is answered here
Related
Basically i get this error with every single https proxy I've tried on every website.
Code:
import requests
endpoint = 'https://ipinfo.io/json'
proxies = {'http':'http://45.167.95.184:8085','https':'https://45.167.95.184:8085'}
r = requests.get(endpoint,proxies=proxies,timeout=10)
Error:
Traceback (most recent call last):
File "<pyshell#5>", line 1, in <module>
r = requests.get(endpoint,proxies=proxies,timeout=10)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 529, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 645, in send
r = adapter.send(request, **kwargs)
File "C:\Users\Utente\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='ipinfo.io', port=443): Max retries exceeded with url: /json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x000001FDCFB9E6A0>, 'Connection to 45.167.95.184 timed out. (connect timeout=10)'))
And when I only use http
import requests
endpoint = 'https://ipinfo.io/json'
proxies = {'http':'http://45.167.95.184:8085'}
r = requests.get(endpoint,proxies=proxies,timeout=10)
The request is sent but websites that return public ips show my real ip. Both requests (2.27.1) and urllib3 (1.26.8) are update to their lastest versions, what could the issue be?
I have build new openstack based on Queens release and now trying to learn python SDK api script and stuck here not sure what is wrong.
my script
from keystoneclient.auth.identity import v3
from keystoneclient import session
from keystoneclient.v3 import client
auth_url = 'http://172.28.0.9:5000/v3'
username = 'dev'
user_domain_name = 'Default'
project_name = 'dev'
project_domain_name = 'Default'
password = 'Password123'
auth = v3.Password(auth_url=auth_url,
username=username,
password=password,
project_id='344506541fd94f068d25990af3eff4b8',
user_domain_name=user_domain_name)
sess = session.Session(auth=auth)
keystone = client.Client(session=sess)
#keystone.projects.list()
from novaclient import client
nova = client.Client(2, session=keystone.session)
nova.flavors.list()
Getting error
[dev#openstack ~]$ python /tmp/myscript.py
/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py:200: UserWarning: Using keystoneclient sessions has been deprecated. Please update your software to use keystoneauth1.
warnings.warn('Using keystoneclient sessions has been deprecated. '
Traceback (most recent call last):
File "/tmp/o.py", line 22, in <module>
nova.flavors.list()
File "/usr/lib/python2.7/site-packages/novaclient/v2/flavors.py", line 145, in list
return self._list("/flavors%s" % detail, "flavors", filters=qparams)
File "/usr/lib/python2.7/site-packages/novaclient/base.py", line 257, in _list
resp, body = self.api.client.get(url)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 328, in get
return self.request(url, 'GET', **kwargs)
File "/usr/lib/python2.7/site-packages/novaclient/client.py", line 77, in request
**kwargs)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 487, in request
resp = super(LegacyJsonAdapter, self).request(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 213, in request
return self.session.request(url, method, **kwargs)
File "/usr/lib/python2.7/site-packages/keystoneclient/session.py", line 428, in request
resp = send(**kwargs)
File "/usr/lib/python2.7/site-packages/keystoneclient/session.py", line 466, in _send_request
raise exceptions.SSLError(msg)
keystoneauth1.exceptions.connection.SSLError: SSL exception connecting to https://10.30.2.9:8774/v2.1/flavors/detail: HTTPSConnectionPool(host='10.30.2.9', port=8774): Max retries exceeded with url: /v2.1/flavors/detail (Caused by SSLError(SSLError("bad handshake: SysCallError(104, 'ECONNRESET')",),))
10.30.2.9 is my F5 load-balancer and i did configured 8774 correctly route to nova api server.
my horizon GUI working fine and also all openstack commands working fine without error.
verify=False helps:
keystone = client.Client(session=sess, verify=False)
You need to include the certificate for SSL.
sess = session.Session(auth=auth, verify=path_to_certificate)
keystone = keystoneclient.Client(session=sess)
Perhaps there is an option to disable SSL.
I am using Python 2.7.3 on a shared FreeBSD host.
The installed version of python requests module is 2.11.1.
import json
import requests
from requests.auth import HTTPBasicAuth
requests.packages.urllib3.disable_warnings()
s = requests.Session()
s.server = "dns-api.company.net"
s.auth = HTTPBasicAuth('user', 'pass')
s.headers = {'User-Agent':'DNS-Client'}
s.verify = False
r = s.get('https://dns-api.company.net/query') //Not the actual url
As you can see I am setting verify to False. Yet I get the following SSL error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 477, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 431, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno bad handshake] [('SSL routines', 'SSL23_GET_SERVER_HELLO', '')]
I have tried the following variations but to no avail as I end up with the same error.
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
r = requests.get("https://dns-api.company.net/query", verify=False, auth=('user','pass')) // Dummy URL
I don't care about SSL verification. What am I doing wrong here?
How to escape # character in the password of proxy. So that python can create the request correctly. I have tried \\ but still not able to hit the url correctly.
proxy = {
"http": "http://UserName:PassWord#X.X.X.X:Port_No"
}
Update question:
I am using python requests module for the http request. It split the string (to get host) from first occurrence of # where as it was suppose to split from second #.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 327, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='XXXXXXXX#X.X.X.X', port=XXXXX): Max retries exceeded with url: http:/URL (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
You have to do urlencoding like in this post:
Escaping username characters in basic auth URLs
This way the # in the PW becomes %40
You don't mention which library you are using to perform your HTTP requests, so you should consider using requests, not only to solve this problem, but because it is a great library.
Here is how to use a proxy with basic authentication:
import requests
proxy = {'http': 'http://UserName:PassWord#X.X.X.X:Port_No'}
r = requests.get("http://whereever.com", proxies=proxy)
Update
Successfully tested with requests and proxy URLs:
http://UserName:PassWord#127.0.0.1:1234
http://UserName:PassWord##127.0.0.1:1234
http://User#Name:PassWord#1234#127.0.0.1:1234
If, instead, you need to use Python's urllib2 library, you can do this:
import urllib2
handler = urllib2.ProxyHandler({'http': 'http://UserName:PassWord#X.X.X.X:Port_No'})
opener = urllib2.build_opener(handler)
r = opener.open('http://whereever.com')
Note that in neither case is it necessary to escape the #.
A third option is to set environment variables HTTP_PROXY and/or HTTPS_PROXY (in *nix).
I want to send some HTTP requests to Twitter in Python in order to create a sign in for Twitter users for my app. I am using urllib, and following this link: https://dev.twitter.com/web/sign-in/implementing.
But I am unable to do this. I guess I need to authenticate before requesting a token but I don't know how to do that.
Code:
import urllib.request
req = urllib.request.Request("https://api.twitter.com/oauth/authenticate",
headers={'User-Agent': 'Mozilla/5.0'})
html = urllib.request.urlopen(req).read() //after this statement im
getting the error
Error:
Traceback (most recent call last):
File "<pyshell#5>", line 1, in <module>
html = urllib.request.urlopen(req).read()
File "C:\Python34\lib\urllib\request.py", line 161, in urlopen
return opener.open(url, data, timeout)
File "C:\Python34\lib\urllib\request.py", line 469, in open
response = meth(req, response)
File "C:\Python34\lib\urllib\request.py", line 579, in http_response
'http', request, response, code, msg, hdrs)
File "C:\Python34\lib\urllib\request.py", line 507, in error
return self._call_chain(*args)
File "C:\Python34\lib\urllib\request.py", line 441, in _call_chain
result = func(*args)
File "C:\Python34\lib\urllib\request.py", line 587, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
If you go to the url with a browser it shows you that you need a key:
Whoa there!
There is no request token for this page. That's the special key we need from applications asking to use your Twitter account. Please go back to the site or application that sent you here and try again; it was probably just a mistake.
If you go to this link it will let you choose one of your apps and
it will bring you to a signature-generator that will show you the request settings.
To get a request_token you can use requests_oauthlib:
import requests
from requests_oauthlib import OAuth1
REQUEST_TOKEN_URL = "https://api.twitter.com/oauth/request_token"
CONSUMER_KEY = "xxxxxxxx
CONSUMER_SECRET = "xxxxxxxxxxxxxxxxx"
oauth = OAuth1(CONSUMER_KEY, client_secret=CONSUMER_SECRET)
r = requests.post(url=REQUEST_TOKEN_URL, auth=oauth)
print(r.content)
oauth_token=xxxxxxxxxxxxxx&oauth_token_secret=xxxxxxxxxxx&oauth_callback_confirmed=true
You then need to extract the oauth_token oauth_token_secret:
from urlparse import parse_qs
import webbrowser
data = parse_qs(r.content)
oauth_token = data['oauth_token'][0]
oauth_token_secret = data['oauth_token_secret'][0]
AUTH = "https://api.twitter.com/oauth/authorize?oauth_token={}"
auth = AUTH.format(oauth_token)
webbrowser.open(auth)
A webpage will open asking to Authorize your_app to use your account?
For python 3 use:
from urllib.parse import parse_qs
data = parse_qs(r.text)
oauth_token = data['oauth_token'][0]
oauth_token_secret = data['oauth_token_secret'][0]