Using that apache libloud docs and valid credentials i get the below error trying to list domains on godaddy. Does libcloud noi longer support godaddy?
>>> from libcloud.dns.types import Provider
>>> from libcloud.dns.providers import get_driver
>>> cls = get_driver(Provider.GODADDY)
>>> driver = cls('twst', 'adfadf', 'dsdfsdf')
>>> zones = driver.list_zones()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/libcloud/dns/drivers/godaddy.py", line 146, in list_zones
'/v1/domains/').object
File "/usr/local/lib/python2.7/dist-packages/libcloud/common/base.py", line 782, in request
headers=headers)
File "/usr/lib/python2.7/httplib.py", line 979, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1013, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 975, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 835, in _send_output
self.send(msg)
File "/usr/lib/python2.7/httplib.py", line 797, in send
self.connect()
File "/usr/local/lib/python2.7/dist-packages/libcloud/httplib_ssl.py", line 266, in connect
self.timeout)
File "/usr/lib/python2.7/socket.py", line 553, in create_connection
for res in getaddrinfo(host, port, 0, SOCK_STREAM):
socket.gaierror: [Errno -2] Name or service not known
>>>
It looks like there was a bug in the driver. The "host" attribute on the connection class was incorrectly set to a URL instead of a hostname.
I pushed a fix for that - https://github.com/apache/libcloud/commit/a3ba6a4751623224f16175df9175ec06b29cdc1a
You can test this change by installing latest in development version from git using pip - pip install git+https://git-wip-us.apache.org/repos/asf/libcloud.git#trunk#egg=apache-libcloud
I have confirmed the change is working locally, but if you encounter any more issues, please let us know.
from libcloud.dns.types import Provider
from libcloud.dns.providers import get_driver
cls = get_driver(Provider.GODADDY)
driver = cls('twst', 'adfadf', 'dsdfsdf')
print driver.list_zones()
...
libcloud.dns.drivers.godaddy.GoDaddyDNSException: <GoDaddyDNSException in MALFORMED_API_KEY: Malformed API key>
In addition to that, I will also go ahead and push a change so a more friendly exception is thrown in case the "host" attribute is set to a value which is not a hostname.
Related
UPDATE: I managed to do a request with urllib2, but I'm still wondering what is happening here.
I would like to do a HTTPS request with Python.
This works fine with the requests module, but I don't want to use external dependencies, so I'd like to use the standard library.
httplib
When I follow this example I don't get a response. I get a timeout instead. I'm out of ideas as to what would cause this.
Code:
import requests
print requests.get('https://python.org')
from httplib import HTTPSConnection
conn = HTTPSConnection('www.python.org')
conn.request('GET', '/index.html')
print conn.getresponse()
Output:
<Response [200]>
Traceback (most recent call last):
File "test.py", line 6, in <module>
conn.request('GET', '/index.html')
File "C:\Python27\lib\httplib.py", line 1069, in request
self._send_request(method, url, body, headers)
File "C:\Python27\lib\httplib.py", line 1109, in _send_request
self.endheaders(body)
File "C:\Python27\lib\httplib.py", line 1065, in endheaders
self._send_output(message_body)
File "C:\Python27\lib\httplib.py", line 892, in _send_output
self.send(msg)
File "C:\Python27\lib\httplib.py", line 854, in send
self.connect()
File "C:\Python27\lib\httplib.py", line 1282, in connect
HTTPConnection.connect(self)
File "C:\Python27\lib\httplib.py", line 831, in connect
self.timeout, self.source_address)
File "C:\Python27\lib\socket.py", line 575, in create_connection
raise err
socket.error: [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
urllib
This fails for a different (but possibly related) reason. Code:
import urllib
print urllib.urlopen("https://python.org")
Output:
Traceback (most recent call last):
File "test.py", line 10, in <module>
print urllib.urlopen("https://python.org")
File "C:\Python27\lib\urllib.py", line 87, in urlopen
return opener.open(url)
File "C:\Python27\lib\urllib.py", line 215, in open
return getattr(self, name)(url)
File "C:\Python27\lib\urllib.py", line 445, in open_https
h.endheaders(data)
File "C:\Python27\lib\httplib.py", line 1065, in endheaders
self._send_output(message_body)
File "C:\Python27\lib\httplib.py", line 892, in _send_output
self.send(msg)
File "C:\Python27\lib\httplib.py", line 854, in send
self.connect()
File "C:\Python27\lib\httplib.py", line 1290, in connect
server_hostname=server_hostname)
File "C:\Python27\lib\ssl.py", line 369, in wrap_socket
_context=self)
File "C:\Python27\lib\ssl.py", line 599, in __init__
self.do_handshake()
File "C:\Python27\lib\ssl.py", line 828, in do_handshake
self._sslobj.do_handshake()
IOError: [Errno socket error] [SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:727)
What is requests doing that makes it succeed where both of these libraries fail?
requests.get without timeout parameter mean no timeout at all.
httplib.HTTPSConnection accept parameter timeout in Python 2.6 and newer according to httplib docs. If your problem was caused by timeout, setting high enough timeout should help. Please try replacing:
conn = HTTPSConnection('www.python.org')
with:
conn = HTTPSConnection('www.python.org', timeout=300)
which will give 300 seconds (5 minutes) for processing.
I am aware there is a patch for using the REST API with a proxy and it works for me. But the Streaming API uses a HTTPConnection which cannot be emulated by urllib and urllib2 (as far as I know). Is there any fix for this?
I tried using proxy with port, but it did not work.
In the streaming.py file, line 153.
if self.scheme == "http":
conn = httplib.HTTPConnection(self.api.proxy_host,self.api.proxy_port, timeout=self.timeout)
else:
conn = httplib.HTTPSConnection(self.api.proxy_host,self.api.proxy_port,timeout=self.timeout)
self.auth.apply_auth(url, 'POST', self.headers, self.parameters)
print conn.host
conn.connect()
conn.request('POST', self.scheme+'://'+self.host+self.url, self.body, headers=self.headers)
resp = conn.getresponse()
And, "self.scheme+'://'+self.host+ self.url" corresponds to - https://stream.twitter.com/1.1/statuses/filter.json?delimited=length
I get this error in return -
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/IPython/core/interactiveshell.py", line 2538, in run_code
exec code_obj in self.user_global_ns, self.user_ns
File "<ipython-input-3-4798d330f7cd>", line 1, in <module>
execfile('main.py')
File "main.py", line 130, in <module>
streamer.filter(track = ['AAP'])
File "/usr/local/lib/python2.7/dist-packages/tweepy/streaming.py", line 305, in filter
self._start(async)
File "/usr/local/lib/python2.7/dist-packages/tweepy/streaming.py", line 242, in _start
self._run()
File "/usr/local/lib/python2.7/dist-packages/tweepy/streaming.py", line 159, in _run
conn.connect()
File "/usr/lib/python2.7/httplib.py", line 1161, in connect
self.sock = ssl.wrap_socket(sock, self.key_file, self.cert_file)
File "/usr/lib/python2.7/ssl.py", line 381, in wrap_socket
ciphers=ciphers)
File "/usr/lib/python2.7/ssl.py", line 143, in __init__
self.do_handshake()
File "/usr/lib/python2.7/ssl.py", line 305, in do_handshake
self._sslobj.do_handshake()
SSLError: [Errno 1] _ssl.c:504: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
Tweepy currently does not support using a proxy. However, support for proxies should be coming in the next few days. I'll update this answer with more information.
Finally found a github patch that enabled streaming api via proxy. It works just fine!
https://github.com/shogo82148/tweepy/blob/64c6266018920e0e36c6d8d1600adb6caa0840de/tweepy/streaming.py
I have a script which get HTTP Header of a lot of pages on Internet with httplib in Python.
My problem is on a specific domain (and probably others), httplib raise an exception, and I don't understand why.
>>> import httplib
>>> http = httplib.HTTPConnection('iswtc.la')
>>> http.request('GET', '/a')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python2.6/httplib.py", line 914, in request
self._send_request(method, url, body, headers)
File "/usr/lib64/python2.6/httplib.py", line 951, in _send_request
self.endheaders()
File "/usr/lib64/python2.6/httplib.py", line 908, in endheaders
self._send_output()
File "/usr/lib64/python2.6/httplib.py", line 780, in _send_output
self.send(msg)
File "/usr/lib64/python2.6/httplib.py", line 739, in send
self.connect()
File "/usr/lib64/python2.6/httplib.py", line 720, in connect
self.timeout)
File "/usr/lib64/python2.6/socket.py", line 553, in create_connection
for res in getaddrinfo(host, port, 0, SOCK_STREAM):
socket.gaierror: [Errno -2] Name or service not known
What is different on this specific domain, and how can I handle this ?
PS : It's not really my code because this works fine :
>>> http = httplib.HTTPConnection('bit.ly')
>>> http.request('GET', '/a')
bit.ly exists, whereas iswtc.la doesn't:
$ nslookup bit.ly
Non-authoritative answer:
Name: bit.ly
Address: 69.58.188.39
Name: bit.ly
Address: 69.58.188.40
$ nslookup iswtc.la
** server can't find iswtc.la: NXDOMAIN
I'm using gdata to map YouTube URLs to video titles, using the following code:
import gdata.youtube.service as youtube
import re
import queue
import urlparse
ytservice = youtube.YouTubeService()
ytservice.ssl = True
ytservice.developer_key = '' # snip
class youtube(mediaplugin):
def __init__(self, parsed_url):
self.url = parsed_url
self.video_id = urlparse.parse_qs(parsed_url.query)['v'][0]
self.ytdata = ytservice.GetYouTubeVideoEntry(self.video_id)
print self.ytdata
I get the following socket exception when calling service.GetYouTubeVideoEntry():
File "/Users/haldean/Documents/qpi/qpi/media.py", line 21, in __init__
self.ytdata = ytservice.GetYouTubeVideoEntry(self.video_id)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/gdata/youtube/service.py", line 210, in GetYouTubeVideoEntry
return self.Get(uri, converter=gdata.youtube.YouTubeVideoEntryFromString)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/gdata/service.py", line 1069, in Get
headers=extra_headers)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/atom/__init__.py", line 93, in optional_warn_function
return f(*args, **kwargs)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/atom/service.py", line 186, in request
data=data, headers=all_headers)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/atom/http_interface.py", line 148, in perform_request
return http_client.request(operation, url, data=data, headers=headers)
File "/Users/haldean/Documents/qpi/lib/python2.7/site-packages/atom/http.py", line 163, in request
connection.endheaders()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 937, in endheaders
self._send_output(message_body)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 797, in _send_output
self.send(msg)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 759, in send
self.connect()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 1140, in connect
self.timeout, self.source_address)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 553, in create_connection
for res in getaddrinfo(host, port, 0, SOCK_STREAM):
gaierror: [Errno 8] nodename nor servname provided, or not known
I'm at a loss as to how to even begin debugging this. Any ideas appreciated. Thanks!
Edit:
In response to a question asked in comments, video_id is qh-mwjF-OMo and parsed_url is:
ParseResult(scheme=u'http', netloc=u'www.youtube.com', path=u'/watch', params='', query=u'v=qh-mwjF-OMo&feature=g-user-u', fragment='')
My mistake was that the video_id should be passed as a keyword parameter, like so:
self.ytdata = ytservice.GetYouTubeVideoEntry(video_id=self.video_id)
It seems that the socket exception is the only layer of gdata that will throw an exception; it tries to get a URL blindly based on the arguments and it only fails when the URL fetch fails.
I'm getting an error when ever I try to pull down a web page with urllib.urlopen. I've disabled windows firewall and my AV so its not that. I can access the pages in my browser. I even reinstalled python to rule out it being a broken urllib. Any help would be greatly appreciated.
>>> import urllib
>>> h = urllib.urlopen("http://www.google.com").read()
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
h = urllib.urlopen("http://www.google.com").read()
File "C:\Python26\lib\urllib.py", line 86, in urlopen
return opener.open(url)
File "C:\Python26\lib\urllib.py", line 205, in open
return getattr(self, name)(url)
File "C:\Python26\lib\urllib.py", line 344, in open_http
h.endheaders()
File "C:\Python26\lib\httplib.py", line 904, in endheaders
self._send_output()
File "C:\Python26\lib\httplib.py", line 776, in _send_output
self.send(msg)
File "C:\Python26\lib\httplib.py", line 735, in send
self.connect()
File "C:\Python26\lib\httplib.py", line 716, in connect
self.timeout)
File "C:\Python26\lib\socket.py", line 514, in create_connection
raise error, msg
IOError: [Errno socket error] [Errno 10061] No connection could be made because the target machine actively refused it
>>>
this could be the case:
Just found the problem I had set a
proxy through internet options, that
proxy went offline, and so did my
python shell.
urllib is working just fine.
Try using ethereal (or some similar network sniffer) on your box to determine if the denial coming from your machine or a machine beyond.