I am trying to use Python to get a JSON file from the Web. If I open the URL in my browser (Mozilla or Chromium) I do see the JSON. But when I do the following with the Python:
response = urllib2.urlopen(url)
data = json.loads(response.read())
I get an error message that tells me the following (after translation in English): Errno 10060, a connection troughs an error, since the server after a certain time period did not react, or the connection was erroneous, or the host did not react.
ADDED
It looks like there are many people who faced the described problem. There are also some answers to the similar (or the same) question. For example here we can see the following solution:
import requests
r = requests.get("http://www.google.com", proxies={"http": "http://61.233.25.166:80"})
print(r.text)
It is already a step forward for me (I think that it is very likely that the proxy is the reason of the problem). However, I still did not get it done since I do not know URL of my proxy and I probably will need user name and password. Howe can I find them? How did it happen that my browsers have them I do not?
ADDED 2
I think I am now one step further. I have used this site to find out what my proxy is: http://www.whatismyproxy.com/
Then I have used the following code:
proxies = {'http':'my_proxy.blabla.com/'}
r = requests.get(url, proxies = proxies)
print r
As a result I get
<Response [404]>
Looks not so good, but at least I think that my proxy is correct, because when I randomly change the address of the proxy I get another error:
Cannot connect to proxy
So, I can connect to proxy but something is not found.
I think there might be something wrong, when you're trying to get the json from the online source(URL). Just to make things clear, here is a small code snippet
#!/usr/bin/env python
try:
# For Python 3+
from urllib.request import urlopen
except ImportError:
# For Python 2
from urllib2 import urlopen
import json
def get_jsonparsed_data(url):
response = urlopen(url)
data = str(response.read())
return json.loads(data)
If you still get a connection error, You can try a couple of steps:
Try to urlopen() a random site from the Interpreter (Interactive Mode). If you are able to grab the source code you're good. If not check internet conditions or try the request module. Check here
Check and see if the json in the URL is in the correct syntax. For sample json syntax check here
Try the simplejson module.
Edit 1:
if you want to access websites using a system wide proxy you will have to use a proxy handler to use loopback(local host) to connect to that proxy.. A sample code is shown below.
proxy = urllib2.ProxyHandler({
'http': '127.0.0.1',
'https': '127.0.0.1'
})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)
# this way you can send both http and https request using proxies
urllib2.urlopen('http://www.google.com')
urllib2.urlopen('https://www.google.com')
I have not not worked a lot with ProxyHandler. I just know the theory and code. I am sure there are better ways to access websites through proxies; One which does not involve installing the opener everytime you run the program. But hopefully it will point you in the right direction.
I'm trying to connect to TestLink via the xmlrpc API. I've set the following in TestLink's config.inc.php:
$tlCfg->api->enabled = TRUE;
$tlCfg->exec_cfg->enabled_test_automation = ENABLED;
and restarted the apache sever. I tried to connect the TestLink server via the python package TestLink-API-Python-client (https://github.com/orenault/TestLink-API-Python-client)
from testlink import TestlinkAPIClient, TestLinkHelper
import sys
URL = 'http://MYSERVER/testlink/lib/api/xmlrpc.php'
DevKey = 'MYKEY'
tl_helper = TestLinkHelper()
myTestLink = tl_helper.connect(TestlinkAPIClient)
myTestLink.__init__(URL, DEVKEY)
myTestLink.checkDevKey()
And then I receive a TLConnectionError, stating my url, and 404 Not Found...
Does anyone have any idea?
Thanks.
I didn't solve it.
I reverted to working on the TestLink DB directly. I'm sure it's more fragile than using the API, but it works...
If you are still looking for help, this code worked for me:
set TESTLINK_API_PYTHON_SERVER_URL=http://[YOURSERVER]/testlink/lib/api/xmlrpc/v1/xmlrpc.php
set TESTLINK_API_PYTHON_DEVKEY=[Users devKey generated by TestLink]
python
import testlink
tls = testlink.TestLinkHelper().connect(testlink.TestlinkAPIClient)
tls.countProjects()
Check out TestLink API Documentation to learn more
In a glance your XML-RPC URL seems wrong. It should be
http://YOURSERVER/testlink/lib/api/xmlrpc/v1/xmlrpc.php
I'm trying to get rid of an exception HTTPException('ApplicationError: 5 ',) I get when using httplib in a python27 API (running on google appengine) - further detailed in this post ApplicationError2 and ApplicationError5 when communicating with external api from AppEngine . I thought that I could perhaps instead try using httplib2. The only part of the API which makes a call to httplib that I can see is:
def _get_conn(self):
return httplib.HTTPConnection(str(self.host), str(self.port), timeout=120)
Is there a direct equivalent to httplib.HTTPConnection() in httplib2? I've had a search but cannot find anything.
It seems there is, see AppEngineHttpConnection in http2 source code.
However, AFAIK those are not part of the official httplib2 API as shown in their documentation, you'd rather do something like:
import httplib2
h = httplib2.Http()
resp, content = h.request("http://bitworking.org/")
assert resp.status == 200
assert resp['content-type'] == 'text/html'
Have you considered using Request library, it is getting a lot of good press recently.
Looking for a better way to get a machines current external IP #... Below works, but would rather not rely on an outside site to gather the information ... I am restricted to using standard Python 2.5.1 libraries bundled with Mac OS X 10.5.x
import os
import urllib2
def check_in():
fqn = os.uname()[1]
ext_ip = urllib2.urlopen('http://whatismyip.org').read()
print ("Asset: %s " % fqn, "Checking in from IP#: %s " % ext_ip)
I liked the http://ipify.org. They even provide Python code for using their API.
# This example requires the requests library be installed. You can learn more
# about the Requests library here: http://docs.python-requests.org/en/latest/
from requests import get
ip = get('https://api.ipify.org').content.decode('utf8')
print('My public IP address is: {}'.format(ip))
Python3, using nothing else but the standard library
As mentioned before, one can use an external service like ident.me in order to discover the external IP address of your router.
Here is how it is done with python3, using nothing else but the standard library:
import urllib.request
external_ip = urllib.request.urlopen('https://ident.me').read().decode('utf8')
print(external_ip)
Both IPv4 and IPv6 addresses can be returned, based on availability and client preference; use https://v4.ident.me/ for IPv4 only, or https://v6.ident.me/ for IPv6 only.
If you are behind a router which obtains the external IP, I'm afraid you have no other option but to use external service like you do. If the router itself has some query interface, you can use it, but the solution will be very environment-specific and unreliable.
You should use the UPnP protocol to query your router for this information. Most importantly, this does not rely on an external service, which all the other answers to this question seem to suggest.
There's a Python library called miniupnp which can do this, see e.g. miniupnpc/testupnpigd.py.
pip install miniupnpc
Based on their example you should be able to do something like this:
import miniupnpc
u = miniupnpc.UPnP()
u.discoverdelay = 200
u.discover()
u.selectigd()
print('external ip address: {}'.format(u.externalipaddress()))
I prefer this Amazon AWS endpoint:
import requests
ip = requests.get('https://checkip.amazonaws.com').text.strip()
In my opinion the simplest solution is
import requests
f = requests.request('GET', 'http://myip.dnsomatic.com')
ip = f.text
Thats all.
Use requests module:
import requests
myip = requests.get('https://www.wikipedia.org').headers['X-Client-IP']
print("\n[+] Public IP: "+myip)
As simple as running this in Python3:
import os
externalIP = os.popen('curl -s ifconfig.me').readline()
print(externalIP)
Try:
import requests
ip = requests.get('http://ipinfo.io/json').json()['ip']
Hope this is helpful
If you don't want to use external services (IP websites, etc.) You can use the UPnP Protocol.
Do to that we use a simple UPnP client library (https://github.com/flyte/upnpclient)
Install:
pip install upnpclient
Simple Code:
import upnpclient
devices = upnpclient.discover()
if(len(devices) > 0):
externalIP = devices[0].WANIPConn1.GetExternalIPAddress()
print(externalIP)
else:
print('No Connected network interface detected')
Full Code (to get more information as mentioned in the github readme)
In [1]: import upnpclient
In [2]: devices = upnpclient.discover()
In [3]: devices
Out[3]:
[<Device 'OpenWRT router'>,
<Device 'Harmony Hub'>,
<Device 'walternate: root'>]
In [4]: d = devices[0]
In [5]: d.WANIPConn1.GetStatusInfo()
Out[5]:
{'NewConnectionStatus': 'Connected',
'NewLastConnectionError': 'ERROR_NONE',
'NewUptime': 14851479}
In [6]: d.WANIPConn1.GetNATRSIPStatus()
Out[6]: {'NewNATEnabled': True, 'NewRSIPAvailable': False}
In [7]: d.WANIPConn1.GetExternalIPAddress()
Out[7]: {'NewExternalIPAddress': '123.123.123.123'}
If you think and external source is too unreliable, you could pool a few different services. For most ip lookup pages they require you to scrape html, but a few of them that have created lean pages for scripts like yours - also so they can reduce the hits on their sites:
automation.whatismyip.com/n09230945.asp (Update: whatismyip has taken this service down)
whatismyip.org
I use IPGrab because it's easy to remember:
# This example requires the requests library be installed. You can learn more
# about the Requests library here: http://docs.python-requests.org/en/latest/
from requests import get
ip = get('http://ipgrab.io').text
print('My public IP address is: {}'.format(ip))
There are a few other ways that do not rely on Python checking an external web site, however the OS can. Your primary issue here, is that even if you were not using Python, if you were using the command line, there are no "built-in" commands that can just simply tell you the external (WAN) IP. Commands such as "ip addr show" and "ifconfig -a" show you the server's IP address's within the network. Only the router actually holds the external IP. However, there are ways to find the external IP address (WAN IP) from the command line.
These examples are:
http://ipecho.net/plain ; echo
curl ipinfo.io/ip
dig +short myip.opendns.com #resolver1.opendns.com
dig TXT +short o-o.myaddr.l.google.com #ns1.google.com
Therefore, the python code would be:
import os
ip = os.popen('wget -qO- http://ipecho.net/plain ; echo').readlines(-1)[0].strip()
print ip
OR
import os
iN, out, err = os.popen3('curl ipinfo.io/ip')
iN.close() ; err.close()
ip = out.read().strip()
print ip
OR
import os
ip = os.popen('dig +short myip.opendns.com #resolver1.opendns.com').readlines(-1)[0].strip()
print ip
Or, plug any other of the examples above, into a command like os.popen, os.popen2, os.popen3, or os.system.
P.S. you can use "pip3 install pytis" and use/take a look at, the "getip" program, written in Python.
You can also find it's code here: https://github.com/PyTis/PyTis/blob/development/src/pytis/getip.py
I tried most of the other answers on this question here and came to find that most of the services used were defunct except one.
Here is a script that should do the trick and download only a minimal amount of information:
#!/usr/bin/env python
import urllib
import re
def get_external_ip():
site = urllib.urlopen("http://checkip.dyndns.org/").read()
grab = re.findall('([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)', site)
address = grab[0]
return address
if __name__ == '__main__':
print( get_external_ip() )
import requests
import re
def getMyExtIp():
try:
res = requests.get("http://whatismyip.org")
myIp = re.compile('(\d{1,3}\.){3}\d{1,3}').search(res.text).group()
if myIp != "":
return myIp
except:
pass
return "n/a"
Unfortunately, there is no way to get your external IP address without consulting a computer on the internet. At best, you can get the local network IP address of your network card (which is likely a 192.16.. address).
You can use the whatismyip module to get the external IP addres. It has no dependencies outside the Python 3 standard library. It connects to public STUN servers and what-is-my-ip websites to find the IPv4 or IPv6 address. Run pip install whatismyip
Example:
>>> import whatismyip
>>> whatismyip.amionline()
True
>>> whatismyip.whatismyip() # Prefers IPv4 addresses, but can return either IPv4 or IPv6.
'69.89.31.226'
>>> whatismyip.whatismyipv4()
'69.89.31.226'
>>> whatismyip.whatismyipv6()
'2345:0425:2CA1:0000:0000:0567:5673:23b5'
If the machine is being a firewall then your solution is a very sensible one: the alternative being able to query the firewall which ends-up being very dependent on the type of firewall (if at all possible).
The most simple (non python) working solution I can think of is
wget -q -O- icanhazip.com
I'd like to add a very short Python3 solution which makes use of the JSON API of http://hostip.info.
from urllib.request import urlopen
import json
url = 'http://api.hostip.info/get_json.php'
info = json.loads(urlopen(url).read().decode('utf-8'))
print(info['ip'])
You can of course add some error checking, a timeout condition and some convenience:
#!/usr/bin/env python3
from urllib.request import urlopen
from urllib.error import URLError
import json
try:
url = 'http://api.hostip.info/get_json.php'
info = json.loads(urlopen(url, timeout = 15).read().decode('utf-8'))
print(info['ip'])
except URLError as e:
print(e.reason, end=' ') # e.g. 'timed out'
print('(are you connected to the internet?)')
except KeyboardInterrupt:
pass
In [1]: import stun
stun.get_ip_info()
('Restric NAT', 'xx.xx.xx.xx', 55320)
Working with Python 2.7.6 and 2.7.13
import urllib2
req = urllib2.Request('http://icanhazip.com', data=None)
response = urllib2.urlopen(req, timeout=5)
print(response.read())
Linux only solution.
On Linux Systems, you can use Python to execute a command on the shell. I think it might help someone.
Something like this, (assuming 'dig/drill' is working on the os)
import os
command = "dig TXT +short o-o.myaddr.l.google.com #ns1.google.com | awk -F\'\"\' '{print $2}' "
ip = os.system(command)
For Arch users, please replace 'dig' with 'drill'.
I liked Sergiy Ostrovsky's answer, but I think there is an even tidier way to do this now.
Install ipify library.
pip install ipify
Import and use the library in your Python program.
import ipify
ip = ipify.get_ip()
ipWebCode = urllib.request.urlopen("http://ip.nefsc.noaa.gov").read().decode("utf8")
ipWebCode=ipWebCode.split("color=red> ")
ipWebCode = ipWebCode[1]
ipWebCode = ipWebCode.split("</font>")
externalIp = ipWebCode[0]
this is a short snippet I had written for another program. The trick was finding a simple enough website so that dissecting the html wasn't a pain.
Here's another alternative script.
def track_ip():
"""
Returns Dict with the following keys:
- ip
- latlong
- country
- city
- user-agent
"""
conn = httplib.HTTPConnection("www.trackip.net")
conn.request("GET", "/ip?json")
resp = conn.getresponse()
print resp.status, resp.reason
if resp.status == 200:
ip = json.loads(resp.read())
else:
print 'Connection Error: %s' % resp.reason
conn.close()
return ip
EDIT: Don't forget to import httplib and json
If you're just writing for yourself and not for a generalized application, you might be able to find the address on the setup page for your router and then scrape it from that page's html. This worked fine for me with my SMC router. One read and one simple RE search and I've found it.
My particular interest in doing this was to let me know my home IP address when I was away from home, so I could get back in via VNC. A few more lines of Python stores the address in Dropbox for outside access, and even emails me if it sees a change. I've scheduled it to happen on boot and once an hour thereafter.
Use this script :
import urllib, json
data = json.loads(urllib.urlopen("http://ip.jsontest.com/").read())
print data["ip"]
Without json :
import urllib, re
data = re.search('"([0-9.]*)"', urllib.urlopen("http://ip.jsontest.com/").read()).group(1)
print data
import os
public_ip = os.system("inxi -i |grep 'WAN IP'")
print(public_ip)
If you are not interested in hitting any url to get public ip, I think following code can help you to get public ip using python of your machine
import os
externalIP = os.popen("ifconfig | grep 'inet' | cut -d: -f2 | awk '{print $2}' | sed -n 3p").readline()
print externalIP
sed -n 3p line varies as per the network you are using for connecting device.
I was facing same issue, I was needed public ip of iot device which is hitting my server. but public ip is totally different in ifconfig command and ip i am getting in server from request object. after this I am adding extra param into my request to send ip of device to my server.
hope this is helpful
import os
externalIp = os.popen("ipconfig").read().split(":")[15][1:14]
some numbers may need to be changed but this works for me
Does urllib2 in Python 2.6.1 support proxy via https?
I've found the following at http://www.voidspace.org.uk/python/articles/urllib2.shtml:
NOTE
Currently urllib2 does not support
fetching of https locations through a
proxy. This can be a problem.
I'm trying automate login in to web site and downloading document, I have valid username/password.
proxy_info = {
'host':"axxx", # commented out the real data
'port':"1234" # commented out the real data
}
proxy_handler = urllib2.ProxyHandler(
{"http" : "http://%(host)s:%(port)s" % proxy_info})
opener = urllib2.build_opener(proxy_handler,
urllib2.HTTPHandler(debuglevel=1),urllib2.HTTPCookieProcessor())
urllib2.install_opener(opener)
fullurl = 'https://correct.url.to.login.page.com/user=a&pswd=b' # example
req1 = urllib2.Request(url=fullurl, headers=headers)
response = urllib2.urlopen(req1)
I've had it working for similar pages but not using HTTPS and I suspect it does not get through proxy - it just gets stuck in the same way as when I did not specify proxy. I need to go out through proxy.
I need to authenticate but not using basic authentication, will urllib2 figure out authentication when going via https site (I supply username/password to site via url)?
EDIT:
Nope, I tested with
proxies = {
"http" : "http://%(host)s:%(port)s" % proxy_info,
"https" : "https://%(host)s:%(port)s" % proxy_info
}
proxy_handler = urllib2.ProxyHandler(proxies)
And I get error:
urllib2.URLError: urlopen error
[Errno 8] _ssl.c:480: EOF occurred in
violation of protocol
Fixed in Python 2.6.3 and several other branches:
_bugs.python.org/issue1424152 (replace _ with http...)
http://www.python.org/download/releases/2.6.3/NEWS.txt
Issue #1424152: Fix for httplib, urllib2 to support SSL while working through
proxy. Original patch by Christopher Li, changes made by Senthil Kumaran.
I'm not sure Michael Foord's article, that you quote, is updated to Python 2.6.1 -- why not give it a try? Instead of telling ProxyHandler that the proxy is only good for http, as you're doing now, register it for https, too (of course you should format it into a variable just once before you call ProxyHandler and just repeatedly use that variable in the dict): that may or may not work, but, you're not even trying, and that's sure not to work!-)
Incase anyone else have this issue in the future I'd like to point out that it does support https proxying now, make sure the proxy supports it too or you risk running into a bug that puts the python library into an infinite loop (this happened to me).
See the unittest in the python source that is testing https proxying support for further information:
http://svn.python.org/view/python/branches/release26-maint/Lib/test/test_urllib2.py?r1=74203&r2=74202&pathrev=74203