Recently I started to get
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://api.soundcloud.com/oauth2/token
using soundcloud (0.5.0) Python library.
It happens in
client = soundcloud.Client(client_id='id',
client_secret='secret',
username='user#mail.com',
password='passwd')
I double checked my credentials to make sure they are not the cause. I tried to get a Client instance from different IPs and different machines. At some random times during a day I can get a Client instance, but 99.99% of the day I get the error.
Does the error mean I was banned for some reason?
It may be helpful, I solved a similar problem by reading the username and password from the configuration file.
Try:
# config.ini
[my_app]
CLIENT_ID = enter_your_id
CLIENT_SECRET = enter_your_secret
USERNAME = enter_username
PASSWORD = enter_password
Install configparser from Python 3.8 for Python 2.6+: pip install configparser.
See the great documentation for more details.
import configparser
config = configparser.RawConfigParser()
config.read('config.ini')
my_id = config.get('my_app', 'CLIENT_ID')
my_secret = config.get('my_app', 'CLIENT_SECRET')
my_user = config.get('my_app', 'USERNAME')
my_pass = config.get('my_app', 'PASSWORD')
client = soundcloud.Client(client_id=my_id,
client_secret=my_secret,
username=my_user,
password=my_pass)
Probably you are using a third-party app to access your account,
in case you are using Gmail service for your purpose, consider allowing third-party apps to have access to your Gmail account, how are you going to do that:
Go to https://myaccount.google.com/security
Use Two-Step-Authentication and enable it.
Create an App Password, Select App > Mail | Select Device > Windows Computer.
And use the password given
And that should solve your issue.
I've created some web services using pysimplesoap like on this documentation:
https://code.google.com/p/pysimplesoap/wiki/SoapServer
When I tested it, I called it like this:
from SOAPpy import SOAPProxy
from SOAPpy import Types
namespace = "http://localhost:8008"
url = "http://localhost:8008"
proxy = SOAPProxy(url, namespace)
response = proxy.dummy(times=5, name="test")
print response
And it worked for all of my web services, but when I try to call it by using an library which is needed to specify the WSDL, it returns "Could not connect to host".
To solve my problem, I used the object ".wsdl()" to generate the correct WSDL and saved it into a file, the WSDL generated by default wasn't correct, was missing variable types and the correct server address...
The server name localhost is only meaningful on your computer. Once outside, other computers won't be able to see it.
1) find out your external IP, with http://www.whatismyip.com/ or another service. Note that IPs change over time.
2) plug the IP in to http://www.soapclient.com/soaptest.html
If your local service is answering IP requests as well as from localhost, you're done!
I have got SVN running on Ubuntu 11.04 32bit and now want notifications using my GMAIL account for every commit.
I've commited a few things but don't actually recieve the commit emails for them. There aren't any errors that are displayed and I have looked through the logs but haven't found much useful information as of yet.
I've read quite alot of posts regarding this and editing the following files below including what they contain now. I've tried using sendmail and postfix but have had no luck with them hence which is why I am using Google's mail server. It would be grateful if someone could point me in the right direction or an alternative approach.
The links I found and have used.
http://sadomovalex.blogspot.com/2009/12/use-gmail-smtp-server-for-post-commit.html
http://iffee.wordpress.com/2009/04/08/svn-commit-to-google-apps-email-notification/
post-commit.tmpl
REPOS="$1"
REV="$2"
/home/megaz/svn/repos/ya/hooks/mailer.py commit "$REPOS" \
"$REV" /home/megaz/svn/repos/ya/hooks/mailer.conf
mailer.conf
[general]
smtp_hostname = smtp.gmail.com:587
smtp_username = #mygmailaddress
smtp_password = #mygmailpassword
smtp_use_ssl = true
smtp_use_tls = 1
[defaults]
diff = /usr/bin/diff -u -L %(label_from)s -L %(label_to)s %(from)s %(to)s
commit_subject_prefix = [SVN-Commit]
propchange_subject_prefix =
lock_subject_prefix =
unlock_subject_prefix =
from_addr = #my from address
to_addr = #my to address
reply_to = #my replyto address
generate_diffs = none
show_nonmatching_paths = yes
[maps]
mailer.py
class SMTPOutput(MailedOutput):
def start(self, group, params):
MailedOutput.start(self, group, params)
self.buffer = StringIO()
self.write = self.buffer.write
self.write(self.mail_headers(group, params))
def finish(self):
server = smtplib.SMTP(self.cfg.general.smtp_hostname)
# 2009-12-13 asadomov: add ssl configuration (e.g. for gmail smtp server)
if self.cfg.is_set('general.smtp_use_ssl') and self.cfg.general.smtp_use_ssl.lower() == "true":
server.ehlo()
server.starttls()
server.ehlo()
if self.cfg.is_set('general.smtp_username'):
server.login(self.cfg.general.smtp_username,
self.cfg.general.smtp_password)
server.sendmail(self.from_addr, self.to_addrs, self.buffer.getvalue())
server.quit()
I see, you have not really read the instructions. The code you have copy/pasted needs to replace a snippet in a larger file which you haven't downloaded. Also, the file name of the post-commit script should not have the .templ suffix; that's what they use for inactive example / template files in the distribution.
Perhaps this explains why you couldn't get Sendmail to work, either. At this point I'd recommend to go back to that, as it's simpler.
rename your "post-commit.templ" to "post-commit"
make sure you give exec rights (such as 755) to "post-commit"
Looking for a better way to get a machines current external IP #... Below works, but would rather not rely on an outside site to gather the information ... I am restricted to using standard Python 2.5.1 libraries bundled with Mac OS X 10.5.x
import os
import urllib2
def check_in():
fqn = os.uname()[1]
ext_ip = urllib2.urlopen('http://whatismyip.org').read()
print ("Asset: %s " % fqn, "Checking in from IP#: %s " % ext_ip)
I liked the http://ipify.org. They even provide Python code for using their API.
# This example requires the requests library be installed. You can learn more
# about the Requests library here: http://docs.python-requests.org/en/latest/
from requests import get
ip = get('https://api.ipify.org').content.decode('utf8')
print('My public IP address is: {}'.format(ip))
Python3, using nothing else but the standard library
As mentioned before, one can use an external service like ident.me in order to discover the external IP address of your router.
Here is how it is done with python3, using nothing else but the standard library:
import urllib.request
external_ip = urllib.request.urlopen('https://ident.me').read().decode('utf8')
print(external_ip)
Both IPv4 and IPv6 addresses can be returned, based on availability and client preference; use https://v4.ident.me/ for IPv4 only, or https://v6.ident.me/ for IPv6 only.
If you are behind a router which obtains the external IP, I'm afraid you have no other option but to use external service like you do. If the router itself has some query interface, you can use it, but the solution will be very environment-specific and unreliable.
You should use the UPnP protocol to query your router for this information. Most importantly, this does not rely on an external service, which all the other answers to this question seem to suggest.
There's a Python library called miniupnp which can do this, see e.g. miniupnpc/testupnpigd.py.
pip install miniupnpc
Based on their example you should be able to do something like this:
import miniupnpc
u = miniupnpc.UPnP()
u.discoverdelay = 200
u.discover()
u.selectigd()
print('external ip address: {}'.format(u.externalipaddress()))
I prefer this Amazon AWS endpoint:
import requests
ip = requests.get('https://checkip.amazonaws.com').text.strip()
In my opinion the simplest solution is
import requests
f = requests.request('GET', 'http://myip.dnsomatic.com')
ip = f.text
Thats all.
Use requests module:
import requests
myip = requests.get('https://www.wikipedia.org').headers['X-Client-IP']
print("\n[+] Public IP: "+myip)
As simple as running this in Python3:
import os
externalIP = os.popen('curl -s ifconfig.me').readline()
print(externalIP)
Try:
import requests
ip = requests.get('http://ipinfo.io/json').json()['ip']
Hope this is helpful
If you don't want to use external services (IP websites, etc.) You can use the UPnP Protocol.
Do to that we use a simple UPnP client library (https://github.com/flyte/upnpclient)
Install:
pip install upnpclient
Simple Code:
import upnpclient
devices = upnpclient.discover()
if(len(devices) > 0):
externalIP = devices[0].WANIPConn1.GetExternalIPAddress()
print(externalIP)
else:
print('No Connected network interface detected')
Full Code (to get more information as mentioned in the github readme)
In [1]: import upnpclient
In [2]: devices = upnpclient.discover()
In [3]: devices
Out[3]:
[<Device 'OpenWRT router'>,
<Device 'Harmony Hub'>,
<Device 'walternate: root'>]
In [4]: d = devices[0]
In [5]: d.WANIPConn1.GetStatusInfo()
Out[5]:
{'NewConnectionStatus': 'Connected',
'NewLastConnectionError': 'ERROR_NONE',
'NewUptime': 14851479}
In [6]: d.WANIPConn1.GetNATRSIPStatus()
Out[6]: {'NewNATEnabled': True, 'NewRSIPAvailable': False}
In [7]: d.WANIPConn1.GetExternalIPAddress()
Out[7]: {'NewExternalIPAddress': '123.123.123.123'}
If you think and external source is too unreliable, you could pool a few different services. For most ip lookup pages they require you to scrape html, but a few of them that have created lean pages for scripts like yours - also so they can reduce the hits on their sites:
automation.whatismyip.com/n09230945.asp (Update: whatismyip has taken this service down)
whatismyip.org
I use IPGrab because it's easy to remember:
# This example requires the requests library be installed. You can learn more
# about the Requests library here: http://docs.python-requests.org/en/latest/
from requests import get
ip = get('http://ipgrab.io').text
print('My public IP address is: {}'.format(ip))
There are a few other ways that do not rely on Python checking an external web site, however the OS can. Your primary issue here, is that even if you were not using Python, if you were using the command line, there are no "built-in" commands that can just simply tell you the external (WAN) IP. Commands such as "ip addr show" and "ifconfig -a" show you the server's IP address's within the network. Only the router actually holds the external IP. However, there are ways to find the external IP address (WAN IP) from the command line.
These examples are:
http://ipecho.net/plain ; echo
curl ipinfo.io/ip
dig +short myip.opendns.com #resolver1.opendns.com
dig TXT +short o-o.myaddr.l.google.com #ns1.google.com
Therefore, the python code would be:
import os
ip = os.popen('wget -qO- http://ipecho.net/plain ; echo').readlines(-1)[0].strip()
print ip
OR
import os
iN, out, err = os.popen3('curl ipinfo.io/ip')
iN.close() ; err.close()
ip = out.read().strip()
print ip
OR
import os
ip = os.popen('dig +short myip.opendns.com #resolver1.opendns.com').readlines(-1)[0].strip()
print ip
Or, plug any other of the examples above, into a command like os.popen, os.popen2, os.popen3, or os.system.
P.S. you can use "pip3 install pytis" and use/take a look at, the "getip" program, written in Python.
You can also find it's code here: https://github.com/PyTis/PyTis/blob/development/src/pytis/getip.py
I tried most of the other answers on this question here and came to find that most of the services used were defunct except one.
Here is a script that should do the trick and download only a minimal amount of information:
#!/usr/bin/env python
import urllib
import re
def get_external_ip():
site = urllib.urlopen("http://checkip.dyndns.org/").read()
grab = re.findall('([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)', site)
address = grab[0]
return address
if __name__ == '__main__':
print( get_external_ip() )
import requests
import re
def getMyExtIp():
try:
res = requests.get("http://whatismyip.org")
myIp = re.compile('(\d{1,3}\.){3}\d{1,3}').search(res.text).group()
if myIp != "":
return myIp
except:
pass
return "n/a"
Unfortunately, there is no way to get your external IP address without consulting a computer on the internet. At best, you can get the local network IP address of your network card (which is likely a 192.16.. address).
You can use the whatismyip module to get the external IP addres. It has no dependencies outside the Python 3 standard library. It connects to public STUN servers and what-is-my-ip websites to find the IPv4 or IPv6 address. Run pip install whatismyip
Example:
>>> import whatismyip
>>> whatismyip.amionline()
True
>>> whatismyip.whatismyip() # Prefers IPv4 addresses, but can return either IPv4 or IPv6.
'69.89.31.226'
>>> whatismyip.whatismyipv4()
'69.89.31.226'
>>> whatismyip.whatismyipv6()
'2345:0425:2CA1:0000:0000:0567:5673:23b5'
If the machine is being a firewall then your solution is a very sensible one: the alternative being able to query the firewall which ends-up being very dependent on the type of firewall (if at all possible).
The most simple (non python) working solution I can think of is
wget -q -O- icanhazip.com
I'd like to add a very short Python3 solution which makes use of the JSON API of http://hostip.info.
from urllib.request import urlopen
import json
url = 'http://api.hostip.info/get_json.php'
info = json.loads(urlopen(url).read().decode('utf-8'))
print(info['ip'])
You can of course add some error checking, a timeout condition and some convenience:
#!/usr/bin/env python3
from urllib.request import urlopen
from urllib.error import URLError
import json
try:
url = 'http://api.hostip.info/get_json.php'
info = json.loads(urlopen(url, timeout = 15).read().decode('utf-8'))
print(info['ip'])
except URLError as e:
print(e.reason, end=' ') # e.g. 'timed out'
print('(are you connected to the internet?)')
except KeyboardInterrupt:
pass
In [1]: import stun
stun.get_ip_info()
('Restric NAT', 'xx.xx.xx.xx', 55320)
Working with Python 2.7.6 and 2.7.13
import urllib2
req = urllib2.Request('http://icanhazip.com', data=None)
response = urllib2.urlopen(req, timeout=5)
print(response.read())
Linux only solution.
On Linux Systems, you can use Python to execute a command on the shell. I think it might help someone.
Something like this, (assuming 'dig/drill' is working on the os)
import os
command = "dig TXT +short o-o.myaddr.l.google.com #ns1.google.com | awk -F\'\"\' '{print $2}' "
ip = os.system(command)
For Arch users, please replace 'dig' with 'drill'.
I liked Sergiy Ostrovsky's answer, but I think there is an even tidier way to do this now.
Install ipify library.
pip install ipify
Import and use the library in your Python program.
import ipify
ip = ipify.get_ip()
ipWebCode = urllib.request.urlopen("http://ip.nefsc.noaa.gov").read().decode("utf8")
ipWebCode=ipWebCode.split("color=red> ")
ipWebCode = ipWebCode[1]
ipWebCode = ipWebCode.split("</font>")
externalIp = ipWebCode[0]
this is a short snippet I had written for another program. The trick was finding a simple enough website so that dissecting the html wasn't a pain.
Here's another alternative script.
def track_ip():
"""
Returns Dict with the following keys:
- ip
- latlong
- country
- city
- user-agent
"""
conn = httplib.HTTPConnection("www.trackip.net")
conn.request("GET", "/ip?json")
resp = conn.getresponse()
print resp.status, resp.reason
if resp.status == 200:
ip = json.loads(resp.read())
else:
print 'Connection Error: %s' % resp.reason
conn.close()
return ip
EDIT: Don't forget to import httplib and json
If you're just writing for yourself and not for a generalized application, you might be able to find the address on the setup page for your router and then scrape it from that page's html. This worked fine for me with my SMC router. One read and one simple RE search and I've found it.
My particular interest in doing this was to let me know my home IP address when I was away from home, so I could get back in via VNC. A few more lines of Python stores the address in Dropbox for outside access, and even emails me if it sees a change. I've scheduled it to happen on boot and once an hour thereafter.
Use this script :
import urllib, json
data = json.loads(urllib.urlopen("http://ip.jsontest.com/").read())
print data["ip"]
Without json :
import urllib, re
data = re.search('"([0-9.]*)"', urllib.urlopen("http://ip.jsontest.com/").read()).group(1)
print data
import os
public_ip = os.system("inxi -i |grep 'WAN IP'")
print(public_ip)
If you are not interested in hitting any url to get public ip, I think following code can help you to get public ip using python of your machine
import os
externalIP = os.popen("ifconfig | grep 'inet' | cut -d: -f2 | awk '{print $2}' | sed -n 3p").readline()
print externalIP
sed -n 3p line varies as per the network you are using for connecting device.
I was facing same issue, I was needed public ip of iot device which is hitting my server. but public ip is totally different in ifconfig command and ip i am getting in server from request object. after this I am adding extra param into my request to send ip of device to my server.
hope this is helpful
import os
externalIp = os.popen("ipconfig").read().split(":")[15][1:14]
some numbers may need to be changed but this works for me
From Python, I would like to retrieve content from a web site via HTTPS with basic authentication. I need the content on disk. I am on an intranet, trusting the HTTPS server. Platform is Python 2.6.2 on Windows.
I have been playing around with urllib2, however did not succeed so far.
I have a solution running, calling wget via os.system():
wget_cmd = r'\path\to\wget.exe -q -e "https_proxy = http://fqdn.to.proxy:port" --no-check-certificate --http-user="username" --http-password="password" -O path\to\output https://fqdn.to.site/content'
I would like to get rid of the os.system(). Is that possible in Python?
Proxy and https wasn't working for a long time with urllib2. It will be fixed in the next released version of python 2.6 (v2.6.3).
In the meantime you can reimplement the correct support, that's what we did for mercurial: http://hg.intevation.org/mercurial/crew/rev/59acb9c7d90f
Try this (notice that you'll have to fill in the realm of your server also):
import urllib2
authinfo = urllib2.HTTPBasicAuthHandler()
authinfo.add_password(realm='Fill In Realm Here',
uri='https://fqdn.to.site/content',
user='username',
passwd='password')
proxy_support = urllib2.ProxyHandler({"https" : "http://fqdn.to.proxy:port"})
opener = urllib2.build_opener(proxy_support, authinfo)
fp = opener.open("https://fqdn.to.site/content")
open(r"path\to\output", "wb").write(fp.read())
You could try this too:
http://code.google.com/p/python-httpclient/
(It also supports the verification of the server certificate.)