401 - Unauthorized: Access is denied due to invalid credentials python - python

import urllib2
proxy = urllib2.ProxyHandler({ "http" : "http:proxyIp1", "https" : "proxyIp2"})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)
try:
a = urllib2.urlopen("corporate internal link")#, proxies=proxy)
b = a.read()
except urllib2.HTTPError as e:
error = e.read() # this will be your error message
print error
This gives an error:
401 - Unauthorized: Access is denied due to invalid credentials.
I know that i have to give "username" and "password" but can any1 tell how i can give the credentials in this case??
Thanks

Source : http://docs.python.org/2/library/urllib2.html Check "Use of Basic HTTP Authentication"
Read about add_password method here
proxy_handler = urllib2.ProxyHandler({'http': 'http://www.example.com:3128/'})
proxy_auth_handler = urllib2.ProxyBasicAuthHandler()
proxy_auth_handler.add_password('realm', 'host', 'username', 'password')
opener = urllib2.build_opener(proxy_handler, proxy_auth_handler)
# This time, rather than install the OpenerDirector, we use it directly:
opener.open('http://www.example.com/login.html')

Related

HTTP Error 401: Unauthorized using urllib.request.urlopen

I'm using urllib.request in python to try and download some build information from Teamcity. This request used to work without username and password, however a recent security change means I must use a username and password. So I have changed tried each of the two solutions below:
Attempt 1)
url = 'http://<domain>/httpAuth/app/rest/buildTypes/<buildlabel>/builds/running:false?count=1&start=0'
# create a password manager
password_mgr = urllib.request.HTTPPasswordMgrWithDefaultRealm()
# Add the username and password.
top_level_url = "http://<domain>/httpAuth/app/rest/buildTypes/id:<buildlabel>/builds/running:false?count=1&start=0"
password_mgr.add_password(None, top_level_url, username, password)
handler = urllib.request.HTTPBasicAuthHandler(password_mgr)
# create "opener" (OpenerDirector instance)
opener = urllib.request.build_opener(handler)
# use the opener to fetch a URL
opener.open(url)
Attempt 2
url = 'http://<username>:<password>#<domain>/httpAuth/app/rest/buildTypes/id:buildlabel/builds/running:false?count=1&start=0'
rest_api = urllib.request.urlopen(url)
Both of these return "HTTP Error 401: Unauthorized". However if I was to print 'url' and copy this output into a browser the link works perfectly. But when used through python I get the above error.
I use something very similar in another Perl script and this works perfectly also.
* SOLVED BELOW *
Solved this using.
credentials(url, username, password)
rest_api = urllib2.urlopen(url)
latest_build_info = rest_api.read()
latest_build_info = latest_build_info.decode("UTF-8")
# Then parse this xml for the information I want.
def credentials(self, url, username, password):
p = urllib2.HTTPPasswordMgrWithDefaultRealm()
p.add_password(None, url, username, password)
handler = urllib2.HTTPBasicAuthHandler(p)
opener = urllib2.build_opener(handler)
urllib2.install_opener(opener)
As a side note, I then want to download a file..
credentials(url, username, password)
urllib2.urlretrieve(url, downloaded_file)
Where Url is:
http://<teamcityServer>/repository/download/<build Label>/<BuildID>:id/Filename.zip

Proxy settings in requests library Python

I have a Python script used to connect to Parse.com (remote server) and upload a file. The script runs off a server that sits behind a corporate firewall.
import env
import json
import requests
from requests.auth import HTTPProxyAuth
def uploadFile(fileFullPath):
print "Attempting to upload file: " + fileFullPath
proxies = {
"http": "http://10.128.198.14",
"https": "http://10.128.198.14"
}
auth = HTTPProxyAuth('MyDomain\MyUsername', 'MyPassord')
headers = {
"X-Parse-Application-Id": env.X_Parse_APP_ID,
"X-Parse-REST-API-Key": env.X_Parse_REST_API_Key,
"Content-Type": "application/pdf"
}
f = open(fileFullPath, 'r')
files = {'file': f}
r = requests.post(env.PARSE_HOSTNAME + env.PARSE_FILES_ENDPOINT + "/" + env.PARSE_FILE_NAME, files=files, headers=headers, timeout=10, verify=False, proxies=proxies)
print r.text
When I used this module from the command prompt, I got the following message:
ConnectionError thrown. Details: Cannot connect to proxy. Socket error: Tunnel connection failed: 407 Proxy Authentication Required.
I am pretty sure the username and password are both correct.
Any solution? Thanks!
The reason for the 407 error is that the proxy itself needs to be authenticated. So for your proxies dict, do the following:
proxies = {
"http": "http://user:pass#10.128.198.14",
"https": "http://user:pass#10.128.198.14"
}
Fill in the user and pass variables in the proxies urls. Here is a link to the relevant requests documentation on how to build proxy objects and have them authenticated.

Can't send HTTPS request through proxy using urlib2

I'm trying to create a Python script that sends a HTTPS request through a proxy (Burp, to be exact), but it keeps failing with
ssl.CertificateError: hostname 'example.com:443' doesn't match u'example.com'
Here's an abbreviated version of my code:
proxy = urllib2.ProxyHandler({'https': '127.0.0.1:8080'})
opener = urllib2.build_opener(proxy)
opener.addheaders = [ ("Host", "example.com"),
...
]
urllib2.install_opener(opener)
try:
req = opener.open( 'https://example.com/service', 'data' ).read()
except urllib2.URLError, e:
print e
So it looks like Python thinks that Python (ssl.CertificateError is, I believe, a Python error, not an OpenSSL error) has a problem with either the port or that one of the addresses is in Unicode. Neither makes sense to me. Any sugestions?
try this code. i got it working with burp
test.py
import urllib2
opener = urllib2.build_opener(
urllib2.HTTPHandler(),
urllib2.HTTPSHandler(),
urllib2.ProxyHandler({'https': 'localhost:8080'}))
urllib2.install_opener(opener)
print opener.open( 'https://example.com', 'data' ).read()
burp configuration
Demo

http login to router using python and requests module getting exception

When I try to log in to my router it works fine but when the password or username is wrong I get the exception. How to handle this exception?
from requests.auth import HTTPBasicAuth
import requests
def hts():
url = 'http://192.168.1.1/'
name = 'username'
passw = 'password'
auth = HTTPBasicAuth(name,passw)
r = requests.get(url, auth=auth)
try:
if r:
print(r.text)
else:
print("not found")
except requests.exceptions.ContentDecodingError as e:
print('wrong password')
hts()
here is the error
raise ContentDecodingError(e)
requests.exceptions.ContentDecodingError: ('Received response with content-encoding:
gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect
header check',))
try:
r = requests.get(url, auth=auth)
except requests.exceptions.ContentDecodingError as e: # handling the ContentDecoding exception
print('wrong password')

How to use the HTTPPasswordMgrWithDefaultRealm() in Python

I need to write some python ftp code that uses a ftp proxy. The proxy doesn't require authentication but the ftp server I am connecting to does. I have the following code but I am getting a "I/O error(ftp error): 501 USER format: proxy-user:auth-method#destination. Closing connection." error. My code is:
import urllib2
proxies = {'ftp':'ftp://proxy_server:21'}
ftp_server = ' ftp.somecompany.com '
ftp_port='21'
username = 'aaaa'
password = 'secretPW'
password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm( )
top_level_url = ftp_server
password_mgr.add_password(None , top_level_url, username, password)
proxy_support = urllib2.ProxyHandler(proxies )
handler = urllib2.HTTPBasicAuthHandler(password_mgr )
opener = urllib2.build_opener(proxy_support )
opener = urllib2.build_opener(handler )
a_url = 'ftp://' + ftp_server + ':' + ftp_port + '/'
print a_url
try:
data = opener.open(a_url )
print data
except IOError, (errno, strerror):
print "I/O error(%s): %s" % (errno, strerror)
I would be grateful for any assistance I can get.
I use the following code block which seems similar except i include the protocol in the top_level_url I use (and of course it's http).
You might also try calling install_opener after each build_opener call and then using urllib2.urlopen
auth_handler = urllib2.HTTPBasicAuthHandler()
auth_handler.add_password(realm='RESTRICTED ACCESS',
uri='http://website.com',
user='username',
passwd='password')
opener = urllib2.build_opener(auth_handler)
urllib2.install_opener(opener)
urllib2.urlopen('http://website.com/....')
I think you need to change this:
opener = urllib2.build_opener(proxy_support )
opener = urllib2.build_opener(handler )
to this:
opener = urllib2.build_opener([proxy_support, handler])
That gives you one opener that has both your authentication and your proxy support. You only need to use install_opener if you want the custom opener to be used whenever urllib2.urlopen is called.

Categories