I am trying to use the Google Drive API to download publicly available files however whenever I try to proceed I get an import error.
For reference, I have successfully set up the OAuth2 such that I have a client id as well as a client secret , and a redirect url however when I try setting it up I get an error saying the object has no attribute urllen
>>> from apiclient.discovery import build
>>> from oauth2client.client import OAuth2WebServerFlow
>>> flow = OAuth2WebServerFlow(client_id='not_showing_client_id', client_secret='not_showing_secret_id', scope='https://www.googleapis.com/auth/drive', redirect_uri='https://www.example.com/oauth2callback')
>>> auth_uri = flow.step1_get_authorize_url()
>>> code = '4/E4h7XYQXXbVNMfOqA5QzF-7gGMagHSWm__KIH6GSSU4#'
>>> credentials = flow.step2_exchange(code)
And then I get the error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Python/2.7/site-packages/oauth2client/util.py", line
137, in positional_wrapper
return wrapped(*args, **kwargs)
File "/Library/Python/2.7/site-packages/oauth2client/client.py", line
1980, in step2_exchange
body = urllib.parse.urlencode(post_data)
AttributeError: 'Module_six_moves_urllib_parse' object has no attribute
'urlencode'
Any help would be appreciated, also would someone mind enlightening me as to how I instantiate a drive_file because according to https://developers.google.com/drive/web/manage-downloads, I need to instantiate one and I am unsure of how to do so.
Edit: So I figured out why I was getting the error I got before. If anyone else is having the same problem then try running.
sudo pip install -I google-api-python-client==1.3.2
However I am still unclear about the drive instance so any help with that would be appreciated.
Edit 2: Okay so I figured out the answer to my whole question. The drive instance is just the metadata which results when we use the API to search for a file based on its id
So as I said in my edits try the sudo pip install and a file instance is just a dictionary of meta data.
Related
I am currently working on authenticating my users. I checked out this blog - RESTful Authentication with Flask - and followed its steps to produce a piece of code I believed would serve my purpose. I wanted to use the Timed Json Serializer class for my particular use case. Below, I am creating an object of that class, generating a token and trying to load the data with it.
from itsdangerous import TimedJSONWebSignatureSerializer
user_id = 'fake1'
s = TimedJSONWebSignatureSerializer(parser_app.config['SECRET_KEY'], expires_in=3600)
token = s.dumps({'user_id' : user_id})
print(token)
print (s.loads(token))
I get the following callback:
Traceback (most recent call last):
eyJhbGciOiJIUzI1NiIsImV4cCI6MTQ2ODI3MjU3MSwiaWF0IjoxNDY4MjY4OTcxfQ.eyJ1c2VyX2lkIjoiZmFrZTEifQ.Ch8y6BDMIIBdIGM0lmjdAimINvP3PnUmBpOp-jDW18w
File "C:/Users/vaibhav/PycharmProjects/Coding/Coding.py", line 6, in <module>
print (s.loads(token))
File "C:\Users\vaibhav\Anaconda\lib\site-packages\itsdangerous.py", line 798, in loads
self, s, salt, return_header=True)
File "C:\Users\vaibhav\Anaconda\lib\site-packages\itsdangerous.py", line 752, in loads
self.make_signer(salt, self.algorithm).unsign(want_bytes(s)),
File "C:\Users\vaibhav\Anaconda\lib\site-packages\itsdangerous.py", line 377, in unsign
payload=value)
itsdangerous.BadSignature: Signature 'Ch8y6BDMIIBdIGM0lmjdAimINvP3PnUmBpOp-jDW18w' does not match
I have just created the token with an expiry of an hour and it gives me a BadSignature which indicates that the token does not match. The desired output would be:
{"user_id" : "fake1"}
Please help me out.
I ended up having to uninstall and reinstall the package itsdangerous with pip. The statements used were:
pip uninstall itsdangerous
followed by:
pip install itsdangerous
Apparently, the file had been corrupted somehow causing it not to work properly.
from a ubuntu machine I use beatbox python package to connect to SF and get the objects descriptions
but can't manage to get access to the ForecastingItems object.
I checked my privileges to access the object and I have full access as admin
the script I'm using is bellow, when I changed the object ForecastingItem with Account it does pull all the object fields.
#!/usr/bin/python
# coding=utf8
import beatbox
import pprint
import sys
import os
import datetime
sf_service = beatbox.PythonClient()
sf_service.login('email#hotmail.com', 'password$numbersletter')
desc_obj = sf_service.describeSObjects('ForecastingItem')
forcat_item = desc_obj[0]
forItem_fields = forcat_item.fields
for sf_field_key, sf_field_value in forItem_fields.items():
print sf_field_key
I heard that I should upgrade beatbox to something more than the version 21 to be able to access the ForecastingItem object so I tried apt-get update upgrade beatbox, but I still get the error :
Traceback (most recent call last):
File "./fields_associated_with_an_object.py", line 14, in <module>
desc_obj = sf_service.describeSObjects('ForecastingItem')
File "/usr/local/lib/python2.7/dist-packages/beatbox-20.0-py2.7.egg/beatbox/python_client.py", line 131, in describeSObjects
res = BaseClient.describeSObjects(self, sObjectTypes)
File "/usr/local/lib/python2.7/dist-packages/beatbox-20.0-py2.7.egg/beatbox/_beatbox.py", line 108, in describeSObjects
return DescribeSObjectsRequest(self.__serverUrl, self.sessionId, sObjectTypes).post(self.__conn)
File "/usr/local/lib/python2.7/dist-packages/beatbox-20.0-py2.7.egg/beatbox/_beatbox.py", line 332, in post
raise SoapFaultError(faultCode, faultString)
beatbox._beatbox.SoapFaultError: 'INVALID_TYPE' "INVALID_TYPE: sObject type 'ForecastingItem' is not supported. If you are attempting to use a custom object, be sure to append the '__c' after the entity name. Please reference your WSDL or the describe call for the appropriate names."
Thanks in advance!
I installed everything as it says on the FlickrAPI homepage but when I try to run:
import flickrapi
api_key = '1a4c975fa83048436a2086bcab7d2290'
api_password = '5e069eae20e60297'
flickrclient = flickrapi.FlickAPI(api_key, api_password)
favourites = flickrClient.favorites_getPublicList(user_id='userid')
photos = flickr.photos_search(user_id='73509078#N00', per_page='10')
sets = flickr.photosets_getList(user_id='73509078#N00')
for photo in favourites.photos[0].photo:
print photo['title']
I get this message from the command prompt:
C:\Users\Desktop>python api.py
Traceback (most recent call last):
File "api.py", line 4, in <module>
flickrclient = flickrapi.FlickAPI(api_key, api_password)
AttributeError: 'module' object has no attribute 'FlickAPI'
Any ideas?? I have tried almost everything
FlickAPI is not the same as FlickrAPI. You're missing an r.
The file C:\Users\XXXXXX\Desktop\FLICKR API\flickrapi.py is not part of the flickrapi package. Please rename it, it is masking the real library. Right now it is being imported instead of the installed package.
The flickrapi package itself consists of a directory with a __init__.py file inside of it. Printing flickrapi.__file__ should result in a path ending in flickrapi\__init__.py.
In your "flickrclient = flickrapi.FlickAPI" line, you're missing an 'r' in FlickAPI.
Also, on the next line, your *"user_id='userid'"* argument needs an actual user ID, such as '999999#N99'
Hopefully you found that & got this working a few months ago! :)
I set up a required environment for Google Cloud Storage according to the manual.
I have installed "gsutil" and set up all paths.
My gsutil works perfectly, however, when I try to run the code below,
#!/usr/bin/python
import StringIO
import os
import shutil
import tempfile
import time
from oauth2_plugin import oauth2_plugin
import boto
# URI scheme for Google Cloud Storage.
GOOGLE_STORAGE = 'gs'
# URI scheme for accessing local files.
LOCAL_FILE = 'file'
uri=boto.storage_uri('sangin2', GOOGLE_STORAGE)
try:
uri.create_bucket()
print "done!"
except boto.exception.StorageCreateError, e:
print "failed"
It gives "403 Access denied" error.
Traceback (most recent call last):
File "/Volumes/WingIDE-101-4.0.0/WingIDE.app/Contents/MacOS/src/debug/tserver/_sandbox.py", line 23, in <module>
File "/Users/lsangin/gsutil/boto/boto/storage_uri.py", line 349, in create_bucket
return conn.create_bucket(self.bucket_name, headers, location, policy)
File "/Users/lsangin/gsutil/boto/boto/gs/connection.py", line 91, in create_bucket
response.status, response.reason, body)
boto.exception.GSResponseError: GSResponseError: 403 Forbidden
<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message></Error>
Since I am new to this, it is kinda hard for me to figure out why.
Can someone help me?
Thank you.
The boto library should automatically find and use your $HOME/.boto file. One thing to check: make sure the project you're using is set as your default project for legacy access (at the API console, click on "Storage Access" and verify that it says "This is your default project for legacy access"). When I have that set incorrectly and I follow the create bucket example you referenced, I also get a 403 error, however, it doesn't make sense that this would work for you in gsutil but not with direct use of boto.
Try adding "debug=2" when you instantiate the storage_uri object, like this:
uri = boto.storage_uri(name, GOOGLE_STORAGE, debug=2)
That will generate some additional debugging information on stdout, which you can then compare with the debug output from an analogous, working gsutil example (via gsutil -D mb ).
Hey guys, I am a little lost on how to get the auth token. Here is the code I am using on the return from authorizing my app:
client = gdata.service.GDataService()
gdata.alt.appengine.run_on_appengine(client)
sessionToken = gdata.auth.extract_auth_sub_token_from_url(self.request.uri)
client.UpgradeToSessionToken(sessionToken)
logging.info(client.GetAuthSubToken())
what gets logged is "None" so that does seem right :-(
if I use this:
temp = client.upgrade_to_session_token(sessionToken)
logging.info(dump(temp))
I get this:
{'scopes': ['http://www.google.com/calendar/feeds/'], 'auth_header': 'AuthSub token=CNKe7drpFRDzp8uVARjD-s-wAg'}
so I can see that I am getting a AuthSub Token and I guess I could just parse that and grab the token but that doesn't seem like the way things should work.
If I try to use AuthSubTokenInfo I get this:
Traceback (most recent call last):
File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/webapp/__init__.py", line 507, in __call__
handler.get(*groups)
File "controllers/indexController.py", line 47, in get
logging.info(client.AuthSubTokenInfo())
File "/Users/matthusby/Dropbox/appengine/projects/FBCal/gdata/service.py", line 938, in AuthSubTokenInfo
token = self.token_store.find_token(scopes[0])
TypeError: 'NoneType' object is unsubscriptable
so it looks like my token_store is not getting filled in correctly, is that something I should be doing?
Also I am using gdata 2.0.9
Thanks
Matt
To answer my own question:
When you get the Token just call:
client.token_store.add_token(sessionToken)
and App Engine will store it in a new entity type for you. Then when making calls to the calendar service just dont set the authsubtoken as it will take care of that for you also.