I am trying to access Google Drive using the Drive API Version 3 (Python). Listing the files seems to work fine. I get insufficient Permission error when I try to upload a file.
I changed My scope to give full permission to my script
SCOPES = 'https://www.googleapis.com/auth/drive'
Below is the block that I use to create the file
file_metadata = {
'name': 'Contents.pdf',
'mimeType': 'application/vnd.google-apps.file'
}
media = MediaFileUpload('Contents.pdf',
mimetype='application/vnd.google-apps.file',
resumable=True)
file = service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print ('File ID: %s' % file.get('id'))
I get this error message:
ResumableUploadError: HttpError 403 "Insufficient Permission"
I am not sure what is wrong here.
I think that your script works fine. From the error you show, I thought the requirement of reauthorize of access token and refresh token. So please try a following flow.
When you authorize using client_secret.json, a credential JSON file is created. At the default Quickstart, it is created in .credentials of your home directory.
For your current situation, please delete your current the credential JSON file which is not client_secret.json, and reauthorize by launching your script. The default file name of Quickstart is drive-python-quickstart.json.
By this, scope of https://www.googleapis.com/auth/drive is reflected to access token and refresh token, and they are used for uploading process. When the error occurs even if this flow is done, please confirm whether Drive API is enabled at API console, again.
If this was not useful for you, I'm sorry.
Maybe you already have a file with the same name there?
Related
I am trying to create a csv file on google cloud storage bucket using python webapp2 using below code :
full_filename = '/' + TEST_BUCKET + "/" + DATA + "/" + 'employee.csv'
logging.info("full_filename is %s ", full_filename)
gcs_file = cloudstorage.open(full_filename,
'w',
content_type='text/plain',
retry_params=cloudstorage.RetryParams(backoff_factor=1.1))
gcs_file.write(file_obj.getvalue())
gcs_file.close()
logging.info("done writing into cloud storage !!")
It's getting created successfully , and Developers who are part of GAE console can see the content of the file.
But employees who are not part of GAE console can't see this and getting 403 Forbidden.
The idea is that employee's who are part of the same org(let's take google workspace domain as : example.com) should be able to access this file irrespective of they are part of GAE console or not.
So for that I tried giving bucket level permissions(uniform access control) and added example.com as new principals and Role as : Storage Legacy Bucket Reader But they are still getting same 403 Forbidden.
Resources:
https://cloud.google.com/iam/docs/overview#g_suite_domain
https://cloud.google.com/storage/docs/access-control
GSuite Permissions on Google Cloud Storage
https://cloud.google.com/storage/docs/access-control/lists
This error (403) indicates that the user was not authorized by Google Cloud Storage to make the request.
The various possible causes for this error are listed in the Google Cloud Storage error documentation for 403-Forbidden.
A common source of this error is that the bucket permissions (bucket ACL) are not set properly to allow your app access.
Since you mentioned that the Developers who are a part of GAE are able to access the bucket contents, so we can rule out the ACL scenario as mentioned above.
However, you may try out the following:
Add the domain users to a Group.
On Google Cloud Platform Console go to "Storage -> Browser", and on your bucket, on the menu on the right select "edit bucket permissions".
On "Add members" put the Group and give the role of "Storage -> Storage Object Viewer" to give the whole group read only permissions when authenticated or any other permission combination you need.
Alternatively, you may have a look at this documentation for more details.
blob.upload_from_filename(source) gives the error
raise exceptions.from_http_status(response.status_code, message, >response=response)
google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: ('Request failed with status >code', 403, 'Expected one of', )
I am following the example of google cloud written in python here!
from google.cloud import storage
def upload_blob(bucket, source, des):
client = storage.Client.from_service_account_json('/path')
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket)
blob = bucket.blob(des)
blob.upload_from_filename(source)
I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.
This whole things wasn't working because I didn't have permission storage admin in the service account that I am using in GCP.
Allowing storage admin to my service account solved my problem.
As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.
Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:
SSH to vm where code is running or will be running. Make sure you are user, who has permission to upload things in google storage.
Run following command:
gcloud auth application-default login
This above command will ask to create token by clicking on url. Generate token and paste in ssh console.
That's it. All your python application started as that user, will use this as default credential for storage buckets interaction.
Happy GCP'ing :)
This question is more appropriate for a support case.
As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.
This is what worked for me when the google documentation didn't work. I was getting the same error with the appropriate permissions.
import pathlib
import google.cloud.storage as gcs
client = gcs.Client()
#set target file to write to
target = pathlib.Path("local_file.txt")
#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"
#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:
#download and write file locally
client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)
I've build the following script:
import boto
import sys
import gcs_oauth2_boto_plugin
def check_size_lzo(ds):
# URI scheme for Cloud Storage.
CLIENT_ID = 'myclientid'
CLIENT_SECRET = 'mysecret'
GOOGLE_STORAGE = 'gs'
dir_file= 'date_id={ds}/apollo_export_{ds}.lzo'.format(ds=ds)
gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)
uri = boto.storage_uri('my_bucket/data/apollo/prod/'+ dir_file, GOOGLE_STORAGE)
key = uri.get_key()
if key.size < 45379959:
raise ValueError('umg lzo file is too small, investigate')
else:
print('umg lzo file is %sMB' % round((key.size/1e6),2))
if __name__ == "__main__":
check_size_lzo(sys.argv[1])
It works fine locally but when I try and run on kubernetes cluster I get the following error:
boto.exception.GSResponseError: GSResponseError: 403 Access denied to 'gs://my_bucket/data/apollo/prod/date_id=20180628/apollo_export_20180628.lzo'
I have updated the .boto file on my cluster and added my oauth client id and secret but still having the same issue.
Would really appreciate help resolving this issue.
Many thanks!
If it works in one environment and fails in another, I assume that you're getting your auth from a .boto file (or possibly from the OAUTH2_CLIENT_ID environment variable), but your kubernetes instance is lacking such a file. That you got a 403 instead of a 401 says that your remote server is correctly authenticating as somebody, but that somebody is not authorized to access the object, so presumably you're making the call as a different user.
Unless you've changed something, I'm guessing that you're getting the default Kubernetes Engine auth, with means a service account associated with your project. That service account probably hasn't been granted read permission for your object, which is why you're getting a 403. Grant it read/write permission for your GCS resources, and that should solve the problem.
Also note that by default the default credentials aren't scoped to include GCS, so you'll need to add that as well and then restart the instance.
I am trying to run watch() on my inbox and send it to a pub/sub.
However, I keep getting this error:
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://www.googleapis.com/gmail/v1/users/me.com/watch?alt=json returned "Invalid topicName does not match projects/western-oarlock/topics/*">
The code I am sending is:
request = {
'labelIds': ['INBOX'],
'topicName': 'projects/flask-app/topics/myTopic'
}
service.users().watch(userId='me', body=request).execute()
Why is it attempting to contact western-oarlock instead of flask-app?
Check if access token you are using is the right one.
Check if .p12 key you are using is from the same project, or try using a new key.
I had the same problem, in my case the cause was access token I used in Google Cloud API OAuth2 authentication which was generated using wrong service account. Hovewer I've read then somewehere on the Internet that wrong .p12 key also can cause this issue.
It ended up having to do with the JSON Secrets file. I was authenticating on the wrong project.
We are using Google Drive API in our Google App Engine application.
This weekend we noticed that it has problems with updating spreadsheet title. We are getting the following error:
HttpError: <HttpError 403 when requesting https://www.googleapis.com/drive/v2/files/1_X51WMK0U12rfPKc2x60E_EuyqtQ8koW-NSRZq7Eqdw?quotaUser=5660071165952000&fields=title&alt=json returned "The authenticated user has not granted the app 593604285024 write access to the file 1_X51WMK0U12rfPKc2x60E_EuyqtQ8koW-NSRZq7Eqdw">
Other calls to Google Drive API succeed. We just have the problem with this one. Also this functionality worked properly for a long time. Is it possible that some update on Google side has broken this?
The minimal code to reproduce the issue is:
class TestDriveUpdate(webapp2.RequestHandler):
def get(self):
credentials = StorageByKeyName(Credentials,
'103005000283606027776',
'credentials').get()
spreadsheet_key = '1_X51WMK0U12rfPKc2x60E_EuyqtQ8koW-NSRZq7Eqdw'
quota_user = '5660071165952000'
body = {"title": 'Test'}
fields = "title"
http = httplib2.Http(timeout=60)
credentials.authorize(http)
gdrive = apiclient.discovery.build('drive', 'v2', http=http)
response = gdrive.files().update(
fileId=spreadsheet_key,
body=body,
fields=fields,
quotaUser=quota_user
).execute()
self.response.write("OK")
Based from this documentation, error occurs when the requesting app is not on the ACL for the file and the user never explicitly opened the file with this Drive app. Found this SO question which states that the scope strings must match exactly between your code and the Admin Console, including trailing slashes, etc. Make sure also that Drive Apps are allowed on the domain ("Allow users to install Google Drive apps").