Google Documents List API v3 (Python) Update document - python

I'm trying to update a document on Google Docs/Drive after I created it, using the GData helpers for Python.
The new version of the API lacks documentation for Py.
client = gdata.docs.client.DocsClient(source=PluginConfig.APP_NAME)
client.http_client.debug = PluginConfig.DEBUG
client.client_login(
PluginConfig.EMAIL,
PluginConfig.PASSWORD,
source=PluginConfig.APP_NAME,
service=client.auth_service
)
[...]
# Upload the text file
ms = gdata.data.MediaSource()
ms.SetFileHandle(file_path, content_type)
doc = gdata.docs.data.Resource(type='document', title=title)
doc.description = gdata.docs.data.Description(description)
doc = client.CreateResource(doc, media=ms)
doc = client.UpdateResource(doc, media=ms, new_revision=True)
Login and document creation work fine, but the Update() receives 400 Bad Request
Traceback (most recent call last):
File "coll.py", line 301, in <module>
main()
File "coll.py", line 293, in main
doc = client.UpdateResource(doc, media=ms, new_revision=True)
File "/usr/lib/python2.7/dist-packages/gdata/docs/client.py", line 344, in update_resource
uri_params=uri_params, **kwargs)
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1151, in update_file
auth_token=auth_token, method='PUT')
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1085, in upload_file
start_byte, self.file_handle.read(self.chunk_size))
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1044, in upload_chunk
raise error
gdata.client.RequestError: Server responded with: 400, Invalid Request
More output: http://pastebin.com/LZL3qV0N
Any help is appreciated.

Try using the newer Drive API, its documentation includes Python samples in the reference guide, a Python quickstart and a complete sample application written in Python on App Engine.

Related

how to use offical google translate api on local runtime via python

after reading the offical google translate api document, it provide us with the following sample code:
from google.cloud import translate
def translate_text(text="Hello, world!", project_id="weighty-site-333613"):
client = translate.TranslationServiceClient().from_service_account_json('key.json')
location = "global"
parent = f"projects/{project_id}/locations/{location}"
response = client.translate_text(
request={
"parent": parent,
"contents": [text],
"mime_type": "text/plain",
"source_language_code": "en-US",
"target_language_code": "zh-CN",
}
)
for translation in response.translations:
print("Translated text: {}".format(translation.translated_text))
translate_text()
These code worked properly on google cloud terminal.
However, even if i put the "key.json" file in the same folder, an error like this is shown:
/usr/local/bin/python3.6 /Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py
Traceback (most recent call last):
File "/Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py", line 37, in <module>
translate_text()
File "/Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py", line 22, in translate_text
client = translate.TranslationServiceClient().from_service_account_json('key.json')
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/client.py", line 354, in __init__
always_use_jwt_access=True,
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/transports/grpc.py", line 158, in __init__
always_use_jwt_access=always_use_jwt_access,
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/transports/base.py", line 110, in __init__
**scopes_kwargs, quota_project_id=quota_project_id
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_default.py", line 488, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Process finished with exit code 1
can someone tell me what should i do at this step? thank you so much
You need service account json file with correct permissions from GCP under IAM & Service Accounts.
Then you need to implement command,
GOOGLE_APPLICATION_CREDENTIALS = "/path/to/your/service_account.json"

error while calling google api from python command line

I am very new to python. I am trying to call google monitoring API, through python command line from google cloud shell. I am using python 2.7. Before that I had already installed
pip install --upgrade google-api-python-client
This is the code for calling google monitoring api
from googleapiclient.discovery import build
service = build('monitoring', 'v3', cache_discovery=True)
timeseries = service.projects().timeSeries().list(
name="my-project",
filter="metric.type=compute.googleapis.com/instance/cpu/utilization"
aggregation_alignmentPeriod=86400s,
# aggregation_crossSeriesReducer=api_args["crossSeriesReducer"],
aggregation_perSeriesAligner="ALIGN_MEAN",
aggregation_groupByFields="metric.labels.key",
interval_endTime="2020-08-23T17:30:04.707Z",
interval_startTime="2020-08-22T17:30:05.603000Z",
pageSize=100
).execute()
I pass the appropriate values in python command line and when I execute this line
service = build('monitoring', 'v3', cache_discovery=True)
i get the following error:
>>>from googleapiclient.discovery import build
>>> service = build('monitoring', 'v3', cache_discovery=True)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/mugham/.local/lib/python2.7/site-packages/googleapiclient/_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "/home/mugham/.local/lib/python2.7/site-packages/googleapiclient/discovery.py", line 282, in build
adc_key_path=adc_key_path,
File "/home/mugham/.local/lib/python2.7/site-packages/googleapiclient/_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "/home/mugham/.local/lib/python2.7/site-packages/googleapiclient/discovery.py", line 489, in build_from_document
credentials = _auth.default_credentials()
File "/home/mugham/.local/lib/python2.7/site-packages/googleapiclient/_auth.py", line 44, in default_credentials
credentials, _ = google.auth.default()
File "/usr/local/lib/python2.7/dist-packages/google/auth/_default.py", line 338, in default
credentials, project_id = checker()
File "/usr/local/lib/python2.7/dist-packages/google/auth/_default.py", line 208, in _get_gae_credentials
project_id = app_engine.get_project_id()
File "/usr/local/lib/python2.7/dist-packages/google/auth/app_engine.py", line 77, in get_project_id
return app_identity.get_application_id()
File "/usr/lib/google-cloud-sdk/platform/google_appengine/google/appengine/api/app_identity/app_identity.py", line 455, in get_application_id
_, domain_name, display_app_id = _ParseFullAppId(full_app_id)
File "/usr/lib/google-cloud-sdk/platform/google_appengine/google/appengine/api/app_identity/app_identity.py", line 436, in _ParseFullAppId
psep = app_id.find(_PARTITION_SEPARATOR)
AttributeError: 'NoneType' object has no attribute 'find'
I have no idea why this is coming, Can anyone please help me? Thanks
I had been doing this from my google cloud shell, so I thought that this would work without passing GOOGLE_APPLICATION_CREDENTIALS as I had done this for other apis like compute engine, but it looks like I need to create a key from the service account which has the privelige to call this monitoring api through this command,
gcloud iam service-accounts keys create --iam-account outputkey
Once this is done, then I had to export this value to the environment file in GOOGLE_APPLICATION_CREDENTIALS it worked.

Accessing google cloud storage bucket from cloud functions throws 500 error

I'm trying to access google cloud storage bucket from cloud functions (python) instance and it's throwing mystic 500 error.
I have given the service account editor role too. It didn't make any change.
I also checked if any of the quota is going off limit. The limits were not even close.
Please, anyone can help me find cause of this error?
here is the code
from google.cloud import storage
import os
import base64
storage_client = storage.Client()
def init_analysis(event, context):
print("event", event)
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
print(pubsub_message)
bucket_name = 'my-bucket'
bucket = storage_client.get_bucket(bucket_name)
blobs = bucket.list_blobs()
for blob in blobs:
print(blob.name)
Error:
Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 99, in refresh service_account=self._service_account_email) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 208, in get_service_account_token 'instance/service-accounts/{0}/token'.format(service_account)) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 140, in get url, response.status, response.data), response) google.auth.exceptions.TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'", <google.auth.transport.requests._Response object at 0x2b0ef9edf438>) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function _function_handler.invoke_user_function(event_object) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function return call_user_function(request_or_event) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function event_context.Context(**request_or_event.context)) File "/user_code/main.py", line 21, in init_analysis bucket = storage_client.get_bucket(bucket_name) File "/env/local/lib/python3.7/site-packages/google/cloud/storage/client.py", line 227, in get_bucket bucket.reload(client=self) File "/env/local/lib/python3.7/site-packages/google/cloud/storage/_helpers.py", line 130, in reload _target_object=self, File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 315, in api_request target_object=_target_object, File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 192, in _make_request return self._do_request(method, url, headers, data, target_object) File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 221, in _do_request return self.http.request(url=url, method=method, headers=headers, data=data) File "/env/local/lib/python3.7/site-packages/google/auth/transport/requests.py", line 205, in request self._auth_request, method, url, request_headers) File "/env/local/lib/python3.7/site-packages/google/auth/credentials.py", line 122, in before_request self.refresh(request) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 102, in refresh six.raise_from(new_exc, caught_exc) File "<string>", line 3, in raise_from google.auth.exceptions.RefreshError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'", <google.auth.transport.requests._Response object at 0x2b0ef9edf438>)
google.auth.exceptions.TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'"
The error you are getting it's because your Cloud Functions service account doesn't have the cloudfunctions.serviceAgent role. As you can see on the documentation:
Authenticating as the runtime service account from inside your function may fail if you change the Cloud Functions service account's permissions.
However, I found that sometimes you can not add this role as it doesn't show up as an option. I have reported this issue to the Google Cloud Functions engineering team and they are working to solve it.
Nevertheless, you can add the role again using this gcloud command:
gcloud projects add-iam-policy-binding <project_name> --role=roles/cloudfunctions.serviceAgent --member=serviceAccount:service-<project_number>#gcf-admin-robot.iam.gserviceaccount.com

Instagram API Python error

I'm trying to familiarize with the library python-instagram.
I've done this code:
from instagram.client import InstagramAPI
clientid = '***'
clientsecret = '***'
api = InstagramAPI(client_id=clientid, client_secret=clientsecret)
tag_name = raw_input("Write the word that you want")
filtered_media = api.tag_recent_media(count=20, max_id=1, tag_name=tag_name)
for media in filtered_media:
print media.images['standard_resolution'].url
and I get the following error using the command line (I have a mac):
Traceback (most recent call last):
File "test.py", line 7, in <module>
filtered_media = api.tag_recent_media(count=20, max_id=1, tag_name=tag_name)
File "build/bdist.macosx-10.6-intel/egg/instagram/bind.py", line 197, in _call
File "build/bdist.macosx-10.6-intel/egg/instagram/bind.py", line 189, in execute
File "build/bdist.macosx-10.6-intel/egg/instagram/bind.py", line 163, in _do_api_request
instagram.bind.InstagramAPIError: (400) OAuthAccessTokenException-The access_token provided is invalid.
Someone knows what is happen? THANKS
I would suggest you to take a look at this sample app . You will get to know how to use python-instagram library.
In your code , You are passing client Id in InstagramAPI(). But as per their sample app required argument is access_token.
code snippet from sample_app.py
api = client.InstagramAPI(access_token=access_token, client_secret=CONFIG['client_secret'])
tag_search, next_tag = api.tag_search(q="backclimateaction")
tag_recent_media, next = api.tag_recent_media(tag_name=tag_search[0].name)
To get tag_search result you must have access_token. you can read this documentation to learn how to generate access token from your app.

Duplicity backup to onedrive client error

I'm trying to make backup of files on my computer in onedrive with duplicity.
I have installed all dependencies, when running duplicity there is the auth link generated which I must open in browser and than in duplicity after giving permissions for app paste the return url.
I do all this steps but duplicity is returning me
Traceback (most recent call last):
File "/usr/bin/duplicity", line 1532, in <module>
with_tempdir(main)
File "/usr/bin/duplicity", line 1526, in with_tempdir
fn()
File "/usr/bin/duplicity", line 1364, in main
action = commandline.ProcessCommandLine(sys.argv[1:])
File "/usr/lib/python2.7/site-packages/duplicity/commandline.py", line 1116, in ProcessCommandLine
backup, local_pathname = set_backend(args[0], args[1])
File "/usr/lib/python2.7/site-packages/duplicity/commandline.py", line 1005, in set_backend
globals.backend = backend.get_backend(bend)
File "/usr/lib/python2.7/site-packages/duplicity/backend.py", line 223, in get_backend
obj = get_backend_object(url_string)
File "/usr/lib/python2.7/site-packages/duplicity/backend.py", line 209, in get_backend_object
return factory(pu)
File "/usr/lib/python2.7/site-packages/duplicity/backends/onedrivebackend.py", line 90, in __init__
self.initialize_oauth2_session()
File "/usr/lib/python2.7/site-packages/duplicity/backends/onedrivebackend.py", line 153, in initialize_oauth2_session
authorization_response=redirected_to)
File "/usr/lib/python2.7/site-packages/requests_oauthlib/oauth2_session.py", line 232, in fetch_token
self._client.parse_request_body_response(r.text, scope=self.scope)
File "/usr/lib/python2.7/site-packages/oauthlib/oauth2/rfc6749/clients/base.py", line 409, in parse_request_body_response
self.token = parse_token_response(body, scope=scope)
File "/usr/lib/python2.7/site-packages/oauthlib/oauth2/rfc6749/parameters.py", line 376, in parse_token_response
validate_token_parameters(params)
File "/usr/lib/python2.7/site-packages/oauthlib/oauth2/rfc6749/parameters.py", line 383, in validate_token_parameters
raise_from_error(params.get('error'), params)
File "/usr/lib/python2.7/site-packages/oauthlib/oauth2/rfc6749/errors.py", line 271, in raise_from_error
raise cls(**kwargs)
InvalidClientError: (invalid_client) The client does not exist. If you are the application developer, configure a new application through the application management site at https://manage.dev.live.com/.
It looks like there is no app with ID which duplicity generate auth link with.
But when I go to the link provided by duplicity I see that "Duplicity is asking for permissions".
So should I add my own app and in some way provide its id to duplicity? (I was searching how to do it but without result) or is it a duplicity bug?
All programmatic interaction with Windows Live requires a client ID,
which uniquely identifies your application to Windows Live. Your
application must include the client ID in every request that it sends
to the Messenger Connect API Service.
You have to register your application as shown in this official Windows Live tutorial:
https://msdn.microsoft.com/en-us/library/ff751474.aspx
And then pass your ID to the application to be able to authentificate in Windows Live in execution time when asking to the API.
You can use the code in
https://github.com/fkalis/bash-onedrive-upload
which also provide full support for upload files which size is bigger then 100MB

Categories