I'm trying to have a custom message be sent when adding permissions to a file, while the invitation gets sent, the custom message does not appear. This is my snippet if someone can tell me what I am doing wrong, I'd appreciate it =)
(Python3.7)
from googleapiclient.discovery import build
def set_permissions(file_id):
permissions = {
"type": "user",
"role": "writer",
"emailAddress": 'my-email#domain.com',
"sendNotificationEmail" : True,
"emailMessage" : "some message with URL string"
}
service = build('drive', 'v3', credentials=Auth(), cache_discovery=False)
request = service.permissions().create(
fileId= file_id,
body=permissions,
fields='id'
)
return request.execute()
I've tried going through the docs here : https://developers.google.com/drive/api/v3/reference/permissions/create
But don't see any typos in the field names or anything.
You want to create a permission using Drive API v3.
You want to add emailMessage.
You want to achieve this using googleapis with python.
You have already been able to use Drive API.
Modified script:
When your script is modified, please modify as follows.
def set_permissions(file_id):
permissions = { # Modified
"type": "user",
"role": "writer",
"emailAddress": 'my-email#domain.com',
}
service = build('drive', 'v3', credentials=Auth(), cache_discovery=False)
request = service.permissions().create(
fileId= file_id,
body=permissions,
fields='id',
sendNotificationEmail=True, # Added
emailMessage="some message with URL string" # Added
)
return request.execute()
References:
Permissions: create
Drive API . permissions
Related
I am trying to use pyrebase to download a file from firebase storage. This file is located in customers/BoomMZrVfrOOOYa4JVibOJQoroT2/example.txt, I'm using this code to download it
firebaseConfig={'apiKey': "example",
'authDomain': "example",
'databaseURL': "example",
'projectId': "example",
'storageBucket': "example",
'messagingSenderId': "example",
'appId': "example",
'measurementId': "example"}
firebase=pyrebase.initialize_app(firebaseConfig)
db = firebase.database()
auth=firebase.auth()
email = 'example#gmail.com'
password = 'password123'
login = auth.sign_in_with_email_and_password(email, password)
storage = firebase.storage()
storage.child('customers/DNOgkWW8yRUaEZXko2S2EPEIiBR2/k.txt').download(path="C:\\Users\\username\\Desktop\\test", filename='k.txt')
This works when I allow read and write to anyone from firebase storage. I'm using these firebase storage rules:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /customers/{userId}/{allPaths=**} {
allow read: if request.auth.uid == userId;
}
}
}
This only allows users to download files from a folder named with their user ID. The problem is that i cannot downloaded files from firebase storage if it requires authentication in my rules. If authentication is not required then my code works.
EDIT:
I have changed my firebase rules to allow all authenticated users to login so that this is easier to debug. Here are my new rules:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /customers/{userId}/{allPaths=**} {
allow read, write: if request.auth != null
}
}
}
I have also edited the code to a minimal example as was requested.
The only information you can pass to the request is the path. Since you want to secure access on the UID of the user, that UID has to be part of the path to the file.
So you will have to:
Ensure that the user is signed in with Firebase Authentication
Pass the UID of the currently signed-in user in the path of the request
Since you hard-coded the UID of the user, I assume the user may not be signed in or you may be passing a UID value that doesn't match the current user.
Problem :
I need to get a list of certificates of apps registered under Azure AD and renew the ones which are expiring.
I was able to get the apps related details through Microsoft Graph API > applications. But, the issue is the bearer token refreshes every time in 1 hr. Since I want this task to be automated, I need to create a fresh token always.
I got some reference of Azure SDK for identity-based authentication but the package function is returning a credential, not a token (bearer token) to be used inside the rest API header Authorization
Code:
from azure.identity import DefaultAzureCredential
default_credential = DefaultAzureCredential()
References:
Azure api or sdk to get list of app registrations and the certificates associated with them
Ok after a lot of debugging and surfing the internet, I was able to find the RestAPI way to get the bearer token.
data = {
"client_id":"add your client id",
"scope": "add scope ex: User.read Directory.read.All",
"grant_type": "password", [don't modify this one since you are providing the password]
"username": "your username",
"password": "your password",
"client_secret": "client secret"
}
headers = {
"Host": "login.microsoftonline.com",
"Content-Type": "application/x-www-form-urlencoded"
}
data = requests.post(f'https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token', data=data, headers=headers)
You will receive a json consisting of access token and related details.
Do remember to provide the permissions in the azure portal> Azure AD > app registrations > your app > API permissions (grant consent)
: )
I am deploying a Google Cloud Function from another Cloud Function with Python. See my code below:
import requests
import json
def make_func(request):
# Get the access token from the metadata server
metadata_server_token_url = 'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/token?scopes=https://www.googleapis.com/auth/cloud-platform'
token_request_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(metadata_server_token_url, headers=token_request_headers)
token_response_decoded = token_response.content.decode("utf-8")
jwt = json.loads(token_response_decoded)['access_token']
# Use the api to create the function
response = requests.post('https://cloudfunctions.googleapis.com/v1/projects/myproject/locations/us-central1/functions',
json={"name":"projects/my-project/locations/us-central1/functions/funct","runtime":"python37","sourceArchiveUrl":"gs://bucket/main.zip","entryPoint":"hello_world","httpsTrigger": {} },
headers={'Accept': 'application/json',
'Content-Type': 'application/json',
'Authorization': 'Bearer {}'.format(jwt)} )
if response:
return 'Success! Function Created'
else:
return str(response.json())
However this function does not have "allow unauthenticated" on automatically. Thus, no requests from outside are allowed. How can I change my Python code to add this functionality when deploying the new function?
Thanks
You'll need to additionally give the allUsers member the Cloud Functions Invoker role:
from googleapiclient.discovery import build
service = build('cloudfunctions', 'v1')
project_id = ...
location_id = ...
function_id = ...
resource = f'projects/{project_id}/locations/{location_id}/functions/{function_id}'
set_iam_policy_request_body = {
'policy': {
"bindings": [
{
"role": "roles/cloudfunctions.invoker",
"members": ["allUsers"],
},
],
},
}
request = service.projects().locations().functions().setIamPolicy(
resource=resource,
body=set_iam_policy_request_body,
)
response = request.execute()
print(response)
This uses the google-api-python-client package.
In addition of Dustin answer, you have to know that the --allow-unauthenticated is for developer convenience. Under the hood it perform 2 things
Deploy your function in private mode
Add allUsers as member with Cloudfunction.invoker role
gcloud functions add-iam-policy-binding --member=allUsers --role=roles/cloudfunctions.invoker function-1
So, indeed, use the google-cloud-iam library for doing this.
In addition, your current code don't work because you use an access token to reach Cloud Function.
Indeed, you have an authorized error (401) -> You present an authorization header, but it's not authorize.
Without the header, you get a 403 error -> unauthenticated.
Anyway, you need to have a signed identity token. You have description and python code snippet here
I need to get the Instance List of my Google cloud project. So i tried this:
requests.get('https://compute.googleapis.com/compute/v1/projects/clouddeployment-265711/zones/europe-west3-a/instances)
How do i get authorization in python?.
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Login Required.",
"domain": "global",
"reason": "required",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
How do i get my "OAuth 2 access token" for my Google Cloud Project
Here is the full documentation on Server to Server authentication which also includes sample codes for every method supported.
In this GCP Github code, you can see multiple ways of authentication that you might choose from depending on your use-case.
For example with this code sample you can use a service account JSON key for authentication:
# [START auth_api_explicit]
def explicit(project):
from google.oauth2 import service_account
import googleapiclient.discovery
# Construct service account credentials using the service account key
# file.
credentials = service_account.Credentials.from_service_account_file(
'service_account.json')
# Explicitly pass the credentials to the client library.
storage_client = googleapiclient.discovery.build(
'storage', 'v1', credentials=credentials)
# Make an authenticated API request
buckets = storage_client.buckets().list(project=project).execute()
print(buckets)
# [END auth_api_explicit]
UPDATE:
If what you want is simply getting the Bearer token and storing it in a python variable to make a simple GET request:
import os
your_key = os.system('gcloud auth print-access-token')
so your_key will now have the Bearer token that you need to include in your request header
Otherwise, please read through this documentation which explains how to authenticate as an end-user.
When you create an instance A in Google Compute Engine, it'll get predefined, "default" service account attached to it (this basically means, that you can query google API from A, being authenticated with 'default' service account).
What I'd like to do, is to setup GCE instance with service account, that's different than a default one. This should be conceptually possible, given GCE API, but fails with exception:
{
"name": "operation-1400060483459-4f958fbc7d7b9-cd817778-b80d1cad",
"operationType": "insert",
"status": "DONE",
"user": "some_name#developer.gserviceaccount.com",
"error": {
"errors": [ {
"code": "SERVICE_ACCOUNT_ACCESS_DENIED",
"message": "The user does not have access to service account 'some_name#developer.gserviceaccount.com'"
} ] } }
Here's my code in python, which setups the instance:
discovery_service = discovery.build('compute',
config['compute_api_version'],
http=SignedJwtAssertionCredentials(
service_account_name="some_name#developer.gserviceaccount.com",
private_key=key_data,
scope='https://www.googleapis.com/auth/compute')
.authorize(httplib2.Http()))
instance = {}
# sets instance configuration details here
# ...
# ...
instance['serviceAccounts'] = [{
'email': "some_name#developer.gserviceaccount.com",
'scopes': ['https://www.googleapis.com/auth/devstorage.full_control',
'https://www.googleapis.com/auth/compute',
'https://www.googleapis.com/auth/userinfo.email', ]
}]
discovery_service.instances().insert(project=project, zone=zone, body=instance)
The weirdest part of it, is that exception says "The user does not have access to service account 'some_name#developer.gserviceaccount.com'", but the "user" it refers to is the 'some_name#developer.gserviceaccount.com' itself! Which means 'some_name#developer.gserviceaccount.com' does not have access to 'some_name#developer.gserviceaccount.com', which makes no sense.
I believe you'll need to create a new service account to use the API from a non-GCE instance. The service account you're referencing works within a GCE instance only.
To do that go to the Cloud Console > Project > APIs & Auth > Credentials.
Create new Client ID
Service Account
Download the .p12 file and load that as the private key. (See example below)
Also you'll need to create an instance from a boot disk which is typically created from one of the GCE supplied images.
Here's an example using JSON Web Tokens that worked for me. It was adapted from the docs located here: https://cloud.google.com/compute/docs/tutorials/python-guide#addinganinstance.
from apiclient import discovery
from oauth2client.file import Storage
from oauth2client.client import SignedJwtAssertionCredentials
import httplib2
import os.path
INSTANCE_NAME = 'my-instance'
API_VERSION = 'v1'
GCE_URL = 'https://www.googleapis.com/compute/%s/projects/' % (API_VERSION)
PROJECT_ID = '***'
SERVICE_ACOUNT_CLIENT_ID = '***.apps.googleusercontent.com'
SERVICE_ACCOUNT_EMAIL_ADDRESS = '***#developer.gserviceaccount.com'
GCE_SCOPE = 'https://www.googleapis.com/auth/compute'
ZONE = 'us-central1-a'
DEFAULT_SERVICE_EMAIL = 'default'
DEFAULT_SCOPES = ['https://www.googleapis.com/auth/devstorage.full_control',
'https://www.googleapis.com/auth/compute']
SOURCE_IMAGE_URL = 'projects/ubuntu-os-cloud/global/images/ubuntu-1410-utopic-v20141217'
def main():
f = file('private-key.p12', 'rb')
oauth_key_data = f.read()
f.close()
http = httplib2.Http()
oauth_storage = Storage('compute-creds.dat')
oauth_credentials = oauth_storage.get()
if oauth_credentials is None or oauth_credentials.invalid:
oauth_credentials = SignedJwtAssertionCredentials(
service_account_name=SERVICE_ACCOUNT_EMAIL_ADDRESS,
private_key=oauth_key_data,
scope=GCE_SCOPE)
oauth_storage.put(oauth_credentials)
else:
oauth_credentials.refresh(http)
http = oauth_credentials.authorize(http)
gce_service = discovery.build('compute', 'v1', http=http)
project_url = '%s%s' % (GCE_URL, PROJECT_ID)
image_url = '%s%s/global/images/%s' % (
GCE_URL, 'ubuntu-os-cloud', 'ubuntu-1410-utopic-v20141217')
machine_type_url = '%s/zones/%s/machineTypes/%s' % (
project_url, ZONE, 'n1-standard-1')
network_url = '%s/global/networks/%s' % (project_url, 'default')
instance = {
'name': INSTANCE_NAME,
'machineType': machine_type_url,
'disks': [{
'autoDelete': 'true',
'boot': 'true',
'type': 'PERSISTENT',
'initializeParams' : {
'diskName': INSTANCE_NAME,
'sourceImage': SOURCE_IMAGE_URL
}
}],
'networkInterfaces': [{
'accessConfigs': [{
'type': 'ONE_TO_ONE_NAT',
'name': 'External NAT'
}],
'network': network_url
}],
'serviceAccounts': [{
'email': DEFAULT_SERVICE_EMAIL,
'scopes': DEFAULT_SCOPES
}]
}
# Create the instance
request = gce_service.instances().insert(
project=PROJECT_ID, body=instance, zone=ZONE)
response = request.execute(http=http)
response = _blocking_call(gce_service, http, response)
print response
def _blocking_call(gce_service, auth_http, response):
"""Blocks until the operation status is done for the given operation."""
status = response['status']
while status != 'DONE' and response:
operation_id = response['name']
# Identify if this is a per-zone resource
if 'zone' in response:
zone_name = response['zone'].split('/')[-1]
request = gce_service.zoneOperations().get(
project=PROJECT_ID,
operation=operation_id,
zone=zone_name)
else:
request = gce_service.globalOperations().get(
project=PROJECT_ID, operation=operation_id)
response = request.execute(http=auth_http)
if response:
status = response['status']
return response
main()
FYI: in GCE you usually get two default service accounts:
-compute#developer.gserviceaccount.com
#cloudservices.gserviceaccount.com
Note the different Email suffix (developer.gserviceaccount.com vs. cloudservices.gserviceaccount.com). It appears that using your own service account, EVEN if it has the Owner role, does not grant you access to the <number>#cloudservices.gserviceaccount.com account, only to the 1st one (<number>-compute#developer.gserviceaccount.com).
In my case, I got the aforementioned error when trying to create an instance with my own service account while specifing that the instance will use the 2nd service account from above. Once I fixed the call to request that the instance will use the 1st account, it worked.