How to disable gcp services by using python (client library files)? - python

I can able to disable GCP services via gcloud by using gcloud services disable storage.googleapis.com. but i need to achieve via python client library files (reference), I searched but no luck. For Authentication i have credentials.json file. do we have any way ? can any one suggest me the code or reference document or site pl?

I believe you referenced the wrong documentation, here is the Python SDK documentation you should look at: https://googleapis.github.io/google-api-python-client/docs/dyn/serviceusage_v1.services.html#disable
Something similar as below (haven't tested it):
client = discovery.build('serviceusage', 'v1', credentials=credentials)
svc_name = "projects/123/services/serviceusage.googleapis.com"
operation = client.services().disable(name=svc_name).execute()

Related

How does the `gcloud` CLI use OAuth2 to authenticate a user?

I'm trying to mimic the flow for authenticating with GCP using the gcloud CLI in a Go project. In this case, I can't just shell out to gcloud using the os package because I have to assume it's not installed on the system. I also need to avoid having the user go in and set an OAuth2 client_id and client_secret in the developer console since the gcloud CLI doesn't seem to require it.
I was trying to look through the Python code for gcloud and I can sort of see what it's doing but it's a bit difficult to follow since I'm not super experienced in Python and it seems like there are many layers of abstraction in the authentication code that are a bit hard to break through.
I can see the GCP URLs (the ones that you open in the browser when doing gcloud auth login) have a client_id but they don't have a client_secret. They also look like they do PKCE but I'm not super familiar with how that works. I also can't figure out exactly where they get the client_id from. If it's somehow bundled with the gcloud CLI I can't find it anywhere.

Connect to Azure sql in Python Using Service Principal

I have a Service principal with a client id and client secret. It has permission to the Azure SQL DB. I want to use python to generate an access token and use it to authenticate to my sql server. Could someone guide me.
I am new with python and would appreciate if someone could specify if I need any supporting libraries for this to work.
I want to use python to generate an access token.
The Azure Active Directory Authentication library for python can be used to access SQL Server in Azure.
Refer this GitHub repository to know more details and how to implement the same.

Authenticate Google Cloud Storage Python client with gsutil-generated boto file

I'm trying to automate report downloading from Google Play (thru Cloud Storage) using GC Python client library. From the docs, I found that it's possible to do it using gsutil. I found this question has been answered here, but I also found that Client infers credentials from environment and I plan to do this on automation platform with (assumed) no gcloud credentials set.
I've found that you can generate gsutil boto file then use it as credential, but how can I load this into the client library?
This is not exactly a direct answer to your question, but the best way would be to create a service account in GCP, and then use the service account's JSON keyfile to interact with GCS. See this documentation on how to generate said keyfile.
NOTE: You should treat this keyfile as a password as it will have the access you give it in the step below. So no uploading to public github repos for example.
You'll also have to give the serviceaccount the permission Storage Object Viewer, or one with more permissions.
NOTE: Always use the least needed to due to security considerations.
The code for this is extremely simple. Note that this is extremely similar to the methods mentioned in the link for generating the keyfile, the exception being the way the client is instantiated.
requirements.txt
google-cloud-storage
code
from google.cloud import storage
cred_json_file_path = 'path/to/file/credentials.json'
client = storage.Client.from_service_account_json(cred_json_file_path)
If you want to use the general Google API Python client library you can use this library to do a similar instantiation of a credentials object using the JSON keyfile, but for GCS the google-cloud-storage library is very much preferred as it does some magic behind the scenes, as the API python client library is a very generic one that (theoretically) be useable with all Google API's.
gsutil will look for a .boto file in the home directory of the user invoking it, so ~/.boto, for Linux and macOS, and in %HOMEDRIVE%%HOMEPATH% for Windows.
Alternately, you can set the BOTO_CONFIG environment variable to the path of the .boto file you want to use. Here's an example:
BOTO_CONFIG=/path/to/your_generated_boto_file.boto gsutil -m cp files gs://bucket
You can generate a .boto file with a service account by using the "-e" flag with the config command: gsutil config -e.
Also note that if gsutil is installed with the gcloud command, gcloud will share its authentication config with gsutil unless you disable that behavior with this command: gcloud config set pass_credentials_to_gsutil false.
https://cloud.google.com/storage/docs/boto-gsutil

Python client for Google Container Engine API

I'm working on a project where I need to create and manage clusters, pods, services and deployments on google container engine.I have googled a lot to find an API for that, Google's Container engine REST API is available, is there any python client for that API? what I need exactly.
Help me, please!
Thanks in advance!
On this page you can find information about using Python including installation of the client library and
Google Container Engine API: The Google Container Engine API is used
for building and managing container based applications, powered by the
open source Kubernetes technology.
This page contains information about getting started with the Google
Container Engine API using the Google API Client Library for Python.
In addition, you may be interested in the following documentation.
More generally there is this page about Google APIs and Python libraries and a getting started using Python in GCE example on Github.

How to store data in GCS while accessing it from GAE and 'GCE' locally

There's a GAE project using the GCS to store/retrieve files. These files also need to be read by code that will run on GCE (needs C++ libraries, so therefore not running on GAE).
In production, deployed on the actual GAE > GCS < GCE, this setup works fine.
However, testing and developing locally is a different story that I'm trying to figure out.
As recommended, I'm running GAE's dev_appserver with GoogleAppEngineCloudStorageClient to access the (simulated) GCS. Files are put in the local blobstore. Great for testing GAE.
Since these is no GCE SDK to run a VM locally, whenever I refer to the local 'GCE', it's just my local development machine running linux.
On the local GCE side I'm just using the default boto library (https://developers.google.com/storage/docs/gspythonlibrary) with a python 2.x runtime to interface with the C++ code and retrieving files from the GCS. However, in development, these files are inaccessible from boto because they're stored in the dev_appserver's blobstore.
Is there a way to properly connect the local GAE and GCE to a local GCS?
For now, I gave up on the local GCS part and tried using the real GCS. The GCE part with boto is easy. The GCS part is also able to use the real GCS using an access_token so it uses the real GCS instead of the local blobstore by:
cloudstorage.common.set_access_token(access_token)
According to the docs:
access_token: you can get one by run 'gsutil -d ls' and copy the
str after 'Bearer'.
That token works for a limited amount of time, so that's not ideal. Is there a way to set a more permanent access_token?
There is convenience option to access Google Cloud Storage from development environment. You should use client library provided with Google Cloud SDK. After executing gcloud init locally you get access to your resources.
As shown in examples to Client library authentication:
# Get the application default credentials. When running locally, these are
# available after running `gcloud init`. When running on compute
# engine, these are available from the environment.
credentials = GoogleCredentials.get_application_default()
# Construct the service object for interacting with the Cloud Storage API -
# the 'storage' service, at version 'v1'.
# You can browse other available api services and versions here:
# https://developers.google.com/api-client-library/python/apis/
service = discovery.build('storage', 'v1', credentials=credentials)
Google libraries come and go like tourists in a train station. Today (2020) google-cloud-storage should work on GCE and GAE Standard Environment with Python 3.
On GAE and CGE it picks up access credentials from the environment and locally you can provide it whit a servce account JSON-file like this:
GOOGLE_APPLICATION_CREDENTIALS=../sa-b0af54dea5e.json
If you're always using "real" remote GCS, the newer gcloud is probably the best library: http://googlecloudplatform.github.io/gcloud-python/
It's really confusing how many storage client libraries there are for Python. Some are for AE only, but they often force (or at least default to) using the local mock Blobstore when running with dev_appserver.py.
Seems like gcloud is always using the real GCS, which is what I want.
It also "magically" fixes authentication when running locally.
It looks like appengine-gcs-client for Python is now only useful for production App Engine and inside dev_appserver.py, and the local examples for it have been removed from the developer docs in favor of Boto :( If you are deciding not to use the local GCS emulation, it's probably best to stick with Boto for both local testing and GCE.
If you still want to use 'google.appengine.ext.cloudstorage' though, access tokens always expire so you'll need to manually refresh it. Given your setup honestly the easiest thing to to is just call 'gsutil -d ls' from Python and parse the output to get a new token from your local credentials. You could use the API Client Library to get a token in a more 'correct' fashion, but at that point things would be getting so roundabout you might as well just be using Boto.
There is a Google Cloud Storage local / development server for this purpose: https://developers.google.com/datastore/docs/tools/devserver
Once you have set it up, create a dataset and start the GCS development server
gcd.sh create [options] <dataset-directory>
gcd.sh start [options] <dataset-directory>
Export the environment variables
export DATASTORE_HOST=http://yourmachine:8080
export DATASTORE_DATASET=<dataset_id>
Then you should be able to use the datastore connection in your code, locally.

Categories