Need help retrieving Google cloudSQL metadata and logs using Python - python

I am new to Google cloud and would like to know if there is a way to retrieve cloudSQL (MySQL) instance metadata and error logs using Python.
I installed the Google cloud SDK and ran the following commands to retrieve metadata and I got detailed metadata like IP, region, disk, etc.
gcloud sql instances describe my-instance-id
I need to check this data periodically for compliance. I have worked on AWS and I use boto3 Python package for these kind of tasks. I googled for boto3 equivalent in Google but the docs for Google API client are really confusing to me.
I also need to fetch MySQL error logs from cloudSQL instance (for alerting in case any errors are found).
Can anyone show me how to perform these operations using google API for python or point me in the right direction?

Here is a sample code on how to retrieve the Cloud SQL MySQL error logs using Cloud Logging API. For testing I logged in with a wrong password to generate error logs.
The filter used is a sample filter in the Cloud Logging docs.
from google.cloud.logging import Client
projectName = 'your-project-here'
myFilter = 'resource.type = "cloudsql_database" AND log_id("cloudsql.googleapis.com/mysql.err")'
client = Client(project = projectName)
entries = client.list_entries(filter_ = myFilter)
for entry in entries:
print(entry)
Output snippet:

Here's how to get SQL instance metadata:
import json
from googleapiclient import discovery
service = discovery.build('sqladmin', 'v1beta4')
req = service.instances().list(project="project-name")
resp = req.execute()
print(json.dumps(resp, indent=2))
credit to #AKX, found the answer at cloud.google.com/sql/docs/mysql/admin-api/libraries#python
No luck on the 2nd part tough i.e. retrieving MySQL error log

Related

Firestore emulator with anonymous credentials

I am trying to get Firestore working in emulator-mode with Python on my local Linux PC. Is it possible to use anonymous credentials for this so I don't have to create new credentials?
I have tried two methods of getting access to the Firestore database from within a Python Notebook, after having run firebase emulators:start from the command-line:
First method:
from firebase_admin import credentials, firestore, initialize_app
project_id = 'my_project_id'
cred = credentials.ApplicationDefault()
initialize_app(cred, {'projectId': project_id})
db = firestore.client()
This raises the following exception:
DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Second method:
from google.auth.credentials import AnonymousCredentials
from google.cloud.firestore import Client
cred = AnonymousCredentials()
db = Client(project=project_id, credentials=cred)
This works, but then I try and access a document from the database that I have manually inserted into the database using the web-interface, using the following Python code:
doc = db.collection('my_collection').document('my_doc_id').get()
And then I get the following error, which perhaps indicates that the anonymous credentials don't work:
PermissionDenied: 403 Missing or insufficient permissions.
Thoughts
It is a surprisingly complicated system and although I have read numerous doc-pages, watched tutorial videos, etc., there seems to be an endless labyrinth of configurations that need to be setup in order for it to work. It would be nice if I could get Firestore working on my local PC with minimal effort, to see if the database would work for my application.
Thanks!
Method 2 works if an environment variable is set. In the following change localhost:8080 to the Firestore server address shown when the emulator is started using firebase emulators:start
import os
os.environ['FIRESTORE_EMULATOR_HOST'] = 'localhost:8080'
I don't know how to make it work with Method 1 above using the firebase_admin Python package. Perhaps someone else knows.
Also note that the emulated Firestore database will discard all its data when the server is shut down. To persist and reuse the data start the emulator with a command like this:
firebase emulators:start --import=./emulator_data --export-on-exit

Google Cloud Function succeeds, but not showing expected output

I am testing out cloud function and I have things setup, but output is not populating correctly (the output is not being saved into Cloud Storage and my print statements are not populating). Here is my code and my requirements below. I have setup the Cloud Function to just run as a HTTP request trigger type with unauthenticated invocations and having a Runtime service account as a specified account that has write access to Cloud Storage. I have verified that I am calling the correct Entry point.
logs
2022-03-22T18:52:02.749482564Z test-example vczj9p85h5m2 Function execution started
2022-03-22T18:52:04.148507183Z test-example vczj9p85h5m2 Function execution took 1399 ms.
Finished with status code: 200
main.py
import requests
from google.cloud import storage
import json
def upload_to_gsc(data):
print("saving to cloud storage")
client = storage.Client(project="my-project-id")
bucket = client.bucket("my-bucket-name")
blob = bucket.blob("subfolder/name_of_file")
blob.upload_from_string(data)
print("data uploaded to cloud storage")
def get_pokemon(request):
url = "https://pokeapi.co/api/v2/pokemon?limit=100&offset=200"
data = requests.get(url).json()
output = [i.get("name") for i in data["results"]]
data = json.dumps(output)
upload_to_gsc(data=data)
print("saved data!")
requirements.txt
google-cloud-storage
requests==2.26.0
As #JackWotherspoon mentioned, be sure to make sure you double check your project-id,bucket-name and entry point if you have a case like I did. For myself, I recreated the Cloud Function and tested it and it worked again.
As #dko512 mentioned in comments, issue was resolved by recreating and redeploying the Cloud Function.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.

Query from a BigQuery database via a Google Cloud Function (Python)

I have a big Query Database connected to a Google Sheet in which I have a read only access
My request is that I want to get data from a table and this request is working perfectly fine in the Big Query editor but I want to create a Google Cloud function to have an API and access this request directly from URL
I have ceated a Service Account using this command:
gcloud iam service-accounts create connect-to-bigquery
gcloud projects add-iam-policy-binding test-24s --member="serviceAccount:connect-to-bigquery#test-24s.iam.gserviceaccount.com" --role="roles/owner"
and I have created a Google cloud function as follow :
Creating Cloud Function
Service account settings
Here is my code for main.py file :
from google.cloud import bigquery
def hello_world(request):
client = bigquery.Client()
query = "SELECT order_id, MAX(status) AS last_status FROM `test-24s.Dataset.Order_Status` GROUP BY order_id"
query_job = client.query(query)
print("The query data:")
for row in query_job:
print("name={}, count ={}".format(row[0], row["last_status"]))
return f'The query run successfully'
And for the requirements.txt file :
# Function dependencies, for example:
# package>=version
google-cloud-bigquery
The function deploys successfully however when I try to test it I get this error :
Error: function terminated. Recommended action: inspect logs for termination reason. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging Details:
500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
And when reading the log file I found this error
403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
Please help me to solve this I already tried all the solutions that I found on the net without any success
Based on this: "Permission denied while getting Drive credentials" - I would say that your service account's IAM permissions are not 'transient' => while that service account probably has relevant access to the BigQuery, it does not have access to the underlined spreadsheet maintained on the Drive...
I would try - either
extend the scope of the service account's credentials (if possible, but that may not be very straightforward). Here is an article by Zaar Hai with some details - Google Auth — Dispelling the Magic and a comment from Guillaume - "Yes, my experience isn't the same";
or (preferably from my point of view)
make a copy (may be with regular updates) of the original spreadsheet based table as a native BigQuery table, and use the later in your cloud function. A side effect of this approach - a significant performance improvement (and cost savings).

How to disable caching for Adwords API on App Engine with zeep?

I am trying to disable caching with zeep as is described here:
https://github.com/googleads/googleads-python-lib/blob/master/README.md#how-can-i-configure-or-disable-caching
adwords_client = adwords.AdWordsClient(
developer_token, oauth2_client, user_agent,
client_customer_id=client_customer_id,
cache=googleads.common.ZeepServiceProxy.NO_CACHE)
But I lack understanding of what I should provided ot AdWordsClient as ‘oauth2_client’ attribute.
I am trying to find the solution here http://googleads.github.io/googleads-python-lib/googleads.oauth2.GoogleOAuth2Client-class.html but without success so far.
I am using For OAuth2 process google_auth_oauthlib and I managed retrieved refresh token, but at this point I am kinda lost, because due to the fact that I am running it on GCP App Engine, I am not able to use googleads.yaml file.
Can somebody enlighten me in a case of this oauth2_client?
Thanks sincerely!
A bit late but I found a solution to this and wanted to share with anyone who might hit this question.
Here's where I found the solution
You can do LoadFromStorage then disable the zeep cache:
from googleads import ad_manager, common
client = ad_manager.AdManagerClient.LoadFromStorage()
client.cache = common.ZeepServiceProxy.NO_CACHE
I spent hours trying to load using the credentials, and couldn't get it to work. This allowed me to run the module from an EC2 instance.
here is an example of using Google Ads Python Client Library on an Google App Engine App.
In this example it is explained how to do all the process for authentication considering there is no googleads.yaml
In particular for the oauth2_client check here they generate it like this:
oauth2credentials = client.OAuth2Credentials(
None, args.client_id, args.client_secret, args.refresh_token,
datetime.datetime(1980, 1, 1, 12), GOOGLE_OAUTH2_ENDPOINT,
USER_AGENT)
GOOGLE_OAUTH2_ENDPOINT is 'https://accounts.google.com/o/oauth2/token'
and USER_AGENT is given to you in the Adwords API information

How can I obtain suitable credentials in a cloud composer environment to make calls to the google sheets API?

I would like to be able to access data on a google sheet when running python code via cloud composer; this is something I know how to do in several ways when running code locally, but moving to the cloud is proving challenging. In particular I wish to authenticate as the composer service account rather than stashing the contents of a client_secret.json file somewhere (be that the source code or some cloud location).
For essentially the same question but instead accessing google cloud platform services, this has been relatively easy (even when running through composer) thanks to the google-cloud_* libraries. For instance, I have verified that I can push data to bigquery:
from google.cloud import bigquery
client = bigquery.Client()
client.project='test project'
dataset_id = 'test dataset'
table_id = 'test table'
dataset_ref = client.dataset(dataset_id)
table_ref = dataset_ref.table(table_id)
table = client.get_table(table_ref)
rows_to_insert = [{'some_column':'test string'}]
errors = client.insert_rows(table,rows_to_insert)
and the success or failure of this can be managed through sharing (or not) 'test dataset' with the composer service account.
Similarly, getting data from a cloud storage bucket works fine:
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('test bucket')
name = 'test.txt'
data_blob = bucket.get_blob(name)
data_pre = data_blob.download_as_string()
and once again I have the ability to control access through IAM.
However, for working with google sheets it seems I must resort to the Google APIs python client, and here I run into difficulties. Most documentation on this (which seems to be a moving target!) assumes local code execution, starting with the creation and storage of a client_secret.json file example 1, example 2, which I understand locally but doesn't make sense for a shared cloud environment with source control. So, a couple of approaches I've tried instead:
Trying to build credentials using discovery and oauth2
from googleapiclient.discovery import build
from httplib2 import Http
from oauth2client.contrib import gce
SAMPLE_SPREADSHEET_ID = 'key for test sheet'
SAMPLE_RANGE_NAME = 'test range'
creds = gce.AppAssertionCredentials(scope='https://www.googleapis.com/auth/spreadsheets')
service = build('sheets', 'v4', http = creds.authorize(Http()))
sheet = service.spreadsheets()
result = sheet.values().get(spreadsheetId=SAMPLE_SPREADSHEET_ID,
range=SAMPLE_RANGE_NAME).execute()
values = result.get('values', [])
Caveat: I know nothing about working with scopes to create credential objects via Http. But this seems closest to working: I get an HTTP403 error of
'Request had insufficient authentication scopes.'
However, I don't know if that means I successfully presented myself as the service account, which was then deemed unsuitable for access (so I need to mess around with permissions some more); or didn't actually get that far (and need to fix this credentials creation process).
Getting a credential object with google.auth and passing to gspread
My (limited) understanding is that oauth2client is being deprecated and google.auth is now the way to go. This yields credentials objects in a similarly simple way to my successful examples above for cloud platform services, that I hoped I could just pass to gspread:
import gspread
from google.auth import compute_engine
credentials = compute_engine.Credentials()
client = gspread.authorize(credentials)
Sadly, gspread doesn't work with these objects, because they don't have the attributes it expects:
AttributeError: 'Credentials' object has no attribute 'access_token'
This is presumably because gspread expects oauth2 credentials and those chucked out by google.auth aren't sufficiently compatible. The gspread docs also go down the 'just get a client_secret file'... but presumably if I can get the previous (oauth/http-based) approach to work, I could then use gspread for data retrieval. For now, though, a hybrid of these two approaches stumbles in the same way: a permission denied response due to insufficient authentication scopes.
So, whether using google.auth, oauth2 (assuming that'll stick around for a while) or some other cloud-friendly approach (i.e. not one based on storing the secret key), how can I obtain suitable credentials in a cloud composer environment to make calls to the google sheets API? Bonus marks for a way that is compatible with gspread (and hence gspread_dataframe), but this is not essential. Also happy to hear that this is a PEBCAK error and I just need to configure IAM permissions differently for my current approach to work.
It looks like your Composer environment oauthScopes config wasn't set up properly. If left unspecified, the default cloud-platform doesn't allow you to access Google sheets API. You may want to create a new Composer environment with oauthScopes = [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/cloud-platform"].
Google sheets API reference: https://developers.google.com/sheets/api/reference/rest/v4/spreadsheets/create.

Categories