GCP cloud function - Could not find kubectl on the path - python

i'm writing this Google Cloud Function (Python)
def create_kubeconfig(request):
subprocess.check_output("curl https://sdk.cloud.google.com | bash | echo "" ",stdin=subprocess.PIPE, shell=True )
os.system("./google-cloud-sdk/install.sh")
os.system("gcloud init")
os.system("curl -LO https://storage.googleapis.com/kubernetes-release/release/v1.17.0/bin/linux/amd64/kubectl")
os.system("gcloud container clusters get-credentials **cluster name** --zone us-west2-a --project **project name**")
os.system("gcloud container clusters get-credentials **cluster name** --zone us-west2-a --project **project name**")
conf = KubeConfig()
conf.use_context('**cluster name**')
when i run the code it gives me the error
'Invalid kube-config file. ' kubernetes.config.config_exception.ConfigException: Invalid kube-config file. No configuration found.
help me to solve it please

You have to reach programmatically the K8S API. You have the description of the API in the documentation
But it's not easy and simple to perform. However, here some inputs for achieving what you want.
First, get the GKE master IP
Then you can access to the cluster easily. Here for reading the deployment
import google.auth
from google.auth.transport import requests
credentials, project_id = google.auth.default()
session = requests.AuthorizedSession(credentials)
response = session.get('https://34.76.28.194/apis/apps/v1/namespaces/default/deployments', verify=False)
response.raise_for_status()
print(response.json())
For creating one, you can do this
import google.auth
from google.auth.transport import requests
credentials, project_id = google.auth.default()
session = requests.AuthorizedSession(credentials)
with open("deployment.yaml", "r") as f:
data = f.read()
response = session.post('https://34.76.28.194/apis/apps/v1/namespaces/default/deployments', data=data,
headers={'content-type': 'application/yaml'}, verify=False)
response.raise_for_status()
print(response.json())
According with the object that you want to build, you have to use the correct file definition and the correct API endpoint. I don't know a way to apply a whole yaml with several definition in only one API call.
Last things, be sure to provide the correct GKE roles to the Cloud Function service Account
UPDATE
Another solution is to use Cloud Run. Indeed, with Cloud Run and thanks to the Container capability, you have the ability to install and to call system process (it's totally open because your container runs into a GVisor sandbox, but most of common usages are allowed)
The idea is the following: use a gcloud SDK base image and deploy your application on it. Then, code your app to perform system calls.
Here a working example in Go
Docker file
FROM golang:1.13 as builder
# Copy local code to the container image.
WORKDIR /app/
COPY go.mod .
ENV GO111MODULE=on
RUN go mod download
COPY . .
# Perform test for building a clean package
RUN go test -v ./...
RUN CGO_ENABLED=0 GOOS=linux go build -v -o server
# Gcloud capable image
FROM google/cloud-sdk
COPY --from=builder /app/server /server
CMD ["/server"]
Note: The image cloud-sdk image is heavy: 700Mb
The content example (only the happy path. I remove error management, and the stderr/stdout feedback for simplifying the code)
.......
// Example here: recover the yaml file into a bucket
client,_ := storage.NewClient(ctx)
reader,_ := client.Bucket("my_bucket").Object("deployment.yaml").NewReader(ctx)
content,_:= ioutil.ReadAll(reader)
// You can store locally the file into /tmp directory. It's an in-memory file system. Don't forget to purge it to avoid any out of memory crash
ioutil.WriteFile("/tmp/file.yaml",content, 0644)
// Execute external command
// 1st Recover the kube authentication
exec.Command("gcloud","container","clusters","get-credentials","cluster-1","--zone=us-central1-c").Run()
// Then interact with the cluster with kubectl tools and simply apply your description file
exec.Command("kubectl","apply", "-f","/tmp/file.yaml").Run()
.......

Instead of using gcloud inside the Cloud Function (and attempting to install it on every request, which will significantly increase the runtime of your function), you should use the google-cloud-container client library to make the same API calls directly from Python, for example:
from google.cloud import container_v1
client = container_v1.ClusterManagerClient()
project_id = 'YOUR_PROJECT_ID'
zone = 'YOUR_PROJECT_ZONE'
response = client.list_clusters(project_id, zone)

Related

Flask web app on Cloud Run - google.auth.exceptions.DefaultCredentialsError:

I'm hosting a Flask web app on Cloud Run. I'm also using Secret Manager to store Service Account keys. (I previously downloaded a JSON file with the keys)
In my code, I'm accessing the payload then using os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload to authenticate. When I deploy the app and try to visit the page, I get an Internal Service Error. Reviewing the logs, I see:
File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 121, in load_credentials_from_file
raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: File {"
I can access the secret through gcloud just fine with: gcloud secrets versions access 1 --secret="<secret_id>" while acting as the Service Account.
Here is my Python code:
# Grabbing keys from Secret Manager
def access_secret_version():
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = "projects/{project_id}/secrets/{secret_id}/versions/1"
# Access the secret version.
response = client.access_secret_version(request={"name": name})
payload = response.payload.data.decode("UTF-8")
return payload
#app.route('/page/page_two')
def some_random_func():
# New way
payload = access_secret_version() # <---- calling the payload
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload
# Old way
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "service-account-keys.json"
I'm not technically accessing a JSON file like I was before. The payload variable is storing entire key. Is this why it's not working?
Your approach is incorrect.
When you run on a Google compute service like Cloud Run, the code runs under the identity of the compute service.
In this case, by default, Cloud Run uses the Compute Engine default service account but, it's good practice to create a Service Account for your service and specify it when you deploy it to Cloud Run (see Service accounts).
This mechanism is one of the "legs" of Application Default Credentials when your code is running on Google Cloud, you don't specify the environment variable (you also don't need to create a key) and Cloud Run service acquires the credentials from the Metadata service:
import google.auth
credentials, project_id = google.auth.default()
See google.auth package
It is bad practice to define|set an environment variable within code. By their nature, environment variables should be provided by the environment. Doing this with APPLICATION_DEFAULT_CREDENTIALS means that your code always sets this value when it should only do this when the code is running off Google Cloud.
For completeness, if you need to create Credentials from a JSON string rather than from a file contain a JSON string, you can use from_service_account_info (see google.oauth2.service_account)

notebook to execute Databricks job

Is there an api or other way to programmatically run a Databricks job. Ideally, we would like to call a Databricks job from a notebook. Following just gives currently running job id but that's not very useful:
dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().toString()
To run a databricks job, you can use Jobs API. I have a databricks job called for_repro which I ran using the 2 ways provided below from databricks notebook.
Using requests library:
You can create an access token by navigating to Settings -> User settings. Under Access token tab, click generate token.
Use the above generated token along with the following code.
import requests
import json
my_json = {"job_id": <your_job-id>}
auth = {"Authorization": "Bearer <your_access-token>"}
response = requests.post('https://<databricks-instance>/api/2.0/jobs/run-now', json = my_json, headers=auth).json()
print(response)
The <databricks-instance> value from the above code can be extracted from your workspace URL.
Using %sh magic command script:
You can also use magic command %sh in your python notebook cell to run a databricks job.
%sh
curl --netrc --request POST --header "Authorization: Bearer <access_token>" \
https://<databricks-instance>/api/2.0/jobs/run-now \
--data '{"job_id": <your job id>}'
The following is my job details and run history for reference.
Refer to this Microsoft documentation to know all other operations that can be achieved using Jobs API.

Google Cloud Run does not find os.environ['GOOGLE_APPLICATION_CREDENTIALS'] variable

I am trying to deploy a Python app in Google Cloud Run to perform some tasks automatically and these tasks require access to my BigQuery.
I have tested the implementation in localhost through Cloud Shell, and it worked just as expected. Then I created a Cloud Run Service and all functions that do not require access to BigQuery work normally, but when I they require, I get the following error:
google.auth.exceptions.DefaultCredentialsError: File /XXXXXX/gbq.json was not found.
However, the file is there (the folders are correct, and I also tested adding copies of the file in other folders):
Any suggestions to solve the problem or a workaround I could use?
Thanks in advance
ADDITIONAL INFO:
main.py function:
(the bottom part of the code is used to test the app in localhost, which works perfectly)
from flask import Flask, request
from test_py import test as t
app = Flask(__name__)
#app.get("/")
def hello():
"""Return a friendly HTTP greeting."""
chamado = request.args.get("chamado", default="test")
print(chamado)
if chamado == 'test':
dados = f'chamado = test?\n{chamado == "test"}\n{t.show_data(chamado)}'
elif chamado == 'bigqueer':
dados = f'chamado = test?\n{chamado == "test"}\n{t.show_bq_data()}'
else:
dados = f'chamado = test?\n{chamado == "test"}\n{t.show_not_data(chamado)}'
print(dados)
return dados
if __name__ == "__main__":
# Development only: run "python main.py" and open http://localhost:8080
# When deploying to Cloud Run, a production-grade WSGI HTTP server,
# such as Gunicorn, will serve the app.
app.run(host="localhost", port=8080, debug=True)
BigQuery class:
class GoogleBigQuery:
def __init__(self):
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/XXXXXX/gbq.json'
self.client = bq.Client()
def executar_query(self, query):
client_query = self.client.query(query)
result = client_query.result()
return result
Cloud Run deploy:
gcloud run deploy pythontest \
--source . \
--platform managed \
--region $REGION \
--allow-unauthenticated
YOU DO NOT NEED THAT
Excuse my first brutal words but it's extremely dangerous to do what you do. Let me explain.
In your container, you put, in plain text a secret. Keep in mind that your container is like a zip. There is nothing secret, encrypted in it. You can convince yourselves by using dive and exploring your container layers and data.
Therefore: DO NOT DO THAT!
So now, what to do?
On Google Cloud, all the services can use the Metadata server to get credentials. The client libraries leverage it and you can rely on the default credential with you initialise your code. That mechanism is named ADC.
In your code, simpy remove that line: os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/XXXXXX/gbq.json'
So that, when you deploy your Cloud Run service, specify the runtime service account that you want to use. That's all! The Google Cloud environment and libraries will do the rest

Uploading file with python returns Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>

blob.upload_from_filename(source) gives the error
raise exceptions.from_http_status(response.status_code, message, >response=response)
google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: ('Request failed with status >code', 403, 'Expected one of', )
I am following the example of google cloud written in python here!
from google.cloud import storage
def upload_blob(bucket, source, des):
client = storage.Client.from_service_account_json('/path')
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket)
blob = bucket.blob(des)
blob.upload_from_filename(source)
I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.
This whole things wasn't working because I didn't have permission storage admin in the service account that I am using in GCP.
Allowing storage admin to my service account solved my problem.
As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.
Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:
SSH to vm where code is running or will be running. Make sure you are user, who has permission to upload things in google storage.
Run following command:
gcloud auth application-default login
This above command will ask to create token by clicking on url. Generate token and paste in ssh console.
That's it. All your python application started as that user, will use this as default credential for storage buckets interaction.
Happy GCP'ing :)
This question is more appropriate for a support case.
As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.
This is what worked for me when the google documentation didn't work. I was getting the same error with the appropriate permissions.
import pathlib
import google.cloud.storage as gcs
client = gcs.Client()
#set target file to write to
target = pathlib.Path("local_file.txt")
#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"
#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:
#download and write file locally
client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)

How to use Google API credentials json on Heroku?

I'm making an app using Google Calendar API, and planning to build it on Heroku.
I have a problem about authentication. Usually I use credential json file for that, but this time I don't want to upload it on Heroku for security reason.
How can I make authentiation on Heroku?
For now, I put my json to an env variable, and use oauth2client's from_json method.
def get_credentials():
credentials_json = os.environ['GOOGLE_APPLICATION_CREDENTIALS']
credentials = GoogleCredentials.from_json(credentials_json)
if not credentials or credentials.invalid:
flow = client.flow_from_clientsecrets(CLIENT_SECRET_FILE, SCOPES)
flow.user_agent = APPLICATION_NAME
if flags:
credentials = tools.run_flow(flow, store, flags)
else: # Needed only for compatibility with Python 2.6
credentials = tools.run(flow, store)
print('Storing credentials to ' + credential_path)
return credentials
But this code isn't perfect. If the credentials is invalid, I want the code to write the new credentials to the env variable, not to a new file.
Is there any better way?
I spent an entire day to find the solution because it's tricky.
No matter your language, the solution will be the same.
1 - Declare your env variables from in Heroku dashboard like :
The GOOGLE_CREDENTIALS variable is the content of service account credential JSON file as is.
The GOOGLE_APPLICATION_CREDENTIALS env variable in the string "google-credentials.json"
2 - Once variables are declared, add the builpack from command line :
$ heroku buildpacks:add https://github.com/elishaterada/heroku-google-application-credentials-buildpack
3 - Make a push. Update a tiny thing and push.
4 - The buildpack will automatically generate a google-credentials.json and fill it with the content of the google credentials content.
If you failed at something, it will not work. Check the content of the google-credentials.json with the Heroku bash.
The buildpack mentioned by Maxime Boué is not working anymore because of the Heroku updates(18+). However, below is a similar buildpack which is working. It is actually a fork from the previous one.
Use the below link in the buildpack setting of your Heroku app settings
https://github.com/gerywahyunugraha/heroku-google-application-credentials-buildpack
Define in Config Vars GOOGLE_CREDENTIALS as key and content of your credential file as Value
Define in Config Vars GOOGLE_APPLICATION_CREDENTIALS as Key and google-credentials.json as Value
Redeploy the application, it should work!!
If anyone is still looking for this, I've just managed to get this working for Google Cloud Storage by storing the JSON directly in an env variable (no extra buildpacks).
You'll need to place the json credentials data into your env vars and install google-auth
Then, parse the json and pass google credentials to the storage client:
from google.cloud import storage
from google.oauth2 import service_account
# the json credentials stored as env variable
json_str = os.environ.get('GOOGLE_APPLICATION_CREDENTIALS')
# project name
gcp_project = os.environ.get('GCP_PROJECT')
# generate json - if there are errors here remove newlines in .env
json_data = json.loads(json_str)
# the private_key needs to replace \n parsed as string literal with escaped newlines
json_data['private_key'] = json_data['private_key'].replace('\\n', '\n')
# use service_account to generate credentials object
credentials = service_account.Credentials.from_service_account_info(
json_data)
# pass credentials AND project name to new client object (did not work wihout project name)
storage_client = storage.Client(
project=gcp_project, credentials=credentials)
Hope this helps!
EDIT: Clarified that this was for Google Cloud Storage. These classes will differ for other services, but from the looks of other docs the different Google Client classes should allow the passing of credentials objects.
The recommended buildpack doesn't work anymore. Here's a quick, direct way to do the same:
Set config variables:
heroku config:set GOOGLE_APPLICATION_CREDENTIALS=gcp_key.json
heroku config:set GOOGLE_CREDENTIALS=<CONTENTS OF YOU GCP KEY>
The GOOGLE_CREDENTIALS is easier to set in the Heroku dashboard.
Create a .profile file in your repo with a line to write the json file:
echo ${GOOGLE_CREDENTIALS} > /gcp_key.json
.profile is run every time the container starts.
Obviously, commit the changes to .profile and push to Heroku, which will trigger a rebuild and thus create that gcp_key.json file.
A more official Heroku documentation in this topic: https://elements.heroku.com/buildpacks/buyersight/heroku-google-application-credentials-buildpack
I also used buyersight's buildpack and it was the only one, which worked for me
As state above, the official Heroku documentation works (https://elements.heroku.com/buildpacks/buyersight/heroku-google-application-credentials-buildpack).
For PHP Laravel users though, the config variable GOOGLE_APPLICATION_CREDENTIALS should be set to ../google-credentials.json. Otherwise, PHP will not find the file.
Screenshot from Heroku
I know this is old, but there is another alternative - ie to "split" the json file and store each important field as its own environment variable.
Something like:
PRIVATE_KEY_ID=qsd
PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
CLIENT_EMAIL=blabla#lalala.iam.gserviceaccount.com
CLIENT_ID=7777
CLIENT_X509_CERT_URL=https://www.googleapis.com/robot/v1/metadata/x509/whatever.iam.gserviceaccount.com
This can be in a .env file locally, and put on heroku using the UI or heroku config:set commands
Then in the python file, you can initialize the ServiceAccount using a dict instead of a JSON
CREDENTIALS = {
"type": "service_account",
"project_id": "iospress",
"private_key_id": os.environ["PRIVATE_KEY_ID"],
"private_key": os.environ["PRIVATE_KEY"],
"client_email": os.environ["CLIENT_EMAIL"],
"client_id": os.environ["CLIENT_ID"],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": os.environ["CLIENT_X509_CERT_URL"]
}
credentials = ServiceAccountCredentials.from_json_keyfile_dict(CREDENTIALS, SCOPES)
It's a bit more verbose than some of the options presented here, but it works without any buildback or other
In case you do not want to use buildpack
1 - Add env variables in Heroku via dashboard or CLI:
GOOGLE_CREDENTIALS variable is the content of the service account credential JSON file.
GOOGLE_APPLICATION_CREDENTIALS = google-credentials.json
2 - Create a file called .profile on the root of your project with the following content
echo ${GOOGLE_CREDENTIALS} > /app/google-credentials.json
3 - Push your code
4 - During startup, the container starts a bash shell that runs any code in $HOME/.profile before executing the dyno’s command.
Note: For Laravel projects GOOGLE_APPLICATION_CREDENTIALS = ../google-credentials.json
You can use the Heroku Platform API to update Heroku env vars from within your app.
It seems that those buildpacks where you can upload the credentials.json file are not working as expected. I finally managed with Lepinsk's buildpack (https://github.com/lepinsk/heroku-google-cloud-buildpack.git), which requires all keys and values to be set as config vars in Heroku. It does do the job though, so lots of thanks for that!
I have done this like below:
Created two env variable
CREDENTIALS - base64 encoded value of google api credential
CONFIG_FILE_PATH - /app/.gcp/key.json (this file we will create it at run time. in heroku preinstall phase as below)
Create a preinstall script.
"heroku-prebuild": "bash preinstall.sh",
And in preinstall.sh file, decode CREDENTIALS and create a config file and update it there.
if [ "$CREDENTIALS" != "" ]; then
echo "Detected credentials. Adding credentials" >&1
echo "" >&1
# Ensure we have an gcp folder
if [ ! -d ./.gcp ]; then
mkdir -p ./.gcp
chmod 700 ./.gcp
fi
# Load the private key into a file.
echo $GCP_CREDENTIALS | base64 --decode > ./.gcp/key.json
# Change the permissions on the file to
# be read-only for this user.
chmod 400 ./.gcp/key.json
fi
If you're still having an issue running the app after following the buildpack instructions already mentioned in this article, try setting your Heroku environment variable GOOGLE_APPLICATION_CREDENTIALS to the full path instead.
GOOGLE_APPLICATION_CREDENTIALS = /app/google-credentials.json
This worked for me.
Previously, I could see that the google-credentials file was already generated by the buildpack (or .profile) but the Heroku app wasn't finding it, and giving errors as:
Error: Cannot find module './google-credentials.json'
Require stack:
- /app/server/src/config/firebase.js
- /app/server/src/middlewares/authFirebase.js
- /app/server/src/routes/v1/auth.route.js
- /app/server/src/routes/v1/index.js
- /app/server/src/app.js
- /app/server/src/index.js
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:902:15)
at Module.Hook._require.Module.require (/app/server/node_modules/require-in-the-middle/index.js:61:29)
at require (internal/modules/cjs/helpers.js:93:18)
at Object.<anonymous> (/app/server/src/config/firebase.js:3:24)
at Module._compile (internal/modules/cjs/loader.js:1085:14)
I just have to add this tip, be careful on making GOOGLE_APPLICATION_CREDENTIALS variable in heroku dashborad, it caused me a day, if you have path like that for the credential file: server\credential.json , that will not work because using the backslash , so use slash instead / :
this will work as path (without "):
server/credential.json
The simplest way I've found is to
Save the credentials as a string in a heroku ENV variable
In you app, load them into a Ruby Tempfile
Then have the GoogleDrive::Session.from_service_account_key load them from that temp file
require "tempfile"
...
google_credentials_tempfile = Tempfile.new("credentials.json")
google_credentials_tempfile.write(ENV["GOOGLE_CREDENTIALS_JSON"])
google_credentials_tempfile.rewind
session = GoogleDrive::Session.from_service_account_key(google_credentials_tempfile)
I have this in a heroku app and it works flawlessly.

Categories