I'm using spotipy to set spotify api credential on jupyter-lab. I have generated the spotify client secrect using client id and whenever i run this code it gives me the key error on client id and on client secrect key.
sp = spotipy.Spotify(auth_manager=SpotifyClientCredentials(client_id=os.environ["SPOTIFY_CLIENT_ID"],client_secret=os.environ["SPOTIFY_CLIENT_SECRET"]))
PS: I have also imported "spotipy" and "spotipy.oauth2 import SpotifyClientCredentials" and they have been imported successfullly.
Below is the code and the error snippet:
code
error
These are environmental variables. You need to set them in your environment before running that code. You can set these within your Jupyter notebook with the magic commands. Copying the below from [michael's answer here][1]:
To set an env variable in a jupyter notebook, just use a % magic
commands, either %env or %set_env, e.g., %env MY_VAR=MY_VALUE or
%env MY_VAR MY_VALUE. (Use %env by itself to print out current
environmental variables.)
[1]: https://stackoverflow.com/questions/37890898/how-to-set-env-variable-in-jupyter-notebook#:~:text=To%20set%20an%20env%20variable,print%20out%20current%20environmental%20variables.)
So I have went the other way on my jupyter-lab and it worked.
I did the following in order to achieve it:
Create a token first with this line of code:
token = SpotifyClientCredentials(client_id="client_id", client_secret="client_secret").get_access_token()
and pass the token to spotipy like this:
sp = spotipy.Spotify(token)
use the 'sp' variable in your later code and it works. No 'key errors' were thrown at the execution of above code, Thanks :)
Related
I am using Azure sentinel notebook for threat intelligence. While trying to configure msticpy for it to connect to Azure sentinel but getting 'Value error'. Following is the code that I am using :
from msticpy.config import MpConfigEdit
import os
mp_conf = "msticpyconfig.yaml"
# check if MSTICPYCONFIG is already an env variable
mp_env = os.environ.get("MSTICPYCONFIG")
mp_conf = mp_env if mp_env and Path (mp_env).is_file() else mp_conf
if not Path(mp_conf).is_file():
print(
"No msticpyconfig.yaml was found!",
"Please check that there is a config.json file in your workspace folder.",
"If this is not there, go back to the Microsoft Sentinel portal and launch",
"this notebook from there.",
sep="\n"
)
else:
mpedit = MpConfigEdit(mp_conf)
mpedit.set_tab("AzureSentinel")
display(mpedit)
ValueError: File not found: 'None'.
In the Azure ML terminal, create the nbuser_settings.py file in the root of your user folder, which is the folder with your username.
In the nbuser_settings.py file, add the following lines:
import os
os.environ["MSTICPYCONFIG"] = "~/msticpyconfig.yaml"
https://learn.microsoft.com/en-us/Azure/sentinel/notebooks-msticpy-advanced?msclkid=e7cd84dfd05c11ecb0df15e0892300fc&tabs=azure-ml
Reference
Some elements of MSTICPy require configuration parameters. An example is the Threat Intelligence providers. Values for these and other parameters can be set in the msticpyconfig.yaml file.
The package has a default configuration file, which is stored in the package directory. You should not need to edit this file directly. Instead you can create a custom file with your own parameters - these settings will combine with or override the settings in the default file.
By default, the custom msticpyconfig.yaml is read from the current directory. You can specify an explicit location using an environment variable MSTICPYCONFIG.
You should also read the MSTICPy Settings Editor document to see how to configure settings using and interactive User Interface from a Jupyter notebook.
!!! NOTE !!! For the Linux and Windows options, you'll need to restart your Jupyter server for it to pick up the environment variable that you defined.
https://msticpy.readthedocs.io/en/latest/getting_started/msticpyconfig.html?msclkid=96fde57dd04d11ec9e5406de243d7c67
The author of msticpy has posted the issue on github & we have to wait for the latest release. Please follow the thread for more details:
https://github.com/microsoft/msticpy/issues/393
I want to run a notebook in databricks from another notebook using %run. Also I want to be able to send the path of the notebook that I'm running to the main notebook as a parameter.
The reason for not using dbutils.notebook.run is that I'm storing nested dictionaries in the notebook that's called and I wanna use them in the main notebook.
I'm looking for Something like:
path = "/References/parameterDefinition/schemaRepository"
%run <path variable>
Magic commands such as %run and %fs do not allow variables to be passed in.
The workaround is you can use dbutils as like dbutils.notebook.run(notebook, 300 ,{})
You can pass arguments as documented on Databricks web site:
https://docs.databricks.com/notebooks/widgets.html#use-widgets-with-run
In the top notebook you can call
%run /path/to/notebook $X="10" $Y="1"
And then in the sub notebook, you can reference those arguments using the widgets API as in
x_value = dbutils.widgets.get("X")
y_value = dbutils.widgets.get("Y")
To your specific question, it would look something like this where "path" is the variable to be referenced via the widgets API in the target notebook:
%run /path/to/notebook $path="/path/to/notebook"
Unfortunately it's impossible to pass the path in %run as variable. You can pass variable as parameter only, and it's possible only in combination with with widgets - you can see the example in this answer. In this case you can have all your definitions in one notebook, and depending on the passed variable you can redefine the dictionary.
There will be a new functionality coming in the next months (approximately, see public roadmap webinar for more details) that will allow to import notebooks as libraries using the import statement. Potentially you can emulate the same functionality by exporting the notebook into the file on disk using the Export command of Workspace API, decoding the data & importing file's content, for example, if you have notebook called module1 with content
my_cool_dict = {"key1": "abc", "key2": 123}
then you can import it as following:
import requests
import base64
import os
api_url = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiUrl().get()
host_token = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().get()
path = "/Users/..../module1"
# fetch notebook
response = requests.get(f"{api_url}/api/2.0/workspace/export",
json = {"format": "SOURCE", "path": path},
headers={"Authorization": f"Bearer {host_token}"}
).json()
# decode base64 encoded content
data = base64.b64decode(response["content"].encode("ascii"))
# write the file & __init__.py, so directory will considered a module
dir = os.path.join("/tmp","my_modules")
if not os.path.exists(dir):
os.mkdir(dir)
with open(os.path.join(dir, os.path.split(path)[-1]+".py"), "wb") as f:
f.write(data)
with open(os.path.join(dir, "__init__.py"), "wb") as f:
f.write("\n".encode("ascii"))
# add our directory into system path
import sys
sys.path.append(dir)
# import notebook
from module1 import my_cool_dict
and see that we got our variable:
Problem
You can't pass it as a variable while running the notebook like this:
In notebook1:
path_var = "/some/path"
%run ./notebook2
%path=path_var
Solution
However what you can do, and what I did, is access the dbutils object or the variable of notebook1 in notebook2:
In notebook1:
dbutils.widgets.text("path","", "")
path_var = "/some/path"
%run ./notebook2
Then in notebook2:
"""
No need to define path widget in Notebook2 like this:
dbutils.widgets.text("path","", "")
"""
path = dbutils.widgets.get("path")
print(path)
Output: /some/path
"""
Or you can access the path_var of Notebook1 directly
without defining it anywhere in Notebook2 like this
"""
print(path_var)
Output: /some/path
So this helps when you are using complicated variables such as heavily nested dictionaries8 in the notebook.
Benefit
What I love about this approach is that environment of notebooks get shared when you call a notebook, meaning you can access variables & methods of Notebook1 in some Notebookn and vice versa such that:
Notebook1 is calling Notebook2
Notebook2 is calling Notebook3
.....
Notebookn-1 is calling Notebookn
I’m using Python 3.6 and Fabric 2.4. I’m using Fabric to SSH into a server and run some commands. I need to set an environment variable for the commands being run on the remote server. The documentation indicates that something like this should work:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.run("command_to_execute", env={"KEY": "VALUE"})
But that doesn’t work. Something like this should also be possible:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run("command_to_execute")
But that doesn’t work either. I feel like I’m missing something. Can anyone help?
I was able to do it by setting inline_ssh_env=True, and then explicitly setting the env variable, ex:
with Connection(host=hostname, user=username, inline_ssh_env=True) as c:
c.config.run.env = {"MY_VAR": "this worked"}
c.run('echo $MY_VAR')
As stated on the site of Fabric:
The root cause of this is typically because the SSH server runs non-interactive commands via a very limited shell call: /path/to/shell -c "command" (for example, OpenSSH). Most shells, when run this way, are not considered to be either interactive or login shells; and this then impacts which startup files get loaded.
You read more on this page link
So what you try to do won't work, and the solution is to pass the environment variable you want to set explicitly:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run('echo export %s >> ~/.bashrc ' % 'ENV_VAR=VALUE' )
c.run('source ~/.bashrc' )
c.run('echo $ENV_VAR') # to verify if it's set or not!
c.run("command_to_execute")
You can try that:
#task
def qa(ctx):
ctx.config.run.env['counter'] = 22
ctx.config.run.env['conn'] = Connection('qa_host')
#task
def sign(ctx):
print(ctx.config.run.env['counter'])
conn = ctx.config.run.env['conn']
conn.run('touch mike_was_here.txt')
And run:
fab2 qa sign
When creating the Connection object, try adding inline_ssh_env=True.
Quoting the documentation:
Whether to send environment variables “inline” as prefixes in front of command strings (export VARNAME=value && mycommand here), instead of trying to submit them through the SSH protocol itself (which is the default behavior). This is necessary if the remote server has a restricted AcceptEnv setting (which is the common default).
According to that part of the official doc, the connect_kwargs attribute of the Connection object is intended to replace the env dict. I use it, and it works as expected.
Building a python 3 web app using flask which includes google maps.
Checking for API Key before loading index.html always raises RuntimeError:
if not os.environ.get("key"):
raise RuntimeError("key not set")
return render_template("index.html", key=os.environ.get("key"))
Also tried os.getenv - the same problem occurs. Changing variable name does not solve the issue either.
Exported the variable to environment via export key=value and printenv returns correct value of key.
Hardcoding the API Key works and returns the map successfully:
return render_template("index.html", key=value)
Any ideas how to solve this?
SOLVED: make sure to run the export var command in the same terminal window as flask run.
ALTERNATIVE: create websiteconfig.py file with key="value" and include import websiteconfig in your application. source: link
I'm trying to connect to Google BigQuery through the BigQuery API, using Python.
I'm following this page here:
https://cloud.google.com/bigquery/bigquery-api-quickstart
My code is as follows:
import os
import argparse
from apiclient.discovery import build
from apiclient.errors import HttpError
from oauth2client.client import GoogleCredentials
GOOGLE_APPLICATION_CREDENTIALS = './Peepl-cb1dac99bdc0.json'
def main(project_id):
# Grab the application's default credentials from the environment.
credentials = GoogleCredentials.get_application_default()
print(credentials)
# Construct the service object for interacting with the BigQuery API.
bigquery_service = build('bigquery', 'v2', credentials=credentials)
try:
query_request = bigquery_service.jobs()
query_data = {
'query': (
'SELECT TOP(corpus, 10) as title, '
'COUNT(*) as unique_words '
'FROM [publicdata:samples.shakespeare];')
}
query_response = query_request.query(
projectId=project_id,
body=query_data).execute()
print('Query Results:')
for row in query_response['rows']:
print('\t'.join(field['v'] for field in row['f']))
except HttpError as err:
print('Error: {}'.format(err.content))
raise err
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('project_id', help='Your Google Cloud Project ID.')
args = parser.parse_args()
main(args.project_id)
However, when I run this code through the terminal, I get the following error:
oauth2client.client.ApplicationDefaultCredentialsError: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
As you can see in the code, I've tried to set the GOOGLE_APPLICATION_CREDENTIALS as per the link in the error. However, the error persists. Does anyone know what the issue is?
Thank you in advance.
First - Thanks for the code - this provided to be very useful.
I would also suggest adding setting the environmental variable directly in your code - as not to set it for every environment you work on.
you can use the following code:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "path_to_your_.json_credential_file"
I found this useful when switching between different projects that require different credentials.
I'm not sure about BigQuery, but i'm using Google Data Store for saving. If you've installed gcloud sdk in your mac, you can try running this command
gcloud auth application-default login
It's looking for the environment variable in your local UNIX (or other) environment, not a variable in your python script.
You'd set that by opening up your terminal or cygwin and doing one of the following:
export GOOGLE_APPLICATION_CREDENTIALS='/path/to/your/client_secret.json'
Type that into your terminal to set the variable for just this session
Open up your .bashrc file, in UNIX by typing in nano ~/.bashrc and add this line to it underneath user specific aliases if you see that header:
GOOGLE_APPLICATION_CREDENTIALS="/full/path/to/your/client_secret.json"
Then reload it by typing source ~/.bashrc and confirm that it's set by trying echo $GOOGLE_APPLICATION_CREDENTIALS. If it returns the path, you're good.
Note: oauth2client is deprecated, instead of GoogleCredentials.get_application_default() you can use google.auth.default(). Install the package first with:
pip install google-auth
In your specific example, I see you know where the JSON file is located from your code. Instead of default credentials (from environment variables), you can use a service account directly with the google.oauth2.service_account module.
credentials = google.oauth2.service_account.Credentials.from_service_account_file(
'./Peepl-cb1dac99bdc0.json',
scopes=['https://www.googleapis.com/auth/cloud-platform'])
You can use this credentials file just as you are currently doing so by passing them to googleapiclient.discovery.build or if you are using the google-cloud-bigquery library, pass the credentials to the google.cloud.bigquery.Client constructor.
Here is a c# solution
System.Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS",#"C:\apikey.json");
string Pathsave = System.Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS");
It's looking for the environment variable. But I was able to solve this problem on Windows platform by using Application Default Credentials.
Steps that I followed:
Installed Google SDK
Then execute gcloud init steps to specify my defaults credentials and default project which you can change as and when needed. gcloud executable can be loacted in the bin directory where you chose to install Google SDK.
After successfully providing the credentials, you can check in at the location C:\Users\"yourusername"\AppData\Roaming\gcloud\legacy_credentials\"youremail"
. You can find the credentials stored in the JSON format there.
It helped me to resolve the error.
Apart from using GOOGLE_APPLICATION_CREDENTIALS (which is already described in a bunch of answers) there is one more way to set generated json credentials as a default service account:
gcloud auth activate-service-account --key-file=<path to your generated json file>
That will activate a default account (and set credentials according to the provided json file) without explicitly setting GOOGLE_APPLICATION_CREDENTIALS, and it will be still activated after re-login or reboot without modifying .bashrc.
The link provided in the error message, https://developers.google.com/identity/protocols/application-default-credentials, says to set the environment variable to point at the fail that contains the JSON service credentials. It looks like you set a Python variable. Try setting your terminal's environment variable to point at the correct file.
An alternate would be to explicitly use some other credentials when you aren't running in a GCE container, like oauth2client.client.SignedJwtAssertionCredentials and point it directly at your client secret so you don't have to indirect through an environment variable.
There is another workaround that I don't think was mentioned here yet. The google.oauth2.service_account.Credentials object offers the from_service_account_info method (see here: https://github.com/googleapis/google-auth-library-python/blob/main/google/oauth2/service_account.py).
So you can set any variable you want in your environment and read it in and pass it into the function something like this:
your_data = {
"type": os.environ.get('YOUR_ENV_VAR'),
"project_id": os.environ.get('YOUR_ENV_VAR'),
"private_key_id": os.environ.get('YOUR_ENV_VAR'),
#... and so on with all the required Google variables....
}
your_credentials = service_account.Credentials.from_service_account_info(your_data, scopes=your_scopes)
service = discovery.build(api_name, api_version, credentials=your_credentials)
I basically took all the data from my google keyfile.json and stored them in the env and did the above. That way you never need to keep your keyfile.json anywhere near your code or worse, upload it somewhere public. And that's basically it. Good luck!
PS: I forgot to mention this also, which might help someone who is running into the same issues as I did. While the above should work fine in development, in some production environments the \n will not be interpreted as a new line. Instead it will remain inside the private key. Put all of the above into a try statement and if you have the error: 'no key could be detected' then this is most likely the problem. In that case you need to replace all \\n with \n similar as to to what has been suggested by Sumit Agrawal but kind of the other way round. That's because in some environments an automatic adding of \ will occur for a new line indication such as \n in order to keep them as they are if that makes any sense. So you have to basically undo this.
You can simply do the following for one of the lines above:
"private_key": os.environ.get('YOUR_ENV_VAR').replace('\\n', '\n'),
But again try to print them to the log file / console to see how they actually look like. If you have any \n in the string you know you need to clean or convert them as explained. Good luck!
If you would like to use different credential files without setting the environmental variable, you can use the following code:
from oauth2client import service_account
from apiclient.discovery import build
import json
client_credentials = json.load(open("<path to .json credentials>"))
credentials_token = service_account._JWTAccessCredentials.from_json_keyfile_dict(client_credentials)
bigquery_service = build('bigquery', 'v2', credentials=credentials_token)
query_request = bigquery_service.jobs()
query_data = {
'query': (
'SELECT TOP(corpus, 10) as title, '
'COUNT(*) as unique_words '
'FROM [publicdata:samples.shakespeare];')
}
query_response = query_request.query(
projectId=project_id,
body=query_data).execute()
print('Query Results:')
for row in query_response['rows']:
print('\t'.join(field['v'] for field in row['f']))
Export the Google credential JSON from command line: export GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
I hope it will work fine.
You can create a client with service account credentials using from_service_account_json():
from google.cloud import bigquery
bigqueryClient = bigquery.Client.from_service_account_json('/path/to/keyfile.json')
If there is case where you can not provide credential in a file set
GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
As the service account is JSON and it contains double quote character, replace every double quote with \"
Wrap the complete JSON in double quote
Replace every \n with \\n ( on linux ) \\\n (on mac)
with above changes in service account if you export it as variable, then it should be recorded correctly.
try echo %variable_name to confirm if it looks good.
In your project folder just type:
set GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
On Windows, update the 'environmental variables for your account'.
You will most likely already have a variable called: GOOGLE_APPLICATION_CREDENTIALS
Simply update the path to /path/to/liquid-optics-xxxxxxxx.json
(you'll most likely have that file somewhere on your machine). Then refresh your environment (cmd or whatever) to pick up the change.