environment variables not updating - python

I am using the dotenv package. I had a key that I had saved in my .env file but I updated it to a new key, but my script still outputs the old key. I have the ".env" file in the root directory.
I thought that by using load_dotenv() that it's taking in the new keys whatever they may be at the current state in time and saving it to be used in the script. What am I doing wrong?
import os
from dotenv import load_dotenv
import praw
load_dotenv()
reddit = praw.Reddit(client_id=os.getenv('reddit_personal_use'),
client_secret=os.getenv('reddit_api_key'),
user_agent=os.getenv('reddit_app_name'),
username=os.getenv('reddit_username'),
password=os.getenv('reddit_pw'))

I had to set override=True
load_dotenv(override=True)
load_dotenv does not override existing System environment variables. To override, pass override=True to load_dotenv().

Related

Error setting / reading environment variables in Flask App | Access env variable in another file

In __init__.py, I set the environment variable.
import os
# set mkey environment variable
MKEY="xxxx"
os.environ["MKEY"] = MKEY
And then I try to read the environment variable in another file - file1.py
import os
token = os.environ["MKEY"]
When I attempt to run the app on localhost, I get an error:
Traceback:
token = os.environ["MKEY"]
File "/Applications/Anaconda/anaconda3/lib/python3.9/os.py", line 679, in __getitem__
raise KeyError(key) from None
KeyError: 'MKEY'
It looks like the environment variables are not being set properly or are only accessible in the file they are being set. I don't see them here either:
for key in os.environ.keys():
print(key + ": " + os.environ[key])
How do I set environment variables that can be accessed across all files in my Flask App? Is there a way to set this in one of the files versus using export in the root directory?
maybe write a config file, say config.py where load all env in object like
config = {'db': os.getenv('user', 'user_db')}
and
set env there like , you are doing
os.environ['mike'] = mike
then do
config['mike'] = os.environ['mike']
and use this config object in the other file like
from config import config
you can write a function to set and update the configs
something like this you can do

How to load environment variables in Flask unit test

I'm trying to test an app which relies on several environment variables (API keys, mostly). I'd like to keep those as variables instead of putting them directly into a config file, but my unit tests (using unittest) won't run because the environment variables aren't loaded when the test mounts.
I've tried calling load_dotenv in setUp (see below) but that doesn't make a difference. How can I ensure that the test suite reads the environment variables correctly?
.flaskenv
FLASK_APP=myapp.py
BASE_URI='https://example.com'
OTHER_API_KEY='abc123itsasecret'
config.py
import os
basedir = os.path.abspath(os.path.dirname(__file__))
class Config(object):
SQLALCHEMY_DATABASE_URI = "sqlite:///"
API_URL = os.environ.get('BASE_URI') + "/api/v1"
SECRET_KEY = os.environ.get('OTHER_API_KEY')
test_file.py
import os
from dotenv import load_dotenv
import config
basedir = os.path.abspath(os.path.dirname(__file__))
class TestTheThing(unittest.TestCase):
def setUp(self):
load_dotenv(os.path.join(basedir, '.flaskenv'))
app.config.from_object(config.Config)
app.config["SQLALCHEMY_DATABASE_URI"] = "sqlite://"
db.create_all()
self.client = app.test_client()
In the console after running python -m unittest myapp.test_file
File "/../../package/config.py", line 29, in Config
'API_URL': os.environ.get('BASE_URI') + 'api/v1/',
TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
After reading more, I realized that I needed to load the variables in my config file, not in the test runner. Adding load_dotenv to the top of the config file allowed tests to run.
config.py
import os
from dotenv import load_dotenv
basedir = os.path.abspath(os.path.dirname(__file__))
load_dotenv(os.path.join(basedir, '.flaskenv'))
# rest of file
I spent time trying to figure out where the crash was happening. I added a breakpoint inside setUp that never got hit, so it had to be earlier in the execution.
Reading the stack trace more carefully, it was caused by importing app at the top of the test file. When the app is imported, it tries to pull all the variables in before it could call load_dotenv inside setUp. Loading environment variables before the config object was loaded into the Flask object cleared it up.

How to load environment variables from file using python-dotenv before loading pytest scritps?

I have tests which on inside docker container with https://github.com/pytest-docker-compose/pytest-docker-compose, but it does take too long for the container to start up/shutdown. Then, I would like to let docker-compose run tests only CI machine or when need.
For this, I used this way of defining tests on simple_test_runner.py:
import os
THIS_FOLDER = os.path.dirname(os.path.realpath(__file__))
RUN_TESTS_LOCALLY = os.environ.get('RUN_TESTS_LOCALLY')
if RUN_TESTS_LOCALLY:
def test_simple(run_process, load_env):
output, returncode = run_process("python3 " + THIS_FOLDER + "/simple_test.py")
assert returncode == 0
else:
def test_simple(function_scoped_container_getter):
container = function_scoped_container_getter.get("container_name")
exec = container.create_exec("python3 /simple_test.py")
...
This works file if I export RUN_TESTS_LOCALLY=1 before calling pytest -vs ., but if I try to use https://github.com/theskumar/python-dotenv in my conftest.py:
from dotenv import load_dotenv
#pytest.fixture(scope='session', autouse=True)
def load_env():
print("Loading .env file")
load_dotenv()
The environment variables are only loaded after python test loaded the simple_test_runner.py. Then, my tests are always running inside docker instead of outside it when RUN_TESTS_LOCALLY=1 is defined.
How can I make pytest to call load_dotenv() before my switch if RUN_TESTS_LOCALLY: is evaluated inside my tests?
Or do you know an alternative to if RUN_TESTS_LOCALLY: which allows me to use load_dotenv() and switch between running my tests inside docker-compose or not?
Related:
https://github.com/quiqua/pytest-dotenv
How to load variables from .env file for pytests
pytest -- how do I use global / session-wide fixtures?
The code I originally posted is working:
#pytest.fixture(scope='session', autouse=True)
def load_env(request):
file = THIS_FOLDER + '/.env'
if request.config.getoption("verbose") > 0:
print("Loading", file, file=sys.stderr)
load_dotenv(dotenv_path=file)
The problem was that when I had tested, I had set my environment variable to RUN_TESTS_LOCALLY= (the empty string). This was causing dotenv to not override my environment variable. Once I unset the variable with bash unset RUN_TESTS_LOCALLY, dotenv was finally loading the environment file correctly.

calling settings.py from another python scripts

I have .env file where I have added environment settings. I wrote "settings.py" which reads .env file and stores values of settings. I want to import settings.py from other_script.py. But I am getting None as value.
I tried to execute "settings.py" and returns a value. On the other hand when I execute other_script which imports settings, the values become None value.
settings.py:
import os
from dotenv import load_dotenv
from pathlib import Path
env_path = Path('.') / '.env'
load_dotenv(env_path)
MONGO_IP = os.getenv("MONGO_IP")
MONGO_PORT = os.getenv("MONGO_PORT")
MONGO_DB = os.getenv("MONGO_DB")
print(MONGO_DB)
other_script.py:
from pymongo import MongoClient
from settings import MONGO_IP, MONGO_PORT, MONGO_DB
print(MONGO_DB)
mongo_client = MongoClient(MONGO_IP, MONGO_PORT)[MONGO_DB]
So when I execute other_script.py, keys should return a value. What do I miss?
Two things to check are:
settings.py and other_script.py are in the same folder. Without this, other_script.py will not be able to find settings.py.
Look at your env and see if load_dotenv(env_path) is working properly. If env values for MONGO_* are not set properly you cannot read them.
If they are not in the same folder, the issue perhaps is that you don't have an __init__.py file in the folder you want to import from, since it is needed to make it a package. The init file can be empty.

How to access an AWS Lambda environment variable from Python

Using the new environment variable support in AWS Lambda, I've added an env var via the webui for my function.
How do I access this from Python? I tried:
import os
MY_ENV_VAR = os.environ['MY_ENV_VAR']
but my function stopped working (if I hard code the relevant value for MY_ENV_VAR it works fine).
AWS Lambda environment variables can be defined using the AWS Console, CLI, or SDKs. This is how you would define an AWS Lambda that uses an LD_LIBRARY_PATH environment variable using AWS CLI:
aws lambda create-function \
--region us-east-1
--function-name myTestFunction
--zip-file fileb://path/package.zip
--role role-arn
--environment Variables={LD_LIBRARY_PATH=/usr/bin/test/lib64}
--handler index.handler
--runtime nodejs4.3
--profile default
Once created, environment variables can be read using the support your language provides for accessing the environment, e.g. using process.env for Node.js. When using Python, you would need to import the os library, like in the following example:
...
import os
...
print("environment variable: " + os.environ['variable'])
Resource Link:
AWS Lambda Now Supports Environment Variables
Assuming you have created the .env file along-side your settings module.
.
├── .env
└── settings.py
Add the following code to your settings.py
# settings.py
from os.path import join, dirname
from dotenv import load_dotenv
dotenv_path = join(dirname(__file__), '.env')
load_dotenv(dotenv_path)
Alternatively, you can use find_dotenv() method that will try to find a .env file by (a) guessing where to start using file or the working directory -- allowing this to work in non-file contexts such as IPython notebooks and the REPL, and then (b) walking up the directory tree looking for the specified file -- called .env by default.
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
Now, you can access the variables either from system environment variable or loaded from .env file.
Resource Link:
https://github.com/theskumar/python-dotenv
gepoggio answered in this post: https://github.com/serverless/serverless/issues/577#issuecomment-192781002
A workaround is to use python-dotenv:
https://github.com/theskumar/python-dotenv
import os
import dotenv
dotenv.load_dotenv(os.path.join(here, "../.env"))
dotenv.load_dotenv(os.path.join(here, "../../.env"))
It tries to load it twice because when ran locally it's in
project/.env and when running un Lambda the .env is located in
project/component/.env
Both
import os
os.getenv('MY_ENV_VAR')
And
os.environ['MY_ENV_VAR']
are feasible solutions, just make sure in the lambda GUI that the ENV variables are actually there.
I used this code; it includes both cases, setting the variable from the handler and setting it from outside the handler.
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Trying new lambda stuff"""
import os
import configparser
class BqEnv(object):
"""Env and self variables settings"""
def __init__(self, service_account, configfile=None):
config = self.parseconfig(configfile)
self.env = config
self.service_account = service_account
#staticmethod
def parseconfig(configfile):
"""Connection and conf parser"""
config = configparser.ConfigParser()
config.read(configfile)
env = config.get('BigQuery', 'env')
return env
def variable_tests(self):
"""Trying conf as a lambda variable"""
my_env_var = os.environ['MY_ENV_VAR']
print my_env_var
print self.env
return True
def lambda_handler(event, context):
"""Trying env variables."""
print event
configfile = os.environ['CONFIG_FILE']
print configfile
print type(str(configfile))
bqm = BqEnv('some-json.json', configfile)
bqm.variable_tests()
return True
I tried this with a demo config file that has this:
[BigQuery]
env = prod
And the setting on lambda was the following:
Hope this can help!
os.environ["variable_name"]
In the configuration section of AWS lambda, make sure you declare the variable with the same name that you're trying to access here. For this example, it should be variable_name

Categories