I am using Mongoengine(version: 0.9.0 ) with Django(version: 1.8).
This is my settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.dummy'
}
}
MONGO_DBNAME = "mydatabasename"
MONGO_HOSTNAME = "localhost"
connect(MONGO_DBNAME, host=MONGO_HOSTNAME)
I want to have fixtures for the application. I have created initial_data.json in myapp/fixtures/ location.
When I run the command python manage.py dumpdata , I get the following error :
CommandError: Unable to serialize database: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details.
Questions:
1) Any workaround for this problem ?
2) Is there any other way to load the initial data ?
References at this link
Thank you
Mongoengine itsn a backend(in django terminology). Its has own models (schemas) and DOM (like ORM in docuemnt db's) but it dont have a Django backend adapters.
You can use it. But there is issue while workind with out-of-box Django solution like Tests, Fixtures, etc.
You need to write your own loader, sadenly but true.
I see 2 options here:
You can try to use Django MongoDB Engine
You can write your own loader for mongodb
Ill write my own fixture loader for tests.
I have a json file where mapped all fixture file ill need to load to db.
So a fast example here:
import bson
import os
from django.conf import settings
from mongoengine.connection import get_db
def _get_db(self):
self.db = get_db()
def _load_fixtures(self, clear_before_load=True):
"""
Load to db a fixtures from folder fixtures/{{DB_NAME}}/{{COLLECTION_NAME}} before each test.
In file fixtures.json mapped collection name and file name for it.
"""
fixture_path = lambda file_name: os.path.join(settings.FIXTURES_DIR, self.db.name, file_name)
with open(settings.COLLECTION_FIXTURES_PATH) as file_object:
db_collections = loads(file_object.read())
for collection_name, filename in db_collections.items():
collection = self.db[collection_name]
if clear_before_load:
collection.remove()
path = fixture_path(filename)
if os.path.exists(path) and os.path.isfile(path):
with open(path, 'r') as raw_data:
collection_data = bson.decode_all(raw_data.read())
for document in collection_data:
collection.save(document)
There is no support for fixtures on mongoengine, and I don't think the mongoengine team is continuing the plugin as of version 0.9.0.
What I ended up doing to load initial data for mongoDB is to create a script called startup.py in my project folder.
startup.py:
from {{app}}.models import Sample
def init():
if Sample.objects(name="test").count() == 0: # a flag to prevent initial data repetition
Sample(name="test").save()
Next is to run this script on Django's startup. The entry point of Django project is when DJANGO_SETTINGS_MODULE is first loaded at wsgi.py:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "{{project_name}}.settings")
import {{project_name}}.startup as startup
startup.init()
application = get_wsgi_application()
With this setup, when you run python manage.py runserver, the init() on startup.py will run and the data you set will be inserted to the DB.
Hope this helps.
Related
I'm a Node developer but I need to create a Django app (i'm totally beginner in Django).
I need to read some data from an API but ofc, I shouldn't hardcode the API url.
So having API_BASE_URL=api.domain.com in my .env file, in Node I would access the variables in my functions this way:
import ('dotenv/config');
import axios from 'axios';
baseUrl = process.env.API_BASE_URL;
function getApiData() {
return axios.get(baseUrl);
}
So how would be the Python/Django version of it?
Saying I have the function bellow:
import ???
def get_api_data():
url = ????
import environ
# reading .env file
environ.Env.read_env()
def get_api_data():
url = env('API_BASE_URL')
Let's say you have a .env file saved in the same directory as your manage.py file.
You can then go to settings.py and do:
from decouple import config
API_BASE_URL = config('API_BASE_URL')
Assuming your .env file looks like:
API_BASE_URL='some.url'
Based on the Configuration Handling Documents for Flask the section of Configuring from Files mentions a possibility to configure the App using files however it provides no example or mention of files that are not Python Files.
Is it possible to configure apps via files like config.yml or config.toml?
My Current flask app has configurations for two distinct databases and since I am using flask-restplus there are additional configurations for Swagger documentations.
Snippet:
from flask import Flask
app = Flask(__name__)
def configure_app(flask_app):
# MongoDB Setting
flask_app.config['MONGO_URI'] = 'mongodb://user:password#mongo_db_endpoint:37018/myDB?authSource=admin'
flask_app.config['MONGO_DBNAME'] = 'myDB'
# InfluxDB Setting
flask_app.config['INFLUXDB_HOST'] = 'my_influxdb_endpoint'
flask_app.config['INFLUXDB_PORT'] = 8086
flask_app.config['INFLUXDB_USER'] = 'influx_user'
flask_app.config['INFLUXDB_PASSWORD'] = 'influx_password'
flask_app.config['INFLUXDB_SSL'] = True
flask_app.config['INFLUXDB_VERIFY_SSL'] = False
flask_app.config['INFLUXDB_DATABASE'] = 'IoTData'
# Flask-Restplus Swagger Configuration
flask_app.config['RESTPLUS_SWAGGER_UI_DOC_EXPANSION'] = 'list'
flask_app.config['RESTPLUS_VALIDATE'] = True
flask_app.config['RESTPLUS_MASK_SWAGGER'] = False
flask_app.config['ERROR_404_HELP'] = False
def main():
configure_app(app)
if __name__ == "__main__":
main()
I would like to avoid setting large number of Environment Variables and wish to configure them using a config.toml file?
How is this achieved in flask?
You can use the .cfg files and from_envvar to achieve this. Create config file with all your environment variables.
my_config.cfg
MONGO_URI=mongodb://user:password#mongo_db_endpoint:37018
..
..
ERROR_404_HELP=False
Then set the env var APP_ENVS=my_config.cfg. Now all you need to do is use from_envvars given by Flask.
def configure_app(flask_app):
flask_app.config.from_envvar('APP_ENVS')
# configure any other things
# register blue prints if you have any
Quoting from documentation:
Configuring from Data Files
It is also possible to load configuration from a file in a format of
your choice using from_file(). For example to load from a TOML file:
import toml
app.config.from_file("config.toml", load=toml.load)
Or from a JSON file:
import json
app.config.from_file("config.json", load=json.load)
EDIT: The above feature is new for v2.0.
Link to the documentation reference:
Class Flask.config, method from_file(filename, load, silent=False)
I would like certain parts of my code to not run while it is being run locally.
This is because, I am having trouble installing certain dependencies locally, for the code to run.
Specifically, memcache doesn't work for me locally.
#app.route('/some_url_route/')
#cache.cached(timeout=2000) #ignore this locally
def show_a_page():
How would the app somehow ignore the cache section of the code above, when running locally?
In my code I follow a Django-esq model and have a main settings.py file I keep all my settings in.
In that file putting DEBUG = True for your local environment (and False for production) I then use:
from settings import DEBUG
if DEBUG:
# Do this as it's development
else:
# Do this as it's production
So in your cache decorator include a similar line that only checks memcached if DEBUG=False
You can then load all these settings into your Flask setup as detailed in the configuration documentation.
If you're using Flask-Cache, then just edit the settings:
if app.debug:
app.settings.CACHE_TYPE = 'null' # the cache that doesn't cache
cache = Cache(app)
...
A better approach would be to have separate settings for production and development. I use a class-based approach:
class BaseSettings(object):
...
class DevelopmentSettings(BaseSettings):
DEBUG = True
CACHE_TYPE = 'null'
...
class ProductionSettings(BaseSettings):
CACHE_TYPE = 'memcached'
...
And then import the appropriate object when you setup your app (config.py is the name of the file which contains the settings):
app.config.from_object('config.DevelopmentConfig')
As per the example provided by sqlalchemy documentation to cache a sqlalchemy query we are suppose to do this
from caching_query import FromCache
# load Person objects. cache the result under the namespace "all_people".
print "loading people...."
people = Session.query(Person).options(FromCache("default", "all_people")).all()
I have the following configuration for beaker in development.ini
cache.regions = day, hour, minute, second
cache.type = file
cache.data_dir = %(here)s/cache/sess/data
cache.lock_dir = %(here)s/cache/sess/lock
cache.second.expire = 1
cache.minute.expire = 60
cache.hour.expire = 3600
cache.day.expire = 86400
When i use the above example code in my application data is not cached in the cache folder, so i am assuming memory based caching is the default, Is it possible to switch sqlalchemy cache type to file based? or am i getting it wrong?
Your question is missing some details, but let me try:
the first parameter passed to FromCache() is a name of a Beaker cache region, it should match one of the configured regions, which is not the case here. Or perhaps you configure default region in the code (I'd expect BeakerException being thrown if region is unknown)?
you need pyramid_beaker module installed and included in Pyramid's project configuration. I suggest you follow pyramid_beaker manual's Setup section.
you need some extra code in __init__.py of your application in order to propagate .ini file settings to Beaker. This is described in Beaker cache region support section of the manual.
And here's a working sample from my current project, configuring both Beaker-based sessions and caching (all irrelevant parts removed):
from pyramid.config import Configurator
from pyramid_beaker import set_cache_regions_from_settings
from pyramid_beaker import session_factory_from_settings
def main(global_config, **settings):
# Configure Beaker caching/sessions
set_cache_regions_from_settings(settings)
session_factory = session_factory_from_settings(settings)
config = Configurator(settings=settings)
config.set_session_factory(session_factory)
config.include('pyramid_beaker')
config.add_static_view('static', 'static', cache_max_age=3600)
config.add_route('home', '/')
config.scan()
return config.make_wsgi_app()
In trying to find a place to store and save settings beyond settings.py and the database, I used an environment.json for environment variables. I import these in settings.py.
My problem is that when I try to change or store new values in my environment, env, settings.py does not notice the change - perhaps because the time and number of times settings.py is read by Django.
Is there a way I would be able to use my environment variables the way I want like attempted below?
# settings.py
import json
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
EMAIL_HOST = env.get('EMAIL_PORT', '500')
# views.py
import json
def site_configuration(request):
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
if request.method == 'POST':
os.environ['EMAIL_PORT'] = request.POST['email_port']
return render(request, ...)
# python manage.py shell demo
>>> import json
>>> with open('/home/dotcloud/environment.json') as f:
... env = json.load(f)
...
>>> project_settings.EMAIL_PORT
'500'
>>> env['EMAIL_PORT']
Traceback (most recent call last):
File "<console>", line 1, in <module>
KeyError: 'EMAIL_PORT'
>>> env['EMAIL_PORT'] = "123"
>>> env['EMAIL_PORT']
'123'
>>> project_settings.EMAIL_PORT
'500'
>>> project_settings.EMAIL_PORT == env['EMAIL_PORT']
False'
And if not, how else could I store changeable settings that are retrieved by settings.py somewhere in my Django project?
You might want to look into foreman (GitHub) or honcho (GitHub). Both of these look for a .env file in your current directory from which to load local environment variables.
My .env looks like this for most projects (I use dj-database-url for database configuration):
DATABASE_URL=sqlite://localhost/local.db
SECRET_KEY=<a secret key>
DEBUG=True
In your settings.py file, you can load these settings from os.environ like this:
import os
DEBUG = os.environ.get('DEBUG', False)
If there are required settings, you can assert their presence before trying to set them:
assert 'SECRET_KEY' in os.environ, 'Set SECRET_KEY in your .env file!'
SECRET_KEY = os.environ['SECRET_KEY']
I've been using this method of handling local settings for the last few projects I've started and I think it works really well. One caveat is to never commit your .env to source control. These are local settings that exist only for the current configuration and should be recreated for a different environment.
I see the question changed slightly, the original answers are still below but this one has a slightly different answer:
First, make sure you are using the right settings.py (print 'This file is being loaded' should do the trick).
Second, personally I would advise against using json files for config since it is less dynamic than Python files, but it should work regardless.
My recommended way of doing something like this:
create a base_settings.py file with your standard settings
create a settings.py which will be your default settings import. This file should have a from base_settings import * at the top to inherit the base settings.
If you want to have a custom settings file, dotcloud_settings.py for example, simply add the from dotcloud_settings import settings (or base_settings) and set the environment variable DJANGO_SETTINGS_MODULE to dotcloud_settings or your_project.dotcloud_settings depending on your setup.
Do note that you should be very careful with importing Django modules from these settings files. If any module does a from django.conf import settings it will stop parsing your settings after that point.
As for using json files, roughly the same principle of course:
Once again, make sure you don't have anything that imports django.conf.settings here
Make all of the variables within your json file global to your settings file:
import json
with open('/home/dotcloud/environment.json') as f:
env = json.load(f)
# A little hack to make all variables within our env global
globals().update(env)
Regardless though, I'd recommend turning this around and letting the settings file import this module instead.
Also, Django doesn't listen to environment variables by default (besides the DJANGO_SETTINGS_MODULE so that might be the problem too.