I have a situation that requires me to use a list of databases. One database for each tenant. This information is kept in a default database. So my DATABASE_APPS_MAPPING is currently hardcoded with the list of databases that I fetch manually from the default database and save it in the settings.py file.
But every time we upgrade our service, I need to manually copy over the database values into DATABASE_APPS_MAPPING. This is getting to be a pain and I'd like to build this set using code once when django initialises. Maybe have a module dbinit.py
list_of_db=[]
def initialize(defaultDbName):
#Build up list_of_db
Then in settings.py, I can just do :-
import dbinit
.
.
.
DATABASE_APPS_MAPPING=dbinit.list_of_db
My specific question is where do I call dbinit.initialize() ? Is that urls.py, wsgi.py. Thanks.
I believe I have the answer here. Spent a couple of days referring to other stack overflow queries.
The code of dbinit.py needs to be like so:-
list_of_db=[]
initialized=False
def initialize(defaultDbName):
if not initialized:
build_up_list(list_of_db,defaultDbName)#Build the list.
initialize("mydefaultDbContainingInfoOnOtherDBs")
Since any imports by the python interpreter are performed while holding on to the import lock, we are assured that only one thread will be executing the code in the above module's initialize method. So even if the above module is imported multiple times in different threads` execution context, we are assured that the initialization has occurred only once.
Related
The current I issue I currently have is how to keep track of an API key that is stored in a MySQL database. Each backend I have is living in it's own file.
api.add_resource(SQL, "/get/<string:name>/<string:hash>/<string:id>/<string:key>/<string:dep>")
api.add_resource(Sync, "/sync/post")
api.add_resource(Update, "/update")
api.add_resource(Login, "/login")
SQL from sql.py, Sync from sync.py, etc. And I'm not too sure how I'm supposed to keep track the API key. From my research, it shows that I need to use something called a session. But I'm not sure how I can get it to work. After looking at some documentation it seems that it needs everything to be in one file. But I wouldn't know anyways. Since following the website's instructions one-to-one doesn't work, as from flask.ext.session import session will give an error: module flask.ext.session does not exist, or something along the lines of unable to find module.
And I don't want to keep all my classes in one file because each class is at least 50 lines long, with one of them having over 150 lines. Which is what made me split each backend into their own files in the first place.
I've also tried making a Host class which inherits from all my backends. And with SQL.__init__(self, key) and Sync.__init__(self, key), and so on. My initial theory was that in each of my backend class' constructors I can make a variable self.key = key. But that didn't work either. Because of circular imports, which can be fixed if I just put all my backend classes into the same file. Which again, is not what I want.
This is my first django application and I looked all over the place to find an answer, to no avail.
I created my models and I know need to to initialize the values to one of the classes. I could do it using the admin page, one by one, but I want anyone using my application to be able to just load the application for the first time to have all the correct objects (and associated records in the database) to be created automatically.
Please help
If you want to populate database check the wiki for initial data. You can use JSON, XML or YAML (with PyYAML installed). I think you are looking for this as your question is not that clear.
I'm using Django Python framework, and MySQL DBMS.
In the screenshot below, I'm creating the new_survey_draft object using the SurveyDraft.objects.create() as shown, assuming that it should create a new row in the surveydraft DB table, but as also shown in the screenshot, and after debugging my code, the new_survey_draft object was created with id=pk=270 , while the DB table shown in the other window to the right doesn't have the new row with the id=270.
Even when setting a break point in the publish_survey_draft() called after the object instantiation, I called the SurveyDraft.objects.get(pk=270) which returned the object, but still there is not id=270 in the DB table.
And finally, after resuming the code and returning from all definitions, the row was successfully added to the DB table with the id=270.
I'm wondering what's happening behind the seen, and is it possible that Django stores data in objects without persisting to DB on real-time, and only persists the data all together on some later execution point?
I've been stuck in this for hours and couldn't find anything helpful online, so I really appreciate any advice regarding the issue.
After digging deep into this issue, I just found that there is a concept called Atomic Requests that's enabled in my Django project by setting the ATOMIC_REQUESTS to True in the settings.py under the DATABASES dictionary as explained here
It works like this. Before calling a view function, Django starts a
transaction. If the response is produced without problems, Django
commits the transaction. If the view produces an exception, Django
rolls back the transaction.
That's why the changes were not persisting in the database while debugging my code using break points, since the changes will only be committed to the DB once the successful response is returned.
I have a file with a bunch of data common between several projects. The data needs to be loaded into the Django database. The file doesn't change that much, so loading it once on server start is sufficient. Since the file is shared between multiple projects, I do not have full control over the format, so I cannot convert this into a fixture or something.
I tried loading it in ready(), but then I run into a problem when creating a new database or migrating an existing database, since apparently ready() is called before migrations are complete and I get errors from using models that do not have underlying tables. I tried to set it in class_prepared signal handler, but the loading process uses more than one model, so I cannot really be sure all required model classes are prepared. Also it seems that ready() is not called when running tests, so unit tests fail because the data is missing. What is the right place to do something like this?
It seems that what I am looking for doesn't exist. Django trusts the user to deal with migrations and such and doesn't check the database on load. So there is no place in the system where you can load some data on system start and be sure that you can actually load it. What I ended up doing is loading the data in ready(), but do a sanity check first by doing MyModel.objects.exist() in a try: except: block and returning if there was an exception. This is not ideal, but I haven't found any other way.
I need to populate my database with a bunch of dummy entries (around 200+) so that I can test the admin interface I've made and I was wondering if there was a better way to do it. I spent the better part of my day yesterday trying to fill it in by hand (i.e by wrapping stuff like this my_model(title="asdfasdf", field2="laksdj"...) in a bunch of "for x in range(0,200):" loops) and gave up because it didn't work the way I expected it to. I think this is what I need to use, but don't you need to have (existing) data in the database for this to work?
Check this app
https://github.com/aerosol/django-dilla/
Let's say you wrote your blog application (oh yeah, your favorite!) in Django. Unit tests went fine, and everything runs extremely fast, even those ORM-generated ultra-long queries. You've added several categorized posts and it's still stable as a rock. You're quite sure the app is efficient and ready to for live deployment. Right? Wrong.
You can use fixtures for this purpose, and the loaddata management command.
One approach is to do it like this.
Prepare your test database.
Use dumpdata to create JSON export of the database.
Put this in the fixtures directory of your application.
Write your unit tests to load this "fixture": https://docs.djangoproject.com/en/2.2/topics/testing/tools/#django.test.TransactionTestCase.fixtures
Django fixtures provide a mechanism for importing data on syncdb. However, doing this initial data propagation is often easier via Python code. The technique you outline should work, either via syncdb or a management command. For instance, via syncdb, in my_app/management.py:
def init_data(sender, **kwargs):
for i in range(1000):
MyModel(number=i).save()
signals.post_syncdb.connect(init_data)
Or, in a management command in myapp/management/commands/my_command.py:
from django.core.management.base import BaseCommand, CommandError
from models import MyModel
class MyCommand(BaseCommand):
def handle(self, *args, **options):
if len(args) > 0:
raise CommandError('need exactly zero arguments')
for i in range(1000):
MyModel(number=i).save()
You can then export this data to a fixture, or continue importing using the management command. If you choose to continue to use the syncdb signal, you'll want to conditionally run the init_data function to prevent the data getting imported on subsequent syncdb calls. When a fixture isn't sufficient, I personally like to do both: create a management command to import data, but have the first syncdb invocation do the import automatically. That way, deployment is more automated but I can still easily make modifications to the initial data and re-run the import.
I'm not sure why you require any serialization. As long as you have setup your Django settings.py file to point to your test database, populating a test database should be nothing more than saving models.
for x in range(0, 200):
m = my_model(title=random_title(), field2=random_string(), ...)
m.save()
There are better ways to do this, but if you want a quick test set, this is the way to go.
The app recommended by the accepted answer is no longer being maintained however django-seed can be used as a replacement:
https://github.com/brobin/django-seed
I would recommend django-autofixtures to you. I tried both django_seed and django-autofixtures, but django_seed has a lot of issues with unique keys.
django-autofixtures takes care of unique, primary and other db constraints while filling up the database