multiple sqlite databases creation - python

I have Django application that using MongoDB with Mongoengine. I need to create several sqlite databases. Every SQLite database associated with one of all documents of some MondoDB collection (for example MainDocument). SQLite duplicate some collections documents that depend on MainDocument with one-to-many relation. So... Number of documents on MainDocument is 1074! I need to create 1074 sqlite databases with different names.
I fill one default sqlite database with django orm then copy it with other name, then truncate all tables and start again for another document. It takes a long time. How to do it faster?
It's possible with django orm without configuring 1074 database routers?
I've tried use ramfs, but it still so slowly.

Related

Create postgres database from pony ORM

Is it possible to create a new database from pony ORM? We couldn't find it within pony ORM docs, it's always assumed the database exists or is using SQLite file.
We would like to create a testing database and drop it afterwards.
No. Per:
https://docs.ponyorm.org/api_reference.html#sqlite
Supported databases
If you look at the .bind() API for the various databases, SQLite is the only one with create_db. This is because in SQLite creating a database is just creating a single file. The other engines need to go through their own program to initialize a database. You will need to create an independent script that creates the database.
If you have your sqlite database file, you can try using pgloader.

Create an SQL inserting script from Django to populate a PostgreSQL database

I'm currently trying to insert in a Django model some complex data extracted from multiple parsed files. Because I have a huge amount of data to insert I don't want to multiply the queries to the database from my Django script.
Is there a way to generate an SQL script instead of executing all the object_to_insert.save() and inserting the script by using psql -f my_script.sql ?
If it's only one table, there is a bulk_create method in the Django ORM queryset: https://docs.djangoproject.com/en/stable/ref/models/querysets/#bulk-create
This method takes a list of objects and insert them on the database by an an efficient way, generally in only one query if the database supports multiple inserts.

using Django to query a MySQL database using the same connection as the ORM database

I have a MySQL server providing access to both a database for the Django ORM and a separate database called "STATES" that I built. I would like to query tables in my STATES database and return results (typically a couple of rows) to Django for rendering, but I don't know the best way to do this.
One way would be to use Django directly. Maybe I could move the relevant tables into the Django ORM database? I'm nervous about doing this because the STATES database contains large tables (10 million rows x 100 columns), and I worry about deleting that data or messing it up in some other way (I'm not very experienced with Django). I also imagine I should avoid creating a separate connection for each query, so I should use the Django connection to query STATE tables?
Alternatively, I could treat the STATE database as existing on a totally different server. I could import SQLAlchemy, create a connection, query STATE.table, return the result, and close that connection.
Which is better, or is there another path?
The docs describe how to connect to multiple databases by adding another database ("state_db") to DATABASES in settings.py, I can then do the following.
from django.db import connections
def query(lname)
c = connections['state_db'].cursor()
c.execute("SELECT last_name FROM STATE.table WHERE last_name=%s;",[lname])
rows = c.fetchall()
...
This is slower than I expected, but I'm guessing this is close to optimal because it uses the open connection and Django without adding extra complexity.

Sqlalchemy create a database table manually

Is there a way to create a table without using Base.metadata.create_all(). I'm looking for something like Mytable.create() which should create only its corresponding table.
The reason I want to do so is because i'm using Postgres schemas for a multi-tenant web app and I want to create the public tables(Useretc) separately and the user specific(each having a separate schema, ex. Blog,Post) tables when the user signs up. However all the definitions lie in the same file and it seems that create_all creates all the tables defined in the file.
Please read documentation Creating and Dropping Database Tables.
You can do user_table.create(engine), or if you are using declarative extension: User.__table__.create(engine).

Flask-SQLAlchemy - When are the tables/databases created and destroyed?

I am a little confused with the topic alluded to in the title.
So, when a Flask app is started, does the SQLAlchemy search theSQLALCHEMY_DATABASE_URI for the correct, in my case, MySQL database. Then, does it create the tables if they do not exist already?
What if the database that is programmed into theSQLALCHEMY_DATABASE_URI variable in the config.py file does not exist?
What if that database exists, and only a few of the tables exist (There are more tables coded into the SQLAlchemy code than exist in the actual MySQL database)? Does it erase those tables and then create new tables with the current specs?
And what if those tables do all exist? Do they get erased and re-created?
I am trying to understand how the entire process works so that I (1) Don't lose database information when changes are made to the schema, and (2) can write the necessary code to completely manage how and when the SQLAlchemy talks to the actual Database.
Tables are not created automatically; you need to call the SQLAlchemy.create_all() method to explicitly to have it create tables for you:
db = SQLAlchemy(app)
db.create_all()
You can do this with command-line utility, for example. Or, if you deploy to a PaaS such as Google App Engine, a dedicated admin-only view.
The same applies for database table destruction; use the SQLAlchemy.drop_all() method.
See the Creating and Dropping tables chapter of the documentation, or take a look at the database chapter of the Mega Flask Tutorial.
You can also delegate this task to Flask-Migrate or similar schema versioning tools. These help you record and edit schema creation and migration steps; the database schema of real-life projects is never static and you would want to be able to move existing data between versions or the schema. Creating the initial schema is then just the first step.

Categories