Django / Postgres write-only database - python

For a specific security reason, a client has asked if we can integrate a 'write-only' DB into a Django web application. I have tried creating a DB then restricting access to one of its tables in psql via:
REVOKE SELECT ON TABLE testapp_testmodel FROM writeonlyuser;
But then trying to save a model in the Django shell...
p = new TestModel(test_field="testvalue")
p.save(using="writeonlydb")
...generates this error:
ProgrammingError: permission denied for relation testapp_testmodel
Which I assume is because the ORM generated SQL includes a return of the newly created object's id, which counts as a read:
INSERT INTO "testapp_testmodel" ("test_field") VALUES ('testvalue') RETURNING "testapp_testmodel"."id"
My question is therefore, is this basically impossible? Or is there perhaps some other way?

Related

UUIDs with django and mssql

I have a mssql schema with the django ORM / pymssql extension. I have some classes build via the inspectdb function. A lot of the Primarykeys in the tables are UUID fields / mssql uniqueidentifier, which the ORM inspected as CharFields with length 36.
I am concerned now with possible duplicates for the primary keys since the tables are growing very fast.
The tables have a default constraint for any new primary key on the database site. So basically I have two (different) sources of UUID generation (the database server and the application server)
How is it possible to insert via the ORM from django performantly?
Am I save with generating the UUIDs via pythons uuid module or do I have to ask the database everytime for a new UUID before creating a object with django?
The primary key cannot be duplicated, so it will raise a "duplicated pk duplicated" exception. In addition, the odds of getting a duplicated uuid is quite close to 0, you will get 3 lottery prizes before you get a duplicated uuid.

SQLAlchemy relation does not exist when importing db

I am migrating to a new database with SQLAlchemy and want to create the database and tables using db.create_all(). Now I am getting the following error
psycopg2.errors.UndefinedTable: relation "Image" does not exist
when I am trying to import db. The error occurs because I'm creating a global variable a service of the app depends on by executing a query Image.query.all(). So practically, the variable is created when the app starts. Of course this does not work now because the data structure is not yet in place.
How can I get around this issue without having to temporarily commenting out the code? Basically I want to simply be able to conduct the database setup without this query being executed.

using Django to query a MySQL database using the same connection as the ORM database

I have a MySQL server providing access to both a database for the Django ORM and a separate database called "STATES" that I built. I would like to query tables in my STATES database and return results (typically a couple of rows) to Django for rendering, but I don't know the best way to do this.
One way would be to use Django directly. Maybe I could move the relevant tables into the Django ORM database? I'm nervous about doing this because the STATES database contains large tables (10 million rows x 100 columns), and I worry about deleting that data or messing it up in some other way (I'm not very experienced with Django). I also imagine I should avoid creating a separate connection for each query, so I should use the Django connection to query STATE tables?
Alternatively, I could treat the STATE database as existing on a totally different server. I could import SQLAlchemy, create a connection, query STATE.table, return the result, and close that connection.
Which is better, or is there another path?
The docs describe how to connect to multiple databases by adding another database ("state_db") to DATABASES in settings.py, I can then do the following.
from django.db import connections
def query(lname)
c = connections['state_db'].cursor()
c.execute("SELECT last_name FROM STATE.table WHERE last_name=%s;",[lname])
rows = c.fetchall()
...
This is slower than I expected, but I'm guessing this is close to optimal because it uses the open connection and Django without adding extra complexity.

Django use multiple databases dynamically without defining in settings

TL;DR: Besides my default django database, i need data pulled in from two different user-selected databases. not sure how to setup django to access these besides just running manual queries using connection.cursor().execute("SQL")
Situtaion:
A process creates a sqlitedb. The database is imported into mysql. I'm writing a django app that interacts with that mysql database (call it StreamDB), another mysql database with additional info user needs to see (call this SourceDB), and of course the default Django app mysql DB (call it AppDB).
There will be two versions of SourceDB (prod and test) the StreamDB imported maps to data to one and only one of these SourceDB's.
I have a table/model in my "AppDB" that identifies these sources (the StreamDB name, which of the two SourceDB's it maps to, and some other data. Here's a sample record:
name: foo
path: /var/www/data/test/foo.sqlite
db_name: foo
source_db_name: bar
date_imported: 2014-05-03 10:20:30
These are managed through the django admin and added manually (or dyanmically via external script)
Dilemma
Depending on which source is selected, my SQL needs to join the tables from those two DB's. Example query with dyanmic db names in :
SELECT a.image_id, a.image_name, b.title, b.begin_time
FROM <selected_streamdb>.image a JOIN <selected_sourcedb>.event b ON b.event_id = a.event_id
WHERE a.image_type = 'png'
Do I fill in and with variables perhaps?
Question
Is there anyway to use Django's ORM model in a situation like this? Can django grab the DATABASE.settings from a DB table? Do I create a model that manages this? (i.e. the source-table above?)
I don't mind managing db permissions on the backend (assume app database user in django settings has access to all of the databases)
Hope all of the above makes sense.

django postgresql query not working

I have a postgreSQL database that has a table foo that I've created outside of django. I used manage.py inspectdb to build the model for table foo for me. This technique worked fine when I was using MySQL but with PostgreSQL it is failing MISERABLY. The table is multiple gigabytes and I build it from a text file with PostgreSQL 'COPY'.
I can run raw queries on table foo and everything executes and expected.
For example
foo.objects.raw('bar_sql')
executes as expected.
But running queries like:
foo.objects.get(bar=bar)
throw
ProgrammingError column foo.id does not exist LINE 1: SELECT "foo"."id", "foo"."bar1", "all_...
foo doesn't innately have an id field. As I understand it django is suppose to create one. Have I some how subverted this step when creating the tables outside of django?
Queries run on models whose table was populated threw django run as expected in all cases.
I'm missing something very basic here and any help would be appreciated.
I'm using django 1.6 with postgreSQL 9.3.
Django doesn't modify your existing database tables. It only creates new tables. If you have existing tables, it usually doesn't touch them at all.
"As I understand it django is suppose to create one." --> It only adds a primary key to a table when it creates it, which means you don't need to specify that explicitly in your model, but it won't do anything to an existing table.
So if for example you later on decide to add fields to your models, you have to update your databases manually.
What you need to do in your case is that by doing manual database administration make sure that your table has a primary key, and also that the name of the primary key is "id" (although I am not sure if this is necessary, it is better to do it.) So use a database administration tool, modify your table and add the primary key, and name it id. Then it should start working.

Categories