Web2py postgreSQL database - python

Recently i m working on web2py postgresql i made few changes in my table added new fields with fake_migration_all = true it does updated my .table file but the two new added fields were not able to be altered in postgres database table and i also tried fake_migration_all = false and also deleted mu .table file but still it didnt help to alter my table does able two add fields in datatable
Any better solution available so that i should not drop my data table and fields should also be altered/added in my table so my data shouldn't be loast

fake_migrate_all doesn't do any actual migration (hence the "fake") -- it just makes sure the metadata in the .table files matches the current set of table definitions (and therefore the actual database, assuming the table definitions in fact match the database).
If you want to do an actual migration of the database, then you need to make sure you do not have migrate_enabled=False in the call to DAL(), nor migrate=False in the relevant db.define_table() calls. Unless you explicitly set those to false, migrations are enabled by default.
Always a good idea to back up your database before doing a migration.

Related

Effective insert-only permissions for peewee tables

I'm wondering what the best strategy is for using insert-only permissions to a postgres db with Peewee. I'd like this in order to be certain that a specific user can't read any data back out of the database.
I granted INSERT permissions to my table, 'test', in postgres. But I've run into the problem that when I try to save new rows with something like:
thing = Test(value=1)
thing.save()
The sql actually contains a RETURNING clause that needs more permissions (namely, SELECT) than just insert:
INSERT INTO "test" ("value") VALUES (1) RETURNING "test"."id"
Seems like the same sql is generated when I try to use query = test.insert(value=1)' query.execute() as well.
From looking around, it seems like you need either grant SELECT privileges, or use a more exotic feature like "row level security" in Postgres. Is there any way to go about this with peewee out of the box? Or another suggestion of how to add new rows with truly write-only permissions?
You can omit the returning clause by explicitly writing your INSERT query and supplying a blank RETURNING. Peewee uses RETURNING whenever possible so that the auto-generated PK can be recovered in a single operation, but it is possible to disable it:
# Empty call to returning will disable the RETURNING clause:
iq = Test.insert(value=1).returning()
iq.execute()
You can also override this for all INSERT operations by setting the returning_clause attribute on the DB to False:
db = PostgresqlDatabase(...)
db.returning_clause = False
This is not an officially supported approach, though, and may have unintended side-effects or weird behavior - caveat emptor.

Advanced migration between Databases using Python, SQLAlchemy, and Alembic

--- UPDATED to be more clear ---
I have quite the task ahead of me, and I am hoping Alembic and SQLAlchemy can do what I need.
I want to develop a desktop application that helps migrate data from one SQLite database to a completely different model and back again. So migrations right? That’s all well and good, but I need to ensure if I haven’t modeled a specific column/table, that it’s ported properly to a table that will be read later to reconstruct the database.
Example:
DB 1:
Table names
ID
First Name
Last Name
Table address
Street 1
Street 2
City
State
DB2:
Table givenName
ID
name
Table Surname
ID
Name
Say with alembic I have mapped the following:
DB1 names.firstname => DB2 givenName.name
DB1 names.lastname => DB2 surname.name
But say I want to use a migration to port DB1 to DB2, store the unknown data somewhere, and then reconstruct it properly when I go from DB2 -> DB1.
So how I’d envision this is a joiner table of sorts.
DB2
Table Joiner
table_name
column_name
data
The thing is I want this all to be completely dynamic so no piece of information is ever lost.
Now let's add an extra complexity, and I want to construct a generator of sorts so I can simply pass down new XML/JSON declarations. This would define the mappings, and any calls to translators already based in the code (date conversions, etc).
For an example of two database formats that need to be migrated from one to the other see https://cloudup.com/cYzP2lCQjbo
My question is if this is even possible or conceivable with SQLAlchemy and Alembic. How feasible is this? Any thoughts?

Is it possible to let users create and perform database migrations from a form?

Can you take form data and change database schema? Is it a good idea? Is there a downside to many migrations from a 'default' database?
I want users to be able to add / remove tables, columns, and rows. Making schema changes requires migrations, so adding in that functionality would require writing a view that takes form data and inserts it into a function that then uses Flask-Migrate.
If I manage to build this, don't migrations build the required separate scripts and everything that goes along with that each time something is added or removed? Is that practical for something like this, where 10 or 20 new tables might be added to the starting database?
If I allow users to add columns to a table, it will have to modify the table's class. Is that possible, or a safe idea? If not, I'd appreciate it if someone could help me out, and at least get me pointed in the right direction.
In a typical web application, the deployed database does not change its schema at runtime. The schema is only changed during an upgrade, and only the developers make these changes. Operations that users perform on the application can add, remove or modify rows, but not modify the tables or columns themselves.
If you need to offer your users a way to add flexible data structures, then you should design your database schema in a way that this is possible. For example, if you wanted your users to add custom key/value pairs, you could have a table with columns user_id, key_name and value. You may also want to investigate if a schema-less database fits your needs better.

Is it possible to manually set a django foreignkey ID to an item that doesn't exist yet?

I'm designing a bulk import tool from an old system into a new Django based system.
I'd like to retain all of the current IDs of objects (they are just 5 digit strings), now due to the design in the current system there are lots of references between these objects.
To import I can see 2 possible techniques - import a known object, and carefully recurse through these relationships making sure to import in the right way and only set relationships as soon as I know they exist
... or start at item 00001 set the foreignkeys to values I know exist and just grab everything in order, knowing that once we get to item 99999 all the relationships will exist.
So is there a way to set a foreignkey to the ID of an item that doesn't exist, but will, even for imports only?
To add further complexity, not all of these relationships are straightforward foreignkeys, some are ManyToMany relationships as well.
To be able to handle any database that Django supports and not have to deal with peculiarities of the backend, I'd export the old database in the format that Django loaddata can read, and then give this exported file to loaddata. This command has no issue importing the type of structure you are talking about.
Creating the file that loaddata will read could be done by writing your own converter that reads the old database and dumps an appropriate file. However, a way which might be easier would be to create a throwaway Django project with models that have the same structure as the tables in the old database, point the Django project to the old database, and use dumpdata to create the file. If table details between the old database and the new database have changed, you'd still have to modify the file but at least some of the conversion work would have already been done.
A more direct way would be to bypass Django completely to do the import in SQL but turn off foreign key constraints for the time of the import. For MySQL this would be done by setting foreign_key_checks to 0 for the time of the import, and then back to 1 when done. For SQLite this would be done by using PRAGMA foreign_keys = OFF; and then ON when done.
Postgresql does not allow just turning off these constraints but Django creates foreign key constraints as DEFERRABLE INITIALLY DEFERRED, which means that the constraint is not checked until the end of a transaction. So initiating a transaction, importing and then committing should work. If something prevents this, then you have to drop the constraint before importing and add it back afterwards.
Sounds like you need a database migration tool like South, the standard for Django. Worth noting that Django 1.7 Beta 1 was released recently and it provides in-built migration.

django postgresql query not working

I have a postgreSQL database that has a table foo that I've created outside of django. I used manage.py inspectdb to build the model for table foo for me. This technique worked fine when I was using MySQL but with PostgreSQL it is failing MISERABLY. The table is multiple gigabytes and I build it from a text file with PostgreSQL 'COPY'.
I can run raw queries on table foo and everything executes and expected.
For example
foo.objects.raw('bar_sql')
executes as expected.
But running queries like:
foo.objects.get(bar=bar)
throw
ProgrammingError column foo.id does not exist LINE 1: SELECT "foo"."id", "foo"."bar1", "all_...
foo doesn't innately have an id field. As I understand it django is suppose to create one. Have I some how subverted this step when creating the tables outside of django?
Queries run on models whose table was populated threw django run as expected in all cases.
I'm missing something very basic here and any help would be appreciated.
I'm using django 1.6 with postgreSQL 9.3.
Django doesn't modify your existing database tables. It only creates new tables. If you have existing tables, it usually doesn't touch them at all.
"As I understand it django is suppose to create one." --> It only adds a primary key to a table when it creates it, which means you don't need to specify that explicitly in your model, but it won't do anything to an existing table.
So if for example you later on decide to add fields to your models, you have to update your databases manually.
What you need to do in your case is that by doing manual database administration make sure that your table has a primary key, and also that the name of the primary key is "id" (although I am not sure if this is necessary, it is better to do it.) So use a database administration tool, modify your table and add the primary key, and name it id. Then it should start working.

Categories