I have launched this project and it uses Flask-SQLAlchemy and already has some classes (tables). Now I added few more tables - classes that inherit db.Model. And when I run application, I get this error:
sqlalchemy.exc.InvalidRequestError: Table 'table_name' is already defined for this MetaData instance. Specify 'extend_existing=True' to redefine options and columns on an existing Table object.`
I also opened and inspected the database file, but there is no such table created, so error seams a bit misleading.
How to achieve that newly defined classes be created as tables in database?
The problem was in Python cache files. I deleted all __pycache__ folders that I found in my project directory and it helped.
Related
Django 1.7, Python 3.4.
In my models I have several TextFields defined.
When I go to load a JSON fixture (which was generated from an SQLite3 dump), it fails on the second object, which has 515 characters for one of its fields.
The error printed is
psycopg2.DataError: value too long for type character varying(500)
I created a new database (not just a table drop, a whole new db), modified my settings.py file, ran manage.py syncdb on the new database, created a user, and tried to load the data again, getting the same error.
Upon opening pgAdmin3, all columns, both CharField and TextField defined are listed as type character var.
So it seems TextField is being ignored and CharFields are being created instead. The PostgreSQL documentation explicitly lists both text and character types, and defines text as being unlimited in length. Any idea why?
I'm not sure what the exact cause was, but it seems to be related to django's migration tool storing migrations, even on a new database.
What I did to get this behavior:
Create django project, then apps, using CharField
syncdb, run the project's dev server
kill the devserver, modify fields to be TextField
Create a new Postgres database, modify settings.py
Run syncdb, attempt to load fixtures
See the error in question, examine db instance
What fixed the problem:
Create a new database, modify settings.py
delete all migrations in apps/migrations folders
after running syncdb, also run createmigrations and migrate
The last step generated a migration, even though there were none stored in the migrations folder, and there had been no changes to models or data since syncdb was run on the new database, which I found to be odd.
Somewhere in the last two steps this was fixed. Future people stumbling upon this: sorry, I'm not going to keep creating django projects to test the behavior further, but perhaps with this information you can fix your own database problems.
I have a Flask project with MySQL database and uses SQLAlchemy as ORM and Flask-Migrate*for migrations.
I wrote my models and when I run the migrations, migration file is empty because existing tables are out of Flask-Migrate's control so I actually have to delete those tables to let migration tool to detect and create them again. But the problem is that I do not want to delete and create my tables.
So is there a way that I can sync my model with my existing table ?
EDIT :
I just found out that in env.py file, it's possible to specify tables that exists
and it will not create those tables :
metadata.reflect(engine, only=["table1", "table2"])
Thanks for the answer.
Automatic migrations are by definition generated as a delta between your models and your database. If you already have a database that was created before you started using Flask-Migrate/Alembic then you can begin tracking migrations from that point on.
If you want to generate an initial migration that takes you to your current version the easiest way is to delete all the tables, as you suggested. To avoid losing your data I can suggest two ideas:
backup your database before deleting your tables, then restore after the migration was generated.
temporarily point your application at an empty database (a different one). Once the migration is generated point it back to your database.
I hope this helps.
I'm trying to deal with a very puzzling error in a Django app. When DEBUG=False, trying to delete a user (via user.delete()) gives this database error:
DatabaseError: relation "social_auth_usersocialauth" does not exist
LINE 1: ...", "social_auth_usersocialauth"."extra_data" FROM "social_au...
However, I do not have social_auth or anything by a similar name in INSTALLED_APPS, nor are there any such tables in my database, nor does any of my code reference anything of the sort (I ran a text search on 'social' in the entire project folder) - and again, this works fine when DEBUG=True. social_auth is installed on my system and on my PYTHONPATH, but I cannot see where this app is getting the idea it should be having social_auth's tables in its database, let alone why it only thinks so when DEBUG=False.
What possible pathways could my app be getting this table from and how could I convince it it's not supposed to be there?
The problem could be caused by saved generic relations realized by Django content types. Relations in Django are not only static, implemented by models and INSTALLED_APPS but also dynamic implemented by table django_content_type that saves mapping from a numeric id to app_label + model. An example of possible dynamic relationship is a permission or a comment. You can have or have not a permission to any table of any installed application. You can write a comment to everything e.g to an article, to a user to a comment itself without changing any model. This relation is realized by saving numeric id of ContentType related to that model (table) and a primary key of related object (row).
Django does not expect that someone can manipulate the database manually. If you use south for manipulation then if you after uninstalling an application then run syncdb, you are asked by south if you want automatically remove orphant content types. Then can be unused tables removed securely without beeing later referenced.
(Possible hack: delete from django_content_type where app_label='social_auth' but south is unfallible.)
Many parts of the question are still open.
Edit:
Why it was not the right way: All generic relations are from descendants to the parent and all data about the relation are saved in descendant. If the child app is removed from INSTALLED_APPS then django.db code can nevermore try to remove descendants because it can not recognize which columns contain the relation data.
This table is created by django-social-auth application.
Looks like you've added it to your project and haven't launched migrate (or syncdb).
Django: If I added new tables to database, how can I query them?
Do I need to create the relevant models first? Or django creates it by itself?
More specifically, I installed another django app, it created several database tables in database, and now I want to get some specific data from them? What are the correct approaches? Thank you very much!
I suppose another django app has all model files needed to access those tables, you should just try importing those packages and use this app's models.
Django doen't follow convention over configuration philosophy. you have to explicitly create the backing model for the table and in the meta tell it about the table name...
I'm using SQLAlchemy and I can create tables that I have defined in /model/__init__.py but I have defined my classes, tables and their mappings in other files found in the /model directory.
For example I have a profile class and a profile table which are defined and mapped in /model/profile.py
To create the tables I run:
paster setup-app development.ini
But my problem is that the tables that I have defined in /model/__init__.py are created properly but the table definitions found in /model/profile.py are not created. How can I execute the table definitions found in the /model/profile.py so that all my tables can be created?
Thanks for the help!
I ran into the same problem with my first real Pylons project. The solution that worked for me was this:
Define tables and classes in your profile.py file
In your __init__.py add from profile import * after your def init_model
I then added all of my mapper definitions afterwards. Keeping them all in the init file solved some problems I was having relating between tables defined in different files.
Also, I've since created projects using the declarative method and didn't need to define the mapping in the init file.
Just import your other table's modules in your init.py, and use metadata object from models.meta in other files. Pylons default setup_app function creates all tables found in metadata object from model.meta after importing it.
If you are using declarative style, be sure to use Base.meta for tables generation.