Flask-Admin deleting secondary mapping on update - python

I'm using Flask-Admin and when I update my content object, it deletes all the items in its associated lookup table (content_products_table).
So if I have a Content object and 7 products mapped in its lookup table (content_products_table) all the mapping from the content to the product is deleted when its updated.
I believe this is happening because my products list is empty while its updating, but in reality my product list could be 1000 items long so I wouldn't want to load it before updating, just simply not update it at all. I feel like there is a way to configure Flask-Admin or sqlalchemy to ignore the field on update.
Example my object below:
content_products_table = db.Table('content_products', db.Model.metadata,
db.Column('content_id', db.Integer, db.ForeignKey('content.id')),
db.Column('product_id', db.Integer, db.ForeignKey('product.id'))
)
class Content(db.Model):
id = db.Column(db.Integer, primary_key=True)
products = db.relationship('Product', secondary=content_products_table)
Example of the code thats updating my object:
form.populate_obj(model)
self._on_model_change(form, model, False)
self.session.commit()
Any information would help since I'm new to both of these platforms. Thanks!
EDIT
I wanted to update that when my objects are updated, it deletes all relationships. So if I have a table User who has a One To Many to a table Content. The user_id will be set to null in the Content table when I update the User.

form_excluded_columns = ('products')
Need to explicitly exclude object properties. This seems very dangerous.

Related

Creating multiple tables from the same sqlalchemy model [duplicate]

I have data for a particular entity partitioned across multiple identical tables, often separated chronologically or by numeric range. For instance, I may have a table called mytable for current data, a mytable_2013 for last year's data, mytable_2012, and so on.
Only the current table is ever written to. The others are only consulted. With SQLAlchemy, is there any way I can specify the table to query from when using the declarative model?
Use mixins and change table names by an object property.
class Node(Base):
__tablename__ = 'node'
nid = Column(Integer, primary_key=True)
uuid = Column(String(128))
vid = Column(Integer)
class Node1(Node):
__tablename__ = 'node_1'
class Node2(Node):
__tablename__ = 'node_2'
As requested, re-posting as answer:
Please take a look at this answer to Mapping lots of similar tables in SQLAlchemy in the Concrete Table Inheritance section.
In your case you can query MyTable only when working with the current data, and do a polymorphic search on all tables when you need the whole history.

how to change the data type of a field using peewee in SQLite

lease help me figure it out. I created this class for a table in a database.
class ProfileAdditionalField(peewee.Model):
profile = peewee.ForeignKeyField(RpProfile, on_delete='cascade')
item = peewee.ForeignKeyField(AdditionalField, on_delete='cascade')
is_allowed = peewee.BooleanField(default=True)
class Meta:
database = db
primary_key = peewee.CompositeKey('profile', 'item')
when I try to modify the RpProfile table, all the entries from the ProfileAdditionalField table become deleted. I think the problem is in setting "on_delete = cascade"
playhouse_migrate.migrate(
migrator.add_column('RpProfile', 'show_link',
peewee.BooleanField(default=False)),
)
I use SQLite, and the migrator.alter_column_type command does not work in it. I can’t even change the setting so that the data is no longer deleted automatically.
How to add a new field to the RpProfile table without deleting data from the ProfileAdditionalField table?
Sqlite has limited support for altering columns. This is well-documented:
https://sqlite.org/lang_altertable.html
You might try disabling foreign-key constraints in sqlite (pragma foreign_keys=0) before doing any changes. In order to do anything beyond simply adding a new column in sqlite, peewee has to create a temp table with the desired schema, migrate everything over, then delete the old table and rename the temp table.

How to rename an existing table?

I want to create 200+ tables using declarative base on the fly. I learnt it's not possible, so my idea was to create a common table and rename it 200+ times.
class Movie(Base):
id = Column(Integer, primary_key=True)
title = Column(String)
release_date = Column(Date)
name=Column(String)
__tablename__ = 'titanic'
def __init__(self, newname,title, release_date):
self.title = title
self.release_date = release_date
What is the code to change the table name from "titanic" to "wild"?
In Postgresql it is
ALTER TABLE table_name
RENAME TO new_table_name;
I am not finding a solution in sqlalchemy.
There are no foreign keys to this table.
The objective of this question is to rename an existing table thru a solution (if) available in sqlalchemy, not in a purely python way (as mentioned in the other question).
The easiest way to rename a table is to create a new table, dumping the data into it with an INSERT INTO statement.
More from the web:
You must issue the appropriate ALTER statements to your database to change the name of the table. As far as the Table metadata itself, you can attempt to set table.name = 'newname', and re-place the Table object within metadata.tables with its new name, but this may have lingering side effects regarding foreign keys that reference the old name. In general, the pattern is not supported - its intended that a SQLAlchemy application runs with a fixed database structure (only new tables can be added on the fly).
(Source)

SQLAlchemy, using the same model with multiple tables

I have data for a particular entity partitioned across multiple identical tables, often separated chronologically or by numeric range. For instance, I may have a table called mytable for current data, a mytable_2013 for last year's data, mytable_2012, and so on.
Only the current table is ever written to. The others are only consulted. With SQLAlchemy, is there any way I can specify the table to query from when using the declarative model?
Use mixins and change table names by an object property.
class Node(Base):
__tablename__ = 'node'
nid = Column(Integer, primary_key=True)
uuid = Column(String(128))
vid = Column(Integer)
class Node1(Node):
__tablename__ = 'node_1'
class Node2(Node):
__tablename__ = 'node_2'
As requested, re-posting as answer:
Please take a look at this answer to Mapping lots of similar tables in SQLAlchemy in the Concrete Table Inheritance section.
In your case you can query MyTable only when working with the current data, and do a polymorphic search on all tables when you need the whole history.

flask-mongoengine and document not accepting unique or primary_key arguments

I'm trying out flask-mongoengine and mongohq and I'm having some difficulty getting it to declare my documents correctly.
I've declaed a db document like so:
class numbers(nodb.Document):
numbers = nodb.StringField(required=True)
simple_date = nodb.DateTimeField(required=True, unique=True, primary_key=True)
date = nodb.DateTimeField(default=datetime.now, required=True)
now when I add an entry to the document it's not taking my _id or even acknowledging that I've put in the unique or primary_key requirement.
test = numbers(
_id=datetime.strptime(currentdate, "%m/%d/%Y").date(),
simple_date=datetime.strptime(currentdate, "%m/%d/%Y").date(),
numbers='12345'
)
test.save()
now if I do those lines again, it creates another identical entry in the db and the requirements on simple_date appear to be ignored. Not sure if I'm hitting a bug here or just doing something wrong?
Mongoengine must create indexes if collection not exists yet. Mongoengine do not take care about data migration. So if you at first created collection without index and next describe index in model then index not created automatically. For your case you must create indexes manually or try drop your numbers collection only for development database when data not necessary.

Categories