Accessing all columns from MySQL table using flask sqlalchemy - python

I'm trying to get the columns for a MySQL table whether in a string or a list format.
Since I defined the table through my main app, I could use dir(table_name) to get a list of attributes but those contain private attributes and other built-in attributes like "query" or "query_class". Filtering these would be possible but I'm trying to find an easier way to get the columns without going through the attribute route.
Between reading the flask-sqlalchemy documentation and sqlalchemy documentation, I noticed that using table_name.c will work for sqlalchelmy. Is there an equivalent for flask-sqlalchemy?

Flask-SQLAlchemy does nothing to the SQLAlchemy side of things, it is just a wrapper to make it easier to use with Flask along with some convenience features. You can inspect orm objects the same way you normally would. For example, inspecting a model returns its mapper.
m = db.inspect(MyModel)
# all orm attributes
print(list(m.all_orm_descriptors.keys()))
# just the columns
print(list(m.columns.keys()))
# etc.

Related

How to translate SQLAlchemy result rows into nested dicts

I am evaluating a potential setup for using SQLAlchemy in an async/await FastAPI app. I am currently composing models and queries using declarative_base classes, and then executing the queries with Databases (the syntax is much more readable and easy to write for model classes; working directly with SQLAlchemy core tables is not my favorite activity). This all works great.
At this point, I have SQLAlchemy result rows, but I need to convert them into generic dicts, potentially nested due to eagerly loaded relationships (only type I will support in this environment). I can't use SQLAlchemy's ORM because 1) I don't have an engine or session; and 2) the ORM assumes that it can just hit the database whenever it needs to load in objects, which is not the case in an async/await FastAPI app.
Does anyone have ideas or pointers for how to accomplish this? I'm struggling to figure out how to associate result rows with particular relationship keys, in particular. I've been poking around in the SQLAlchemy internals for ideas, but it's pretty opaque since a lot of it assumes an entire layer of object caching and session/engine management that just isn't present in my setup.
The two things I could use ideas about:
How to map columns names like table_1_column_name to specific models and their properties
How to detect and map relationships (potentially more than one level deep)
Thanks for any help you can provide!
Update: You can find a runnable example here: https://gist.github.com/onecrayon/dd4803a5099061fa48d52f2d4bc2396b (see lines 92-109 for the relevant place where I need to figure out how to convert a RowProxy to a nested dict by mapping the query column names to the names on the SQLAlchemy model).
If you are db first, sqlalchemy execute method usually returns a Result Proxy object and you can get the result of it with its methods such as fetchone, first, fetchall and then cast it to list or dict.
You can see also this dock
Casting the object into a dict should work if the result is a raw SQLAlchemy row and not an ORM-mapped instance.
Seeing from your comment on another answer, it looks like you need to map the result back into an ORM instance. You can define declarative mappings so that your result gets translated back into a python instance.
SQLALchemy object has an option to return as dict -
Here is the SQLALchemy source doc to help you.
https://docs.sqlalchemy.org/en/13/orm/query.html#sqlalchemy.util.KeyedTuple._asdict
Or You can go with the https://pythonhosted.org/dictalchemy which works as wrapper on top of SQLALchemy.
Hope that helps.

Is it a bad practice to iterate over a Flask SQLAlchemy object using __dict__?

To build a dynamic user update route on Flask, I have iterate over the user Flask SQLAlchemy object using the dunder __dict__:
parameters = ['name'] # Valid parameters after the Regex filter was applied
for parameter in parameters:
user.__dict__[parameter] = request.form.get(parameter)
I have done this to avoid the usage of ifs. To ensure that only valid parameters are present in parameters, I have applied a Regex pattern that filters the valid parameters received in the request for the user route, and I have documented this aspect on the doc string.
I'm asking if iterate over a Flask SQLAlchemy object using __dict__ is it a bad practice because if I print the user.__dict__, I receive all parameters, even those that aren't on the Regex filter, i.g, password, date created, etc; and should never be updated by this route.
I have found another approach that uses get all columns in SQLAlchemy, but I think that at the end its similar to the approach that I'm using...
Obs: the implemented route can update specific attributes from user or all of them, using the same route
I'd recommend looking into marshmallow-sqlalchemy to manage this sort of thing. I've found that there are very few use-cases where there is a problem, and __dict__ is the best solution.
Here's an example application using it: How do I produce nested JSON from database query with joins? Using Python / SQLAlchemy

Flask and SQLAlchemy sort in display without new query?

I'm displaying the results from an SQLAlchemy (Flask-SQLAlchemy) query on a particular view. However the sorting/order is only set by what I originally passed into the query ( order_by(desc(SelectedTable.date_changed)) ). I'm trying to now add functionality that each column that is displayed can be selected to order the presentation.
Is there a way to alter the way a returned query object is sorted once it's returned to create this behavior? Or will I need to build custom queries for each possible column that could be sorted by and ascending/descending?
Is there a recipe for implementing something like this? I've tried google, here, the Flask, Flask-SQLAlchemy, and SQLAlchemy docs for something along these lines but haven't seen anything that touches on the subject and beginning to think that I'm going to need to use custom queries or without new queries try some JavaScript in the Jinja Template to achieve this.
Thanks!

Django query with joins

I need to write a complex query, which retrieves a lot of data from a bunch of tables. Basically I need to find all instances of the models
Customer
Payment
Invoice
where relationships intersect in a specific way. In SqlAlchemy, I would be able to do something like
for c, p, i in session.query(Customer, Payment, Invoice).\
filter(User.id==Payment.customer_id).\
filter(Invoice.id==Payment.invoice_id).\
filter(Payment.date==...).\
filter(Customer.some_property==...)
all():
# Do stuff ...
This would allow me to set several constraints and retrieve it all at once. In Django, I currently do something stupid like
customers = Customer.objects.filter(...)
payments = Payment.objects.filter(customer=customer)
invoices = Invoice.objects.filter(customer=customer, payment_set=payments)
Now, we already have three different queries (some details are left out to keep it simple). Could I reduce it to one? Well, I could have done something like
customers = Customer.objects.filter(...).prefetch_related(
'payments', 'payments__invoices'
)
but now I have to traverse a crazy tree of data instead of having it all laid out neatly in rows, like with SqlAlchemy. Is there any way Django can do something like that? Or would I have to drop through to custom SQL directly?
After reading up on different solutions, I have decided to use SqlAlchemy on top of my Django models. Some people try to completely replace the Django ORM with SqlAlchemy, but this almost completely defeats the purpose of using Django, since most of the framework relies on the ORM.
Instead, I use SqlAlchemy simple for querying the tables defined by the Django ORM. I follow a recipe similar to this
# Setup sqlalchemy bindings
import sqlalchemy as s
from sqlalchemy.orm import sessionmaker
engine = s.create_engine('postgresql://<user>:<password>#<host>:<port>/<db_name>')
# Automatically read the database tables and create metadata
meta = s.MetaData()
meta.reflect(bind=engine)
Session = sessionmaker(bind=engine)
# Create a session, which can query the tables
session = Session()
# Build table instances without hardcoding tablenames
s_payment = meta.tables[models.Payment()._meta.db_table]
s_allocation = meta.tables[models.Allocation()._meta.db_table]
s_customer = meta.tables[models.Customer()._meta.db_table]
s_invoice = meta.tables[models.Invoice()._meta.db_table]
report = session.query(s_payment.c.amount, ...).all()
There is room for a few improvements on this recipe, e.g. it is not very elegant to create an empty instance of Django models in order to find their table name, however, with a few lines of code, I get the full flexibility of SqlAlchemy without compromising with the Django ORM layer. This means both can live happily alongside each other.
One caveat is that SqlAlchemy will not use the same connection as the Django ORM, which means that the view of things may not appear consistent if I use both approaches in the same context. This won't be a problem for me though, since I just want to read a bunch of data from the database.

Grabbing multiple documents with flask-mongoengine

I'm using Flask-MongoEngine in my python application, and I'm trying to grab a list of documents WHERE a field equals some value. I know how to grab a single document based on the value of a field using get(name="chris"), but how would I be able to do this with returning multiple documents? Nothing in the docs is really sticking out.
MongoEngine Document classes have an objects attribute, which is used for accessing the objects in the database associated with the class. example:
uk_users = User.objects(country='uk')
For advanced queries you can use the filter attribute:
uk_female_users = User.objects(country='uk').filter(gender='f')
This is the related documentation MongoEngine - Querying the database

Categories