Insert and update with core SQLAlchemy - python

I have a database that I don't have metadata or orm classes for (the database already exists).
I managed to get the select stuff working by:
from sqlalchemy.sql.expression import ColumnClause
from sqlalchemy.sql import table, column, select, update, insert
from sqlalchemy.ext.declarative import *
from sqlalchemy.orm import sessionmaker
from sqlalchemy import create_engine
import pyodbc
db = create_engine('mssql+pyodbc://pytest')
Session = sessionmaker(bind=db)
session = Session()
list = []
list.append (column("field1"))
list.append (column("field2"))
list.append (column("field3"))
s = select(list)
s.append_from('table')
s.append_whereclause("field1 = 'abc'")
s = s.limit(10)
result = session.execute(s)
out = result.fetchall()
print(out)
So far so good.
The only way I can get an update/insert working is by executing a raw query like:
session.execute(<Some sql>)
I would like to make it so I can make a class out of that like:
u = Update("table")
u.Set("file1","some value")
u.Where(<some conditon>)
seasion.execute(u)
Tried (this is just one of the approaches I tried):
i = insert("table")
v = i.values([{"name":"name1"}, {"name":"name2"}])
u = update("table")
u = u.values({"name": "test1"})
I can't get that to execute on:
session.execute(i)
or
session.execute(u)
Any suggestion how to construct an insert or update without writing ORM models?

As you can see from the SQLAlchemy Overview documentation, sqlalchemy is build with two layers: ORM and Core. Currently you are using only some constructs of the Core and building everything manually.
In order to use Core you should let SQLAlchemy know some meta information about your database in order for it to operate on it. Assuming you have a table mytable with columns field1, field2, field3 and a defined primary key, the code below should perform all the tasks you need:
from sqlalchemy.sql import table, column, select, update, insert
# define meta information
metadata = MetaData(bind=engine)
mytable = Table('mytable', metadata, autoload=True)
# select
s = mytable.select() # or:
#s = select([mytable]) # or (if only certain columns):
#s = select([mytable.c.field1, mytable.c.field2, mytable.c.field3])
s = s.where(mytable.c.field1 == 'abc')
result = session.execute(s)
out = result.fetchall()
print(out)
# insert
i = insert(mytable)
i = i.values({"field1": "value1", "field2": "value2"})
session.execute(i)
# update
u = update(mytable)
u = u.values({"field3": "new_value"})
u = u.where(mytable.c.id == 33)
session.execute(u)

Related

Using sqlalchemy with psycopg

I am in need of combining the results of a SQLAlchemy query and a pyscopg query.
Currently I use psycopg to do most of my SQL selects in my code. This is done using a cursor and fetchall().
However, I have a separate microservice that returns some extra WHERE clauses I need for my statement, based on some variables. This is returned as a SQLAlchemy SELECT object. This is out of my control.
Example return:
select * from users where name = 'bar';
My current solution for this is to hardcode the results of the microservice (just the WHERE clauses) into an enum and add them into the pyscopg statement using an f-string. This is a temporary solution.
Simplified example:
user_name = "bar"
sql_enum = {
"foo": "name = 'foo'"
"bar": "name = 'bar'"
}
with conn.cursor() as cur:
cur.execute(f"select * from users where location = 'FOOBAR' and {sql_enum[user_name]}")
I am looking for a way to better join these two statements. Any suggestions are greatly appreciated!
Rather than mess with dynamic SQL (f-strings, etc.), I would just start with a SQLAlchemy Core select() statement and then add the whereclause from the statement returned by the microservice:
import sqlalchemy as sa
engine = sa.create_engine("postgresql://scott:tiger#192.168.0.199/test")
users = sa.Table(
"users", sa.MetaData(),
sa.Column("id", sa.Integer, primary_key=True),
sa.Column("name", sa.String(50)),
sa.Column("location", sa.String(50))
)
users.drop(engine, checkfirst=True)
users.create(engine)
# mock return from microservice
from_ms = sa.select(sa.text("*")).select_from(users).where(users.c.name == "bar")
base_query = sa.select(users).where(users.c.location == "FOOBAR")
full_query = base_query.where(from_ms.whereclause)
engine.echo = True
with engine.begin() as conn:
result = conn.execute(full_query)
"""SQL emitted:
SELECT users.id, users.name, users.location
FROM users
WHERE users.location = %(location_1)s AND users.name = %(name_1)s
[generated in 0.00077s] {'location_1': 'FOOBAR', 'name_1': 'bar'}
"""

Update a SQL table with data from itself using SQLAlchemy

I have a short SQL script which "copies" selected columns from a SQL table from one id (main_id=1) to two other ids (main_id=3 and 4) of the same table.
There are also some other ids which are part of the primary key of the table.
The script works fine using a PostgreSQL DB.
However, i would like to replace this using SQLAlchemy ORM, but i don't know how to do this.
UPDATE "MyTable" AS T
SET "Varname_1" = Cell."Varname_1",
"Varname_2" = Cell."Varname_2"
FROM "MyTable" AS Cell
WHERE T.id_A = Cell.id_A AND
T.id_B = Cell.id_B AND
Cell.main_id = 1 AND
T.main_id IN (3, 4);
Can anyone help me to "translate" this?
Not sure what you were having problems with, as I was able to do this by following the examples from Multiple Table Updates and Using Aliases and Subqueries sections of the tutorial:
import sqlalchemy
from sqlalchemy import create_engine
engine = create_engine('sqlite:///:memory:', echo=True)
from sqlalchemy import Table, Column, Integer, String, MetaData, ForeignKey
from sqlalchemy import alias
metadata = MetaData()
my_table = Table('MyTable', metadata,
Column('id_A', Integer),
Column('id_B', Integer),
Column('main_id', Integer),
Column('varname_1', String),
Column('varname_2', String),
)
cell = my_table.alias("cell")
stmt = my_table.update(). \
where(my_table.c.id_A == cell.c.id_A). \
where(my_table.c.id_B == cell.c.id_B). \
where(cell.c.main_id == 1). \
where(my_table.c.main_id.in_([3, 4])). \
values(varname_1=cell.c.varname_1,
varname_2=cell.c.varname_2)
print(str(stmt))
print(stmt.compile().params)

Flask-SQLAlchemy check if table exists in database

Flask-SQLAlchemy check if table exists in database.
I see similar problems, but I try not to succeed.
Flask-SQLAlchemy check if row exists in table
I have create a table object ,like this:
<class'flask_sqlalchemy.XXX'>,
now how to check the object if exists in database.
I do many try:
eg:
for t in db.metadata.sorted_tables:
print("tablename",t.name)
some table object is created before,but it doesnt exists in database,and now they. all print.
eg:print content is
tablename: table_1
tablename: table_2
tablename: table_3
but only table_1 is exist datable,table_2 and table_3 is dynamica create,now I only want use the table_1.
very thanks.
I used these methods. Looking at the model like you did only tells you what SHOULD be in the database.
import sqlalchemy as sa
def database_is_empty():
table_names = sa.inspect(engine).get_table_names()
is_empty = table_names == []
print('Db is empty: {}'.format(is_empty))
return is_empty
def table_exists(name):
ret = engine.dialect.has_table(engine, name)
print('Table "{}" exists: {}'.format(name, ret))
return ret
There may be a simpler method than this:
def model_exists(model_class):
engine = db.get_engine(bind=model_class.__bind_key__)
return model_class.metadata.tables[model_class.__tablename__].exists(engine)
the solution is too easy, just write this two rows in you code and it should work fine for you
from flask_sqlalchemy import SQLAlchemy, inspect
...
inspector = inspect(db.engine)
print(inspector.has_table("user")) # output: Boolean
have a nice day
SQL Alchemy's recommended way to check for the presence of a table is to create an inspector object and use its has_table() method.
The following example was copied from sqlalchemy.engine.reflection.Inspector.has_table, with the addition of an SQLite engine (in memory) to make it reproducible:
In [17]: from sqlalchemy import create_engine, inspect
...: from sqlalchemy import MetaData, Table, Column, Text
...: engine = create_engine('sqlite://')
...: meta = MetaData()
...: meta.bind = engine
...: user_table = Table('user', meta, Column("first_name", Text))
...: user_table.create()
...: inspector = inspect(engine)
...: inspector.has_table('user')
Out[17]: True
You can also use the user_table metadata element name to check if it exists as such:
inspector.has_table(user_table.name)

Using window functions to LIMIT a query with SqlAlchemy on Postgres

I'm trying to write the following sql query with sqlalchemy ORM:
SELECT * FROM
(SELECT *, row_number() OVER(w)
FROM (select distinct on (grandma_id, author_id) * from contents) as c
WINDOW w AS (PARTITION BY grandma_id ORDER BY RANDOM())) AS v1
WHERE row_number <= 4;
This is what I've done so far:
s = Session()
unique_users_contents = (s.query(Content).distinct(Content.grandma_id,
Content.author_id)
.subquery())
windowed_contents = (s.query(Content,
func.row_number()
.over(partition_by=Content.grandma_id,
order_by=func.random()))
.select_from(unique_users_contents)).subquery()
contents = (s.query(Content).select_from(windowed_contents)
.filter(row_number >= 4)) ## how can I reference the row_number() value?
result = contents
for content in result:
print "%s\t%s\t%s" % (content.id, content.grandma_id,
content.author_id)
As you can see it's pretty much modeled, but I have no idea how to reference the row_number() result of the subquery from the outer query where. I tried something like windowed_contents.c.row_number and adding a label() call on the window func but it's not working, couldn't find any similar example in the official docs or in stackoverflow.
How can this be accomplished? And also, could you suggest a better way to do this query?
windowed_contents.c.row_number against a label() is how you'd do it, works for me (note the select_entity_from() method is new in SQLA 0.8.2 and will be needed here in 0.9 vs. select_from()):
from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Content(Base):
__tablename__ = 'contents'
grandma_id = Column(Integer, primary_key=True)
author_id = Column(Integer, primary_key=True)
s = Session()
unique_users_contents = s.query(Content).distinct(
Content.grandma_id, Content.author_id).\
subquery('c')
q = s.query(
Content,
func.row_number().over(
partition_by=Content.grandma_id,
order_by=func.random()).label("row_number")
).select_entity_from(unique_users_contents).subquery()
q = s.query(Content).select_entity_from(q).filter(q.c.row_number <= 4)
print q

Converting SQL commands to Python's ORM

How would you convert the following codes to Python's ORM such as by SQLalchemy?
#1 Putting data to Pg
import os, pg, sys, re, psycopg2
#conn = psycopg2.connect("dbname='tkk' host='localhost' port='5432' user='noa' password='123'")
conn = psycopg2.connect("dbname=tk user=naa password=123")
cur = conn.cursor()
cur.execute("""INSERT INTO courses (course_nro)
VALUES ( %(course_nro)s )""", dict(course_nro='abcd'))
conn.commit()
#2 Fetching
cur.execute("SELECT * FROM courses")
print cur.fetchall()
Examples about the two commands in SQLalchemy
insert
sqlalchemy.sql.expression.insert(table, values=None, inline=False, **kwargs)
select
sqlalchemy.sql.expression.select(columns=None, whereclause=None, from_obj=[], **kwargs)
After the initial declarations, you can do something like this:
o = Course(course_nro='abcd')
session.add(o)
session.commit()
and
print session.query(Course).all()
The declarations could look something like this:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import session_maker
# create an engine, and a base class
engine = create_engine('postgre://naa:123#localhost/tk')
DeclarativeBase = declarative_base(bind=engine)
metadata = DeclarativeBase.metadata
# create a session
Session = session_maker(engine)
session = Session()
# declare the models
class Cource(DelcarativeBase):
__tablename__ = 'courses'
course_nro = Column('course_nro', CHAR(12))
This declarative method is just one way of using sqlalchemy.
Even though this is old, more examples can't hurt, right? I thought I'd demonstrate how to do this with PyORMish.
from pyormish import Model
class Course(Model):
_TABLE_NAME = 'courses'
_PRIMARY_FIELD = 'id' # or whatever your primary field is
_SELECT_FIELDS = ('id','course_nro')
_COMMIT_FIELDS = ('course_nro',)
Model.db_config = dict(
DB_TYPE='postgres',
DB_CONN_STRING='postgre://naa:123#localhost/tk'
)
To create:
new_course = Course().create(course_nro='abcd')
To select:
# return the first row WHERE course_nro='abcd'
new_course = Course().get_by_fields(course_nro='abcd')

Categories