sqlalchemy create_engine() if the db already exists - python

from pox.core import core
import pox.openflow.libopenflow_01 as of
import re
import datetime
from sqlalchemy import create_engine, ForeignKey
from sqlalchemy import Column, Date, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship, backref
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
log = core.getLogger()
engine = create_engine('sqlite:///nwtopology.db', echo=False)
Base = declarative_base()
Session = sessionmaker(bind=engine)
session = Session()
if I create call the last four python statements repeateadly by restarting the program will it have a negative impact on the correct functioning of the database.? Will it create the database again if one already exists?

As sberry wrote - calling create_engine and creating session multiple times by rerunning same script will just open connection and create SQLAlchemy engine object with reference to this connection.
Thus doing so won't create new sqlite database file and won't impact the database functioning.
Also I would suggest to make sure that your code always do session.close() at the end of your script. This would ensure that all changes if there were any will be committed to database.
By changes I mean any updates/inserts your script may do to database.

Related

sessionmaker object has no attribute add

I am trying to insert data into postgresql server. While doing so, when I try to add the data into the SQLalchemy session, I am getting the error "sessionmaker object has no attribute add":
from sqlalchemy.orm import Session
def create_new_user(user: UserCreate, db: Session):
user=User(username= user.username,
email=user.email,
hashed_password= Hasher.get_password_hash(user.password),
is_active=True,
is_superuser=False
)
db.add(user)
db.commit()
db.refresh(user)
return user
You should create an object from Session as show in this example; which uses a context-manager.
Currently, I use the scoped_session pattern (suitable for most web apps). Here is how my session initialization looks like:
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
engine = create_engine("sqlite://")
Session = scoped_session(sessionmaker(bind=engine))
Session() # --> returns the same object in the same thread

pandas.read_sql Read uncommitted with SQLAlchemy

I am trying to use the pandas function pd.read_sql to read records that have been created, added, and flushed in a SQLAlchemy session, but not committed. So I want to create an object in a SQLAlchemy session and query it with pandas before calling commit. Using pandas 0.22.0 and SQLAlchemy 1.1.10.
I have tried setting the isolation_level on create_engine, and various other ways of setting the isolation level to 'READ UNCOMMITTED', but this does not seem to work. Minimal example below:
# Import packages
import pandas as pd
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.orm import sessionmaker
# Set up an example ORM
Base = declarative_base()
class Record(Base):
__tablename__ = 'records'
id = Column(Integer, primary_key=True)
foo = Column(String(255))
# Create a session and engine:
database='foobar'
user=''
password = ''
host = 'localhost'
port = '5432'
connection_string = f"postgresql+psycopg2://{user}:{password}#{host}:{port}/{database}"
engine = create_engine(connection_string, encoding = 'utf8', convert_unicode = True,
isolation_level='READ_UNCOMMITTED'
)
session = sessionmaker()
session.configure(bind=engine)
db = session()
# Set up the example record:
Record.__table__.create(bind=engine)
record = Record(foo='bar')
db.add(record)
db.flush()
# Attempt to query:
records = pd.read_sql('select * from records', db.get_bind())
assert records.empty
I am looking for a solution that will cause the above code to throw an AssertionError on the last line. records.empty currently evaluates to true.
And of course I figure it out as soon as I post here. For posterity: use db.connection() instead of db.get_bind().

sqlalchemy existing database query

I am using SQLAlchemy as ORM for a python project. I have created few models/schema and it is working fine. Now I need to query a existing MySQL database, no insert/update just the select statement.
How can I create a wrapper around the tables of this existing database? I have briefly gone through the sqlalchemy docs and SO but couldn't find anything relevant. All suggest execute method, where I need to write the raw sql queries, while I want to use the SQLAlchemy query method in same way as I am using with the SA models.
For example if the existing db has table name User then I want to query it using the dbsession ( only the select operation, probably with join)
You seem to have an impression that SQLAlchemy can only work with a database structure created by SQLAlchemy (probably using MetaData.create_all()) - this is not correct. SQLAlchemy can work perfectly with a pre-existing database, you just need to define your models to match database tables. One way to do that is to use reflection, as Ilja Everilä suggests:
from sqlalchemy import Table
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyClass(Base):
__table__ = Table('mytable', Base.metadata,
autoload=True, autoload_with=some_engine)
(which, in my opinion, would be totally fine for one-off scripts but may lead to incredibly frustrating bugs in a "real" application if there's a potential that the database structure may change over time)
Another way is to simply define your models as usual taking care to define your models to match the database tables, which is not that difficult. The benefit of this approach is that you can map only a subset of database tables to you models and even only a subset of table columns to your model's fields. Suppose you have 10 tables in the database but only interested in users table from where you only need id, name and email fields:
import sqlalchemy as sa
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class User(Base):
id = sa.Column(sa.Integer, primary_key=True)
name = sa.Column(sa.String)
email = sa.Column(sa.String)
(note how we didn't need to define some details which are only needed to emit correct DDL, such as the length of the String fields or the fact that the email field has an index)
SQLAlchemy will not emit INSERT/UPDATE queries unless you create or modify models in your code. If you want to ensure that your queries are read-only you may create a special user in the database and grant that user SELECT privileges only. Alternatively/in addition, you may also experiment with rolling back the transaction in your application code.
You can access an existing table using the automap extension:
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
Base = automap_base()
Base.prepare(engine, reflect=True)
Users = Base.classes.users
session = Session(engine)
res = session.query(Users).first()
Create a table with autoload enabled that will inspect it. Some example code:
from sqlalchemy.sql import select
from sqlalchemy import create_engine, MetaData, Table
CONN_STR = '…'
engine = create_engine(CONN_STR, echo=True)
metadata = MetaData()
cookies = Table('cookies', metadata, autoload=True,
autoload_with=engine)
cols = cookies.c
with engine.connect() as conn:
query = (
select([cols.created_at, cols.name])
.order_by(cols.created_at)
.limit(1)
)
for row in conn.execute(query):
print(row)
Other answers don't mention what to do if you have a table with no primary key, so I thought I would address this. Assuming a table called Customers that has columns for CustomerId, CustomerName, CustomerLocation you could do;
from sqlalchemy.ext.automap import automap_base
from sqlalchemy import create_engine, MetaData, Column, String, Table
from sqlalchemy.orm import Session
Base = automap_base()
conn_str = '...'
engine = create_engine(conn_str)
metadata = MetaData()
# you only need to define which column is the primary key. It can automap the rest of the columns.
customers = Table('Customers',metadata, Column('CustomerId', String, primary_key=true), autoload=True, autoload_with=engine)
Base.prepare()
Customers= Base.classes.Customers
session = Session(engine)
customer1 = session.query(Customers).first()
print(customer1.CustomerName)
Assume we have a Postgresql database named accounts. And we already have a table named users.
import sqlalchemy as sa
psw = "verysecret"
db = "accounts"
# create an engine
pengine = sa.create_engine('postgresql+psycopg2://postgres:' + psw +'#localhost/' + db)
from sqlalchemy.ext.declarative import declarative_base
# define declarative base
Base = declarative_base()
# reflect current database engine to metadata
metadata = sa.MetaData(pengine)
metadata.reflect()
# build your User class on existing `users` table
class User(Base):
__table__ = sa.Table("users", metadata)
# call the session maker factory
Session = sa.orm.sessionmaker(pengine)
session = Session()
# filter a record
session.query(User).filter(User.id==1).first()
Warning: Your table should have a Primary Key defined. Otherwise, Sqlalchemy won't like it.

Accessing same db.session across different modules in sqlalchemy

I am very new to sqlalchemy and am trying to figure out how to get things cleaner and connecting.
I have created a /model base.py doc where I have created a session and established all my entities in tables (along with relationships and etc.). I want to create another module in which I operate CRUD operations on the entities (tables) in base.py. This file is called object.py and has the class BaseAPI(object) and has the different functions "create" "read" "update" and "delete". I want to make sure that I am connecting to my table (base.py) in object.py and operating on the entity User. For this case, the entity (table) is Users.
This is what I have in the API object.py doc:
#!/usr/bin/env python
from sqlalchemy import create_engine
from sqlalchemy.orm import relationship, backref, sessionmaker
from datetime import datetime, timedelta
import notssdb.model
from base import User #importing from the module base.py -- doesn't work
engine = create_engine('sqlite:///./notssdb.db', echo=True) #in-memory sql engine
# create a Session
Session = sessionmaker(bind=engine)
class BaseAPI(object):
# DBSession = scoped_session(sessionmaker(engine))
# users = DBSession.query(User).all()
def __init__ (self):
session = Session()
# CREATE USER
def create_user(self, username, password, fullname):
new_user = User(username, password, fullname)
self.session.commit(new_user)
print(username, password, fullname)
Am I importing too many things? Do I need to import all the sqlalchemy tools? Does my init constructor under class BaseAPI need to instantiate the DB session?
1. Am I importing too many things? Do I need to import all the sqlalchemy tools?
Sqlalchemy doesn't have it's own coding style, you've to follow Python coding style. If you don't use any module there is no point of importing it.
I don't see this has been used from sqlalchemy.orm import relationship, backref and this should be used while defining models, hence you don't need to import these modules.
2. Does my init constructor under class BaseAPI need to instantiate the
DB session?
There is no hard rule that you've initiate session in your BaseAPI, you can even write your programme like this..
#!/usr/bin/env python
from sqlalchemy import create_engine
from sqlalchemy.orm import relationship, backref, sessionmaker
from datetime import datetime, timedelta
import notssdb.model
from base import User #importing from the module base.py -- doesn't work
engine = create_engine('sqlite:///./notssdb.db', echo=True) #in-memory sql engine
# create a Session
Session = sessionmaker(bind=engine)
session = Session()
class BaseAPI(object):
# CREATE USER
def create_user(self, username, password, fullname):
new_user = User(username, password, fullname)
session.commit(new_user)
print(username, password, fullname)
But it's not good practice to club your connection generation part with user manager, I would suggest you follow this way..
Note: This is just a sample code and I didn't execute this, you just have to follow this to structure your code.
First create seperate module for connection management, may be like connection_manager.py with the below content.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine('sqlite:///./notssdb.db', echo=True)
# create a Session
Session = sessionmaker(bind=engine)
class SessionManager(object):
def __init__(self):
self.session = Session()
And the create your user_manger.py and import your SessionManager here.
from base import User # this is your User model
from connection_manager import SessionManager
class UserManager(SessionManager):
def create_user(self, username, password, fullname):
new_user = User(username, password, fullname)
self.session.commit(new_user)
print(username, password, fullname)
def delete_user(self, *args):
pass # your code
This way you can make your code cleaner.

Declarative SQLAlchemy CREATE SQLITE in memory tables

This is how I setup my database for an application (in Flask):
from sqlalchemy.engine import create_engine
from sqlalchemy.orm import scoped_session, create_session
from sqlalchemy.ext.declarative import declarative_base
engine = None
db_session = scoped_session(lambda: create_session(bind=engine,
autoflush=False, autocommit=False, expire_on_commit=True))
Base = declarative_base()
Base.query = db_session.query_property()
def init_engine(uri, **kwargs):
global engine
engine = create_engine(uri, **kwargs)
Base.metadata.create_all(bind=engine)
return engine
If I connect to a file database that has had tables created already, everything works fine, but using sqlite:///:memory: as a target database gives me:
OperationalError: (OperationalError) no such table: users u'DELETE FROM users' ()
when querying like so for ex.:
UsersTable.query.delete()
db_session.commit()
I am accessing this code from a unit test. What is the problem?
Thanks
Edit:
Working setup of the application:
app = Flask(__name__)
app.config.from_object(__name__)
app.secret_key = 'XXX'
# presenters
from presenters.users import users
# register modules (presenters)
app.register_module(users)
# initialize the database
init_engine(db)
The code you posted doesn't contain any table/class declaration. Are you sure that the declaration is done before init_engine() is called?

Categories