I am in the process of developing a Python3/tkinter application that I want to have its database features based on a remote MySQL server. I have found SQLalchemy and I am trying to understand what the best practices are in terms of remote access and user authentication. There will be a client application that will ask for a username and password and then it will get, insert and update data in the remote database.
Right now, I am starting with a clear board, following the steps in some tutorial:
from sqlalchemy import create_engine, ForeignKey
from sqlalchemy import Column, Date, Integer, String
from sqlalchemy.ext.declarative import declarative_base
engine = create_engine('mysql+pymysql://DB_USER:DB_PASSWORD#DATABASE_URL:PORT/DATABASENAME', pool_recycle=3600, echo=True)
Base = declarative_base()
connection = engine.connect()
class Process(Base):
__tablename__ = "processes"
id = Column(Integer, primary_key=True)
name = Column(String)
Base.metadata.create_all(engine)
Assuming this is the right way to do it, my first question about this code is:
Isn't here a potential security problem by sending unencrypted user and password through the Internet? Should be taken some kind of measures to prevent password steal? If so, what should I be doing instead?
The other question that I have right now is about users:
Should each application user correspond to a different MySQL database user, or is it more correct to have the database client with a single user and then add my client users in a table (user id, password, ...), defining and managing them in the application code?
Isn't here a potential security problem by sending unencrypted user
and password through the Internet? Should be taken some kind of
measures to prevent password steal? If so, what should I be doing
instead?
There is a fundamental issue with having a (MySQL) database available to the web. With MySQL you can configure it to require ssh-tunnels or ssl-certificates, both of which prevents sending passwords in clear text. Generally you'll have to write both your client software, and a piece of server software that sits on a server close to the database (and the protocol between client/server).
Should each application user correspond to a different MySQL database
user, or is it more correct to have the database client with a single
user and then add my client users in a table (user id, password, ...),
defining and managing them in the application code?
Neither is more correct than the other, but depending on your database (and your client machines) it might influence licensing costs.
Normally your client would authenticate users with the server-software you'll be writing, and then the server software would be using a single database login to contact the database.
Related
I have a fastapi app using SQLAlchemy. This application connects to our legacy database and the architecture is a bit unusual.
For every client we have, we have a separate database. There is a table called "prefill_data" (I didn't name these) which is basically a representation of their employee data. So this table is completely dynamic from client to client.
So our application will receive the database name they are trying to connect to and build the connection string.
The issue we are facing is trying to actually query data from that table given the fact it's completely dynamic. We have a somewhat working example by using DeferredReflection. However, the issue we are seeing is this:
customer A connects to the database every thing works fine.
customer B connects to the database, then makes a request to this "prefill_data" table where we try to select something from, the query fails with AttributeError: type object 'DynamicPrefillData' has no attribute 'zone'.
I can actually reproduce this locally by connecting to one db, then logging out and logging in as another user who connects to a different database. If I stop and start the server each time, everything works as expected. So it seems to me the DeferredReflection caches the metadata so it doesn't reflect the table again.
This is problematic for us. We need to reflect the table each time the db connection is changed.
I'm a ruby developer who got assigned to this project, so I have very minimal experience with SQLAlchemy. I'm praying someone can help point me in a direction for a fix.
database_url = f"mysql+mysqlconnector://{user}:{password}#{host}:{port}/{database}?auth_plugin=mysql_native_password"
engine = create_engine(
database_url, isolation_level="READ UNCOMMITTED", pool_recycle=300
)
Reflected.prepare(engine)
return scoped_session(
sessionmaker(
autocommit=False, autoflush=False, bind=engine, expire_on_commit=False
)
)
class Reflected(DeferredReflection):
__abstract__ = True
class DynamicPrefillData(Reflected, Base):
__tablename__ = "prefill_data"
__table_args__ = {"extend_existing": True}
id = Column("sequence", Integer, primary_key=True)
Turns out the connection was closing causing unexpected behavior.
I developed a Django application, where a user can change the database dynamically through the UI.
I tried using Django's integrated database configurations in settings.py but I had to do some workarounds but still faced some weird errors.
Therefore I decided to use pyodbc with a connection string to connect to the database in my views.py.
The User inputs his database credentials (table, user, password, host, port), which then get saved on a database.
Every time a user needs some data following method gets invoked:
con = DatabaseConnection.objects.all().first()
conn_str = '{}/{}#{}:{}/{}'.format(con.user, con.password, con.host, con.port, con.name)
Everything works fine, but how exactly do I store the password(or the rest of the data) correctly?
I thought about encrypting and decrypting the password in my views.py but that wouldn't make sense, would it?
Is it possible to use username, password and db in Redis?
The reason for this question is because in the official pyramid_redis_sessions documentation ( http://pyramid-redis-sessions.readthedocs.io/en/latest/gettingstarted.html ) the parameter...
redis.sessions.url = redis://username:password#localhost:6379/0
... (to use inside a Python/Pyramid production.ini, for example) suggests the use of username, password and db.
However I have not found anything on the internet that explains how to create a user and password linked to a db on Redis. In the link https://stackoverflow.com/a/34781633/3223785 there is some information about using a db (Redis).
There is the possibility of creating a password ( https://stackoverflow.com/a/7548743/3223785 ). But it seems that is a scope of use other than the parameter redis.sessions.url.
NOTE: The pyramid_redis_sessions provides a implementation of Pyramid’s ISession interface, using Redis as its backend.
#Jacky
In the Redis, the AUTH command is used to authenticate to the Redis server. Once a client is authenticated against a server, it can switch to any of the DBs configured on there server. There is no inbuilt authentication against a specific database.
I'm working on a Flask project and I am using Flask-SQLAlchemy.
I need to work with multiple already existing databases.
I created the "app" object and the SQLAlchemy one:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
db = SQLAlchemy(app)
In the configuration I set the default connection and the additional binds:
SQLALCHEMY_DATABASE_URI = 'postgresql://pg_user:pg_pwd#pg_server/pg_db'
SQLALCHEMY_BINDS = {
'oracle_bind': 'oracle://oracle_user:oracle_pwd#oracle_server/oracle_schema',
'mssql_bind': 'mssql+pyodbc://msssql_user:mssql_pwd#mssql_server/mssql_schema?driver=FreeTDS'
}
Then I created the table models using the declarative system and, where needed, I set the
__bind_key__ parameter to indicate in which database the table is located.
For example:
class MyTable(db.Model):
__bind_key__ = 'mssql_bind'
__tablename__ = 'my_table'
id = db.Column(db.Integer, nullable=False, primary_key=True)
val = db.Column(db.String(50), nullable=False)
in this way everything works correctly, when I do a query it is made on the right database.
Reading the SQLAlchemy documentation and the Flask-SQLALchemy documentation I understand these things
(i write them down to check I understand correctly):
You can handle the transactions through the session.
In SQLAlchemy you can bind a session with a specific engine.
Flask-SQLAlchemy automatically creates the session (scoped_session) at the request start and it destroys it at the request end
so I can do:
record = MyTable(1, 'some text')
db.session.add(record)
db.session.commit()
I can not understand what happens when we use multiple databases, regarding the session, in Flask-SqlAlchemy.
I verified that the system is able to bind the table correctly at the right database through the __bind_key__ parameter,
I can, therefore, insert data on different databases through db.session and, at the commit, everything is saved.
I can't, however, understand if Flask-SQLAlchemy create multiple sessions (one for each engine) or if manages the thing in a different way.
In both cases, how is it possible refer to the session/transaction of a specific database?
If I use db.session.commit() the system does the commit on all involved databases, but how can I do if I want to commit only for a single database?
I would do something like:
db.session('mssql_bind').commit()
but I can not figure out how to do this.
I also saw a Flask-SQLAlchemy implementation which should ease the management of these situations:
Issue: https://github.com/mitsuhiko/flask-sqlalchemy/issues/107
Implementation: https://github.com/mitsuhiko/flask-sqlalchemy/pull/249
but I can not figure out how to use it.
In Flask-SQLAlchemy how can I manage sessions specifically for each single engine?
Flask-SQLAlchemy uses a customized session that handles bind routing according to given __bind_key__ attribute in mapped class. Under the hood it actually adds that key as info to the created table. In other words, Flask does not create multiple sessions, one for each bind, but a single session that routes to correct connectable (engine/connection) according to the bind key. Note that vanilla SQLAlchemy has similar functionality out of the box.
In both cases, how is it possible refer to the session/transaction of a specific database?
If I use db.session.commit() the system does the commit on all involved databases, but how can I do if I want to commit only for a single database?
It might not be a good idea to subvert and issue commits to specific databases mid session using the connections owned by the session. The session is a whole and keeps track of state for object instances, flushing changes to databases when needed etc. That means that the transaction handled by the session is not just the database transactions, but the session's own transaction as well. All that should commit and rollback as one.
You could on the other hand create new SQLAlchemy (or Flask-SQLAlchemy) sessions that possibly join the ongoing transaction in one of the binds:
session = db.create_scoped_session(
options=dict(bind=db.get_engine(app, 'oracle_bind'),
binds={}))
This is what the pull request is about. It allows using an existing transactional connection as the bind for a new Flask-SQLAlchemy session. This is very useful for example in testing, as can be seen in the rationale for that pull request. That way you can have a "master" transaction that can for example rollback everything done in testing.
Note that the SignallingSession always consults the db.get_engine() method if a bind_key is present. This means that the example session is unable to query tables without a bind key and which don't exist on your oracle DB, but would still work for tables with your mssql_bind key.
The issue you linked to on the other hand does list ways to issue SQL to specific binds:
rows = db.session.execute(query, params,
bind=db.get_engine(app, 'oracle_bind'))
There were other less explicit methods listed as well, but explicit is better than implicit.
I have an sqlalchemy application that currently uses a local database.The code for the application is given below.
log = core.getLogger()
engine = create_engine('sqlite:///nwtopology.db', echo=False)
Base = declarative_base()
Session = sessionmaker(bind=engine)
session = Session()
class SourcetoPort(Base):
""""""
__tablename__ = 'source_to_port'
id = Column(Integer, primary_key=True)
port_no = Column(Integer)
src_address = Column(String,index=True)
#-----------------------------------------
def __init__(self, src_address,port_no):
""""""
self.src_address = src_address
self.port_no = port_no
I want to create the database itself in a remote machine.I came across this document.
http://www.sqlalchemy.org/doc_pdfs/sqlalchemy_0_6_3.pdf
In the explanation they mentioned the lines given below.
engine = create_engine(’postgresql://scott:tiger#localhost:5432/mydatabase’)
My first question is
1) does sqlite support remote database creation?
2) How do I keep the connection to the remote machine open always? I don't want to initiate an ssh connection every time I have to insert an entry or make a query.
These question may sound stupid but I am very new to python and sqlalchemy.Any help is appreciated.
Answering your questions:
SQLite doesn't support remote database connection - you'll have to implement this by yourself - like putting sqlite database file on shared by network filesystem, but it would make your solution less reliable
My suggestion - do not try to use user remote sqlite database but switch to traditional RDBMS. Please see below for more details.
Sounds like your application had overgrown SQLite. And it is good time to switch to using traditional RDBMS like MySQL or PosgreSQL where network connections are supporting out of the box.
SQLite is local database. SQLite has a page explaining when to use it. It says:
If you have many client programs accessing a common database over a
network, you should consider using a client/server database engine
instead of SQLite.
The good thing is that your application might be database agnostic as you are using SQLAlchemy for generating queries.
So I would do the following:
install database system to machine (it doesn't matter - local or
remote, you can always repeat move your database to remote machine) and configure permissions for your user (create database, alter, select, update and insert)
create database schema and populate data - to clone your existing. There are some tools available for doing so - i.e. Copying Databases across Platforms with SQLAlchemy
sqlite database.
update db connection string in your application from using sqlite to use remote database of your choice
Times have changed.
If one wishes to make a SQLite database available over the web, one option would be to use CubeSQL as a server, and SQLiteManager for SQLite as its client. For details, see e.g. https://www.sqlabs.com/
Another option might be to use Valentina Server similarly: see https://www.valentina-db.com/en/valentina-server-overview
(These options will probably only be suitable if there is at most one client with write-access at a time.)
Are there any others?