Are there database testing tools for python (like sqlunit)? - python

Are there database testing tools for python (like sqlunit)? I want to test the DAL that is built using sqlalchemy

Follow the design pattern that Django uses.
Create a disposable copy of the database. Use SQLite3 in-memory, for example.
Create the database using the SQLAlchemy table and index definitions. This should be a fairly trivial exercise.
Load the test data fixture into the database.
Run your unit test case in a database with a known, defined state.
Dispose of the database.
If you use SQLite3 in-memory, this procedure can be reasonably fast.

Related

Architectural pattern for multiple Python applications with shared database

The project consists of two applications in Python with a single shared database:
Application A: performs preprocessing of raw data and stores it in the database;
Application B: performs data analysis techniques on the already populated database.
What should be the right architectural pattern for such a context?
Also, so far I have been using MongoDB and Flask to populate the database but I don't know if it would be suitable as a choice.
If you are asking about a database access pattern, you can always use ORMs (Object relational mappers). An object-relational mapper (ORM) is a code library that automates the transfer of data stored in relational database tables into objects that are more commonly used in application code.
You can learn more about ORM in here. https://www.fullstackpython.com/object-relational-mappers-orms.html
I believe this would be better suited for python than the DAO pattern we use in java. https://www.tutorialspoint.com/design_pattern/data_access_object_pattern.htm

Proper way to centralize schema fore databas

We have our infrastructure up in AWS, which includes a database.
Our transfer of data occurs in Python using SQLAlchemy ORM, which we use to mimic the database schema. At this point it's very simple so it's no big deal.
But if the schema changes/grows, then a manual change needs to be done in the code as well each time.
I was wondering: what is the proper way to centralize the schema of the database, so that there is one source of truth for it?
Check out the Glue Schema Registry - this is pretty much what it's made for.

Flask SQLAlchemy MySQL app with SQLite test db?

I've been working on a Flask app for a while, using SQLAlchemy to access a MySQL database. I've finally started looking into writing tests for this (I'm a strong believer in testing, but am new to Flask and SQLA and Python for that matter, so delayed this), and am having a problem getting my structure set up.
My production database isn't using any unusual MySQL features, and in other languages/frameworks I've been able to set up a test framework using an in-memory SQLite database. (For example, I have a Perl app using DBIx::Class to run a SQL Server db but with a test suite built on SQLite.) However, with SQLAlchemy I've needed to declare a few specific MySQL things in my model, and I'm not sure how to get around this. In particular, I use TINYINT and CHAR types for a few columns, and I seem to have to import these from sqlalchemy.dialects.mysql, since these aren't generic types in SQLA. Thus I'll have a class declaration like:
class Item(db.Model):
...
size = db.Column(TINYINT, db.ForeignKey('size.size_id'), nullable=False)
So even though if I were using raw SQL, I could use TINYINT with SQLite or MySQL and it would work fine, here, it's coming from the mysql dialect class.
I don't want to override my entire model class in order to cover seemingly trivial things like this. Is there some other solution? I've read what I could about using different databases for testing and production, but this issue hasn't been mentioned. It would be a lot easier to use an in-memory SQLite db for testing, instead of having to have a MySQL test database available for everything.

How do I handle migrations in sqlalchemy without any framework?

I have seen sqlalchemy-migrate and alembic, but I do not want to use those frameworks. How can I write the migration script? Most of the migrations as I understand revolve around altering/dropping existing tables? Additionally, I use sqlalchemy mostly at orm level than schema/core/engine level?
The reasons I wish to do-it-myself is mostly a learning purpose and understanding how django orm automatically generates a migration script?
You should just use alembic to execute raw sql to start. Then if you decide to try to use more alembic features you'll be all set.
For example after creating a new revision named drop nick you can execute raw sql:
op.execute ('ALTER TABLE users DROP COLUMN nickname')
This way alembic can handle the version numbers but you can, or rather have to, do all the sql manipulations manually.

The choice of using DAL and web2py to connect PostgreSQL database to a web server

I have a database in postgresql and I have to connect it to a web server. I'm familiar with python programming but not web programing. I have some information about DAL (Database Abstraction Layer) to write my queries from within python. I have to generalize my queries to functions. Is it a good idea to to do it using DAL and subsequently use web2py technology to connect it to a web server?
The web2py DAL is a great choice if being database agnostic is a requirement for you. This means the DAL dynamically generates the SQL in real time using the specified dialect for the database back end, so that you do not have to write SQL code or learn different SQL dialects; therefore, your application will be portable among different types of databases. Since SQL is generated dynamically by the DAL, it ensures that all inserted data is properly escaped, which prevents Injection Flaws and makes SQL Injection impossible.
Additionally, although I had not found anything mentioned in the web2py book about how to automatically generate a model from an existing database, there is a way to quickly and easily create a database model from an existing PostgreSQL database by using script available on github created by Mariano Reingart, who based it on a script to "generate schemas from dbs" (mysql) created by Alexandre Andrade. The script is used to create a web2py db model from a PostgreSQL database.
Since I was using a MSSQL database, I needed a similar script, but I couldn't find anything, so I made minor edits to the script, specifically around the SQL and datatypes so that it was more in line with MSSQL, and it worked like a charm. I had a database model of about 150 tables created in seconds and it saved me a lot of time.
https://github.com/phektus/cvstash/blob/master/scripts/extract_pgsql_models.py
I hope this helps others out there who are looking for the same.
web2py DAL has support for Postgres and you can use it within web2py or you can take only the dal.py and use with your favorite project/framework.
For existing databases I recommend you to read the chapter 6 of http://web2py.com/book
I have multiple apps running with Postgres and it works very nice!

Categories