Python - how to run end to end test cases with mysql database? - python

Is there any in memory DB for Python similar to HSQLDB. MySQL is the DB the application uses and for running end to end test cases, we are currently bringing up a clone of actual DB which causes some delay and couple of manual steps.
I have tried SQLite3 facing some trouble with running DDL queries generated for MySQL.
What are the good options for bringing up temporary DB to run all test cases and shut it down after test execution?
Thank you

MySQL has an in-memory engine (https://dev.mysql.com/doc/refman/5.5/en/memory-storage-engine.html). I've never used, but I guess it will help you run the tests quickly.

Related

MySQL Table disappeared after error occurred in my API

I encountered a weird situation and I need your help.
I am developing a Restful API using the Python 3.7 with Flask and SQLAlchemy. The application is hosted using AWS EC2 and database in AWS RDS (MySQL).
I also have an application hosted using Raspberry PI which will call the API and communicate with the EC2 Server.
Sometimes, I encountered a long transaction time between Raspberry and my API server, most of the time, I will kill the process in Raspberry PI and try to restart the process again and debug to see where goes wrong. However, when I restart the process I will see an error message related to my database. Then when I check my database, I notice all my tables are gone, nothing left. I am pretty sure that no drop tables in my codes and I have no idea why this occurred.
Is there anyone encountered the same situation? If yes, please tell me the root cause and the solution for this issue.
By the way, there is no error message recorded in MySQL log nor my RestAPI.
Thank you and good day.
To my eye this looks like magic and there is to much guessing involved to point a finger properly.
But there is an easy workaround so it does not happen in the future. A good practice is to separate the admin user (can do anything, including schema migrations) from the connect user (can do insert, update, delete, select, but may not run any DDL scripts). Only the connect user may be used by the applications. In this case no table drop would be performed even if the application is running berserk.
Enabling logging might help too: How to log PostgreSQL queries?

Is it possible to deploy Django with Sqlite?

I've built a Django app that uses sqlite (the default database), but I can't find anywhere that allows deployment with sqlite. Heroku only works with postgresql, and I've spent two days trying to switch databases and can't figure it out, so I want to just deploy with sqlite. (This is just a small application.)
A few questions:
Is there anywhere I can deploy with sqlite?
If so, where/how?
SQLite is a database on the disk, it is very useful for development purposes, however services like Heroku expect your server-side code to be stateless, which as a consequence does not really allow for databases such as SQLite. I guess you could make it work (provided you find a place on Heroku's disk where to put your SQLite db) but you would constantly lose your database's content every time you redeploy.
For Heroku specifically, I'll redirect you to this link which explains how to use Django with PostgreSQL on Heroku.
Don't use SQLite on heroku. As stated in the docs you will lose your data:
SQLite runs in memory, and backs up its data store in files on disk.
While this strategy works well for development, Heroku’s Cedar stack
has an ephemeral filesystem. You can write to it, and you can read
from it, but the contents will be cleared periodically. If you were to
use SQLite on Heroku, you would lose your entire database at least
once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not
be a good fit. Since SQLite does not run as a service, each dyno would
run a separate running copy. Each of these copies need their own disk
backed store. This would mean that each dyno powering your app would
have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on
Postgres.
sure you can deploy with sqlite ... its not really recommended but should work ok if you have low network traffic
you set your database engine to sqlite in settings.py ... just make sure you have write access to the path that you specify for your database

Rails app to work with a remote heroku database

I have built an application in python that is hosted on heroku which basically uses a script written in Python to store some results into a database (it runs as a scheduled task on daily basis). I would have done this with ruby/rails to avoid this confusion, but the application partner did not support Ruby.
I would like to know if it will be possible to build the front-end with Ruby on Rails and use the same database.
My rails application will need to make use MVC and have its own tables on the database, but it will also use the database that python sends data to just to retrieve some data from there.
Can I create the Rails app and reference the details of the database that my python application uses?
How could I test this on my local machine?
What would be the best approach to this?
I don't see any problem in doing this, as far as rails manages the database structure and python script populates it with data.
My advice, but just to make it simpler, is to define the database schema through migrations in your rails app and build it like the python script doesn't exist.
Once you have completed it, simply start the python script so it can start populating the Database (could be necessary to rename some table in the python script, but no more than this).
If you want to test in your local machine you can one of this:
run the python script in your local machine
configure the database.ymlin your rails app to point to the remote DB (can be difficult if you don't have administration access to the host server, because of port farwarding etc)
The only thing you should keep in mind is about concurrent accesses.
Because you have 2 application that both read and write in your DB, would be better if the python script makes its job in a single and atomic transaction, to avoid your rails app finding the DB in an half-updated state.
You can see the database like a shared box, it doesn't matter how many applications use it.

Scripting the creation of a database from a backup in SQL Server

I currently have a tediously long process for creating new instances of a CMS we make.
I plan to script as much of the process as I can, using Python.
The first step is creating a database.
Currently it is a manual process where I will create an empty database "MyNewSite" and then select restore from backup and restore it from the "master" db file. Before the restore I change the data and log paths accordingly (so they dont overwrite the master).
Is there any way to automate this? I'm not really sure where to begin so any help would be appreciated.
The CMS you make should have a deployment script. All your development process should update upgrade scripts, never touch the database directly. Database updates should be deployed through source code (upgrade scripts) and version control: Version Control and your Database.

Django syncdb locking up on table creation

I've added new models and pushed to our staging server, run syncdb to create their tables, and it locks up. It gets as far as 'Create table photos_photousertag' and postgres output shows the notice for creation of 'photos_photousertag_id_seq', but otherwise i get nothing on either said. I can't ctrl+c the syncdb process and I have no indication of what route to take from here. Has anyone else ran into this?
We use postgres, and while we've not run into this particular issue, there are some steps you may find helpful in debugging:
a. What version of postgres and psycopg2 are you using? For that matter, what version of django?
b. Try running the syncdb command with the "--verbosity=2" option to show all output.
c. Find the SQL that django is generating by running the "manage.py sql " command. Run the CREATE TABLE statements for your new models in the postgres shell and see what develops.
d. Turn the error logging, statement logging, and server status logging on postgres way up to see if you can catch any particular messages.
In the past, we've usually found that either option b or option c points out the problem.
I just experienced this as well, and it turned out to just be a plain old lock on that particular table, unrelated to Django. Once that cleared the sync went through just fine.
Try querying the table that the sync is getting stuck on and make sure that's working correctly first.
Strange here too, but simply restarting the PostgreSQL service (or server) solved it. I'd tried manually pasting the table creation code in psql too, but that wasn't solving it either (well, no way it could if it was a lock thing) - so I just used the restart:
systemctl restart postgresql.service
that's on my Suse box.
Am not sure whether reloading the service/server might lift existing table locks too?

Categories