I tried a few different things but I could not succeed, so maybe it's just not possible.
When I create a sqlalchemy engine with
create_engine('mysql://user#host/database')
it looks like the database is compulsory, while it's not using the sqlite backend.
But since I have to manipulate many different databases on the same server, I would like to avoid creating multiple engines..
Is there a way?
Otherwise what I could is to create a superengine that creates all the needed engines and then redirect to the right engine depending on the database name requested...
One engine pretty much equals one connectable database. See SQLAlchemy docs. You can have as many engines as you want, or you can throw away the engines that you do not need anymore.
Creating the engine one time for each of your databases would be the way I would do it. The you will be able to access them as needed instead of creating an engine then destroying it then recreating etc.
Related
I am an 11th grade student and I'm learning how to build a web app, with my teammates. Currently, We're making a website showing the school schedule (also to show students' marks) and helping the users to create their to-do lists, of course this web serves students like me. In the backend of the web, we use Python as the main language, Flask as the framework and MySQL to manipulate our database. Now, everything is ok and we're trying to make something like an admin interface for people who host the web. Specifically, it is where teachers can insert their students' grade, and maybe adjust the school timetable. The problem is, we're learning how to use Flask Admin to code that function, and we've found out this tech is only compatible with SQL Server. However, we have a better understand in MySQL therefore we could create multiple tasks, in contrast, we don't know how to use SQL Server to create those funcs. Now I have 2 main questions:
Could we use 2 different SQL in the backend of our web? It is the quickest way we know, however we have to learn how to use SQL Server.
Could we use 2 different Python backend frameworks in the backend of our web? We haven't searched which framework to use yet because we don't know if it's possible to do this.
We don't know any other ways to solve this problem except getting rid of MySQL and use SQL Server instead. However this is not the way we prefer and we hope those 2 questions answered. If there is anything wrong in our knowledge please just straightly comment to let us know, and we greatly welcome any other solutions. Thanks for answering!!
Directly from the flask-admin docs https://flask-admin.readthedocs.io/en/latest/advanced/#using-different-database-backends is the following:
Using Different Database Backends Other than SQLAlchemy… There are
five different backends for you to choose from, depending on which
database you would like to use for your application. If, however, you
need to implement your own database backend, have a look at Adding A
Model Backend.
If you don’t know where to start, but you’re familiar with relational
databases, then you should probably look at using SQLAlchemy. It
is a full-featured toolkit, with support for SQLite, PostgreSQL,
MySQL, Oracle and MS-SQL amongst others. It really comes into its own once you have lots of data, and a fair amount of relations between
your data models. If you want to track spatial data like
latitude/longitude points, you should look into GeoAlchemy, as well.
Regarding the original question, it is possible to use two different frameworks in the backend of a web app. One way to do so would be to set up a reverse proxy server (see https://www.nginx.com/resources/glossary/reverse-proxy-server/#:~:text=A%20reverse%20proxy%20server%20is,traffic%20between%20clients%20and%20servers.), but I would recommend giving SQLAlchemy before doing so.
Why do you think that flask-admin is tied to SqlServer? Flask (and flask-admin) can handle different connections to various databases:
https://github.com/flask-admin/flask-admin#introduction
https://flask-admin.readthedocs.io/en/v1.0.9/db/
My guess is you are currently using SqlAlchemy. As explained here, you can use different backends:
The string form of the URL is dialect[+driver]://user:password#host/dbname[?key=value..], where dialect is a database name such as mysql, oracle, postgresql, etc., and driver the name of a DBAPI, such as psycopg2, pyodbc, cx_oracle, etc.
Alternatively, the URL can be an instance of URL.
(https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine)
More on Engine here
Flask-admin is a admin view on your database tables - you cannot run it on a "different" db then the tables it should modify. It needs to have access to the database tables you want it to modify, so you cannot run "admin" on MS and "your data backend" on another database(-connection).
Some other things to think about:
MS-(T-)Sql and MySql are infrastructural choices, language wise they are very closely related so adapting MySql knowledge to T-SQL Syntax should be possible. Choosing SqlServer may force you to license it - and afaik that comes with fees (eiter on premise or as a azure subscription wich might or might not be free for schools (no idea - but you should check that)).
This sounds like an ambitious project for school - depending on where you live, privat data protection laws come into play especially considering you connect names with schedules with grades which would need you to implement a lot more to comply.
For first question, I checked flask-admin documentation and found that the framework already included serval built-in ORM library, i.e. SQLAlchemy, MongoEngine, pymongo and Peewee. This means you can just directly import the ORM library from the flask-admin package and use it to access your database. For your case, you should look for SQLAlchemy as you are using SQL Database. Both SQL Server and MySQL are supported by SQLAlchemy.
The Adding Model Views section in their official doc also mentioned it as well:
https://flask-admin.readthedocs.io/en/latest/introduction/#getting-started
For the second question, it is technically not possible to apply two different frameworks in one single backend application.
I've been working on a Flask app for a while, using SQLAlchemy to access a MySQL database. I've finally started looking into writing tests for this (I'm a strong believer in testing, but am new to Flask and SQLA and Python for that matter, so delayed this), and am having a problem getting my structure set up.
My production database isn't using any unusual MySQL features, and in other languages/frameworks I've been able to set up a test framework using an in-memory SQLite database. (For example, I have a Perl app using DBIx::Class to run a SQL Server db but with a test suite built on SQLite.) However, with SQLAlchemy I've needed to declare a few specific MySQL things in my model, and I'm not sure how to get around this. In particular, I use TINYINT and CHAR types for a few columns, and I seem to have to import these from sqlalchemy.dialects.mysql, since these aren't generic types in SQLA. Thus I'll have a class declaration like:
class Item(db.Model):
...
size = db.Column(TINYINT, db.ForeignKey('size.size_id'), nullable=False)
So even though if I were using raw SQL, I could use TINYINT with SQLite or MySQL and it would work fine, here, it's coming from the mysql dialect class.
I don't want to override my entire model class in order to cover seemingly trivial things like this. Is there some other solution? I've read what I could about using different databases for testing and production, but this issue hasn't been mentioned. It would be a lot easier to use an in-memory SQLite db for testing, instead of having to have a MySQL test database available for everything.
I am new to flask. So I came to know about sql alchemy. Can someone explain me how to use it properly?
Is it like, we need to create a Mysql db first and connect to flask or is everything done through sqlalchemy?
or whatever be the tables in mysql,same have to be represented in models.py???
I tried many tutorials but nothing explains the same.
Thanks in advnce
For your use case, to insert 100s of value a once use BULK operations like add_all (http://docs.sqlalchemy.org/en/rel_1_0/orm/persistence_techniques.html#bulk-operations)
If you don't know Sqlalchemy go here http://docs.sqlalchemy.org/en/latest/orm/tutorial.html
from app import session
from models import User
objects = [User(name="u1"), User(name="u2"), User(name="u3")]
session.add_all(objects)
session.commit()
I'm completely new to managing data using databases so I hope my question is not too stupid but I did not find anything related using the title keywords...
I want to setup a SQL database to store computation results; these are performed using a python library. My idea was to use a python ORM like SQLAlchemy or peewee to store the results to a database.
However, the computations are done by several people on many different machines, including some that are not directly connected to internet: it is therefore impossible to simply use one common database.
What would be useful to me would be a way of saving the data in the ORM's format to be able to read it again directly once I transfer the data to a machine where the main database can be accessed.
To summarize, I want to do:
On the 1st machine: Python data -> ORM object -> ORM.fileformat
After transfer on a connected machine: ORM.fileformat -> ORM object -> SQL database
Would anyone know if existing ORMs offer that kind of feature?
Is there a reason why some of the machine cannot be connected to the internet?
If you really can't, what I would do is setup a database and the Python app on each machine where data is collected/generated. Have each machine use the app to store into its own local database and then later you can create a dump of each database from each machine and import those results into one database.
Not the ideal solution but it will work.
Ok,
thanks to MAhsan's and Padraic's answers I was able to find the how this can be done: the CSV format is indeed easy to use for import/export from a database.
Here are examples for SQLAlchemy (import 1, import 2, and export) and peewee
Lets say I created a product database system for different departments of my company. Each department has its own PostgreSQL-databse-instance for various reasons. The schemata of the databases are the same, however the data in them is not. For each of these systems a Python application exists that does some business logic (irrelevant). Each Python app accesses its and only its databases through SQLAlchemy.
I want to create a Supervisior-System that can access all data in all of these databases (readthrough functionality).
Here is an example of what I think about:
Can I do that with SQLAlchemy? If so, what is the best approach for that kind of problem?
Sure you can do that with SQLAlchemy.
All you need to do is create different connection engines, each with their own session maker. Nothing in SQLAlchemy limits you to only one database at a time.
engines = []
sessions = []
for dbconninfo in databases:
engine = create_engine(dbconninfo)
engines.append(engine)
sessions.append(sessionmaker(bind=engine)())
You can use each session to run queries, result objects are attached to the session that produced them, so that changes flow back to the correct database. Do study the session documentation in detail, to see what happens if you were to merge an object from one session into another, for example.