Python ORM - save or read sql data from/to files - python

I'm completely new to managing data using databases so I hope my question is not too stupid but I did not find anything related using the title keywords...
I want to setup a SQL database to store computation results; these are performed using a python library. My idea was to use a python ORM like SQLAlchemy or peewee to store the results to a database.
However, the computations are done by several people on many different machines, including some that are not directly connected to internet: it is therefore impossible to simply use one common database.
What would be useful to me would be a way of saving the data in the ORM's format to be able to read it again directly once I transfer the data to a machine where the main database can be accessed.
To summarize, I want to do:
On the 1st machine: Python data -> ORM object -> ORM.fileformat
After transfer on a connected machine: ORM.fileformat -> ORM object -> SQL database
Would anyone know if existing ORMs offer that kind of feature?

Is there a reason why some of the machine cannot be connected to the internet?
If you really can't, what I would do is setup a database and the Python app on each machine where data is collected/generated. Have each machine use the app to store into its own local database and then later you can create a dump of each database from each machine and import those results into one database.
Not the ideal solution but it will work.

Ok,
thanks to MAhsan's and Padraic's answers I was able to find the how this can be done: the CSV format is indeed easy to use for import/export from a database.
Here are examples for SQLAlchemy (import 1, import 2, and export) and peewee

Related

Is there a way to display db2 data in grafana

We use db2 at our company. I would like to find a way to query db2 for data and display that data in grafana. For example to get a number of completed transactions.
I see grafana does support mysql natively but not db2. Is there a way to just add the db2 driver/libraries?
Worst case is writing queries in python and then simply displaying that recorded data with grafana an effective solution?
Thanks
Don't know if you found what you need but in case you didn´t you might consider using 'Db2 rest services' and Grafana plugin 'Simple JSON Datasource'.
In the meantime, two suitable datasources have been developed
grafana-odbc-datasource
grafana-db2-datasource
Unfortunately, both require an enterprise license.
We're currently evaluating other approaches like copying the data in a PostgreSQL database with an ETL-like job.
A generic ODBC/JDBC plugin is really needed.

Is it possible to use 2 different frameworks in 1 backend of a web app?

I am an 11th grade student and I'm learning how to build a web app, with my teammates. Currently, We're making a website showing the school schedule (also to show students' marks) and helping the users to create their to-do lists, of course this web serves students like me. In the backend of the web, we use Python as the main language, Flask as the framework and MySQL to manipulate our database. Now, everything is ok and we're trying to make something like an admin interface for people who host the web. Specifically, it is where teachers can insert their students' grade, and maybe adjust the school timetable. The problem is, we're learning how to use Flask Admin to code that function, and we've found out this tech is only compatible with SQL Server. However, we have a better understand in MySQL therefore we could create multiple tasks, in contrast, we don't know how to use SQL Server to create those funcs. Now I have 2 main questions:
Could we use 2 different SQL in the backend of our web? It is the quickest way we know, however we have to learn how to use SQL Server.
Could we use 2 different Python backend frameworks in the backend of our web? We haven't searched which framework to use yet because we don't know if it's possible to do this.
We don't know any other ways to solve this problem except getting rid of MySQL and use SQL Server instead. However this is not the way we prefer and we hope those 2 questions answered. If there is anything wrong in our knowledge please just straightly comment to let us know, and we greatly welcome any other solutions. Thanks for answering!!
Directly from the flask-admin docs https://flask-admin.readthedocs.io/en/latest/advanced/#using-different-database-backends is the following:
Using Different Database Backends Other than SQLAlchemy… There are
five different backends for you to choose from, depending on which
database you would like to use for your application. If, however, you
need to implement your own database backend, have a look at Adding A
Model Backend.
If you don’t know where to start, but you’re familiar with relational
databases, then you should probably look at using SQLAlchemy. It
is a full-featured toolkit, with support for SQLite, PostgreSQL,
MySQL, Oracle and MS-SQL amongst others. It really comes into its own once you have lots of data, and a fair amount of relations between
your data models. If you want to track spatial data like
latitude/longitude points, you should look into GeoAlchemy, as well.
Regarding the original question, it is possible to use two different frameworks in the backend of a web app. One way to do so would be to set up a reverse proxy server (see https://www.nginx.com/resources/glossary/reverse-proxy-server/#:~:text=A%20reverse%20proxy%20server%20is,traffic%20between%20clients%20and%20servers.), but I would recommend giving SQLAlchemy before doing so.
Why do you think that flask-admin is tied to SqlServer? Flask (and flask-admin) can handle different connections to various databases:
https://github.com/flask-admin/flask-admin#introduction
https://flask-admin.readthedocs.io/en/v1.0.9/db/
My guess is you are currently using SqlAlchemy. As explained here, you can use different backends:
The string form of the URL is dialect[+driver]://user:password#host/dbname[?key=value..], where dialect is a database name such as mysql, oracle, postgresql, etc., and driver the name of a DBAPI, such as psycopg2, pyodbc, cx_oracle, etc.
Alternatively, the URL can be an instance of URL.
(https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine)
More on Engine here
Flask-admin is a admin view on your database tables - you cannot run it on a "different" db then the tables it should modify. It needs to have access to the database tables you want it to modify, so you cannot run "admin" on MS and "your data backend" on another database(-connection).
Some other things to think about:
MS-(T-)Sql and MySql are infrastructural choices, language wise they are very closely related so adapting MySql knowledge to T-SQL Syntax should be possible. Choosing SqlServer may force you to license it - and afaik that comes with fees (eiter on premise or as a azure subscription wich might or might not be free for schools (no idea - but you should check that)).
This sounds like an ambitious project for school - depending on where you live, privat data protection laws come into play especially considering you connect names with schedules with grades which would need you to implement a lot more to comply.
For first question, I checked flask-admin documentation and found that the framework already included serval built-in ORM library, i.e. SQLAlchemy, MongoEngine, pymongo and Peewee. This means you can just directly import the ORM library from the flask-admin package and use it to access your database. For your case, you should look for SQLAlchemy as you are using SQL Database. Both SQL Server and MySQL are supported by SQLAlchemy.
The Adding Model Views section in their official doc also mentioned it as well:
https://flask-admin.readthedocs.io/en/latest/introduction/#getting-started
For the second question, it is technically not possible to apply two different frameworks in one single backend application.

Importing data from multiple related tables in mySQL to SQLite3 or postgreSQL

I'm updating from an ancient language to Django. I want to keep the data from the old project into the new.
But old project is mySQL. And I'm currently using SQLite3 in dev mode. But read that postgreSQL is most capable. So first question is: Is it better to set up postgreSQL while in development. Or is it an easy transition to postgreSQL from SQLite3?
And for the data in the old project. I am bumping up the table structure from the old mySQL structure. Since it got many relation db's. And this is handled internally with foreignkey and manytomany in SQLite3 (same in postgreSQL I guess).
So I'm thinking about how to transfer the data. It's not really much data. Maybe 3-5.000 rows.
Problem is that I don't want to have same table structure. So a import would be a terrible idea. I want to have the sweet functionality provided by SQLite3/postgreSQL.
One idea I had was to join all the data and create a nested json for each post. And then define into what table so the relations are kept.
But this is just my guessing. So I'm asking you if there is a proper way to do this?
Thanks!
better create the postgres database. write down the python script which take the data from the mysql database and import in postgres database.

Storing/Copying PostgreSQL Database to Another Server through SQLAlchemy

I know there are ways of storing data/tables from one server to another, such as the instruction provided here. However, due to I use python to scrape, create, and store data, I am wondering that whether I could fulfill this process by directly using SQLAlchemy. More precisely, after I store the scraped data in the database I create through SQLAlchemy in my own computer, can I simultaneously store.copy those database/tables to another computer/server directly through SQLAlchemy? Can anyone help? Thanks so much.

The choice of using DAL and web2py to connect PostgreSQL database to a web server

I have a database in postgresql and I have to connect it to a web server. I'm familiar with python programming but not web programing. I have some information about DAL (Database Abstraction Layer) to write my queries from within python. I have to generalize my queries to functions. Is it a good idea to to do it using DAL and subsequently use web2py technology to connect it to a web server?
The web2py DAL is a great choice if being database agnostic is a requirement for you. This means the DAL dynamically generates the SQL in real time using the specified dialect for the database back end, so that you do not have to write SQL code or learn different SQL dialects; therefore, your application will be portable among different types of databases. Since SQL is generated dynamically by the DAL, it ensures that all inserted data is properly escaped, which prevents Injection Flaws and makes SQL Injection impossible.
Additionally, although I had not found anything mentioned in the web2py book about how to automatically generate a model from an existing database, there is a way to quickly and easily create a database model from an existing PostgreSQL database by using script available on github created by Mariano Reingart, who based it on a script to "generate schemas from dbs" (mysql) created by Alexandre Andrade. The script is used to create a web2py db model from a PostgreSQL database.
Since I was using a MSSQL database, I needed a similar script, but I couldn't find anything, so I made minor edits to the script, specifically around the SQL and datatypes so that it was more in line with MSSQL, and it worked like a charm. I had a database model of about 150 tables created in seconds and it saved me a lot of time.
https://github.com/phektus/cvstash/blob/master/scripts/extract_pgsql_models.py
I hope this helps others out there who are looking for the same.
web2py DAL has support for Postgres and you can use it within web2py or you can take only the dal.py and use with your favorite project/framework.
For existing databases I recommend you to read the chapter 6 of http://web2py.com/book
I have multiple apps running with Postgres and it works very nice!

Categories