How to create a UI which shows the data in a presentable format and keeps getting updated dynamically?
MySQL queries needs to be executed automatically to take data from a database. Currently this task has to be done manually by executing each query.
So how to make a UI and how to link it with MySQL queries that can be made to run automatically ?
Related
I successfully created a series of queries using PostgreSQL's great extension pgRouting. Now I'm trying to set up a python script to run the whole process from python. I connected to the PostgreSQL database using sqlalchemy library. All queries are working well except pgr_nodeNetwork and pgr_createTopology which are pgRouting functions. When I run the python script, it successfully finishes and gives all messages, exactly the same as PostgreSQL does. pgr_nodeNetwork should create a new table but it's not created. Is there any different way to make these two functions work?
Update: I noted that in pgAdmin dashboard for this transaction state is "idle in transaction"
I'm building a web application which utilizes SQLAlchemy to store and retrieve data. My goal is to update the SQLite database on a scheduled daily basis in the background as the app is constantly running. My current approach works as the following:
The SQLite database is first initialized and built from the script: initializedb.py by reading through a series of text files and adding the proper information as rows in a table to the database
The Pyramid app is then run and is accessible via localhost:6543
The user is then able to access a list read from the SQLite database, rendered using a Jinja2 template
My app will be constantly running 24/7 so that the user can access this list whenever. Because the text files which I initialize the database from are constantly updating, I want to be able to update the database as well everyday. My main question is this:
How would I automatically update the database on a daily basis using SQLAlchemy and Pyramid?
Should the code to update the database periodically be done on a script running separately from the app, or should it be done in the Pyramid code itself, such as in views.py?
Use cron to schedule regular tasks
Just use cron. Run your initialise code once per day to recreate the database.
If you need to be more sophisticated you can use celery for more advanced stuff. But I think cron would be the best place to start.
Should you make the database primary?
You should try to have only one copy of your data. It sounds like you have text files and are 'importing' them into a database. But it sounds like your text files are updating regularly by some other process.
An alternative approach is to make the database the canonical version of the data. You could create an administrative interface in your app to update the database.
If the data come in via automatic processes then perhaps you could create an import script to take new data.
This could be done via a command line script. Just add this kind of thing to your setup.py
entry_points = """\
[paste.app_factory]
main = myapp:main
[console_scripts]
some_script = myapp.scripts.script:main
another_script = myapp.any_module:some_function
"""
Is it possible to run a python flask app that pulls from a sql database, while also running a python script that updates the sql database every few seconds?
Yes, databases are designed to handle this type of concurrent access. If the database is in the middle of an update, it will wait until the update is complete before handling the Flask app's query, and it will complete the query before starting the next incoming update.
So I have a Google sheet that maintains a lot of data. I also have a MySQL DB with a huge junk of data. There is a vital piece of information in the Sheet that is also present in the DB. Both needs to be in sync. The information always enters the Sheet first. I had a python script with mysql queries to update my database separately.
Now the work flow has changed. Data will enter the sheet and whenever that happens the database has to updated automatically.
After some research, I found that using the onEdit function of Google AppScript (I learned from here.), I could pickup when the file has changed.
The Next step is to fetch the data from relevant cell, which I can do using this.
Now I need to connect to the DB and send some queries. This is where I am stuck.
Approach 1:
Have a python web-app running live. Send the data via UrlFetchApp.This I yet have to try.
Approach 2:
Connect to mySQL remotely through appscript. But I am not sure this is possible after 2-3 hours of reading the docs.
So this is my scenario. Any viable solution you can think of or a better approach?
Connect directly to mySQL. You likely missed reading this part https://developers.google.com/apps-script/guides/jdbc
Using JDBC within Apps Script will work if you have the time to build this yourself.
If you don't want to roll your own solution, check out SeekWell. It allows you to connect to databases and write SQL queries directly in Sheets. You can create a run a “Run Sheet” that will run multiple queries at once and schedule those queries to be run without you even opening the Sheet.
Disclaimer: I made this.
I am trying to load a huge file like 5900 lines of sql creates, inserts and alter tables into mysql database with flask-SQLalchemy.
I am parsing the file and seperate each command by splitting between ;
This works as expected.
Here is what I am having so far.
For the SQL Query execution I am using the Engine API of SQLAlchemy.
When I execute the queries it seems that the database quits its job after like 5400lines of the file, but the application logs the full execution until line 5900 without error.
When i do the creates and inserts seperately it also works, so is there a way to split the batch execution or use pooling or something like that, which does not make the database stuck.
Thank you!