Writing to a stream on a PostgreSQL trigger using Python - python

I would like to write a trigger for a PostgreSQL database which, on insertions, would notify a node.js server which would send some data to connected clients.
Currently, my thought is to write a Python row insert trigger for the database which would write data to some file which would then be read by the node.js server.
However, this would be slow, as disk access would be involved. What would be a better way to connect these two applications?

Have you looked at Listen Notify functionality?
http://www.postgresql.org/docs/9.0/interactive/sql-notify.html
Also you will want to test out different options against your needs, instead of assuming one is not fast enough for what you need. Maybe your python approach will work just fine.

Related

Is there a way to use a php server as a database?

I am attempting to make a small mafia style game, and I am using replit. Would there be a way to use a php server (or an html server) as a database that I can connect to from a python project?
HTML nor PHP are databases. The "LAMP" stack uses PHP with MySQL / MariaDB for its database, which might be what you're referring to... However, the "P" there could also be Python ¯\_(ツ)_/¯
What you need in Python is a Data Persistence layer, which could just be a simple CSV / JSON file; however pickle module is easier to work with native Python types.
sqlite module can be used if you want the data to be more portable to other frameworks/languages.
And the final option is to actually run your own database server externally and expose it over a remote TCP / HTTP API connection (I don't think Repl.it supports running Docker containers).
If you have access to the actual machine, you can run something like SQLite on the machine. You are running dangerously close to needing more security and what not though.
Security is important, but if you just want to "play around" running something like SQLite should answer your initial pass.

Sending Data using Python

I Have a Python application which analyses data from multiple sources in real time. Once the data is analyzed the result of the analysis is stored in a database along with a time-stamp of when it was analyzed.
I would like to access the most recent result of this program remotely from another computer.
I was thinking about using python sockets and having a server script running on the main computer which runs the application and then that way I can access the data using a client script on another computer.
Is there a better way of doing this? Or are there any other solutions out there that can address this need?
Your question is very broad.
Most DB servers will provide a method/API to access the data remotely. You can use Python as a client if there is a DBAPI module for your DB that supports remote access over the network. For example if you are using Postgres you could use the psycopg2 module.
If you are using a simple DB such as SQLite then you might be able to use an ODBC driver. Some alternatives are here.
Edit
mongodb provides an API, pymongo.
In the end Redis was the best solution. Considering the original question The goal was to be able to send data in real time from one computer to another. Solutions such as Redis or RabbitMQ successfully accomplish this.
With Redis a server can be setup and it can publish messages to the network, clients can then subscribe to data channels and receive the messages in a queue
This Python library was used as a python Redis client :
https://pypi.python.org/pypi/redis

How to sync data between a Google Sheet and a Mysql DB?

So I have a Google sheet that maintains a lot of data. I also have a MySQL DB with a huge junk of data. There is a vital piece of information in the Sheet that is also present in the DB. Both needs to be in sync. The information always enters the Sheet first. I had a python script with mysql queries to update my database separately.
Now the work flow has changed. Data will enter the sheet and whenever that happens the database has to updated automatically.
After some research, I found that using the onEdit function of Google AppScript (I learned from here.), I could pickup when the file has changed.
The Next step is to fetch the data from relevant cell, which I can do using this.
Now I need to connect to the DB and send some queries. This is where I am stuck.
Approach 1:
Have a python web-app running live. Send the data via UrlFetchApp.This I yet have to try.
Approach 2:
Connect to mySQL remotely through appscript. But I am not sure this is possible after 2-3 hours of reading the docs.
So this is my scenario. Any viable solution you can think of or a better approach?
Connect directly to mySQL. You likely missed reading this part https://developers.google.com/apps-script/guides/jdbc
Using JDBC within Apps Script will work if you have the time to build this yourself.
If you don't want to roll your own solution, check out SeekWell. It allows you to connect to databases and write SQL queries directly in Sheets. You can create a run a “Run Sheet” that will run multiple queries at once and schedule those queries to be run without you even opening the Sheet.
Disclaimer: I made this.

SQLite over a network

I am creating a Python application that uses embedded SQLite databases. The programme creates the db files and they are on a shared network drive. At this point there will be no more than 5 computers on the network running the programme.
My initial thought was to ask the user on startup if they are the server or client. If they are the server then they create the database. If they are the client they must find a server instance on the network. The one way I suppose is to send all db commands from client to server and server implements in the database. Will that solve the shared db issue?
Alternatively, is there some way to create a SQLite "server". I presume this would be the quicker option if available?
Note: I can't use a server engine such as MySQL or PostgreSQL at this point but I am implementing using ORM and so when this becomes viable, it should be easy to change over.
Here's a "SQLite Server", http://sqliteserver.xhost.ro/, but it looks like not in maintain for years.
SQLite supports concurrency itself, multiple processes can read data at one time and only one can write data into it. Also, When some process is writing, it'll lock the whole database file for a few seconds and others have to wait in the mean time according official document.
I guess this is sufficient for 5 processes as yor scenario. Just you need to write codes to handle the waiting.

Common File on Server

I created an app. Its copy will be in two different computers. But a sqlite database file needs to be common for these two apps. I mean, both computers will be able to read and write this database file. For this purpose, I will put this file in a folder on our server which both computers are connected to. How can I get the full path for this file in Python? Or can you suggest any other way as easy as possible for doing this task?
Sqlite over a network share [stackoverflow.com]
I'd recommend against database files on a network drive. The network filesystem usually isn't robust enough to handle random updates like a DB.
As a previous answer suggested, you'd be better off creating a simple client/server model. A server process has sole access to the sqlite db, clients send requests to the server. Don't pass the sqlite db file back and forth.
You might want to use a full network DB such as MySQL or PostgreSQL.
I would have a Python server program running on the server with the database file (using the sockets library). Then have the two clients connect to the server program (again, using the sockets library), and then receive the database file. You can find some examples for the socket library at http://www.prasannatech.net/2008/07/socket-programming-tutorial.html

Categories