I am creating a Python application that uses embedded SQLite databases. The programme creates the db files and they are on a shared network drive. At this point there will be no more than 5 computers on the network running the programme.
My initial thought was to ask the user on startup if they are the server or client. If they are the server then they create the database. If they are the client they must find a server instance on the network. The one way I suppose is to send all db commands from client to server and server implements in the database. Will that solve the shared db issue?
Alternatively, is there some way to create a SQLite "server". I presume this would be the quicker option if available?
Note: I can't use a server engine such as MySQL or PostgreSQL at this point but I am implementing using ORM and so when this becomes viable, it should be easy to change over.
Here's a "SQLite Server", http://sqliteserver.xhost.ro/, but it looks like not in maintain for years.
SQLite supports concurrency itself, multiple processes can read data at one time and only one can write data into it. Also, When some process is writing, it'll lock the whole database file for a few seconds and others have to wait in the mean time according official document.
I guess this is sufficient for 5 processes as yor scenario. Just you need to write codes to handle the waiting.
Related
I'm working on a code wrote in Python 2.7 that connects to a MariaDB database to read data.
This database receives data from different external resources. My code only read it.
My service read the data once at the beginning and keep everything in memory to avoid I/O.
I would like to know if there is someway to create some 'function callback' in my code to receive some kind of alert of new update/insert, so I can reload my memory data from the database every time that any external resource change or save new data.
I have thought of creating a sql trigger to a new table to insert some "flag" there and put my service to check that new table periodically if the flag is present.
If so, reload the data and delete the flag.
But it sounds like a wrong workaround...
I'm using:
Python 2.7
MariaDB Ver 15.1 Distrib 10.3.24-MariaDB
lib mysql-connector 2.1.6
The better solution for MariaDB is streaming with the CDC API: https://mariadb.com/resources/blog/how-to-stream-change-data-through-mariadb-maxscale-using-cdc-api/
The plan you have now, with using a flag table, means your client has to poll the flag table for presence of the flag. You have to run a query against that table at intervals, and keep doing it 24/7. Depending on how quickly your client needs to be notified of a change, you might need to run this type of polling query very frequently, which puts a burden on the MariaDB server just to respond to the polling queries, even when there is no change to report.
The CDC solution is better because the client can just request to be notified the next time a change occurs, then the client waits. It does not put extra load on the MariaDB server, any more than if you had simply added a replica server.
I am trying to integrate alongside an existing application that uses ADS as its database.
When i connect my integration app using the code below it connects fine until i try and run the original application at the same time. It seems to only allow one connection, my application seems to hold the connection and block all others. Yet i can have multiple instances of the original application running conncurrently with no issue. Which leads me to believe that its the way in which i am trying to correct from my c# app. The error im getting when the original app is open and i then try to connect with my integration app is "The Advantage Data Dictionary cannot be opened. axServerConnect" .
Error 7077: The Advantage Data Dictionary cannot be opened. axServerConnect
Anyone any suggestions? How to create a multiple connection at same time?
Python code:
conn = adsdb.connect(DataSource=str(dbpath[0]), ServerType='local',
UserID = config.ADS_USERNAME, password=config.ADS_PASS)
According to this page in ADS documentations, you can use connection pooling by providing pooling=True to your client connection arguments.
I think using this approach, you will be able to open multiple connections at the same time.
Edit
After checking adsdb python script, I think it does not support connection pooling. You probably be able to set that connection pooling in your C# application.
I Have a Python application which analyses data from multiple sources in real time. Once the data is analyzed the result of the analysis is stored in a database along with a time-stamp of when it was analyzed.
I would like to access the most recent result of this program remotely from another computer.
I was thinking about using python sockets and having a server script running on the main computer which runs the application and then that way I can access the data using a client script on another computer.
Is there a better way of doing this? Or are there any other solutions out there that can address this need?
Your question is very broad.
Most DB servers will provide a method/API to access the data remotely. You can use Python as a client if there is a DBAPI module for your DB that supports remote access over the network. For example if you are using Postgres you could use the psycopg2 module.
If you are using a simple DB such as SQLite then you might be able to use an ODBC driver. Some alternatives are here.
Edit
mongodb provides an API, pymongo.
In the end Redis was the best solution. Considering the original question The goal was to be able to send data in real time from one computer to another. Solutions such as Redis or RabbitMQ successfully accomplish this.
With Redis a server can be setup and it can publish messages to the network, clients can then subscribe to data channels and receive the messages in a queue
This Python library was used as a python Redis client :
https://pypi.python.org/pypi/redis
I created an app. Its copy will be in two different computers. But a sqlite database file needs to be common for these two apps. I mean, both computers will be able to read and write this database file. For this purpose, I will put this file in a folder on our server which both computers are connected to. How can I get the full path for this file in Python? Or can you suggest any other way as easy as possible for doing this task?
Sqlite over a network share [stackoverflow.com]
I'd recommend against database files on a network drive. The network filesystem usually isn't robust enough to handle random updates like a DB.
As a previous answer suggested, you'd be better off creating a simple client/server model. A server process has sole access to the sqlite db, clients send requests to the server. Don't pass the sqlite db file back and forth.
You might want to use a full network DB such as MySQL or PostgreSQL.
I would have a Python server program running on the server with the database file (using the sockets library). Then have the two clients connect to the server program (again, using the sockets library), and then receive the database file. You can find some examples for the socket library at http://www.prasannatech.net/2008/07/socket-programming-tutorial.html
I would like to write a trigger for a PostgreSQL database which, on insertions, would notify a node.js server which would send some data to connected clients.
Currently, my thought is to write a Python row insert trigger for the database which would write data to some file which would then be read by the node.js server.
However, this would be slow, as disk access would be involved. What would be a better way to connect these two applications?
Have you looked at Listen Notify functionality?
http://www.postgresql.org/docs/9.0/interactive/sql-notify.html
Also you will want to test out different options against your needs, instead of assuming one is not fast enough for what you need. Maybe your python approach will work just fine.