I have a python script( script 1) that adds elements in a table in MySQL database using mysql.connector module, and I have another python script( script 2) that reads data from that table, I want to put a trigger on that table and each time a new element is added script 2 notice the new added element and display a message on the console.
I'm looking for something like the nodeJS npm package "mysql-events" that watches a MySQL database and runs callbacks on matched functions
I know this is an old question. But I was searching the same today for my work and came across this link - mysql - Run python script on Database event. Look at the answer which has been submitted by the person asking question. It might help you/someone.
Related
This question already has an answer here:
python script is not saving into database
(1 answer)
Closed 6 months ago.
I have a small script made in Python which has two functions. The first to send data into a table. The second can read the table.
When I call the function that triggers my insert query, the data is not saved in the database.
When I insert an identical query directly into SQL Server it works fine.
So my script is good and my query is good too. Firewall systems are properly configured.
So why data is not saved ?
The primary key of my table is an IDENTITY column. When I activate my insert function, the IDENTITY column still auto increments while no data is saved.
I give you my script :
Here my SQL Server :
The SQL query for creating my table
I try my best to find a solution, i need your help to understand my problem.
As a quick fix, add commit after execute:
...
cursor.execute(sql)
cursor.connection.commit()
But I would also advise you to keep as little connections as possible. In your current code you create new connection for each operation.
After inserting with your cursor you need to commit your inserts with connection.commit()
As qaziqarta pointed out it is best practice to not open a new connection everytime you are trying to insert or read something from the database.
You should initialize your connection once at the beginning and close it after you are done reading/writing.
I'm currently trying to make a twitter bot that will print out new entries from my database. I'm trying to use python to do this, and can successfully post messages to Twitter. However, whenever a new entry comes in it doesn't update.
How would I go about implementing something like this, and what would I use? I'm not too experienced with this topic area. Any help or guidance would be appreciated.
Use a trigger to propogate the newly inserted rows from the original table to a record table which is under the surveillance of python, and have python post the new record (possibly remove the already posted ones from the record table)
DELIMITER //
drop trigger if exists record_after_insert //
create trigger record_after_insert after insert on original_table for each row begin
insert record_table (new_record) values (new.new_message);
end //
Good day to All,
I need to communicate to DBs like here is the sample using python & peewee.
First i need to put an entry in the types table of the MainDb. only the Name & type am having & the ID will get auto increment after the entryin the table, then using the auto incremented id(MainID), i need to put a data entry in respective type table of the respective type db.
then i need to do some operation using the select queries with MainID in the python code and finally i need to delete the entries of the MainID in the MainDb as well as in the respective type db, if the operations in the python code gets executed successfully or not (throws any error or exception).
Currently i am using the simple db (MySQLdb) connection & close with cursor to execute the queries & to get the last auto incremented id am using lastrowid of the cursor(cursor.lastrowid).
And also i referred the question in stackoverflow as per my understanding its ok if i have an single db (like main db alone) but what am i need to do for my situation.
OS: Windows7 64bit,
Db: MySQL Db v5.7,
Python : v2.7,
peewee : v2.10
Thanks in advance
Some devices are asynchronously storing values on a common remote MySQL database server.
I would like to write a supervisor app in Python (and possibly SQLAlchemy) to recognize the external INSERT events on the database and act upon the last rows' data. This is to avoid a long manual test to see if every table is being updated regularly or a logger crashed.
Can somebody just tell me where to search online this kind of info and, even better, an example?
EDIT
I already read all tables periodically using a datetime primary key ({date_time}), loading the last row of each table, and comparing to the previous values:
SELECT * FROM table ORDER BY date_time DESC LIMIT 1
but it looks very cumbersome and doesn't guarantee that I don't lose some rows between successive database checks.
The engine is an old version of INNODB that I cannot upgrade: I cannot use the UPDATE field in schema because it simply doesn't work.
To reword my question:
How to listen any database event with a daemon-like Python application (sleeping thread) and wake up only when something happens?
I want also to avoid SQL triggers because this would be just too heavy to manage: tables are in hundreds and they are added/removed very often according to the active loggers.
I gave a look to SQLAlchemy but all reference I could find, if I don't misunderstood it, are decorators to act on INSERTs made by SQLAlchemy's itself. I didn't find anything about external changes to the database.
About the example request: I am not interested in a copy-and-paste, because first I want to understand how stuff works. I prefer (even incomplete) examples because SQLAlchemy documentation is far too deep for my knowledge and I simply cannot put the pieces together.
I would like to copy the contents of a MySQL database from one server to another using a third server. This could be done from the shell prompt using this:
mysqldump --host=hostname1 --user=username --password="mypwd" acme | mysql --host=hostname2 --user=username --password="mypwd" acme
However, how do I do this from within a Python script without using os.system or any of the other subprocess methods? I've read through the MySQLdb docs, but don't see a way to do a bulk export/import. Thank you!
If you dont want to use mysqldump from the command line (using the os.system methods) you are kind of tied to get the data straight from MySQL and then put it to the other server. In that respect your question looks very similar to Get Insert Statement for existing row in MySQL
you can use a query to get the schema creation sql
SHOW CREATE TABLE MyTable;
And then you need to implement a script that just querys data and inserts it to the other server.
You could also look into third party applications that allows you to copy data from one database to another.