I'm using python-mysql, attacked below is the code snippet I'm using to insert into a database table. For some reasons, the code is not populating any rows in the database. There are no exceptions raised and the SELECT queries work fine. On copying the code inside execute and running in phpmyadmin, the database is populated fine.
import MySQLdb as mdb
try:
con = mdb.connect(host='localhost', user='', passwd='', db='indoor')
cur = con.cursor()
cur.execute("INSERT INTO locationdata VALUES('1','1','1','1','1','1')")
numrows = cur.execute("SELECT * FROM locationdata")
print str(numrows) + " : total Rows"
print cur.fetchone()
if con.open:
print "Hello DB"
except mdb.Error, e:
Print "Error " + e. args [0]
Any ideas what am I missing?
Related
I'm coding up a Python file that inserts rows into a MySQL table from a dataframe using mysql.connector. I'm trying to log each successful request and also the exceptions- I'm using Jupyter Notebook to execute the program. However, I'm not able to see any logging on the notebook- I have manually to go into the MySql database and check what has gone in, and I have no idea which entries haven't been successfully entered. Here is my code:
import os
import pandas as pd
import mysql.connector
import logging
import math
def upload_to_db(host, database, user, password,
tbl_name, col_str, dataframe):
try:
conn = mysql.connector.connect(
host=host, database=database, user=user, password=password)
cursor = conn.cursor()
cursor.execute("drop table if exists %s;" % (tbl_name))
cursor.execute("create table %s (%s);" % (
tbl_name, col_str))
data_list = dataframe.to_numpy().tolist()
for i in range(0, len(data_list)-1):
row_values = convert_list(data_list[i])
sql_statement = 'INSERT INTO %s VALUES (%s);' % (
tbl_name, row_values)
cursor.execute(sql_statement)
logging.info("SQL statement [" + sql_statement + "] successful")
conn.commit()
cursor.close()
except mysql.connector.Error as err:
logging.info("Exception: {}".format(err))
Why doesn't the python logging class show the exceptions or successes on the Notebook?
I am running postgressql on a docker container. I am trying to connect to postgres via python and display the tables below is the code that I am using to connect to postgres:
import psycopg2
conn_string = "host='192.168.99.100:15432' dbname='PREDICTIVE_DS_POSTGRESQL'
user='ds_user' password='ds_user'"
print("Connecting to database\n ->%s" % (conn_string))
conn = psycopg2.connect(conn_string)
cursor = conn.cursor()
print("Connected!\n")
Then I use the below Python code to display the existing tables within postgres:
def table_exists(con, table_str):
exists = False
try:
cur = con.cursor()
cur.execute("select exists(select relname from pg_class where relname='"
+ table_str + "')")
exists = cur.fetchone()[0]
print("exists")
cur.close()
except psycopg2.Error as e:
print(e)
return exists
def get_table_col_names(con, table_str):
col_names = []
try:
cur = con.cursor()
cur.execute("select * from " + table_str + " LIMIT 0")
for desc in cur.description:
col_names.append(desc[0])
cur.close()
except psycopg2.Error as e:
print(e)
However, it is not working at all. It says that it cannot connect translate host name "192.168.99.100:15432" to address: Unknown host. However, the container is up and running and that is the host name. Additionally, I don't know whether the rest of the code will work once it connects.
Have your database credentials defined in a separate file.
For example, have a file called database.ini and define it like this:
[creds]
host=192.168.99.100
port=15432
database=PREDICTIVE_DS_POSTGRESQL
user=ds_user
password=ds_user
Have another config parser file to parse this. Call it config.py
#!/usr/bin/python
try:
import configparser
except:
from six.moves import configparser
def config(section,filename='database.ini',):
parser = configparser.ConfigParser()
parser.read(filename)
db = {}
if parser.has_section(section):
params = parser.items(section)
for param in params:
db[param[0]] = param[1]
else:
raise Exception('Section {0} not found in the {1}
file'.format(section, filename))
return db
Now, in your main file, import your config function like this:
from config import config
and connect like this:
dbParams = config("creds")
con = psycopg2.connect(**dbParams)
I'm having problems returning auto-incremented ID columns from a MySQL database using MySQLdb python library.
I have something like:
sql = """INSERT INTO %s (%s) VALUES (\"%s\")""" %(tbl, colsf, valsf)
try:
cursor.execute(sql)
id = cursor.lastrowid
db.close()
except:
print "Failed to add to MySQL database: \n%s" %sql
print sys.exc_info()
db.close()
exit()
However the lastrowid command seems to be returning incorrect values. For instance, I've tried printing out various id columns from the MySQL command line which shows them to be empty, but the lastrowid value keeps increasing by 1 every time the python script is run. Any ideas?
Turned out that the values weren't being committed to the MySQL database properly, adding "db.commit()" command seems to solve the problem.
sql = """INSERT INTO %s (%s) VALUES (\"%s\")""" %(tbl, colsf, valsf)
try:
cursor.execute(sql)
id = cursor.lastrowid
cursor.close()
db.commit()
db.close()
except:
print "Failed to add to MySQL database: \n%s" %sql
print sys.exc_info()
db.close()
exit()
The problem: I run a python script that takes in as user input the user associated with a PostgreSQL database. The script opens the database creates the extension postgis and then alters some tables. I'm connecting just fine, and no messages are printed to the console when I run the script saying it didn't work, but the extension postgis is never installed and I don't know if the tables are altered correctly. This python script is called in a bash script and from it not working my bash won't later on. Any help would be great!
import psycopg2
import sys
con = None
argument = sys.argv[1]
try:
con = psycopg2.connect(database='gis', user=sys.argv[1])
cur = con.cursor()
cur.execute("CREATE EXTENSION postgis")
cur.execute("ALTER TABLE geometry_columns OWNER TO %s" % argument)
cur.execute("ALTER TABLE spatial_ref_sys OWNER TO %s" % argument)
cur.execute('SELECT version()')
ver = cur.fetchone()
print ver
except psycopg2.DatabaseError, e:
print 'Error %s' % e
sys.exit(1)
finally:
if con:
con.close()
i am trying to send two SQL ( select and update) in one python file. but getting still error
cursor = cnx.cursor()
query = "select id, mail from `candidats`; UPDATE candidats SET statusmail=1 "
results = cursor.execute(query, multi=True)
for cur in results:
print('cursor:', cur)
print('result:', cur.fetchall())
cursor.close()
cnx.close()
getting this error:
mysql.connector.errors.InterfaceError: No result set to fetch from