I want to insert data from a CSV file into a PostgreSQL table. The table
structure is given below. But I am unable to give input of INTEGER type values.
It is showing error like-
DataError: invalid input syntax for integer: "vendor_phone"
LINE 1: ...vendor_phone,vendor_address)VALUES ('vendor_name','vendor_ph...
It is working fine if I use VARCHAR type. But i need to use integer values.
CREATE TABLE vendors (
vendor_id SERIAL PRIMARY KEY,
vendor_name VARCHAR(100) NOT NULL,
vendor_phone INTEGER,
vendor_address VARCHAR(255) NOT NULL
)
import psycopg2
import csv
database = psycopg2.connect (database = "supplier", user="postgres", password="1234", host="localhost", port="5432")
cursor = database.cursor()
vendor_data = csv.reader(open('vendors.csv'),delimiter=',')
for row in vendor_data:
cursor.execute("INSERT INTO vendors (vendor_name,vendor_phone,vendor_address)"\
"VALUES (%s,%s,%s)",
row)
print("CSV data imported")
cursor.close()
database.commit()
database.close()
instead of cursor, you can use below statement to load data directly from CSV to table which skips Header of the CSV file
COPY vendors (vendor_name,vendor_phone,vendor_address) FROM 'vendors.csv' CSV HEADER;
I'm using pymysql to load a large csv file into a database, because of memory limitations im using load infile rather than insert. however after the code completes when i query the server it for the data in the table it returns an empty set.
import pymysql
conn = pymysql.connect(host = 'localhost', port = 3306, user = 'root', passwd = '', local_infile = True)
cur = conn.cursor()
cur.execute("CREATE SCHEMA IF NOT EXISTS `test`DEFAULT "
"CHARACTER SET utf8 COLLATE utf8_unicode_ci ;")
cur.execute("CREATE TABLE IF NOT EXISTS "
"`test`.`scores` ( `date` DATE NOT NULL, "
"`name` VARCHAR(15) NOT NULL,"
"`score` DECIMAL(10,3) NOT NULL);")
conn.commit()
def push(fileName = '/home/pi/test.csv', tableName = '`test`.`scores`'):
push = """LOAD DATA LOCAL INFILE "%s" INTO TABLE %s
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(date, name, score);""" % (fileName, tableName)
cur.execute(push)
conn.commit()
push()
I get some truncation warnings but no other errors or warnings to work off of. any ideas on how to fix this?
I did a few things to fix this, First I changed the config files for my sql server to allow load infile, following this MySQL: Enable LOAD DATA LOCAL INFILE. Then the problem was with the line,
LINES TERMINATED BY '\r\n'
the fix was to change it to
LINES TERMINATED BY '\n'
after that the script runs fine and is significantly faster than inserting row by row
I am working on pushing data from DBF files from a UNC to a sql server DB. There are about 50 DBF files, all of which with different schemas. Now I know I can create a program and list all 50 Tables and all 50 DBF files but this is going to take forever. Is there a way to derive the DBF field names somehow to do the insert rather then going through every DBF and typing out every field name in the DBF? Here's the code I have right now that inserts records from two fields in one DBF file.
import pyodbc
from dbfread import DBF
# SQL Server Connection Test
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=**********;DATABASE=TEST_DBFIMPORT;UID=test;PWD=test')
cursor = cnxn.cursor()
dir = 'E\\Backups\\'
table = DBF('E:\\Backups\\test.dbf', lowernames=True)
for record in table.records:
rec1 = record['field1']
rec2 = record['field2']
cursor.execute ("insert into tblTest (column1,column2) values(?,?)", rec1, rec2)
cnxn.commit()
Some helpful hints using my dbf package:
import dbf
import os
for filename in os.listdir('e:/backups'):
with dbf.Table('e:/backups/'+filename) as table:
fields = dbf.field_names(table)
for record in table:
values = list(record)
# insert fields, values using odbc
If you want to transfer all fields, then you'll need to calculate the table name, the field names, and the values; some examples:
sql_table = os.path.splitext(filename)[0]
fields = ','.join(fields)
place_holders = ','.join(['?'] * len(fields))
values = tuple(record)
sql = "insert into %s (%s) values(%s)" % (sql_table, fields, place_holders)
curser.execute(sql, *values)
Im trying read data from a smartmeter that has FTDI chip. Wrote a simple Python serial program to pass the commands to meter and the meter replies back. the data from meter is then converted to float and stored in dictionary.
now i want to store the dictionary to DB, here is the code to put data in the table.
import psycopg2
conn = psycopg2.connect(database="smartmeter", user="postgres", password="12345", host="localhost", port="5432")
cur = conn.cursor()
cur.execute('''CREATE TABLE metertable (ID SERIAL PRIMARY KEY NOT NULL,meter TEXT NOT NULL,temperature TEXT NOT NULL,freq TEXT NOT NULL,penergy TEXT NOT NULL,qenergy TEXT NOT NULL,senergy TEXT NOT NULL,cospi TEXT NOT NULL,irms TEXT NOT NULL,ppower TEXT NOT NULL,qpower TEXT NOT NULL,spower TEXT NOT NULL);''')
while 1:
data=dict[]
data={
'time':timestamp,
'meter':m0_data,
'temperature':m1_data,
'freq':m2_data,
'penergy':m3_data,
'qenergy':m6_data,
'senergy':m7_data,
'cospi':m11_data,
'irms':m15_data,
'vrms':m16_data,
'ppower':realpower,
'qpower':reactivepower,
'spower':apparentpower
}
cur.executemany ("""INSERT INTO metertable(time,meter,temperature,freq,penergy,qenergy,senergy,cospi,irms,vrms,ppower,qpower,spower) VALUES (%(time)s, %(meter)s), %(temperature)s), %(freq)s), %(penergy)s), %(qenergy)s), %(senergy)s), %(cospi)s), %(irms)s), %(vrms)s), %(ppower)s) %(qpower)s) %(spower)s)""", data)
I get an error like this
Traceback (most recent call last):
File "metercmd.py", line 97, in <module>
cur.executemany("""INSERT INTO metertable(time,meter,temperature,freq,penergy,qenergy,senergy,cospi,irms,vrms,ppower,qpower,spower) VALUES (%(time)s, %(meter)s), %(temperature)s), %(freq)s), %(penergy)s), %(qenergy)s), %(senergy)s), %(cospi)s), %(irms)s), %(vrms)s), %(ppower)s) %(qpower)s) %(spower)s)""", data)
TypeError: string indices must be integers, not str
Am I going on the right direction to enter data to the DB? Please suggest some best methods.
Error says: Some of fields you try to push into database are string fields but you push integer values.
Make sure that your data values matches your database table fields.
You have extra closing parenthesis and missing commas:
cur.executemany ("""
INSERT INTO metertable(
time,meter,temperature,freq,penergy,qenergy,senergy,cospi,irms,vrms,ppower,qpower,spower
) VALUES (
%(time)s, %(meter)s, %(temperature)s, %(freq)s, %(penergy)s, %(qenergy)s, %(senergy)s,
%(cospi)s, %(irms)s, %(vrms)s, %(ppower)s, %(qpower)s, %(spower)s
)
""", data)
Thanks for the help . The error was with the executemany. i used type(variable) to get the variable types and changed in those table I changed that to cur.execute. But when I run the code. I see no data saving in my database.
here is the code
import serial
import psycopg2
port=serial.Serial("/dev/ttyUSB0",baudrate=9600,timeout=.1)
conn =psycopg2.connect(database="smartmeternode",user="postgres",password="amma",host="localhost",port="5432")
cur=conn.cursor
data2=dict()
while 1:
port.write('m0\r')
meter=port.read()
port.write('m2\r')
temp=port.read()
ph=meter*temp
time=timestamp
data={'meter':meter,'temp':temp,'ph':ph,'time':time}
cur.execute("INSERT INTO METERDATATEST(meter,temp,ph,time) VALUES(%(meter)s,%(temp)s,%(ph)s,%(time)s;",data)
cur.commit
cur.close
conn.close
the program will run for ínfinite time and i stop the program using ctrl+Z. when I check the database its still empty. what am i missing? I want the data to be stored in the PSQL database as soon as it is read from my smartmeter
psql smartmeternode -U postgres
smartmeternode=# SELECT * FROM meterdatatest;
id | meter | temp | ph | time
----+-------+------+----+------
(0 rows)
I'm trying to insert some data into a local MySQL database by using MySQL Connector/Python -- apparently the only way to integrate MySQL into Python 3 without breaking out the C Compiler.
I tried all the examples that come with the package; Those who execute can enter data just fine. Unfortunately my attempts to write anything into my tables fail.
Here is my code:
import mysql.connector
def main(config):
db = mysql.connector.Connect(**config)
cursor = db.cursor()
stmt_drop = "DROP TABLE IF EXISTS urls"
cursor.execute(stmt_drop)
stmt_create = """
CREATE TABLE urls (
id TINYINT UNSIGNED NOT NULL AUTO_INCREMENT,
str VARCHAR(50) DEFAULT '' NOT NULL,
PRIMARY KEY (id)
) CHARACTER SET 'utf8'"""
cursor.execute(stmt_create)
cursor.execute ("""
INSERT INTO urls (str)
VALUES
('reptile'),
('amphibian'),
('fish'),
('mammal')
""")
print("Number of rows inserted: %d" % cursor.rowcount)
db.close()
if __name__ == '__main__':
import config
config = config.Config.dbinfo().copy()
main(config)
OUTPUT:
Number of rows inserted: 4
I orientate my code strictly on what was given to me in the examples and can't, for the life of mine, figure out what the problem is. What am I doing wrong here?
Fetching table data with the script works just fine so I am not worried about the configuration files. I'm root on the database so rights shouldn't be a problem either.
You need to add a db.commit() to commit your changes before you db.close()!