No error, but no data written to SQLite DB - python

I have a python list called result that contains 7 items...
print result
returns
[u'2013:11:29', u'17:01:11', u'Apple', u'iPhone 5', -36.57033055555556, 174.68374722222222, '5095c554fef7d990a2c57e0e12b18854']
I have this code setting up my DB, and writting the result into the DB
conn = sqlite3.connect('geopic.sqlite')
cur = conn.cursor()
cur.execute('''DROP TABLE IF EXISTS Results''')
cur.execute('''CREATE TABLE Results (Date DATE, Time TIME, Make TEXT, Model TEXT, Latitude FLOAT, Longitude FLOAT, Hash VARCHAR(128))''')
for result in results:
if result == None:
continue
else:
print result #this prints as expected
cur.execute('''INSERT INTO Results (Date, Time, Make, Model, Latitude, Longitude, Hash) VALUES (?,?,?,?,?,?,?)''', result)
This runs with no errors, but nothing gets written into the database. I'm stumped as to why!

Append conn.commit() to end of your code.
Example:
for result in results:
if result == None:
continue
else:
print result #this prints as expected
cur.execute('''INSERT INTO Results (Date, Time, Make, Model, Latitude, Longitude, Hash) VALUES (?,?,?,?,?,?,?)''', result)
conn.commit()

OK - Simple Answer - I forgot to use
conn.commit()
To write the changes to the DB.
DOH!

Related

Store big integer values correctly in sqlite db

I'm trying to store very large int values in sqlite3 db. the values are 100-115 digits long.
I've tried every possible combination - send the input as string/integer and store it as int/text/blob, but the result is always the same - the value of 7239589231682139...97853 becomes 7.239589231682139e+113 in the db.
My db schema is:
conn.execute('''CREATE TABLE DATA
RESULT TEXT NOT NULL''')
and the query is:
def insert(result):
conn.execute(f'INSERT INTO DATA (RESULT) VALUES ({result})')
conn.commit()
I wrote a simple function to test the above case:
DB_NAME = 'test.db'
conn = sqlite3.connect(DB_NAME)
conn.execute(('''CREATE TABLE TEST_TABLE
(TYPE_INT INT,
TYPE_REAL REAL,
TYPE_TEXT TEXT,
TYPE_BLOB BLOB);
'''))
value1 = '123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890'
value2 = 123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890
conn.execute(f'INSERT INTO TEST_TABLE (TYPE_INT, TYPE_REAL, TYPE_TEXT, TYPE_BLOB) VALUES ({value1}, {value1}, {value1}, {value1})')
conn.execute(f'INSERT INTO TEST_TABLE (TYPE_INT, TYPE_REAL, TYPE_TEXT, TYPE_BLOB) VALUES ({value2}, {value2}, {value2}, {value2})')
conn.commit()
cursor = conn.execute('SELECT * from TEST_TABLE')
for col in cursor:
print(f'{col[0]}, {col[1]}, {col[2]}, {col[3]}')
print('--------------')
conn.close()
As you can see - I try all the possibilites, and the output is:
1.2345678901234568e+119, 1.2345678901234568e+119, 1.23456789012346e+119, 1.2345678901234568e+119
1.2345678901234568e+119, 1.2345678901234568e+119, 1.23456789012346e+119, 1.2345678901234568e+119
You are passing a value without single quotes so it is considered numeric.
Pass it as a string like this:
value1 = "123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
conn.execute("INSERT INTO TEST_TABLE (TYPE_TEXT) VALUES (?)", (value1,))
The ? placeholder will be replaced by:
'123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890'
because the value's type is string and it will be stored properly as TEXT which must be the data type of the column.

How to make automatically items id numeration?

I'm trying to insert some data into SQL database, and the problem is that I'm really green on this. So the MAIN problem is that How can I sort all the items in table? I have 3 main things: ID, CARNUM, TIME. But in this 'Insertion' I have to type the id manually. How can I make that the system would create a numeric id numeration automatically?
Here's the insertion code:
postgres_insert_query = """ INSERT INTO Vartotojai (ID, CARNUM, TIME) VALUES (%s,%s,%s)"""
record_to_insert = (id, car_numb, Reg_Tikslus_Laikas)
cursor.execute(postgres_insert_query, record_to_insert)
connection.commit()
count = cursor.rowcount
print (count, "Record inserted successfully into mobile table")
pgadmin sort
pgadmin table
You could change the datatype of ID to serial, which is an auto incrementing integer. Meaning that you don't have to manually enter an ID when inserting into the database.
Read more about datatype serial: source

Update SQLITE DB with multiple python lists

I'm attempting to update my sqlite db with 2 python lists. I have a sqlite db with three fields. Name, number, date. I also have three python lists with similar names. I'm trying to figure out a way to update my sqlite db with data from these 2 lists. I can get the db created, and even get a single column filled, but I cant seem to update it correctly or at all. Is there a way to INSERT both lists at once? Rather than INSERT a single column and then UPDATE the db with the other?
Here is what I have so far:
name_list = []
number_list = []
date = now.date()
strDate = date.strftime("%B %Y")
tableName = strDate
sqlTable = 'CREATE TABLE IF NOT EXISTS ' + tableName + '(name text, number integer, date text)'
c.execute(sqlTable)
conn.commit()
for i in name_list:
c.execute('INSERT INTO January2018(names) VALUES (?)', [i])
conn.commit()
I can't seem to get past this point. I still need to add another list of data (number_list) and attach the date to each row.
Here's what I have on that:
for i in number_list:
c.execute('UPDATE myTable SET number = ? WHERE name', [i])
conn.commit()
Any help would be much appreciated. And if you need more information, please let me know.
You can use executemany with zip:
c.executemany('INSERT INTO January2018 (name, number) VALUES (?, ?)', zip(name_list, number_list))
conn.commit()

POSTGIS inserts become slow after some time

I have a location table with following structure:
CREATE TABLE location
(
id BIGINT,
location GEOMETRY,
CONSTRAINT location_pkey PRIMARY KEY (id, location),
CONSTRAINT enforce_dims_geom CHECK (st_ndims(location) = 2),
CONSTRAINT enforce_geotype_geom CHECK (geometrytype(location) = 'POINT'::TEXT OR location IS NULL),
CONSTRAINT enforce_srid_geom CHECK (st_srid(location) = 4326)
)
WITH (
OIDS=FALSE
);
CREATE INDEX location_geom_gist ON location
USING
GIST (location);
I run the following query to insert data:
def insert_location_data(msisdn, lat, lon):
if not (lat and lon):
return
query = "INSERT INTO location (id, location) VALUES ('%s', ST_GeomFromText('POINT(%s %s)', 4326))"%(str(id), str(lat), str(lon))
try:
cur = get_cursor()
cur.execute(query)
conn.commit()
except:
tb = traceback.format_exc()
Logger.get_logger().error("Error while inserting location in sql: %s", str(tb))
return False
return True
I run this block of code 10,000,000 times in a loop but somewhere after 1 million inserts the inserting speed drops drastically. The speed returns to normal when I restart the script but it again drops around a million documents and the same trend continues. I cannot figure out why?
Any help.
Here's a few tips.
Watch out for str(id), which would always return a string '<built-in function id>', since id is not shown to be a variable in the question, and is a built-in id() function.
The correct axis order for PostGIS is (X Y) or (lon lat).
There are more efficient ways to insert points.
Don't format a string to insert
This is how to insert one point:
cur.execute(
"INSERT INTO location (id, location) "
"VALUES (%s, ST_SetSRID(ST_MakePoint(%s, %s), 4326))",
(msisdn, lon, lat))
And see executemany if you want to insert more records at a time, where you would prepare a list of parameters to insert (i.e. [(msisdn, lon, lat), (msisdn, lon, lat), ..., (msisdn, lon, lat)]).

Python Write a number of columns to sqlite

I am working on a small project and I have created a helper function that will write a string of comma separated values to a database as if they were values. I realise there are implications to doing it this way but this is small and i need to get it going until i can do better
def db_insert(table,data):
"""
insert data into a table, the data should be a tuple
matching the number of columns with null for any columns that
have no value. False is returned on any error, error is logged to
database log file."""
if os.path.exists(database_name):
con = lite.connect(database_name)
else:
error = "Database file does not exist."
to_log(error)
return False
if con:
try:
cur = con.cursor()
data = str(data)
cur.execute('insert into %s values(%s)') % (table, data)
con.commit()
con.close()
except Exception, e:
pre_error = "Database insert raised and error;\n"
thrown_error = pre_error + str(e)
to_log(thrown_error)
finally:
con.close()
else:
error = "No connection to database"
to_log(error)
return False
database_name etc... are defined elsewhere in the script.
Barring any other obvious glaring errors;
what i need to be able to do (by this method or some other if there are suggestions) is allow somebody to create a list where each value represents a column value. As I will not know how many columns are being populated.
so somebody uses it as follows:
data = ["null", "foo","bar"]
db_insert("foo_table", data)
this insert that data into the table name foo_table. It is up to the user to know how many columns are in the table and supply the correct number of elements to satisfy that.
I realise that it is better to use sqlite parameters but there are two problems.
first you cannot use a parameter to specify the table only the values.
second is that you need to know how many values you are supplying. you have to do;
cur.execute('insert into table values(?,?,?), val1,val2,val3)
you need to be able to specify the three ?'s.
I am trying to write a general function that allows me to take an arbitrary number of values and insert them into an arbitrary table name.
Now, it was working relatively ok until i tried to pass in 'null' as a value.
One of the columns is the primary key and has an autoincrement. So passing in null will allow it to autoincrement. There will also be other instances where nulls would be required.
The problem is that python keeps wrapping my null in single quotes which sqlite complains about as a datatype mismatch as the primary key is an integer field. If I try passing None as the python null equivalent then the same thing happens.
So two problems.
How to insert an arbitrary number of columns.
How to pass a null.
Thank you for all your help on this and past questions.
Sorry, this looks like a duplicate of this
Using Python quick insert many columns into Sqlite\Mysql
my apologies I did not find it until after I wrote this.
Results in the following which works;
def db_insert(table,data):
"""
insert data into a table, the data should be a tuple
matching the number of columns with null for any columns that
have no value. False is returned on any error, error is logged to
database log file."""
if os.path.exists(database_name):
con = lite.connect(database_name)
else:
error = "Database file does not exist."
to_log(error)
return False
if con:
try:
tuple_len = len(data)
holders = ','.join('?' * tuple_len)
sql_query = 'insert into %s values({0})'.format(holders) % table
cur = con.cursor()
#data = str(data)
#cur.execute('insert into readings values(%s)') % table
cur.execute(sql_query, data)
con.commit()
con.close()
except Exception, e:
pre_error = "Database insert raised and error;\n"
thrown_error = pre_error + str(e)
to_log(thrown_error)
finally:
con.close()
else:
error = "No connection to database"
to_log(error)
return False
The second problem is a "Works for me". When I pass None as value it will correctly convert that value back and forth to and from the db.
import sqlite3
conn = sqlite3.connect("test.sqlite")
data = ("a", None)
conn.execute('INSERT INTO "foo" VALUES(' + ','.join("?" * len(data)) + ')', data)
list(conn.execute("SELECT * FROM foo")) # -> [("a", None)]

Categories