I'm trying to store very large int values in sqlite3 db. the values are 100-115 digits long.
I've tried every possible combination - send the input as string/integer and store it as int/text/blob, but the result is always the same - the value of 7239589231682139...97853 becomes 7.239589231682139e+113 in the db.
My db schema is:
conn.execute('''CREATE TABLE DATA
RESULT TEXT NOT NULL''')
and the query is:
def insert(result):
conn.execute(f'INSERT INTO DATA (RESULT) VALUES ({result})')
conn.commit()
I wrote a simple function to test the above case:
DB_NAME = 'test.db'
conn = sqlite3.connect(DB_NAME)
conn.execute(('''CREATE TABLE TEST_TABLE
(TYPE_INT INT,
TYPE_REAL REAL,
TYPE_TEXT TEXT,
TYPE_BLOB BLOB);
'''))
value1 = '123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890'
value2 = 123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890
conn.execute(f'INSERT INTO TEST_TABLE (TYPE_INT, TYPE_REAL, TYPE_TEXT, TYPE_BLOB) VALUES ({value1}, {value1}, {value1}, {value1})')
conn.execute(f'INSERT INTO TEST_TABLE (TYPE_INT, TYPE_REAL, TYPE_TEXT, TYPE_BLOB) VALUES ({value2}, {value2}, {value2}, {value2})')
conn.commit()
cursor = conn.execute('SELECT * from TEST_TABLE')
for col in cursor:
print(f'{col[0]}, {col[1]}, {col[2]}, {col[3]}')
print('--------------')
conn.close()
As you can see - I try all the possibilites, and the output is:
1.2345678901234568e+119, 1.2345678901234568e+119, 1.23456789012346e+119, 1.2345678901234568e+119
1.2345678901234568e+119, 1.2345678901234568e+119, 1.23456789012346e+119, 1.2345678901234568e+119
You are passing a value without single quotes so it is considered numeric.
Pass it as a string like this:
value1 = "123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
conn.execute("INSERT INTO TEST_TABLE (TYPE_TEXT) VALUES (?)", (value1,))
The ? placeholder will be replaced by:
'123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890'
because the value's type is string and it will be stored properly as TEXT which must be the data type of the column.
Related
I am trying to add a new column in existing table and want to populate that column in database, there is a predictions column which is dataframe it is giving me error what I am doing wrong,
Code:
conn = create_connection()
cur = conn.cursor()
query = "ALTER TABLE STOCK_MARKET_FORECASTING ADD COLUMN predictions float"
cur.execute(query)
# Inserting predictions in database
def inserting_records(df):
for i in range(0 ,len(df)):
values = (df['Predicted_values_Hourly_Interval'][i])
cur.execute("UPDATE STOCK_MARKET_FORECASTING SET (predictions) VALUES (%s)", values)
conn.commit()
print("Records created successfully")
inserting_records(predictions)
You're passing in a single value – cur.execute requires a tuple of values.
You're probably looking for INSERT, not UPDATE. UPDATE updates existing rows.
def inserting_records(df):
series = df['Predicted_values_Hourly_Interval']
for val in series:
cur.execute("INSERT INTO STOCK_MARKET_FORECASTING (predictions) VALUES (%s)", (val, ))
conn.commit()
might be what you're looking for.
I'm trying to debug a SQL statement generated with sqlite3 python module...
c.execute("SELECT * FROM %s WHERE :column = :value" % Photo.DB_TABLE_NAME, {"column": column, "value": value})
It is returning no rows when I do a fetchall()
When I run this directly on the database
SELECT * FROM photos WHERE album_id = 10
I get the expected results.
Is there a way to see the constructed query to see what the issue is?
To actually answer your question, you can use the set_trace_callback of the connection object to attach the print function; this will make all queries get printed when they are executed. Here is an example in action:
# Import and connect to database
import sqlite3
conn = sqlite3.connect('example.db')
# This attaches the tracer
conn.set_trace_callback(print)
# Get the cursor, execute some statement as an example
c = conn.cursor()
c.execute("CREATE TABLE stocks (symbol text)")
t = ('RHAT',)
c.execute("INSERT INTO stocks VALUES (?)", t)
c.execute('SELECT * FROM stocks WHERE symbol=?', t)
print(c.fetchone())
This produces the output:
CREATE TABLE stocks (symbol text)
BEGIN
INSERT INTO stocks VALUES ('RHAT')
SELECT * FROM stocks WHERE symbol='RHAT'
('RHAT',)
the problem here is that the string values are automatically embraced with single quotes. You can not dynamically insert column names that way.
Concerning your question, I'm not sure about sqlite3, but in MySQLdb you can get the final query as something like (I am currently not at a computer to check):
statement % conn.literal(query_params)
You can only use substitution parameters for row values, not column or table names.
Thus, the :column in SELECT * FROM %s WHERE :column = :value is not allowed.
I have a list contains many lists in python.
my_list = [['city', 'state'], ['tampa', 'florida'], ['miami','florida']]
The nested list at index 0 contains the column headers, and rest of the nested lists contain corresponding values. How would I insert this into sql server using pyodbc or slqalchemy? I have been using pandas pd.to_sql and want to make this a process in pure python. Any help would be greatly appreciated.
expected output table would look like:
city |state
-------------
tampa|florida
miami|florida
Since the column names are coming from your list you have to build a query string to insert the values. Column names and table names can't be parameterised with placeholders (?).
import pyodbc
conn = pyodbc.connect(my_connection_string)
cursor = conn.cursor()
my_list = [['city', 'state'], ['tampa', 'florida'], ['miami','florida']]
columns = ','.join(my_list[0]) #String of column names
values = ','.join(['?'] * len(my_list[0])) #Placeholders for values
query = "INSERT INTO mytable({0}) VALUES ({1})".format(columns, values)
#Loop through rest of list, inserting data
for l in my_list[1:]:
cursor.execute(query, l)
conn.commit() #save changes
Update:
If you have a large number of records to insert you can do that in one go using executemany. Change the code like this:
columns = ','.join(my_list[0]) #String of column names
values = ','.join(['?'] * len(my_list[0])) #Placeholders for values
#Bulk insert
query = "INSERT INTO mytable({0}) VALUES ({1})".format(columns, values)
cursor.executemany(query, my_list[1:])
conn.commit() #save change
Assuming conn is already open connection to your database:
cursor = conn.cursor()
for row in my_list:
cursor.execute('INSERT INTO my_table (city, state) VALUES (?, ?)', row)
cursor.commit()
Since the columns value are are the first elemnts in the array, just do:
q ="""CREATE TABLE IF NOT EXISTS stud_data (`{col1}` VARCHAR(250),`{col2}` VARCHAR(250); """
sql_cmd = q.format(col1 = my_list[0][0],col2 = my_list[0][1])
mcursor.execute(sql)#Create the table with columns
Now to add the values to the table, do:
for i in range(1,len(my_list)-1):
sql = "INSERT IGNORE into test_table(city,state) VALUES (%s, %s)"
mycursor.execute(sql,my_list[i][0],my_list[i][1])
mycursor.commit()
print(mycursor.rowcount, "Record Inserted.")#Get count of rows after insertion
This question already has answers here:
How can I get dict from sqlite query?
(16 answers)
Closed 4 years ago.
Issue:
Hi, right now I am making queries to sqlite and assigning the result to variables like this:
Table structure: rowid, name, something
cursor.execute("SELECT * FROM my_table WHERE my_condition = 'ExampleForSO'")
found_record = cursor.fetchone()
record_id = found_record[0]
record_name = found_record[1]
record_something = found_record[2]
print(record_name)
However, it's very possible that someday I have to add a new column to the table. Let's put the example of adding that column:
Table structure: rowid, age, name, something
In that scenario, if we run the same code, name and something will be assigned wrongly and the print will not get me the name but the age, so I have to edit the code manually to fit the current index. However, I am working now with tables of more than 100 fields for a complex UI and doing this is tiresome.
Desired output:
I am wondering if there is a better way to catch results by using dicts or something like this:
Note for lurkers: The next snipped is made up code that does not works, do not use it.
cursor.execute_to(my_dict,
'''SELECT rowid as my_dict["id"],
name as my_dict["name"],
something as my_dict["something"]
FROM my_table WHERE my_condition = "ExampleForSO"''')
print(my_dict['name'])
I am probably wrong with this approach, but that's close to what I want. That way if I don't access the results as an index, and if add a new column, no matter where it's, the output would be the same.
What is the correct way to achieve it? Is there any other alternatives?
You can use namedtuple and then specify connection.row_factory in sqlite. Example:
import sqlite3
from collections import namedtuple
# specify my row structure using namedtuple
MyRecord = namedtuple('MyRecord', 'record_id record_name record_something')
con = sqlite3.connect(":memory:")
con.isolation_level = None
con.row_factory = lambda cursor, row: MyRecord(*row)
cur = con.cursor()
cur.execute("CREATE TABLE my_table (record_id integer PRIMARY KEY, record_name text NOT NULL, record_something text NOT NULL)")
cur.execute("INSERT INTO my_table (record_name, record_something) VALUES (?, ?)", ('Andrej', 'This is something'))
cur.execute("INSERT INTO my_table (record_name, record_something) VALUES (?, ?)", ('Andrej', 'This is something too'))
cur.execute("INSERT INTO my_table (record_name, record_something) VALUES (?, ?)", ('Adrika', 'This is new!'))
for row in cur.execute("SELECT * FROM my_table WHERE record_name LIKE 'A%'"):
print(f'ID={row.record_id} NAME={row.record_name} SOMETHING={row.record_something}')
con.close()
Prints:
ID=1 NAME=Andrej SOMETHING=This is something
ID=2 NAME=Andrej SOMETHING=This is something too
ID=3 NAME=Adrika SOMETHING=This is new!
Is it possible for me to take data stored in a sqlite3 table and use it as a Python variable? I'm looking for something that might be similar to this pseudo-code:
import sqlite3
conn = sqlite3.connect(DATABASE)
cursor = conn.cursor()
variable = cursor.execute("fetch data from table")
To read a single value from a table, use a SELECT query that returns a result with a single row and a single column:
for row in cursor.execute("SELECT MyColumn FROM MyTable WHERE ID = ?", [123]):
variable = row[0]
break
else:
variable = 0 # not found