Inserting multiple strings into 1 table - python

I am trying to add 2 strings into a table.
My insert statement is:
INSERT INTO "State"
(state, relevant_id)
VALUES (%s, %s) """, state_values, relevant_id
This does not work because I am supplying too many arguments. Relevant_id is a variable that holds an integer, while state_values are values pertaining to the relevant_id.
Is there a way to insert both strings coming from 2 different variables? I am coding in python and using postgres as a db.

You should pass query parameters as a tuple in a second argument to execute:
cursor.execute("""INSERT INTO
State
(state, relevant_id)
VALUES
(%s, %s);""",
(state_values, relevant_id))
If you do it this way, you'll also get escaping to prevent sql injections for free.
Hope that helps.

Related

PostGres - Insert list of tuples using execute instead of executemany

I am inserting thousands of rows, timing and speed is very important. I have found through benchmarking that postgres can ingest my rows faster using execute() instead of executemany()
This works well for me:
...
def insert(self, table, columns, values):
conn = self.connectionPool.getconn()
conn.autocommit = True
try:
with conn.cursor() as cursor:
query = (
f'INSERT INTO {table} ({columns}) '
f'VALUES {values} '
f'ON CONFLICT DO NOTHING;'
).replace('[', '').replace(']', '') # Notice the replace x2 to get rid of the list brackets
print(query)
cursor.execute(query)
finally:
cursor.close()
self.connectionPool.putconn(conn)
...
self.insert('types', 'name, created_at', rows)
After the double replace, printing query returns something like this and the rows are ingested:
INSERT INTO types (name, created_at) VALUES ('TIMER', '2022-04-09 03:19:49'), ('Sequence1', '2022-04-09 03:19:49') ON CONFLICT DO NOTHING;
Is my approach secure? Is there a more pythonic implementation using execute?
No, this isn’t secure or even reliable – Python repr isn’t compatible with PostgreSQL string syntax (try some strings with single quotes, newlines, or backslashes).
Consider passing array parameters instead and using UNNEST:
cursor.execute(
"INSERT INTO types (name, created_at)"
" SELECT name, created_at FROM UNNEST (%(names)s, %(created_ats)s) AS t",
{
'names': ['TIMER', 'Sequence1', ...],
'created_ats': ['2022-04-09 03:19:49', ...],
})
This is the best solution, as the query doesn’t depend on the parameters (can be prepared and cached, statistics can be easily grouped, makes the absence of SQL injection vulnerability obvious, can easily log queries without data).
Failing that, build a query that’s only dynamic in the number of parameters, like VALUES ((%s, %s, ...), (%s, %s, ...), ...). Note that PostgreSQL has a parameter limit, so you might need to produce these in batches.
Failing that, use psycopg2.sql.Literal.

2 questions: Importing data from MySQL data base to Python

Q1. My database contains 3 columns: time, value A and value B. The time data is written in the form 00:00:00 and the increment is 1 minute.
When I try to import data ...
cursor.execute (f"SELECT * FROM trffc_int_data.{i};")
instead getting (00:00:00, A, B), I get
(datetime.timedelta(0), 7, 2), (datetime.timedelta(seconds=60), 8, 5), .....
I suppose Python doesn't convert the time right. Any suggestions?
Q2. I have an initial database with the data mentioned above. I need to get the data from the initial database, convert it, and save it to another database.
I'm stuck at a point where data should be saved to a new table.
Here are the sections of the code...
# Creating new DB
NewDB = input(" :: Enter the Database name : ")
sqlsynt = f"CREATE DATABASE IF NOT EXISTS {NewDB}"
cursor.execute(sqlsynt,NewDB)
stdb.commit()
# Creating table and writing the data
cursor.execute (f"USE {NewDB}")
sqlsynt = f"CREATE TABLE {dayinweek} (time TIME, Vehicles INT(3), Pedestrians INT(3))"
cursor.execute (sqlsynt, NewDB, dayinweek)
#stdb.commit()
sqlsyntax = f"INSERT INTO {NewDB}.{dayinweek} (time, Vehicles, Pedestrians) VALUES (%s, %s, %s)"
cursor.executemany(sqlsyntax, temp_list_day)
The program stucks on the last line saying that there is no table 1 in NewDB!
mysql.connector.errors.ProgrammingError: 1146 (42S02): Table 'test001.1' doesn't exist
What's wrong with the code? Maybe the problem is in mixing f and %s formating?
Thanks in advance
If I am followin this correctly, you are creating a table called 1. Digit-only identifiers are not allowed in MySQL, unless the identifier is quoted, as explained in the documentation.
Identifiers may begin with a digit but unless quoted may not consist solely of digits.
Your create table statement did fail, but you did not notice that error until you tried to insert.
You could quote the table name, using backticks:
CREATE TABLE `{dayinweek}` (time TIME, Vehicles INT(3), Pedestrians INT(3))
And then:
INSERT INTO `{NewDB}`.`{dayinweek}` (time, Vehicles, Pedestrians) VALUES (%s, %s, %s)
Quoting the database name may also be a good idea: the same rules apply as for table names (and this is user input to start with).
But overall, changing the table name seems like a better option, as this makes for cleaner code: how about something like table1 for example - or better yet, a table name that is more expressive on what kind of data it contains, such as customer1, or sales1.
Note: your code is open to SQL injection, as you are passing user input directly to the database in a create database statement. Obviously such information cannot be parameterized, however I would still recommend performing a minimal sanity check on application side beforehand.

INSERT INTO using sqlite with python [duplicate]

This question already has answers here:
How to insert variable into sqlite database in python?
(3 answers)
Closed 5 years ago.
c.execute("INSERT INTO REDDIT_USER (USERNAME, CREATED_DATE) VALUES ('PHILZEEY', '08-09-17')")
This top one works.
c.execute('INSERT INTO {tn} (USERNAME, CREATED_DATE) VALUES ({nm}, {dt})'.format(tn='REDDIT_USER', nm='John', dt='09-09-17'))
This bottom one doesn't.
Anything I'm missing?
You should be using prepared statements with positional parameters:
c.execute("INSERT INTO REDDIT_USER (USERNAME, CREATED_DATE) VALUES (?, ?)", ('John', '2017-09-09'))
With regard to making the table name a parameter, this isn't possible, even from a prepared statement. In general, most queries would not run on a different table. And allowing the table name to be changed like this could represent a security risk. If you need to query two different tables, then create two prepared statements for those tables.
By the way, you said that the top one works:
c.execute("INSERT INTO REDDIT_USER (USERNAME, CREATED_DATE) VALUES ('PHILZEEY', '08-09-17')")
Actually, while this might execute, it almost certainly is not date data you want to be inserting into your SQLite database. Dates are stored as regular text and should almost always be stored in an ISO format, with the year, followed by the month and day. For example, you could use '2017-09-08' as a date literal, but don't use '08-09-17'.
Your 2nd one doesn't have quotes around the inserted values. Try this:
c.execute("INSERT INTO {tn} (USERNAME, CREATED_DATE) VALUES ('{nm}', '{dt}')".format(tn='REDDIT_USER', nm='John', dt='09-09-17'))

Given table and column name, how to test if INSERT needs quotes ('') around the values to be inserted?

I have a dictionary of column name / values, to insert into a table. I have a function that generates the INSERT statement. I'm stuck because the function always puts quotes around the values, and some are integers.
e.g. If column 1 is type integer then the statement should be INSERT INTO myTable (col1) VALUES 5; vs
INSERT INTO myTable (col1) VALUES '5'; second one causes an error saying column 5 does not exist.
EDIT: I found the problem (I think). the value was in double quotes not single, so it was "5".
In Python, given a table and column name, how can I test if the INSERT statement needs to have '' around the VALUES ?
This question was tagged with "psycopg2" -- you can prepare the statement using a format string and have psycopg2 infer types for you in many cases.
cur.execute('INSERT INTO myTable (col1, col2) VALUES (%s, %s);', (5, 'abc'))
psycopg2 will deal with it for you, because Python knows that 5 is an integer and 'abc' is a string.
http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries
You certainly want to use a library function to decide whether or not to quote values you insert. If you are inserting anything input by a user, writing your own quoting function can lead to SQL Injection attacks.
It appears from your tags that you're using psycopg2 - I've found another response that may be able to answer your question, since I'm not familiar with that library. The main gist seems to be that you should use
cursor.execute("query with params %s %s", ("param1", "pa'ram2"))
Which will automatically handle any quoting needed for param1 and param2.
Although I personally don't like the idea, you can use single quotes around integers when you insert in Postgres.
Perhaps your problem is the lack of parentheses:
INSERT INTO myTable(col1)
VALUES('5');
Here is a SQL Fiddle illustrating this code.
As you note in the comments, double quotes do not work in Postgres.
You can put always the single quote (be careful, if the value contents a quote you must double it: insert into example (value_t) values ('O''Hara');
You can decide checking the value that you want to insert regardles of the type of de destination
You can decide checking the type of the target field
As you can see in http://sqlfiddle.com/#!15/8bfbd/3 theres no mater with inserting integers into a text field or string that represents an integer in a numeric field.
To check the field type you can use the information_schema:
select data_type from information_schema.columns
where table_schema='public'
and table_name='example'
and column_name='value_i';
http://sqlfiddle.com/#!15/8bfbd/7

dynamic table mysqldb python string/int issue

I am receiving an error when trying to write data to a database table when using a variable for the table name that I do not get when using a static name. For some reason on the line where I insert, if I insert an integer as the column values the code runs and the table is filled, however, if I try to use a string I get a SQL syntax error
cursor = db.cursor()
cursor.execute('DROP TABLE IF EXISTS %s' %data[1])
sql ="""CREATE TABLE %s (IP TEXT, AVAILIBILITY INT)""" %data[1]
cursor.execute(sql)
for key in data[0]:
cur_ip = key.split(".")[3]
cursor.execute("""INSERT INTO %s VALUES (%s,%s)""" %(data[1],key,data[0][key]))
db.commit()
the problem is where I have %(data[1], key, data[0][key]) any ideas?
It's a little hard to analyse your problem when you don't post the actual error, and since we have to guess what your data actually is. But some general points as advise:
Using a dynamic table name is often not way DB-systems want to be used. Try thinking if the problem could be used by using a static table name and adding an additional key column to your table. Into that field you can put what you did now as a dynamic table name. This way the DB might be able to better optimize your queries, and your queries are less likely to get errors (no need to create extra tables on the fly for once, which is not a cheap thing to do. Also you would not have a need for dynamic DROP TABLE queries, which could be a security risk.
So my advice to solve your problem would be to actually work around it by trying to get rid of dynamic table names altogether.
Another problem you have is that you are using python string formatting and not parameters to the query itself. That is a security problem in itself (SQL-Injections), but also is the problem of your syntax error. When you use numbers, your expression evaluates to
INSERT INTO table_name VALUES (100, 200)
Which is valid SQL. But with strings you get
INSERT INTO table_name VALUES (Some Text, some more text)
which is not valid (since you have no quotes ' around the strings.
To get rid of your syntax problem and of the sql-injection-problem, don't add the values to the string, pass them as a list to execute():
cursor.execute("INSERT INTO table_name VALUES (%s,%s)", (key, data[0][key]))
If you must have a dynamic table name, put that in your query string first (e.g. with % formatting), and give the actual values for your query as parameters as above (since I cannot imagine that execute will accept the table name as a parameter).
To put it in some simple sample code. Right now you are trying to do it like this:
# don't do this, this won't even work!
table_name = 'some_table'
user_name = 'Peter Smith'
user_age = 47
query = "INSERT INTO %s VALUES (%s, %s)" % (table_name, user_name, user_age)
cursor.execute(query)
That creates query
INSERT INTO some_table VALUES (Peter Smith, 100)
Which cannot work, because of the unquoted string. So you needed to do:
# DON'T DO THIS, it's bad!
query = "INSERT INTO %s VALUES ('%s', %s)" % (table_name, user_name, user_age)
That's not a good idea, because you need to know where to put quotes and where not (which you will mess up at some point). Even worse, imagine a user named named Connor O'Neal. You would get a syntax error:
INSERT INTO some_table VALUES ('Connor O'Neal', 100)
(This is also the way sql-injections are used to crush your system / steal your data). So you would also need to take care of escaping the values that are strings. Getting more complicated.
Leave those problems to python and mysql, by passing the date (not the table name) as arguments to execute!
table_name = 'some_table'
user_name = 'Peter Smith'
user_age = 47
query = "INSERT INTO " + table_name + " VALUES (%s, %s)"
cursor.execute(query, (user_name, user_age))
This way you can even pass datetime objects directly. There are other ways to put the data than using %s, take a look at this examples http://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html (that is python3 used there, I don't know which you use - but except of the print statements it should work with python2 as well, I think).

Categories