Postgresql database error: column does not exist - python

I'm using Postrgesql via python, but I'm getting an error on the following insertion, alleging that 'column "\ufeff61356169" does not exist'.
c_id = "\ufeff61356169"
chunk = "engine"
query = f"""
INSERT INTO company_chunks(company_id, chunk)
VALUES(`{c_id}`, `{chunk}`);
"""
c.execute(query)
>>>
DatabaseError: {'S': 'ERROR', 'V': 'ERROR', 'C': '42703', 'M': 'column "\ufeff61356169" does not exist', 'P': '88', 'F': 'parse_relation.c', 'L': '3514', 'R': 'errorMissingColumn'}
One key note: "\ufeff61356169" is the value which is to be inserted into the column. So the error confuses me. It's confusing the insertion value for the column, which should receive the insertion. Any thoughts?
Just to verify that everything else is in working order I made sure to check that my table was successfully created.
query = """
SELECT column_name
FROM information_schema.columns
WHERE table_name = 'company_chunks';
"""
c.execute(query)
c.fetchall()
>>>
(['company_id'], ['chunk'])
So the table does exist and it has the columns, which I'm trying to make insertions to. Where am I going wrong here?
Btw, I'm connecting to this database, which is stored in GCP, via the Cloud SQL Python Connector. However, this connector was able to create the table, so I believe the problem is specific to python syntax and/or Postgres.
Edit: For the sake of understanding what this table looks like, here's the creation query.
query= """
CREATE TABLE company_chunks
(
company_id VARCHAR(25) NOT NULL,
chunk VARCHAR(100) NOT NULL
);
"""
c.execute(query)
conn.commit()

better to do it this way by using %s placeholder:
sql = "INSERT INTO company_chunks(company_id, chunk) VALUES (%s, %s)"
var = (c_id,chunk)
mycursor.execute(sql,var)

Related

Issue while trying to select record in mysql using Python

Error Message
You have an error in your SQL syntax; check the manual that
corresponds to your MariaDB server version for the right syntax to use
near '%s' at line 1
MySQL Database Table
CREATE TABLE `tblorders` (
`order_id` int(11) NOT NULL,
`order_date` date NOT NULL,
`order_number` varchar(50) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
ALTER TABLE `tblorders`
ADD PRIMARY KEY (`order_id`),
ADD UNIQUE KEY `order_number` (`order_number`);
ALTER TABLE `tblorders`
MODIFY `order_id` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=4;
Code
mydb = mysql.connector.connect(host = "localhost", user = "root", password = "", database = "mydb")
mycursor = mydb.cursor()
sql = "Select order_id from tblorders where order_number=%s"
val = ("1221212")
mycursor.execute(sql, val)
Am I missing anything?
You must pass a list or a tuple as the arguments, but a tuple of a single value is just a scalar in parentheses.
Here are some workarounds to ensure that val is interpreted as a tuple or a list:
sql = "Select order_id from tblorders where order_number=%s"
val = ("1221212",)
mycursor.execute(sql, val)
sql = "Select order_id from tblorders where order_number=%s"
val = ["1221212"]
mycursor.execute(sql, val)
This is a thing about Python that I always find weird, but it makes a kind of sense.
In case you want to insert data you have to modify your SQL. Use INSERT instead of SELECT like this:
INSERT INTO tblorders (order_number) VALUES ("122121");
That statement will add new record to the table. Besides, in MariaDB you need to use ? instead of %s that works on Mysql database.
sql = "INSERT INTO tblorders (order_number) VALUES (?);"
val = "1231231"
mycursor.execute(sql, [val])

MySQL query inside python program returning unexpected 'b' inside the tuple fetched from the query execution

I'm writing a DBMS and am validating user inputs ( for tables in the database ) using the lengths and data_types stored in the MySQL database's INFORMATION_SCHEMA.COLUMNS for the particular table they are entering data into. Im using Python 3.8.4 ( 64 bit ) with all the needed mysql connector modules installed etc and i have the mysql server running on a local host on the same machine.
After executing the following query = "SELECT COLUMN_TYPE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = ''my_schema' AND TABLE_NAME = 'the_specific_table';"
with the following python code:
NOTE: the particular table im using in this query is set out like below
user_id - INT - PRIMARY KEY- NOT NULL
first_name - VARCHAR(45) - NOT NULL
last_name - VARCHAR(45) - NOT NULL
import mysql.connector
connection = mysql.connector.connect(
host = host,
user = user,
passwd = password,
database = db_name)
cursor = connection.cursor()
cursor.execute("SELECT COLUMN_TYPE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'SCHEMA_HERE' AND TABLE_NAME = 'TABLE_HERE';")
result = cursor.fetchall()
print(result)
The result i get from this query is the following
-- >
[(b'int',), (b'varchar(45)',), (b'varchar(45)',)]. This is completely unchanged from how it is returned when i print the cursor content out
Looping through the list gives each tuple as you would expect,however when indexing those tuples, instead of giving a value such as - string = 'hello' - print(string[0]) outputing of course 'h' - a seemingly random number is outputed instead.This means at the moment i cant work out how to validate inputs that need to be validated against the length and datatype of a column
This weird 'b' inclusion only happens on the DATA_TYPE column of the query, so if the query
cursor.execute("""SELECT TABLE_NAME,COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, IS_NULLABLE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'my_table'""")
is executed the result is
[('tblusers', 'user_id', b'int', None, 'NO'), ('tblusers', 'first_name', b'varchar', 45, 'NO'), ('tblusers', 'last_name', b'varchar', 45, 'NO')]
NOTE this is my first time posting a question on here so do forgive me if i have left out any crucial pieces of infomation needed for someone to help, just let me know and i will try to add that.
Any help is hugely appreciated :)

Compare data between MongoDB and MySQL using python script

I am working on a Django Application that uses both MySQL and MongoDB to store its data. What I need to do is to compare the data that are stored in the MongoDB's collection and stored in the MySQL's table.
For example, my MySQL database contains the table "relation", which is created using:
CREATE TABLE relations (service_id int, beneficiary_id int, PRIMARY KEY (service_id, beneficiary_id));
My MongoDB contains a collection called "relation", which is expected to store the same data as the relation table in MySQL. The following is one document of the collection "relation":
{'_id': 0, 'service_id': 1, 'beneficiary_id': 32}
I tried to create a python script that compares the data between the relation table in MySQL and relation collection in Mongo. The script works as the following:
mysql_relations = Relations.objects.values('beneficiary_id', 'service_id')
mongo_relations_not_in_mysql = relations_mongodb.find({'$nor':list(mysql_relations)})
mongo_relations = relations_mongodb.find({}, {'_id': 0, 'beneficiary_id':1, 'service_id': 1})
filter_list = Q()
for mongo_relation in mongo_relations:
filter_list &= Q(mongo_relation)
mysql_relations_not_in_mongo = Relations.objects.exclude(filter_list)
However, this code takes forever.
I think the main problem is because of the primary key that is composed of 2 columns, which required the usage of the Q() and the '$nor'.
What do you suggest?
Just in case someone is interested, I used the following solution to optimize the data comparison.
(The Idea was to create a temporary MySQL Table to store mongo's data, then doing the comparison between the the MySQL tables). The code is below:
Get the relations From MongoDB
mongo_relations = relations_mongodb.find({}, {'_id': 0, 'service_id': 1, 'beneficiary_id': 1})
Create a temporary MySQL table to store MongoDB'S relations
cursor = connection.cursor()
cursor.execute(
"CREATE TEMPORARY TABLE temp_relations (service_id int, beneficiary_id int, INDEX `id_related` (`service_id`, `beneficiary_id`) );"
)
Insert MongoDB's relations not the temporary table just created
cursor.executemany(
'INSERT INTO temp_relations (service_id, beneficiary_id) values (%(service_id)s, %(beneficiary_id)s) ',
list(mongo_relations)
)
Get the MongoDB's relations that does not exist in MySQL
cursor.execute(
"SELECT service_id, beneficiary_id FROM temp_relations WHERE (service_id, beneficiary_id) NOT IN ("
"SELECT service_id, beneficiary_id FROM relations);"
)
mongo_relations_not_in_mysql = cursor.fetchall()
Get MySQL relations that does not exist in MongoDB
cursor.execute(
"SELECT id, service_id, beneficiary_id, date FROM relations WHERE (service_id, beneficiary_id) not IN ("
"SELECT service_id, beneficiary_id FROM temp_relations);"
)
mysql_relations_not_in_mongo = cursor.fetchall()
cursor.close() # Close the connection to MySQL

How do I work with ids as a primary key in IBM'S DB2

I have a flask server that needs to register an account. This account will have an ID as the primary key. I set the primary key to be
usr_id INTEGER NOT NULL PRIMARY KEY GENERATED ALWAYS AS IDENTITY (START WITH 1 INCREMENT BY 1)
when I try to save the info to the db (using python), I try
save_data = 'INSERT INTO myTable(name, email, psw) VALUES(?, ?, ?)'
param = name, email, psw #these are python vars
stmt = ibm_db.prepare(conn, save_data)#conn is the connection to db2 in python
ibm_db.execute(stmt, param)
My problem is, 'myTable' has that userid that auto increments. When I tried registering for the first time I got an error saying I was giving too few values. I figured that the missing one would be userid, so I specified which values I was giving as shown in the code above. Now I get a new error. I don't understand what I should do to make the db2 generate the number and I am not sure if that error still relates to the missing value I talked about
the error message I get in the terminal is
Exception: Statement Execute Failed: [IBM][CLI Driver][DB2/LINUXX8664] SQL0302N The value of a host variable in the EXECUTE or OPEN statemen SQLCODE=-302ange for its corresponding use. SQLSTATE=22001
with a table defined like this:
create table so.my_table(id int not null primary key generated always as identity,
name varchar(10),
email varchar(20),
psw varchar(16))";
this works just fine:
save_data_sql = 'insert into so.my_table(name, email, psw) values(?, ?, ?)'
params = 'kkuduk', 'kkuduk#ibm', 'strongPassword'
stmt = ibm_db.prepare(conn, save_data_sql)
ibm_db.execute(stmt, params)
i.e.
In [16]: ibm_db.execute(stmt, params)
In [17]: stmt = ibm_db.exec_immediate(conn,"select * from so.my_table")
In [18]: result = ibm_db.fetch_assoc(stmt)
In [19]: result
Out[19]: {'ID': 1, 'NAME': 'kkuduk', 'EMAIL': 'kkuduk#ibm', 'PSW': 'strongPassword'}
other option is to bind parametes on-by-one
stmt = ibm_db.prepare(conn, save_data_sql)
ibm_db.bind_param(stmt, 1, 'kkuduk')
ibm_db.bind_param(stmt, 2, 'kkuduk#ibm')
ibm_db.bind_param(stmt, 3, 'strongPassword')
ibm_db.execute(stmt)
An error SQL0302N could be returned e.g. if I would try a value out for range, so likely the issue is not your usr_id column, but one of 3 that you are actually trying to explicitly insert.

Python pymysql syntax error while inserting a JSON object

I get the following error while inserting into a table of a SQL database used in Python:
pymysql.err.ProgrammingError: (1064, 'You have an error in your SQL
syntax; check the manual that corresponds to your MariaDB server
version for the right syntax to use near \'"Black": {"b": "125.98",
"a": "126.796", "L": "117.245"}, "Pink": {"b": "130.286\' at line 1')
SQL command is:
json1 = json.dumps(meanLAB_vals_dict) # convert python dict to json string
json2 = json.dumps(deltaE_dict)
sql_command = """INSERT INTO data_integrity_tool VALUES (%d, %d, %s, %s)""" %(i, image_id, json1, json2)
cursor.execute(sql_command)
connection.commit()
While meanLAB_vals_dict is:
{'Black': {'b': '125.98', 'a': '126.796', 'L': '117.245'}, 'Pink':
{'b': '130.286', 'a': '180.918', 'L': '169.0'}, 'Green': {'b':
'135.531', 'a': '103.51', 'L': '144.755'}, 'Violet': {'b': '109.878',
'a': '136.653', 'L': '122.02'}, 'Grey': {'b': '123.327', 'a':
'125.612', 'L': '139.429'}, 'Yellow': {'b': '195.571', 'a':
'112.612', 'L': '234.694'}, 'Red': {'b': '153.449', 'a': '177.918',
'L': '163.939'}, 'White': {'b': '128.02', 'a': '128.939', 'L':
'243.878'}, 'Blue': {'b': '84.7551', 'a': '122.98', 'L': '163.673'}}
and deltaE_dict is:
{'Black': '38.5187', 'Pink': '38.6975', 'mean delta E': '28.0643',
'Green': '42.6365', 'Violet': '35.5018', 'Grey': '19.8903', 'Yellow':
'24.5115', 'Red': '40.0078', 'White': '4.4993', 'Blue': '8.31544'}
While i and image_id are two integers (indices of the iterations).
Following is the data_integrity_tool table:
sql_command = """CREATE TABLE IF NOT EXISTS data_integrity_tool (
id_ INTEGER NOT NULL PRIMARY KEY AUTO_INCREMENT,
image_id INTEGER NOT NULL,
mean_lab_values TEXT,
delta_e_values TEXT);"""
I know some similar questions exist already, however, they are for PHP and I have no idea about it and moreover, I am totally new in SQL.
It seems that you haven't specified the field names upon which data is to be inserted in sql query. Also use blob as data type if you want to insert json data. That is much safer.Also don't cast the no in insert query.
If you consider my answer then your sql statement should be something like this.
Table creation command:
sql_command = """CREATE TABLE IF NOT EXISTS data_integrity_tool (
id_ INTEGER NOT NULL PRIMARY KEY AUTO_INCREMENT,
image_id INTEGER NOT NULL,
mean_lab_values BLOB,
delta_e_values BLOB);"""
Insertion command:
sql_command = """INSERT INTO data_integrity_tool(id_,image_id,mean_lab_values,delta_e_values) VALUES (%d, %d, %s, %s)""" %(i, image_id, json1, json2)
Here is the answer (first, there is no problem in the JSON objects):
Create table:
sql_command = """CREATE TABLE IF NOT EXISTS data_integrity_tool (
id_ INTEGER NOT NULL PRIMARY KEY AUTO_INCREMENT,
image_id INTEGER NOT NULL,
mean_lab_values TEXT COLLATE utf8_unicode_ci,
delta_e_values TEXT COLLATE utf8_unicode_ci);"""
SQL insert query:
json1 = json.dumps(meanLAB_vals_dict)
json2 = json.dumps(deltaE_dict)
sql_command = """INSERT INTO data_integrity_tool(id_, image_id,
mean_lab_values, delta_e_values) VALUES (%s, %s, %s, %s)"""
cursor.execute(sql_command, (i, image_id, json1, json2))
NOTE:
In the create table, I added COLLATE to utf8_unicode_ci. If you did not mention, default is latin1_swedish_ci which I don't need.
In the SQL insert query, sql_command doesn't contain the values, rather they are provided during the execution function. This avoids SQL Injection.
Also, in the SQL insert query, always use %s for all the fields.
Interesting links:
Insert Python List (JSON or otherwise) into MySQL databse
Inserting JSON into MySQL using Python
Python/MySQL query error: `Unknown column`
Python MySQLdb issues (TypeError: %d format: a number is required, not str)

Categories