I have a Postgres table with a _text type (note the underscore) and am unable to determine how to insert the string [] into that table.
Here is my table definition:
CREATE TABLE public.newtable (
column1 _text NULL
);
I have the postgis extension enabled:
CREATE EXTENSION IF NOT EXISTS postgis;
And my python code:
conn = psycopg2.connect()
conn.autocommit = True
cur = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
rows = [("[]",)]
insert_query = f"INSERT INTO newtable (column1) values %s"
psycopg2.extras.execute_values(cur, insert_query, rows, template=None, page_size=100)
This returns the following error:
psycopg2.errors.InvalidTextRepresentation: malformed array literal: "[]"
LINE 1: INSERT INTO newtable (column1) values ('[]')
^
DETAIL: "[" must introduce explicitly-specified array dimensions.
How can I insert this data? What does this error mean? And what is a _text type in Postgres?
Pulling my comments together:
CREATE TABLE public.newtable (
column1 _text NULL
);
--_text gets transformed into text[]
\d newtable
Table "public.newtable"
Column | Type | Collation | Nullable | Default
---------+--------+-----------+----------+---------
column1 | text[] | | |
insert into newtable values ('{}');
select * from newtable ;
column1
---------
{}
In Python:
import psycopg2
con = psycopg2.connect(dbname="test", host='localhost', user='postgres')
cur = con.cursor()
cur.execute("insert into newtable values ('{}')")
con.commit()
cur.execute("select * from newtable")
cur.fetchone()
([],)
cur.execute("truncate newtable")
con.commit()
cur.execute("insert into newtable values (%s)", [[]])
con.commit()
cur.execute("select * from newtable")
cur.fetchone()
([],)
From the psycopg2 docs Type adaption Postgres arrays are adapted to Python lists and vice versa.
UPDATE
Finding _text type in Postgres system catalog pg_type. In psql:
\x
Expanded display is on.
select * from pg_type where typname = '_text';
-[ RECORD 1 ]--+-----------------
oid | 1009
typname | _text
typnamespace | 11
typowner | 10
typlen | -1
typbyval | f
typtype | b
typcategory | A
typispreferred | f
typisdefined | t
typdelim | ,
typrelid | 0
typelem | 25
typarray | 0
typinput | array_in
typoutput | array_out
typreceive | array_recv
typsend | array_send
typmodin | -
typmodout | -
typanalyze | array_typanalyze
typalign | i
typstorage | x
typnotnull | f
typbasetype | 0
typtypmod | -1
typndims | 0
typcollation | 100
typdefaultbin | NULL
typdefault | NULL
typacl | NULL
Refer to the pg_type link above to get information on what the columns refer to. The typcategory of A as mapped in "Table 52.63. typcategory Codes Code Category A Array types" at the link is one clue. As well as typinput, typoutput, etc values.
Related
I'm trying to execute a multi-select SQL query using Pyodbc, but getting errors either about no results or no scalar variables. Since I need to create SQL variables that are used at different locations in the query, how could I get this to run in pyodbc?
Would this be feasible if I converted my SQL into a stored procedure?
It is not likely that I will be able to create the logic as a stored procedure as I do not have write access to the database.
Is there any possible way to get this type of query to run in python, or does it need to be modified in some way?
| ID | LNAME | FNAME | EMAIL |
| ----- | -------- | -------- | ------- |
| 1 | Smith | Bob | s#a.com |
| 2 | Davidson | Mike | d#a.com |
| 1 | Campbell | Brian | c#a.com |
This is what I tried so far but keep running into errors.
q = """
set ANSI_WARNINGS OFF;
declare #html varchar(MAX)
decalre #dedupedemails varchar(MAX)
decalre #esc_seq int
set #esc_seq = 5;
if object_id('tempdb.dbo.##dedupemail', 'U') is not null
drop table ##dedupemail;
with sub1 as (
select p.ID, p.LNAME, p.FNAME, p.EMAIL
from dbo.person p
),
sub2 as (
select
s1.*,
case when s1.ID = 1
then 'Yes'
else 'No'
end as IS_ADMIN
from sub1 s1
)
select distinct s2.* into ##dedupemail
from sub2 s2
where s2.IS_ADMIN = 'Yes'
set #html = 'abc';
select #dedupedemails = ltrim(stuff((
select '; ' + d.email
from ##dedupemail d
for xml path('')), 1,1,''));
select #dedupedemails as EMAIL_LIST, #html as EMAIL_BODY"""
try:
cnxn = pyodbc.connect(cnxn_str)
cursor = cnxn()
cursor.execute(q)
result = cursor.fetchall()
del cnxn
except pyodbc.Error as e:
print("Error caught: ", e)
The error:
Error caught: No results. Previous SQL was not a query.
Another error that I get refers to unknown scalars, but it always gives errors.
I have a table in mysql as below:
CREATE TABLE `province` (
`pid` int(2) unsigned zerofill NOT NULL,
`pname` varchar(255) CHARACTER SET utf8 COLLATE utf8_persian_ci DEFAULT NULL,
`family` int(12) DEFAULT NULL,
`population` int(11) DEFAULT NULL,
`male` int(11) DEFAULT NULL,
`female` int(11) DEFAULT NULL,
PRIMARY KEY (`pid`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_persian_ci
note that first column (pid) is zero fill column
and the data in the table(province) is as below:
=========================================
|pid|pname|family|population|male|female|
=========================================
|02 | 'A' | 12 | 20 | 8 | 5 |
=========================================
|03 | 'B' | 25 | 20 | 7 | 6 |
=========================================
|05 | 'c' | 34 | 5 | 7 | 9 |
=========================================
I want to retrieve pid column via python so my python code is:
import mysql.connector
if __name__ == '__main__':
data = []
res = []
cnx = mysql.connector.connect(user='mehdi', password='mehdi', host='127.0.0.1', database='cra_db')
cursor = cnx.cursor()
qrystr = 'SELECT pid FROM province;'
cursor.execute(qrystr)
print(cursor.fetchall())
cnx.close()
but when i run this python code this Exception occurred:
returned a result with an error set
File "C:\Users\M_Parastar\Desktop\New folder\ttt.py", line 11, in
print(cursor.fetchall())
Do you have any idea how to retrieve zero fill column via python??
The cursor returns an iterable python object, replace "print(cursor.fetchall())" with "for rows in cursor: print (rows)". Visit Connector/Python API Reference
I'm currently using the mysql-connector-python package to execute database actions on Flask. It's been working so well until suddenly the variables don't seem to working correctly anymore. My code is here:
#bp.route('/addcart', methods=('OPTIONS', 'POST'))
def addcart():
...
userID = session.get("user_id")
reqDict = request.get_json()
itemCode = str(reqDict['itemCode'])
itemAmt = reqDict['itemAmt']
if userID is not None:
db = get_db()
cursor = db.cursor()
query = ('SELECT %s FROM cartdata WHERE id = %s')
cursor.execute(query, (itemCode, userID))
currentNum = cursor.fetchone()[0]
if currentNum is None:
stmt = ('UPDATE cartdata SET %s = 1 WHERE id = %s')
cursor.execute(stmt, (itemCode, userID))
else:
currentNum = int(currentNum) + int(itemAmt)
stmt = ('UPDATE cartdata SET %s = %s WHERE id = %s')
cursor.execute(stmt, (itemCode, currentNum, userID))
....
For some reason, I seem to having trouble with the itemCode variable. When I use it properly, like in the execution of 'query' or 'stmt', it doesn't work. Typically I will get an error saying
" You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use
near ''p1' = 1 WHERE id = 21'".
However, if I do this:
query = ('SELECT ' + itemCode + ' FROM cartdata WHERE id = %s')
...
stmt = ('UPDATE cartdata SET '+ itemCode +' = 1 WHERE id = %s')
...
It works properly as intended.
EDIT: I've checked my backend, and apparently the UPDATE statement does not actually update anything. So now I'm at a complete loss.
I don't understand why the connector suddenly breaks now for variables. I've checked this variables and its types, but they were the expected types. Any insight would be helpful.
My table schema for 'cartdata' looks something like this:
+-------+---------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+---------+------+-----+---------+-------+
| id | int(11) | NO | PRI | NULL | |
| p1 | int(8) | YES | | NULL | |
| p2 | int(8) | YES | | NULL | |
| p3 | int(8) | YES | | NULL | |
| p4 | int(8) | YES | | NULL | |
| p5 | int(8) | YES | | NULL | |
+-------+---------+------+-----+---------+-------+
That's because when MySQL connector injects your variables into the SQL statement, it formats them according to their type.
You can actually see it in the error message that you get:
"p1' = 1 WHERE id = 21'"
^
So probably, your SQL query looks like this:
SELECT 'p1' FROM cartdata WHERE id = someId
Which is syntactically invalid SQL...
Your second option however seems okay. Btw, it seems weird to adapt the column you want to select depending on the user's input... I'd highly recommend to validate this value with something efficient...
Details
You cannot use %s for column names since this injects a string value in your SQL query and this results in a non valid SQL syntax (column names are not string values).
As above:
SET %s = ...
Generates:
SET 'colName' = ...
which is not valid because you are attempting to affect a value to another value...
That would be the same as trying to do the following in python:
'foo' = 'bar'
or
'foo' = 4
You can use %s when setting values (using SET colName = %s) or filtering values (using WHERE colName = %s) because the type of the values in the column colName is actually a string.
As above:
WHERE colName = %s
Generates:
WHERE colName = 'fooBar'
which is valid because you filter on the values that are equal to the string fooBar.
By the way, you might want to check what
SELECT %s FROM cartdata WHERE id = %s
gives you as a result. That could result problems... Actually MySQL won't tell you anything, but you result will probably be exactly the value of itemCode. (it is valid SQL SELECT 'hello', it just returns 'hello').
I'm trying to delete rows from a psql table on a condition.
I want all rows to be deleted if column "TagNaam" equals a variable var_tagnaam.
I've tried the following code and some variants but I can't get it to work.
There aren't any errors though.
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
Is there something wrong with the syntax?
Edit:
Maybe it is more clear with additional code, the error might be in the other code?
for i in range(len(taginhoud)):
(var_tagnaam, var_tagwaarde, var_tagkwaliteit, var_tagtime) = taginhoud[i]
print (var_tagnaam)
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
conn.commit()
cur.execute('INSERT INTO opc_actuelewaardentags ("TagNaam", "TagWaarde", "TagKwaliteit", create_date, write_date) VALUES (%s,%s,%s,now(),now())',
(var_tagnaam, var_tagwaarde, var_tagkwaliteit))
conn.commit()
So what I try to do here is:
Retrieve "var_tagnaam" from list "taginhoud".
Then in table opc_actuelewaardentags find all rows where column "Tagnaam" equals the value in "var_tagnaam". (Should be a string)
Then delete those rows where "Tagnaam" = "var_tagnaam". This part doesn't work.
Then insert new rows with data. This part works.
Could this code be wrong to do what I want?
I have tried many things already to solve the upper/lower case problem.
Edit 2:Query in pgadmin worked, trying to do the same thing in python:
I ran this query in pgadmin and it deleted the rows:
delete FROM opc_actuelewaardentags where "TagNaam" = 'Bakkerij.Device1.DB100INT8';
My attempt to make it as similar as possible in python:
var_tagnaam2 = "'"+var_tagnaam+"'"
cur.execute("DELETE FROM opc_actuelewaardentags WHERE \"TagNaam\" = %s", (var_tagnaam2,))
conn.commit()
Tried to escape the double quotes in attempt to make it the same as in pgadmin.
'TagNaam' is not a valid column_name identifier in sql language. You must not use single or double quotes in writing database name, table name or colunm name, but you can use apostrophe (`) .
Invalid:
DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = 'test';
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
Valid:
DELETE FROM opc_actuelewaardentags WHERE TagNaam = 'test';
DELETE FROM opc_actuelewaardentags WHERE `TagNaam` = 'test';
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
Update: According to PSQL dosc, double quote is a valid character in table and column names. It is especially used for key words while usinga as a table or column name. So following is valid:
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
More is here...
I don't have a psql server, but a mysql server.
For MySQL:
mysql> select * from user where '1' = '1';
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.05 sec)
mysql> select * from user;
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
mysql> select * from user where '1' = "1";
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
mysql> select * from user where 'id' = "1";
Empty set (0.00 sec)
mysql> select * from user where 'id' = 1;
Empty set, 1 warning (0.02 sec)
mysql> select * from user where id = 1;
+------+
| id |
+------+
| 1 |
+------+
1 row in set (0.02 sec)
mysql> select * from user where 'id' = "id";
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
The SQL grammar should be similar. Therefore,
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
should be
cur.execute("DELETE FROM opc_actuelewaardentags WHERE TagNaam = %s", (var_tagnaam,))
or
cur.execute("DELETE FROM opc_actuelewaardentags WHERE `TagNaam` = %s", (var_tagnaam,))
Above analyusis is error.
Simple Postgresql Statement - column name does not exists gives the answer.
RobbeM wrote: Edit 2:Query in pgadmin worked, trying to do the same thing in python
I've had the same symptoms - I could delete table rows using pgadmin or in SQL console, but Python code wouldn't work. The thing was I was accidentally creating cursor before establishing connection with postgreSQL server:
c = db_conn.cursor()
db_conn = psycopg2.connect(conn_string)
So, the solution for me was to create cursor after establishing connection with database:
db_conn = psycopg2.connect(conn_string)
c = db_conn.cursor()
I dont see why it's not working. I have created several databases and tables and obviously no problem. But I am stuck with this table which is created from django data model. To clarify what I have done, created new database and table from mysql console and try to insert from python and working. But, this one is strange for me.
class Experiment(models.Model):
user = models.CharField(max_length=25)
filetype = models.CharField(max_length=10)
createddate= models.DateField()
uploaddate = models.DateField()
time = models.CharField(max_length=20)
size = models.CharField(max_length=20)
located= models.CharField(max_length=50)
Here is view in mysql console
mysql> describe pmass_experiment;
+-------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| user | varchar(25) | NO | | NULL | |
| filetype | varchar(10) | NO | | NULL | |
| createddate | date | NO | | NULL | |
| uploaddate | date | NO | | NULL | |
| time | varchar(20) | NO | | NULL | |
| size | varchar(20) | NO | | NULL | |
| located | varchar(50) | NO | | NULL | |
+-------------+-------------+------+-----+---------+----------------+
8 rows in set (0.01 sec)
Above pmass_experiment table is created by django ORM after python manage.py syncdb
Now I am trying to insert data into pmass_experiment through python MySQLdb
import MySQLdb
import datetime,time
import sys
conn = MySQLdb.connect(
host="localhost",
user="root",
passwd="root",
db="experiment")
cursor = conn.cursor()
user='tchand'
ftype='mzml'
size='10MB'
located='c:\'
date= datetime.date.today()
time = str(datetime.datetime.now())[10:19]
#Insert into database
sql = """INSERT INTO pmass_experiment (user,filetype,createddate,uploaddate,time,size,located)
VALUES (user, ftype, date, date, time, size, located)"""
try:
# Execute the SQL command
cursor.execute(sql)
# Commit your changes in the database
conn.commit()
except:
# Rollback in case there is any error
conn.rollback()
# disconnect from server
conn.close()
But, unfortunately nothing is inserting. I am guessing it's may be due to primary_key (id) in table which is not incrementing automatically.
mysql> select * from pmass_experiment;
Empty set (0.00 sec)
can you simply point out my mistake?
Thanks
sql = """INSERT INTO pmass_experiment (user,filetype,createddate,uploaddate,time,size,located)
VALUES (user, ftype, date, date, time, size, located)"""
Parametrize your sql and pass in the values as the second argument to cursor.execute:
sql = """INSERT INTO pmass_experiment (user,filetype,createddate,uploaddate,time,size,located)
VALUES (%s, %s, %s, %s, %s, %s, %s)"""
try:
# Execute the SQL command
cursor.execute(sql,(user, ftype, date, date, time, size, located))
# Commit your changes in the database
conn.commit()
except Exception as err:
# logger.error(err)
# Rollback in case there is any error
conn.rollback()
It is a good habit to always parametrize your sql since this will help prevent sql injection.
The original sql
INSERT INTO pmass_experiment (user,filetype,createddate,uploaddate,time,size,located)
VALUES (user, ftype, date, date, time, size, located)
seems to be valid. An experiment in the mysql shell shows it inserts a row of NULL values:
mysql> insert into foo (first,last,value) values (first,last,value);
Query OK, 1 row affected (0.00 sec)
mysql> select * from foo order by id desc;
+-----+-------+------+-------+
| id | first | last | value |
+-----+-------+------+-------+
| 802 | NULL | NULL | NULL |
+-----+-------+------+-------+
1 row in set (0.00 sec)
So I'm not sure why your are not seeing any rows committed to the database table.
Nevertheless, the original sql is probably not doing what you intend.