Can't retrieve OUT parameters from stored procedures (MySQL) - python

I'm trying to call stored procedure from MySQL and obtain OUT param.
I have code like this one,
Call procedure from Python
from django.db import connection
# ...
cursor = connection.cursor()
out_arg1 = ""
args = [in_arg1, in_arg2, out_arg1]
result = cursor.callproc('some_procedure', args)
print(args[2], result[2])
cursor.close()
# ...
MySQL procedure
CREATE DEFINER=`root`#`localhost` PROCEDURE `some_procedure`(IN `in_arg1` VARCHAR(255) CHARSET utf8, IN `in_arg2` VARCHAR(255) CHARSET utf8, OUT `out_arg1` VARCHAR(255) CHARSET utf8)
MODIFIES SQL DATA
BEGIN
proc:begin
set out_arg1="Result";
end;
END;
I've checked args and the result returned by cursor.callproc method, but no data changed.
Any ideas why this is happening?
Thanks in advance.
P.S. I've tried to call this procedure from MySQL console and everything is ok.

use this way
from django.db import connection
# ...
cursor = connection.cursor()
out_arg1 = ""
args = [in_arg1, in_arg2, out_arg1]
result = cursor.callproc('some_procedure', args)
cursor.execute('SELECT #some_procedure_2')
print(cursor.fetchall())
#print(args[2], result[2])
cursor.close()
# ...

Related

psycpg2 Insert into fails

I got some python code (psycopg2) with which should insert data into a database:
def debug(self):
try:
self.connection.execute(
"SELECT test();")
res = self.connection.fetchall()
print(res)
except Exception as e:
print(e)
return
The test() function in pgsql is this:
CREATE OR REPLACE FUNCTION test(
) RETURNS setof varchar
AS $Body$
BEGIN
INSERT INTO Linie(name) VALUES('3');
RETURN QUERY(SELECT * FROM linie);
END;
$Body$ LANGUAGE plpgsql;
When i change the "name" value and execute the query in pgAdmin there is a now entry in the database. However when calling the function from python it always overrides the value.
The table is defined as follows:
CREATE TABLE Linie(
name varchar,
PRIMARY KEY (name)
);
For example with pgAdmin i can insert 1,2,3,4,5.
With python after running 5 equivalent queries it is just 5.
Calling the test function with nodeJS works fine.
When calling the function once from python then changing the insert value and then calling it from python again, the values are not replaced but inserted.
Also it does not throw any errors and returns the table as it should (except the replaced value).
why is this happening and what can i do against it?
Psycopg2 by default will not commit changes made to the database unless you explicitly call connection.commit() after executing your SQL. You could alter you code like so:
def debug(self):
try:
self.connection.execute(
"SELECT test();")
res = self.connection.fetchall()
self.connection.commit()
print(res)
except Exception as e:
print(e)
return
However, please be careful doing this as I have no information on what exactly self.connection is an instance of, therefore I have assumed it to be of type connection :)
Alternatively, when you setup your connection to the DB, set the property autocommit to True, as documented here. Example:
self.connection = psycopg2.connect(user='foo', password='bar', host='localhost', dbname='mydb')
self.connection.autocommit = True
If you are already using autocommit let me know and I'll have another look at your question.

Psycopg2 : Create a table in a stored procedure Postgres

Stored Procedure :
CREATE OR REPLACE FUNCTION try_create() RETURNS INT AS $$
BEGIN
CREATE TABLE hello(id SERIAL PRIMARY KEY, name TEXT);
RETURN 1;
END ;
$$ LANGUAGE plpgsql;
test.py
import psycopg2
conn = psycopg2.connect(user='a', password='a', dbname='a')
cur = conn.cursor()
cur.callproc('try_create', ())
print cur.fetchall()
I am trying to create a stored procedure which will create a table named hello. I am invoking the same using a python script. Upon running the above script I see the following output
[root#localhost partitioning]# python test.py
[(1,)]
But the table is not created at the db. Am I making something wrong here? Thanks.
You should commit the transaction, add the commands:
...
conn.commit()
conn.close()
Alternatively, you can set the connection in autocommit mode:
conn = psycopg2.connect(user='a', password='a', dbname='a')
conn.autocommit = True
cur = conn.cursor()
cur.callproc('try_create', ())
conn.close()
Read more about transactions in psycopg2.

Executing SQL Server stored procedures with parameters in Python

I'm trying to execute a stored procedure to query a table, and am having trouble passing through a parameter successfully.
title=cursor.execute("SELECT titlequery(%s)", str(member_id))`
titlequery() is created by this:
CREATE OR REPLACE FUNCTION public.titlequery(mid text)
RETURNS text AS
$BODY$
SELECT title FROM Member WHERE member_id=mid;
$BODY$
LANGUAGE sql
And the error I'm getting:
modules.pg8000.core.ProgrammingError: ('ERROR', '42P18', 'could not
determine data type of parameter $2', 'postgres.c', '1356',
'exec_parse_message', '', '')
Does anyone know what's happening here?
PEP-249 specifies API for database drivers and pg8000 follows this API as well
pg8000 is a DB-API 2.0 compatible pure-Python interface to the PostgreSQL database engine.
From PEP-249 execute method specification:
Parameters may be provided as sequence or mapping and will be bound to variables in the operation.
We can see at pg8000 sources an example of how to pass parameters to query.
So you should pass a tuple/list of values, not value itself.
Also we should execute query first and then fetch its results using fetchone or fetchmany or fetchall because execute itself returns None (more at sources). I guess OP needs one record, so we're going to use fetchone.
Note: fetchone method returns record represented as tuple, so if we need first coordinate, then we should get it using zero index.
In your case you should try:
parameters = (str(member_id),) # WARNING: don't miss the comma
cursor.execute("SELECT titlequery(%s)", parameters)
title = cursor.fetchone()[0]
or
parameters = [str(member_id)]
cursor.execute("SELECT titlequery(%s)", parameters)
title = cursor.fetchone()[0]
Example
This worked for me
import pg8000
table_definition = """
CREATE TABLE Member(
title VARCHAR(40) NOT NULL,
member_id VARCHAR(40) NOT NULL)
"""
procedure_definition = """
CREATE OR REPLACE FUNCTION public.titlequery(mid text)
RETURNS text AS
$BODY$
SELECT title FROM Member WHERE member_id=mid;
$BODY$
LANGUAGE sql
"""
connection = pg8000.connect(database='database',
user='username',
password='password',
host='hostname',
port=5432)
cursor = connection.cursor()
# Preparation
cursor.execute(table_definition)
cursor.execute(procedure_definition)
values = ('Information', 'A000042553')
cursor.execute('INSERT INTO Member (title, member_id) VALUES (%s, %s)',
values)
# Reading stored procedure result
parameters = ('A000042553',)
cursor.execute("SELECT titlequery(%s)", parameters)
title = cursor.fetchone()[0]
print(title)
# Cleanup
cursor.close()
connection.close()
gives us
Information

Python mysql: how do loop through table and regex-replacing field?

I am trying to iterate a table, fetch the rows in which a field has a pattern, then update the same row with a match group.
The following code runs without error, the two print lines before update clause output correct values. I have followed similar answers to come up the update clause, the logic seems right to me. However the code does not work, i.e., no rows updated. Where did I do wrong? Thanks,
# -*- coding: utf-8 -*-
import re
import MySQLdb
pattern = re.compile('#(.*)#.*$')
conn = MySQLdb.connect(
host='localhost', user='root',
passwd='password', db='j314', charset='utf8')
cursor = conn.cursor()
cursor.execute(
"""select `id`, `created_by_alias` from w0z9v_content where `catid` = 13 AND `created_by_alias` regexp "^#.*#.*$" limit 400""")
aliases = cursor.fetchall()
for alias in aliases:
newalias = pattern.match(alias[1])
if newalias.group(1) is not None:
# print alias[0]
# print newalias.group(1)
cursor.execute("""
update w0z9v_content set created_by_alias = %s where id = %s""", (newalias.group(1), alias[0]))
conn.close
autocommit is probably globally disabled on the server.
Execute either COMMIT after your updates, or SET autocommit=1 at the beginning of the session.
http://dev.mysql.com/doc/refman/5.0/en/commit.html
Also, you're not actually closing the connection, you forgot to call close:
conn.close()

DB-API with Python

I'm trying to insert some data into a local MySQL database by using MySQL Connector/Python -- apparently the only way to integrate MySQL into Python 3 without breaking out the C Compiler.
I tried all the examples that come with the package; Those who execute can enter data just fine. Unfortunately my attempts to write anything into my tables fail.
Here is my code:
import mysql.connector
def main(config):
db = mysql.connector.Connect(**config)
cursor = db.cursor()
stmt_drop = "DROP TABLE IF EXISTS urls"
cursor.execute(stmt_drop)
stmt_create = """
CREATE TABLE urls (
id TINYINT UNSIGNED NOT NULL AUTO_INCREMENT,
str VARCHAR(50) DEFAULT '' NOT NULL,
PRIMARY KEY (id)
) CHARACTER SET 'utf8'"""
cursor.execute(stmt_create)
cursor.execute ("""
INSERT INTO urls (str)
VALUES
('reptile'),
('amphibian'),
('fish'),
('mammal')
""")
print("Number of rows inserted: %d" % cursor.rowcount)
db.close()
if __name__ == '__main__':
import config
config = config.Config.dbinfo().copy()
main(config)
OUTPUT:
Number of rows inserted: 4
I orientate my code strictly on what was given to me in the examples and can't, for the life of mine, figure out what the problem is. What am I doing wrong here?
Fetching table data with the script works just fine so I am not worried about the configuration files. I'm root on the database so rights shouldn't be a problem either.
You need to add a db.commit() to commit your changes before you db.close()!

Categories