I need to export some rows from a table in a PostgreSQL database to a .csv file using a Python script:
#!/usr/bin/python
# -*- coding: utf-8 -*-
import sys, psycopg2
...
conn = psycopg2.connect("dbname=dbname user=user password=password")
cur = conn.cursor()
sql = "\copy (SELECT * FROM table WHERE month=6) TO '/mnt/results/month/table.csv' WITH CSV DELIMITER ';';"
cur.execute(sql)
cur.close()
...
But when I run the script I get this:
Syntax error at or near «\»
LINE 1: \copy (SELECT * FROM TABLE WHERE month=6) TO '...
Does anyone know what can be wrong or give me a tip about?
The \copy is not an SQL command, it is a command specific for the Postgres terminal client psql and cannot be used in this context.
Use copy_expert(sql, file, size=8192) instead, e.g.:
sql = "COPY (SELECT * FROM a_table WHERE month=6) TO STDOUT WITH CSV DELIMITER ';'"
with open("/mnt/results/month/table.csv", "w") as file:
cur.copy_expert(sql, file)
Read more about the function in the documentation.
\COPY is a psql command (client based) and therefore not a valid SQL command. Try the same simply using COPY (without the backslash\).
If you want the output file directly at the client, you might wanna use the STDOUT of COPY, otherwise the it will be created at the database server, which is a luxury not everyone has.
Yes, all statements are correct related to \COPY as SQL command.
If you want to try the same as you stated in the question then you can try it as a shell command in python script.
Eg:
import os
cmd1 = " Your SQL to csv Command"
os.system(cmd1)
You can use this syntax:
query = """ select * from instalacje WHERE date > '2022-02-20'"""
outputquery = "COPY ({0}) TO STDOUT WITH CSV HEADER".format(query)
with open("C:/1/Wojciech_CCC.csv", "w") as f:
cur.copy_expert(outputquery, f)
Related
For example, lets say I want to access a Postgresql database in the shell and apply the command select *;. This would require me to run:
psql -U postgres
<insert the password> (optional)
select *;
and ideally get the the intermediate output after each step. This is only a toy example, for doing this sqlalchemy would be a good pick, but still it should be possible through pythons subprocess module.
What I have tried based on this post:
start = f"psql -U postgres"
fw = open("tmpout.txt", "wb")
fr = open("tmpout.txt", "r")
p = subprocess.Popen(start, stdin=subprocess.PIPE, stdout=fw, stderr=fw, bufsize=1,
shell=True)
p.stdin.write(bytes("select *;", 'utf-8'))
out = fr.read() # Here i would expect the result of the select, but it doesn't terminate..
I've discovered Oracle's SQLCL and have been able to make it work in the terminal. I've also been able to make it work in Python, up through entering the actual SQL query.
My code in Python looks like this:
import subprocess
import time
import os
os.chdir("C:/sqlcl/bin")
subprocess.run(["sql", "username/password#//database-oracle.datamore.com/moreprod.more:1521"])
At this point, I get the "SQL>" prompt showing that Oracle is ready to take my query. What I'd like to do is enter the location to a script and have it be executed, something like:
#C:/Users/username/queries/test-query.sql;
Basically, I need a way to pass a SQL statement or a script location via Python to the SQL prompt.
Here is how to pass items to a subprocess
args = ["grep", "beans"]
child_proccess = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
child_process_output = child_proccess.communicate(b"I love beans \n I like cars")[0]
print(child_process_output)
Here's what worked:
import subprocess
import os
os.chdir("C:/sqlcl/bin")
subprocess.run(["sql",
"username/password#//database-oracle.datamore.com/moreprod.more:1521",
"#",
"C:/sqlcl/bin/test-query.sql",
";"])
Note that the query is in the same directory as the SQLCLapplication
import pexpect
from os import environ
import sys
environ['TNS_ADMIN'] = wallet
sqlcl_bin = '/sqlcl_source/sqlcl/bin/sql /nolog' # path to sqlcl bin
child = pexpect.spawn (sqlcl_bin)
child.logfile = sys.stdout.buffer
child.expect('SQL>', timeout=30)
user=
password=
service=
conn_string = f"conn {user}/{password}#{service};"
child.sendline(conn_str)
This is how I connect
I have a python program, I need to know what does the following block of code does.
particularly the dbcon=sqlite3.connect command. What are the parameters in the parenthesis.
dbcon = sqlite3.connect(sys.argv[1] + '.sqlite')
dbcurs = dbcon.cursor()
dbcurs.execute('''DROP TABLE IF EXISTS acc''')
dbcurs.execute('''CREATE TABLE acc (time REAL, x INTEGER, y INTEGER, z INTEGER)''')
dbcurs.execute('''CREATE INDEX time_hash ON acc (time)''')
sys.argv holds the command line arguments passed to the python script. So sys.argv[1] would be the first argument after the name of the script, which is sys.argv[0]. I assume that this argument is the location of a sqlite database, minus the .sqlite extension. See the docs for the sqlite3 and sys modules.
For example, if you typed python myscript.py /path/to/my/db into the command line, where myscript.py is the name of your script, the line dbcon = sqlite3.connect(sys.argv[1] + '.sqlite')would try to open a database connection for a database file located at "/path/to/my/db.sqlite".
I would like to execute the MySQL query source FileName.sql in a Python script on Linux.
I am able to execute other queries like SELECT * FROM table_name but this one is giving an error. I am executing this on a Linux server with a MySQL database using Python. The frontend I am using is Putty.
The Python script I have used is:
import MySQLdb
db = MySQLdb.connect("hostname","username","pswrd","dbname")
cursor = db.cursor()
cursor.execute("source FileName.sql")
db.close()
How can I execute the query source FileName.sql on the location where this file-> FileName.sql is located?
source is not a SQL command. It's a MySQL CLI command, it only exists in the console application mysql (and wherever else implemented). All it does is to read the contents of FileName.sql and issue the SQL commands inside.
To do this in python, you can use something like
Edit: This assumes you have 1 query per line! If you have multi-line queries, you'll have to find the means to extract each query from the file.
import MySQLdb
db = MySQLdb.connect("hostname","user","pass","db")
cursor = db.cursor()
for line in open("FileName.sql"):
cursor.execute(line)
db.close()
You can execute a bash command with Python and import your SQL file.
This exemple is for MySQL
import subprocess
command = "mysql -u username --password=p#55W0rD database_name < file.sql".split()
p = subprocess.Popen(command, stdout=subprocess.PIPE)
p.communicate() # you can see if errors are returned
if your SQL file creates a database, remove database_name.
sources:
https://docs.python.org/3/library/subprocess.html#popen-constructor
https://dev.mysql.com/doc/refman/8.0/en/mysql-batch-commands.html
Separate the scripts in SQL file in python using ";" as delimiter
Execute each command iteratively.
awesome5team had developed a nice solution in https://github.com/awesome5team/General-Resources-Box/issues/7
Code snippet from the same:
import mysql.connector
cnx = mysql.connector.connect(user='root',
password='YOUR-PASSWORD-FOR-MYSQL',
host='localhost',
database='YOUR-DATABASE-NAME')
cursor =cnx.cursor()
def executeScriptsFromFile(filename):
fd = open(filename, 'r')
sqlFile = fd.read()
fd.close()
sqlCommands = sqlFile.split(';')
for command in sqlCommands:
try:
if command.strip() != '':
cursor.execute(command)
except IOError, msg:
print "Command skipped: ", msg
executeScriptsFromFile('SQL-FILE-LOCATION')
cnx.commit()
I'm trying to run this command through KennethReitz's Envoy package:
$ sqlite3 foo.db 'select * from sqlite_master'
I've tried this:
r = envoy.run("sqlite3 foo.db 'select * from sqlite_master'")
sqlite3: Error: too many options: "*"
and this:
r = envoy.run(['sqlite3', 'foo.db', 'select * from sqlite_master'])
AttributeError: 'NoneType' object has no attribute 'returncode'
additional quoting & escaping doesn't seem to help. Any suggestions?
FYI: This is what I had to do for now:
cmd = "sqlite3 %(database)s 'select * from sqlite_master'" % locals()
os.system(cmd)
Note that this is a contrived example, and that most of the unix shell commands that I'd like to issue aren't just a simple select that could be easily done via SQLAlchemy.
This will not work in envoy because envoy splits the commands and pass them to subprocess. Even if you try with subprocess.Popen(command, shell = False) you will end up getting sqlite3 terminal. Both subprocess and envoy fails to address this, I will be happy if you can open an issue in envoy since I am contributing to it, I will be thinking about this.
You could use subprocess:
from subprocess import check_output as qx
output = qx(['sqlite3', 'foo.db', 'select * from sqlite_master'])
print output
Or sqlite3 module:
import sqlite3
conn = sqlite3.connect('foo.db')
for row in conn.execute('select * from sqlite_master'):
print row
If you still want to use envoy then you could fix it as:
import envoy
r = envoy.run([["sqlite3", "foo.db", "select * from sqlite_master"]])
print r.std_out