Returning Output of Python CGI MySQL Script - python

I'm very new to Python and MySQL and this is my first Stack question. So, apologies in advance if I'm missing something obvious. But, I really did try to research this before asking.
I'm trying to learn the basics of Python, MySQL, and CGI scripting. To that end, I've been reading tutorials at http://www.tutorialspoint.com/python/python_cgi_programming.htm and http://www.tutorialspoint.com/python/python_database_access.htm, among others.
I'm trying to have a CURL GET or Python Requests GET call a Python CGI script on a test server. That Python CGI script would then perform a Read action on a local MySQL database and return the results to CURL or to Python Requests.
The Python CGI script I've created outputs the MySQL Read data perfectly to the terminal on the remote test server. But, it won't return that output to the CURL or to the Python Requests that triggered the CGI script.
This is the Python CGI script I've cobbled together:
#!/usr/bin/python
import MySQLdb
# Open database connection
db = MySQLdb.connect("localhost","testuser","test123","TESTDB" )
# prepare a cursor object using cursor() method
cursor = db.cursor()
# Prepare SQL query to INSERT a record into the database.
sql = "SELECT * FROM EMPLOYEE \
WHERE INCOME > '%d'" % (1000)
try:
# Execute the SQL command
cursor.execute(sql)
# Fetch all the rows in a list of lists.
results = cursor.fetchall()
for row in results:
fname = row[0]
lname = row[1]
age = row[2]
sex = row[3]
income = row[4]
# Now print fetched result
print "fname=%s,lname=%s,age=%d,sex=%s,income=%d" % \
(fname, lname, age, sex, income )
except:
print "Error: unable to fetch data"
# disconnect from server
db.close()
My guess is that I need to somehow pass the data from the SQL query back to Python or back to the requesting process through some sort of Return or Output statement. But, all my best efforts to research this have not helped. Can anyone point me in the right direction? Many thanks for any help!
Marc :-)

First, as per the CGI tutorial you link, you need to output the content type you are working with:
print "Content-type: text/plain\r\n\r\n",
If you don't at least have the newlines in there, HTTP clients will think your content is supposed to be part of the headers and get confused, they will probably assume the document you asked for is empty.
Next, you need a CGI server. Python comes with one. Place your script in a subdirectory called cgi-bin and run this command (from the parent directory):
python -m CGIHTTPServer
The url to call will look something like this:
curl http://localhost:8000/cgi-bin/cgitest.py

Related

flask sqlalchemy update sql raw command does not provide any response

I'm trying to perform an update using flask-sqlalchemy but when it gets to the update script it does not return anything. it seems the script is hanging or it is not doing anything.
I tried to wrap a try catch on the code that does not complete but there are no errors.
I gave it 10 minutes to complete the update statement which only updates 1 record and still, it will not do anything for some reason.
When I cancel the script, it provides an error Communication link failure (0) (SQLEndTran) but I don't think this is the root cause of the error because on the same script, I have other sql scripts that works ok so the connection to db is good
what my script does is get some list of filenames that I need to process (I have no issues with this). then using the retrieved list of filenames, I will look into the directory to check if the file exists. if it does not exists, I will update the database to tag the file as it is not found. this is where I get the issue, it does not perform the update nor provide an error message of some sort.
I even tried to create a new engine just for the update script, but still I get the same behavior.
I also tried to print out the sql script first in python before executing. I ran the printed sql command on my sql browser and it worked ok.
The code is very simple, I'm not really sure why it's having the issue.
#!/usr/bin/env python3
from flask_sqlalchemy import sqlalchemy
import glob
files_directory = "/files_dir/"
sql_string = """
select *
from table
where status is null
"""
# ommited conn_string
engine1 = sqlalchemy.create_engine(conn_string)
result = engine1.execute(sql_string)
for r in result:
engine2 = sqlalchemy.create_engine(conn_string)
filename = r[11]
match = glob.glob(f"{files_directory}/**/{filename}.wav")
if not match:
print('no match')
script = "update table set status = 'not_found' where filename = '" + filename + "' "
engine2.execute(script)
engine2.dispose()
continue
engine1.dispose()
it appears that if I try to loop through 26k records, the script doesn't work. but when I try to do by batches of 2k records per run, then the script will work. so my sql string will become (added top 2000 on the query)
sql_string = """
select top 2000 *
from table
where status is null
"""
it's manual yeah, but it works for me since I just need to run this script once. (I mean 13 times)

inserting data to postgres using python

below is a sample of code that i am using to push data from one postgres server to another postgres server. I am trying to move 28 Million records. This worked perfectly with sql server to postgres, but now that it's postgres to postgres it is hanging on line
sourcecursor.execute('select * from "schema"."reallylargetable"; ')
it never reaches any of the other statements to get to the Iterator.
I get this message:
psycopg2.DatabaseError: out of memory for query result ad the select query statement.
#cursors for aiods and ili#
sourcecursor = sourceconn.cursor()
destcursor= destconn.cursor()
#name of temp csv file
filenme= 'filename.csv'
#defenition that uses fetchmany to iterate through data in batch. default
value is in 10000#
def ResultIterator(cursor, arraysize=1000):
'iterator using fetchmany and consumes less memory'
while True:
results = cursor.fetchmany(arraysize)
if not results:
break
for result in results:
yield result
#set data for the cursor#
print("start get data")
#it is not going past the line below. it errors at with out of memory for query result
sourcecursor.execute('select * from "schema"."reallylargetable"; ')
print("iterator")
dataresults= ResultIterator(sourcecursor)
*****do something with dataresults *********
Please change this line:
sourcecursor = sourceconn.cursor()
to name your cursor (use whatever name pleases you):
sourcecursor = sourceconn.cursor('mysourcecursor')
What this does is direct psycopg2 to open a postgresql server-side named cursor for your query. Without a named cursor on the server side, psycopg2 attempts to grab all rows when executing the query.

Python script not being interpreted

I'm very new to Python, seriously, painful newb question here.
Uploaded domainname.com/test.py to server containing nothing but:
#output data
import sqlite3
connection = sqlite3.connect("company.db")
cursor = connection.cursor()
cursor.execute("SELECT * FROM employee")
print("fetchall:")
result = cursor.fetchall()
for r in result:
print(r)
cursor.execute("SELECT * FROM employee")
print("\nfetch one:")
res = cursor.fetchone()
print(res)
it's literally just displaying output
#output data
import sqlite3
connection = sqlite3.connect("company.db")
cursor = connection.cursor()
cursor.execute("SELECT * FROM employee")
print("fetchall:")
result = cursor.fetchall()
for r in result:
print(r)
cursor.execute("SELECT * FROM employee")
print("\nfetch one:")
res = cursor.fetchone()
print(res)
in PHP or ASP I would use <% %> - feel like I'm missing something obvious here. Obviously in PHP I would easily output this to the screen - I can't seem to find a basic explanation of how to do this in python.
I tried domainname.com/test.cgi moved it to various folders etc still nothing.
I'm using a Linux Operating System. It recommends PYTHON 2.7 which is enabled. I'm just using Sublime Text editor. I do have latest Python 3.8 on my machine presumably this simple code should still work.
What am I missing to enable Python to run server side.
Add the shebang, with the Python interpreter location, at the first line of the file, for example:
#!/usr/local/bin/python
print('Ciao')
# Other code....
The interpreter could be installed in different locations depending on the server configuration, in my example was /usr/local/bin/python
The reason this script did not execute was that the shared hosting server (once activated as python was not installed by default) took a while to switch over to supporting (installing) python scripts, during this period the script did not appear run properly. The script above now runs.

Export & Map CSV output to MySQL table using Python

I have a multiple clients to single server bidirectional iperf set-up for network monitoring. The iperf server runs well and displays output in CSV format based on the cron jobs written on the client end.
I wish to write a python script to automate the process of mapping these CSV outputs to a MySQL database; which in turn would be updated and saved at regular intervals without need of human intervention.
I am using a Ubuntu 13.10 machine as the iperf server. Following is a sample CSV output that I get. This is not being stored to a file, just being displayed on screen.
s1:~$ iperf -s -y C
20140422105054,172.16.10.76,41065,172.16.10.65,5001,6,0.0-20.0,73138176,29215083
20140422105054,172.16.10.76,5001,172.16.10.65,56254,4,0.0-20.0,46350336,18502933
20140422105100,172.16.10.76,54550,172.16.10.50,5001,8,0.0-20.0,67895296,27129408
20140422105100,172.16.10.76,5001,172.16.10.50,58447,5,0.0-20.1,50937856,20292796
20140422105553,172.16.10.76,5001,172.16.10.65,47382,7,0.0-20.1,51118080,20358083
20140422105553,172.16.10.76,41067,172.16.10.65,5001,5,0.0-20.1,76677120,30524007
20140422105600,172.16.10.76,5001,172.16.10.50,40734,4,0.0-20.0,57606144,23001066
20140422105600,172.16.10.76,54552,172.16.10.50,5001,8,0.0-20.0,70123520,28019115
20140422110053,172.16.10.76,41070,172.16.10.65,5001,5,0.0-20.1,63438848,25284066
20140422110053,172.16.10.76,5001,172.16.10.65,46462,6,0.0-20.1,11321344,4497094
The fields I want to map them to are: timestamp, server_ip, server_port, client_ip, client_port, tag_id, interval, transferred, bandwidth
I want to map this CSV output periodically to a MySQL database, for which I do understand that I would have to write a Python script (inside a cron job) querying and storing in MySQL database. I am a beginner at Python scripting and database queries.
I went through another discussion on Server Fault at [https://serverfault.com/questions/566737/iperf-csv-output-format]; and would like to build my query based on this.
Generate SQL script, then run it
If you do not want to use complex solutions like sqlalchemy, following approach is possible.
having your csv data, convert them into SQL script
use mysql command line tool to run this script
Before you do it the first time, be sure you create needed database structure in the database (this I leave to you).
My following sample uses (just for my convenience) package docopt, so you need installing it:
$ pip install docopt
CSV to SQL script conversion utility
csv2sql.py:
"""
Usage:
csv2sql.py [--table <tablename>] <csvfile>
Options:
--table <tablename> Name of table in database to import into [default: mytable]
Convert csv file with iperf data into sql script for importing
those data into MySQL database.
"""
from csv import DictReader
from docopt import docopt
if __name__ == "__main__":
args = docopt(__doc__)
fname = args["<csvfile>"]
tablename = args["--table"]
headers = ["timestamp",
"server_ip",
"server_port",
"client_ip",
"client_port",
"tag_id",
"interval",
"transferred",
"bandwidth"
]
sql = """insert into {tablename}
values ({timestamp},"{server_ip}",{server_port},"{client_ip}",{client_port},{tag_id},"{interval}",{transferred},{bandwidth});"""
with open(fname) as f:
reader = DictReader(f, headers, delimiter=",")
for rec in reader:
print(sql.format(tablename=tablename, **rec)) # python <= 2.6 will fail here
Convert CSV to SQL script
First let the conversion utility introduce:
$ python csv2sql.py -h
Usage:
csv2sql.py [--table <tablename>] <csvfile>
Options:
--table <tablename> Name of table in database to import into [default: mytable]
Convert csv file with iperf data into sql script for importing
those data into MySQL database.
Having your data in file data.csv:
$ python csv2sql.py data.csv
insert into mytable
values (20140422105054,"172.16.10.76",41065,"172.16.10.65",5001,6,"0.0-20.0",73138176,29215083);
insert into mytable
values (20140422105054,"172.16.10.76",5001,"172.16.10.65",56254,4,"0.0-20.0",46350336,18502933);
insert into mytable
values (20140422105100,"172.16.10.76",54550,"172.16.10.50",5001,8,"0.0-20.0",67895296,27129408);
insert into mytable
values (20140422105100,"172.16.10.76",5001,"172.16.10.50",58447,5,"0.0-20.1",50937856,20292796);
insert into mytable
values (20140422105553,"172.16.10.76",5001,"172.16.10.65",47382,7,"0.0-20.1",51118080,20358083);
insert into mytable
values (20140422105553,"172.16.10.76",41067,"172.16.10.65",5001,5,"0.0-20.1",76677120,30524007);
insert into mytable
values (20140422105600,"172.16.10.76",5001,"172.16.10.50",40734,4,"0.0-20.0",57606144,23001066);
insert into mytable
values (20140422105600,"172.16.10.76",54552,"172.16.10.50",5001,8,"0.0-20.0",70123520,28019115);
insert into mytable
values (20140422110053,"172.16.10.76",41070,"172.16.10.65",5001,5,"0.0-20.1",63438848,25284066);
insert into mytable
values (20140422110053,"172.16.10.76",5001,"172.16.10.65",46462,6,"0.0-20.1",11321344,4497094);
Put it all into file data.sql:
$ python csv2sql.py data.csv > data.sql
Apply data.sql to your MySQL database
And finally use mysql command (provided by MySQL) to do the import into database:
$ myslq --user username --password password db_name < data.sql
If you plan using Python, then I would recommend using sqlalchemy
General approach is:
define class, which has all the attributes, you want to store
map all the properties of the class to database columns and types
read your data from csv (using e.g. csv module), for each row create corresponding object being the class prepared before, and let it to be stored.
sqlalchemy shall provide you more details and instructions, your requirement seems rather easy.
Other option is to find out an existing csv import tool, some are already available with MySQL, there are plenty of others too.
This probably is not the kind of answer you are looking for, but if you learn a little sqlite3 (a native Python module - "import sqlite3") by doing a basic tutorial online, you will realize that your problem is not at all difficult to solve. Then just use a standard timer, such as time.sleep() to repeat the procedure.

.cgi problem with web server

The code
#!/usr/bin/env python
import MySQLdb
print "Content-Type: text/html"
print
print "<html><head><title>Books</title></head>"
print "<body>" print "<h1>Books</h1>"
print "<ul>"
connection = MySQLdb.connect(user='me', passwd='letmein', db='my_db') cursor = connection.cursor() cursor.execute(“SELECT name FROM books ORDER BY pub_date DESC LIMIT 10”)
for row in cursor.fetchall():
print "<li>%s</li>" % row[0]
print "</ul>"
print "</body></html>"
connection.close()
I saved it as test.cgi to my web server. I run it by www.mysite.com/test.cgi unsuccessfully
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
How can you solve the problem?
[edit] after the first answer
test.cgi is executable (I run $ chmod +x test.cgi)
I use Apache.
I have this in .bashrc export PATH=${PATH}:~/bin
Python module MySQLdb is installed.
The code does not have smart quotes.
[edit] after the second answer
you're getting that error because you
haven't installed the MySQLdb module
that Python needs to talk to a MySQL
database
I installed MySQLdb to my system. The module works, since I can import them.
However, I still get the same error whet I go to the www.[mysite].com/test.cgi.
[edit]
I am not sure about the questions
Are the connect() parameters correct? Is MySQL running on localhost
at the default port?
I run MySQL on my server. Is the question about the connect() parameters relevant here?
Is the SELECT statement correct?
You mean whether I have my SQL statements such as SELECT statement correct?
I have not used any SQL queries yet.
Do I need them here?
Any number of issues can cause the error you are seeing:
Is test.cgi executable (chmod 755) on the server?
Is the directory in which you placed test.cgi designated as a ScriptAlias location or have the ExecCGI option enabled (or equivalent if you're not using Apache)?
Is python in the system PATH or in the PATH in the Web server's startup environment?
Is the MySQLdb Python library installed?
Are the connect() parameters correct? Is MySQL running on localhost at the default port?
Is the SELECT statement correct?
If you're sure that python is found (test using the simplest possible script or by logging into the Web server if you can and typing which python) then you can get much better debug output by adding the following to the top of your script just below the shebang:
import cgitb
cgitb.enable()
More details: http://docs.python.org/library/cgitb.html
Additionally, if you have shell access to the Web server, try running python and just typing:
>>> import MySQLdb
If the command returns with no error, you have your answer for #4 above. If an error is printed, you will need to get MySQLdb installed into the Web server's Python installation.
EDIT: Looking more closely at the top of your question, I see that the code was scraped from an illustrative example at the very beginning of the Django Book. As such, I might expand #5 above to include the caveat that, of course, the requisite database, tables, user, and permissions need to be set up on the MySQL installation available to the Web server.
I've tidied up the code a bit by inserting linebreaks where necessary and replacing smart quotes with " and '. Do you have any more luck with the following? Can you run it from a terminal just by typing python test.cgi?
#!/usr/bin/env python
import MySQLdb
print "Content-Type: text/html"
print
print "<html><head><title>Books</title></head>"
print "<body>"
print "<h1>Books</h1>"
print "<ul>"
connection = MySQLdb.connect(user='me', passwd='letmein', db='my_db')
cursor = connection.cursor()
cursor.execute("SELECT name FROM books ORDER BY pub_date DESC LIMIT 10")
for row in cursor.fetchall():
print "<li>%s</li>" % row[0]
print "</ul>"
print "</body></html>"
connection.close()
The error in http://dpaste.com/8866/ is occurring because you are using "curly quotes" instead of standard ASCII quotation marks.
You'll want to replace the “ and ” with ". Just use find and replace in your text editor.
Make sure you're saving the file with correct line endings. i.e. LF only on unix, or CR/LF for Windows. I just recently had the exact same problem...

Categories