How to include external python file i SQL request - python

I'm not experienced in Python.
I have the following Python code:
How can i import various values from files outside the file, and use them in a SQL request?
#!/usr/bin/env python
import MySQLdb
import Stamdata
from Stamdata import Varmekurve
K = Varmekurve
print K #this vorks, and the value 1.5 from Varmekurve is printed.
#Open database connection
db = MySQLdb.connect("localhost","root","Codename","MyDvoDb")
#prepare a cursor object using cursor method
cursor = db.cursor()
#Get SetTemp FROM SQL
sql = ("SELECT SetTemp FROM varmekurver WHERE kurvenummer = '1.5' AND TempSensor ='15'")
#Here i would like to import the value from Varmekurve instead of '1.5', and the data from a DS18b20 temp. sensor instead of '15'.
#The DS18B20 sensor are located in '/sys/bus/w1/devices/28-0316007914ff/w1_slave'
cursor.execute(sql)
results = cursor.fetchall()
for row in results:
print row[0]
db.close()
Only the Stamdata file are in the same library.
The Script shall control a motorvalve by calling the SetTemp and open/close a mix-valve if the temp. is to high or low (within 2-3 degrees)
But i haven't come that far yet :0)

to dynamically change the value in the string from a variable do:
SELECT SetTemp FROM varmekurver WHERE kurvenummer = '{}' AND TempSensor ='{}'.format(val1, val2)
If you want to import these values from an external source, like a flat file, you can do it in a number of ways. For example using Pandas.

Related

Running select query on db for different variables using python

I am using python to establish db connection and reading csv file. For each line in csv i want to run a PostgreSQL query and get value corresponding to each line read.
DB connection and file reading is working fine. Also if i run query for hardcoded value then it works fine. But if i try to run query for each row in csv file using python variable then i am not getting correct value.
cursor.execute("select team from users.teamdetails where p_id = '123abc'")
Above query works fine.
but when i try it for multiple values fetched from csv file then i am not getting correct value.
cursor.execute("select team from users.teamdetails where p_id = queryPID")
Complete code for Reference:
import psycopg2
import csv
conn = psycopg2.connect(dbname='', user='', password='', host='', port='')
cursor = conn.cursor()
with open('playerid.csv','r') as csv_file:
csv_reader = csv.reader(csv_file)
for line in csv_reader:
queryPID = line[0]
cursor.execute("select team from users.teamdetails where p_id = queryPID")
team = cursor.fetchone()
print (team[0])
conn.close()
DO NOT concatenate the csv data. Use a parameterised query.
Use %s inside your string, then pass the additional variable:
cursor.execute('select team from users.teamdetails where p_id = %s', (queryPID,))
Concatenation of text leaves your application vulnerable to SQL injection.
https://www.psycopg.org/docs/usage.html

LOAD DATA LOCAL INFILE with incremental field

I have multiple unstructured txt files in a directory and I want to insert all of them into mysql; basically, the entire content of each text file should be placed into a row . In MySQL, I have 2 columns: ID (auto increment), and LastName(nvarchar(45)). I used Python to connect to MySql; used LOAD DATA LOCAL INFILE to insert the whole content. But when I run the code I see the following messages in Python console:
.
Also, when I check MySql, I see nothing but a bunch of empty rows with Ids being automatically generated.
Here is the code:
import MySQLdb
import sys
import os
result = os.listdir("C:\\Users\\msalimi\\Google Drive\\s\\Discharge_Summary")
for x in result:
db = MySQLdb.connect("localhost", "root", "Pass", "myblog")
cursor = db.cursor()
file1 = os.path.join(r'C:\\Discharge_Summary\\'+x)
cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test" %(file1,));
db.commit()
db.close()
Can someone please tell me what is wrong with the code? What is the right way to achieve my goal?
I edited my code with:
.....cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test LINES TERMINATED BY '\r' (Lastname) SET id = NULL" %(file1,))
and it worked :)

Python Running SQL Query With Temp Tables

I am new to the Python-SQL connectivity world. My goal is to retrieve data from SQL in a pandas DataFrame format by executing long SQL queries thru my python script.
Most of my SQL queries are long with multiple interim-temp tables before the final SELECT statement from the last temp table. When I run such a monolithic query in Python I get an error saying -
"pandas.io.sql.DatabaseError: Execution failed on sql"
Though they run absolutely fine in MS SQL Management Studio
I suspect this is due to the interim-temp tables, because if I split my long query into two pieces (with everything before the final SELECT in 1st section and final SELECT in the 2nd section) the two section sequentially, run fine
Can someone guide me why is it so or alternatively what is the best way to run long queries with temp tables/views and retrieve results in a pandas DataFrame?
Here is my sample Python code that ideally should take a fine name as an input and run the SQL to retrieve results in a data frame, however it fails in case of a query with temp tables
import pyodbc as db
import pandas as pd
filename = 'file.sql'
username = 'XXXX'
password = 'YYYYY'
driver= '{ODBC Driver 13 for SQL Server}'
database = 'DB'
server = 'local'
conn = db.connect('DRIVER='+driver+'; PORT=1433; SERVER='+server+';
PORT=1443; DATABASE='+database+'; UID='+username+'; PWD='+ password)
fd = open(filename, 'r')
sqlfile = fd.read()
fd.close()
sqlcommand1 = sql
df_table = pd.read_sql(sqlcommand1, conn)
If I break my sql query in two pieces (one with all temp tables and 2nd with final Select), then it runs fine. Below is a modified function that splits the long Query after finding '/**/' and it works fine
"""
This Function Reads a SQL Script From an Extrenal File and Executes The
Script in SQL. If The SQL Script Has Bunch of Tem Tables/Views
Followed By a Select Statement to Retrieve Data From Those Views Then Input
SQL File Should Have '/**/' Immediately Before the Final
Select Statement. This is to Esnure Final Select Statement is Executed on
the Temporary Views Already Run by Python.
Input is a SQL File Name and Output is a DataFrame
"""
import pyodbc as db
import pandas as pd
filename = 'filename.sql'
username = 'XXXX'
password = 'YYYYY'
driver= '{ODBC Driver 13 for SQL Server}'
database = 'DB'
server = 'local'
conn = db.connect('DRIVER='+driver+'; PORT=1433; SERVER='+server+';
PORT=1443; DATABASE='+database+'; UID='+username+'; PWD='+ password)
fd = open(filename, 'r')
sqlfile = fd.read()
fd.close()
sql = sqlfile.split('/**/')
sqlcommand1 = sql[0] #1st Section of Query with temp tables
sqlcommand2 = sql[1] #2nd section of Query with final SELECT statement
conn.execute(sqlcommand1)
df_table = pd.read_sql(sqlcommand2, conn)
Quick and dirty answer: if using T-SQL put the line SET NOCOUNT ON at the beginning of your query.
Like #Parfait mentioned above the pandas read_sql method can only support one result set. However, when you generate a temp table in T-sql you do create a result set in the form "(XX row(s) affected)" which is what causes your original query to fail. By setting NOCOUNT you eliminate any early returns and only get the results from your final SELECT statement.
Alternatively, if using pyodbc cursor instead of pandas you can utilize nextset() to skip the result sets from the temp table(s). More info on pyodbc here.

Add list of values to a blob field in firebird using Python

I have a list of items which I like to store in my firebird database.
Thus far I made the following code
Sens=278.3
DSens=1.2
Fc10=3.8
Bw10=60.0
Fc20=4.2
Bw20=90.0
ResultArray = (Sens,DSens,Fc10,Bw10,Fc20,Bw20,t6,t20,Nel,Nsub)
con = fdb.connect(dsn="192.168.0.2:/database/us-database/usdb.gdb", user="sysdba", password="#########")
cur = con.cursor()
InsertStatement="insert into Tosh_Probe (TestResults ) Values (?)"
cur.execute(InsertStatement, (ResultArray,))
con.commit()
In here the TestResult field is blob field in my database.
This gives a TypeError (???)
What is the correct syntax to store these values into a blob
An other option I tried is to write the list of items into a StringIO, and store that in the database. Now a new entry is made in the database but no data is added to the blob field
Here is the code for adding the fields to the StringIO
ResultArray = StringIO.StringIO()
ResultArray.write = Sens
ResultArray.write = DSens
#ResultArray.close #tried with and without this line but with the same result
I've tested this with Python 3.5.1 and FDB 1.6. The following variants of writing all work (into a blob sub_type text):
import fdb
import io
con = fdb.connect(dsn='localhost:testdatabase', user='sysdba', password='masterkey')
cur = con.cursor()
statement = "insert into blob_test2 (text_blob) values (?)"
cur.execute(statement, ("test blob as string",))
cur.execute(statement, (io.StringIO("test blob as StringIO"),))
streamwrites = io.StringIO()
streamwrites.write("streamed write1,")
streamwrites.write("streamed write2,")
streamwrites.seek(0)
cur.execute(statement, (streamwrites,))
con.commit()
con.close()
The major differences with your code in the case of the the writes to StringIO are:
Use of write(...) instead of write = ...
Use of seek(0) to position the stream at the start, otherwise you read nothing, as the stream is positioned after the last write.
I haven't tried binary IO, but I expect that to work in a similar fashion.

comparing a given variable to data in a database and checking to see if it exists

I have a sqlite db of API keys and I want to make something check and see if the given key is in the database.I'm generating the API keys using another python script named apikeygen.py. I'm using python 2.7 and pattern 2.6. This is going to be a data scraping/mining/filtering application that I'm doing just for fun and maybe have a future use for malware analysis.
I need help getting the main piece of code that we will call API.py to check and see if the given API key is in the database.
This is the code for the API.py file so far.
import os, sys; sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", ".."))
import sqlite3 as lite
from pattern.server import App
from pattern.server import MINUTE, HOUR, DAY
app = App("api")
def search_db(key=''):
con = lite.connect('apikeys.db')
with con:
cur = con.cursor()
cur.execute("SELECT * FROM keys")
while True:
row = cur.fetchone()
if row == None:
break
print row[2]
I'm still not really clear what you are asking. Why don't you explicitly query for the key, rather than iterating over your whole table?
cur.execute("SELECT * FROM keys WHERE key = ?", (key,))

Categories