this should be pretty simple.
I'm writing a program to pull data from a database to be stored in a variable and passed on to another program. I have it working to connect to the db and run the query to pull the data, which is returned in a new line for each column. I would like to parse through this output to store only the columns I need into separate variables to be imported by another python program. Please note that the print(outbound) part is just there for testing purposes.
Here's the function:
def pullData():
cnxn = pyodbc.connect('UID='+dbUser+';PWD='+dbPassword+';DSN='+dbHost)
cursor = cnxn.cursor()
#run query to pull the newest sms message
outbound = cursor.execute("SELECT * FROM WKM_SMS_outbound ORDER BY id DESC")
table = cursor.fetchone()
for outbound in table:
print(outbound)
#close connection
cnxn.close()
and here's the sample output from the query that I would like to parse through- as it's currently being stored in variable outbound. (NOTE) This is not 1 column. This is one ROW. Each new line is a new column in the db... this is just how it's being returned and formatted when I run the program.
I think this is the best way you can achieve this:
(Considering that your table variable returns as list)
# Lets say until here you've done your queries
collection = {}
for index,outbound in enumerate(table):
key_name = "key{0}".format(index)
collection[key_name] = outbound
print(collection)
OUTPUT Expected:
{
"key1" : 6932921,
"key2" : 303794,
...
...
...
}
And then what you can do to access it from another python file by importing the collection variable by adding return collection on your pulldata function
On the other python file it will be simple just to:
from your_file_name import collection # considering they are on the same directory
Related
Hello everyone what I am trying to do is through a query insert data to a tableA(), once data is inserted into Table A then delete the newly inserted values in A, then write the response/output into Table B that I created.
Here is my python code :
client = bigquery.Client()
#This is Table B
table_id = "ntest.practice.btabletest"
#here is the table I am writing my deleted output to
job_config = bigquery.QueryJobConfig(destination=table_id)
sql2 ="""
INSERT INTO `ntest.practice.atabletest`(%s) VALUES (%s);
DELETE FROM `ntest.practice.atabletest`
WHERE name = 'HEART'
"""%(columns_aaa,valueaaa)
query_job1 = client.query(sql2,job_config=job_config) # Make an API request.
query_job1.result() # Waits for query to finish
print("Query results loaded to the table {}".format(table_id))
Yet, I get an error code saying:
google.api_core.exceptions.BadRequest: 400
configuration.query.destinationTable cannot be set for scripts
Any thoughts on how to fix this error? I don't believe, that my query is wrong, nor my tables or values are incorrect.
Although BigQuery scripting doesn't support destination table, it doesn't seems that you'll need it for your specific query.
DELETE query never writes any data to a destination table. You may work around it by sending INSERT first then DELETE, this way, destination table will "work" (I mean BigQuery won't complaint about it). But you're getting an empty table.
I'm using python in TestComplete to conduct a db query, but the results seem to be empty strings and do not match the data in the table I queried. The file is a s3db file. Does that matter?
Using:
TestComplete Version 14
imported sqlite3 into python file
I've:
-Tried running the same query in SQLite. It returned the expected result
-Verified the connection is established with the correct db
---python
import sqlite3
def getInfo():
conn = sqlite3.connect(db)
c = conn.cursor()
try:
c.execute('SELECT Column_Name FROM Table_Name')
results = c.fetchall()
except:
Log.Error("Query execution failed")
for x in results:
Log.Message(x) `enter code here`
#Log.Message() works like a print statement in testcomplete.
---
Actual Output:
The program runs without errors, but the results come back as 15 lines of blank rows. 15 is the number of records within the table, so I know it's looking in the right place, but it seems like it's not identifying that there's information stored here.
Expected Output:
15 lines of data contained within the Column I specified in the query.
There is no error with sqlite3 and your DB operations. The issue is with Log.Message and what it expects as an argument. Within TestComplete, Log.Message requires variable arguments of type Variant, which can be any of the supported data types within TestComplete; String, Double/Real, Boolean, Date/Time, Object (i.e. TestComplete-recognised UI objects) and Integer.
Log.Message cannot accept arguments of the type returned by cursor.fetchall's rows.
So you'd need to convert each row into a String, e.g.
for x in results:
msg = ('{0} : {1}'.format(x[0], x[1]))
Log.Message(msg)
I'm relatively new at Python but have more experience in Java, so I understand most of the concepts. However, I keep having issues with MySQL and passing information back from a function using MySQL
to use in another function later.
I need to make complex MySQL queries with multiple return field, So I don't want to be running multiple SQL queries for each SQL field as this will smash the database.
Saying that the below is a small example of what I'm trying to achieve.
I wanted a function to run an SQL query (def connectory_mysql()) that took parameters from elsewhere, (this part works) then take the output of the SQL query and pass back to the main function to use.
The main function then needs to use the different column results of the SQL query for different parameters.
I can return the result and assign it to a result1 which appears and looks like a dictionary when printed, but I'm unable to split/use the different keys or data from the result1 = connector_mysql(subSerialNum, ldev, today_date)
If i splituse the keys in the dictionary in the SQL function before returning ie ldev_cap = result['ldev_cap']
I can print the individual elements within the SQL function... However, I cant seem to pass the parameters then back to the main function and split them out??
I must have missed something easy or am not understanding something... any assistance or help would be greatly appreciated...
...
result1 = connector_mysql(subSerialNum, ldev, today_date)
print(result1) #this works and appears to be a dictionary, but I can split it
## into its elements like:
ldev_cap = result1['ldev_cap'] ##-> this dosn't work here.....???? if i return it as a dictonary..
#and i'm unsure how to split them when i pass just the paramaters
#back if i split the dictionary in the sql function.
...
def connector_mysql(subSerialNum, ldev, today_date):
import pymysql.cursors
db_server = 'localhost'
db_name = 'CBDB'
db_pass = 'secure_password'
db_user = 'user1'
sql_query = (
"SELECT ldev_cap, ldev_usdcap FROM Ldevs WHERE sub_serial=%(serial)s "
"and ldev_id=%(lun)s and data_date=%(todayD)s")
connection = pymysql.connect(host=db_server,
user=db_user,
password=db_pass,
db=db_name,
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
cursor.execute(sql_query, {'serial': subSerialNum, 'lun': ldev, 'todayD': today_date})
result = cursor.fetchone()
while result:
ldev_cap = result['ldev_cap'] #here the dictionary acts as
#expected and i can assign a value
ldev_usdcap = result['ldev_usdcap']
print(result)
return ldev_cap, ldev_usdcap #here i can return
finally:
connection.close()
Any help or assistance would be greatly apricated...
Cheers
Graham
First of all, you should get familiar with the Python style guide for writing python code.
Based on your existing code, result1 return as a tuple that contained the value of (ldev_cap, ldef_usdcap) (it is not a directory). You get access to the return result as result1[0] which corresponding to the return value of ldev_cap or result1[1] which corresponding to the return value of ldev_usdcap.
Alternatively, since you are returning two data, you can access each return data by using
ldev_cap, ldev_usdcap = connector_mysql(subSerialNum, ldev, today_date)
just to explain whats going here: I have a search function that runs a MySQL query using user input [givenLocation]. It is supposed to dump the contents of the query into the listbox [self.lookuplist]. My issue is that currently it will only dump the first result even though I am using the fetchall() function. I am a self-taught python developer but I have not been able to find any information on this from other sources. Here is my code:
def searchL_button(self):
i = 0
givenLocation = self.top3.searchEntry1.get()
searchLookup = ("SELECT Status, Serial, Product_Code, Location FROM Registers WHERE Location = %s")
cursor9.execute(searchLookup, [givenLocation])
locRes = cursor9.fetchall() [i]
for i in locRes:
self.lookupList.insert(END, locRes)
You are setting the variable locRes to only contain the first result of your query. Change the last few lines to the following
locRes = cursor9.fetchall()
for curRes in locRes:
self.lookupList.insert(END, curRes)
I am using python to fetch data from Oracle DB. All the rows have a column which has XML data. When I print the data fetched from Oracle DB using python, the column with XML data is printed as - cx_Oracle.OBJECT object at 0x7fffe373b960 etc. I even converted the data to pandas data frame and still the data for this columns is printed as cx_Oracle.OBJECT object at 0x7fffe373b960. I want to access the key value data stored in this column(XML files).
Please read inline comments.
cursor = connection.cursor() # you know what it is for
# here getClobVal() returns whole xml. It won't work without alias I don't know why.
query = """select a.columnName.getClobVal() from tablename a"""
cursor.execute(query) #you know what it is for
result = cursor.fetchone()[0].read() # for single record
result = cursor.fetchall() # for all records
for res in result:
print res[0].read()