I just started learning Python and I got into pyodbc. I would like to print the datatypes of the columns just once above the actual data. But right now, it prints the datatypes through the whole loop or the program shuts down. I'm not sure about what exactly my mistake is.
def selectPlace(dbc):
cursor = dbc.cursor()
try:
cursor.execute('SELECT Nr, address from Place')
except:
print ('Error')
cursor.close()
return
print ('\nPlaces:')
i = 0
for row in cursor:
if i == 0:
print('Datatypes: ', type(row[0]), type(row[1]))
i = 1
print (row[0], row[1])
# print type(row[0])
cursor.close()
x = input('Input : ')
return x
Related
I have a code, shown below that reads 4 lines of data over a serial connection. The data is assigned a variable and then an attempt is made to insert this into a local database. However, once the code has run, there is no new data in the database.
I have inserted print commands in to check that the data is definitely being received over terminal and it is, I have also successfully inserted data into the database via terminal, but that was static values such as 10.0, 10.0, 0, 10.
import MySQLdb
import serial
import time
ser = serial.Serial('/dev/ttyACM0', 115200)
conn = MySQLdb.connect(host= "localhost", user= "JP", passwd= "password", db= "serialdb")
cursor = conn.cursor()
while 1:
print "waiting for data"
print ""
xs = ser.readline()
print xs
time.sleep(1)
ys = ser.readline()
print ys
time.sleep(1)
zs = ser.readline()
print zs
time.sleep(1)
vs = ser.readline()
print vs
time.sleep(1)
try:
x= float(xs)
except ValueError:
pass
try:
y= float(xs)
except ValueError:
pass
try:
z= float(xs)
except ValueError:
pass
v = int(vs)
print "inserting into database"
print ""
time.sleep(1)
sql = "INSERT INTO Arduino_Data(Temperature, Humidity, RPM, Distance) VALUES (%f, %f, %f, %d)" %(x, y, z, v)
cursor.execute(sql)
conn.commit
break
Commit is a function, you are not calling it :)
conn.commit()
That should do it
I'm trying to insert some data from a text file (fields terminated by ";") to a table in MySQL using Python. Inserting one by one takes too long, so I decided to write a loop to insert a large block of rows at once, but not the whole file (no memory available for that). The table has 21 columns.
import datetime
import mysql.connector
print(datetime.datetime.now())
with open("/backup/backup/backupDB/csv/file.txt", "r", encoding = "latin-1") as data:
dbconn = mysql.connector.connect(
host = "server", user = "user", password = "password", port = 3306
)
cur = dbconn.cursor(prepared = True)
cur.execute("SELECT COLUMN_NAME FROM information_schema.columns WHERE table_schema='schema' AND table_name='table'")
iterColumn = cur.fetchall()
columns = str(iterColumn).replace("(","").replace(",)","").replace("[","(").replace("]",")").replace("'","")
next(data)
cur = dbconn.cursor()
block = 5000
y = []
try:
while True:
for x in data:
x = x.split(";")
y.append(tuple(x))
if len(y) == block:
break
cur.executemany("insert ignore into schema.table " + columns + " values (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s)", y)
dbconn.commit()
y=[]
except:
print(datetime.datetime.now())
The code above works, but it caps the insert to 60000 rows, exactly, although I have 210000+ in my file.
Where am I doing wrong?
I have created a database and needs to check that one for corresponding user name and password... if it is there it should print "success" else not...
I tried many ways, it will print only if the there is a correct username and password... else it shows nothing.
It does not show can't log in or error. It only exits with 0 error without any display.
#!/usr/bin/python
import pymysql
# Open database connection
db = pymysql.connect("localhost","root","","text" )
# prepare a cursor object using cursor() method
cursor = db.cursor()
eg = input("What's your user name? ")
password = input("What's your user password? ")
sql = ("SELECT * FROM login WHERE name = '%s' AND password= '%s' " % (eg,password))
try:
# Execute the SQL command
cursor.execute(sql)
results = None
# Fetch all the rows in a list of lists.
results = cursor.fetchall()
try:
if results is not None:
for row in results:
id = row[0]
name = row[1]
password = row[2]
print("login success")
except:
print("Error you arenot in the planet dude sign up first")
except:
print("Error you arenot in the planet dude sign up first")
# disconnect from server
db.close()
Something seems off with the indentation, try the following segment:
try:
# Execute the SQL command
cursor.execute(sql)
results = None
# Fetch all the rows in a list of lists.
results = cursor.fetchall()
try:
if results is not None :
for row in results:
id = row[0]
name = row[1]
password= row[2]
print ("login success")
except:
print("Error you arenot in the planet dude sign up first")
except:
print ("Error you are not in the planet dude sign up first")
for row in results:
id = row[0]
name = row[1]
password= row[2]
print ("login success")
Should be
success = False
for row in results:
db_id = row[0]
db_name = row[1]
db_password= row[2]
if db_name == eg and db_password == password:
success = True
print ("login success")
The variables created from row[1] and row[2] need to be different from the eq and password variables created above. Once that is done, you just need to compare them to see if they match up.
You can check the success variable later on to perform an action only if the login process was successful or not successful.
I have a python loop, which uses selenium to get some data from a website and store that in a SQL Database. At the beginning every loop takes about one second but after some time it is slowing down more and more... I think the problem is a memory problem but I don't know how to solve it.
This is my code:
count = 0
driver = webdriver.PhantomJS()
driver.set_window_size(1120, 550)
con = sql.connect(user="user", passwd="passwd", db="db", host="localhost")
cur = con.cursor()
def create():
if random.random() < 0.5:
driver.get('http://www.example.com/w')
else:
driver.get('http://www.example.com/p')
name = driver.find_element_by_xpath("//div[#class='address']/h3").text
name1 = name.split(" ")[0]
name2 = name.split(" ")[1]
test = driver.find_element_by_xpath("//div[#class='adr']").text
test2 = test.replace("\n", " ")
dd = driver.find_element_by_xpath("(//dl[#class='dl-horizontal')[1]/dd").text
dd2 = driver.find_element_by_xpath("(//dl[#class='dl-horizontal'])[2]/dd/a").text
day = driver.find_element_by_xpath("(//dl[#class='dl-horizontal'])[5]/dd").text
i = "','"
try:
values = unidecode("'" + name1 + i + name2 + i + dd + i + dd2 + i + day + i + test2 + "'")
cur.execute("INSERT INTO accounts (name1,name2,dd,dd2,day,test2) VALUES (" + values + ")")
con.commit()
global anzahl
anzahl += 1
sys.stdout.write('.')
sys.stdout.flush()
gc.collect()
except sql.Error as e:
print("Error %d: %s" % (e.args[0], e.args[1]))
gc.collect()
start = time.time()
for _ in range(200):
create()
cur.close()
con.close()
end = time.time()
I don't see anything what yould slow down the loop. I tried gc.collect() but it doesn't change anything.
What can i do to that my loop does not slow down after a some time?
Things that can slow down you code:
The web server, which can reduce bandwidth to prevent DoS,
You driver object?
Network (Database) can be slow,
I/O access (with sys.stdout.write and print), depending on the real stream. Is it a console?
I'm using an open source piece of python code that basically pulls in a location of an entity and saves these details to a DB in real time. lets call it scanner the scanner program. DB file it saves it to is a sqlite file: db.sqlite.
As this is happening my piece of code in question is searching the db file every 45 seconds performing a select statement to find a certain value. This will work a couple of times but after running for a couple of minutes concurrently with the scanner program they run into a DB lock error:
sqlite3.OperationalError: database is locked
So what can I do to my code to ensure this lock does not happen. I cannot change how the scanner program accesses the DB. Only my program.
Any help here would be great. I've seen timeouts mentioned along with threading but I am not sure on either.
from datetime import datetime
import sqlite3
import time
import json
import tweepy
def get_api(cfg):
auth = tweepy.OAuthHandler(cfg['consumer_key'], cfg['consumer_secret'])
auth.set_access_token(cfg['access_token'], cfg['access_token_secret'])
return tweepy.API(auth)
# Fill in the values noted in previous step here
cfg = {
"consumer_key" : "X",
"consumer_secret" : "X",
"access_token" : "X",
"access_token_secret" : "X"
}
with open('locales/pokemon.en.json') as f:
pokemon_names = json.load(f)
currentid = 1
pokemonid = 96 #test
while 1==1:
conn = sqlite3.connect('db.sqlite')
print "Opened database successfully";
print "Scanning DB....";
time.sleep(1)
cur = conn.execute("SELECT * FROM sightings WHERE pokemon_id = ? and id > ?", (pokemonid, currentid))
row = cur.fetchone()
if row is None:
print "No Pokemon Found \n "
time.sleep(1)
while row is not None:
#get pokemon name
name = pokemon_names[str(pokemonid)]
#create expiry time
datestr = datetime.fromtimestamp(row[3])
dateoutput = datestr.strftime("%H:%M:%S")
#create location
location = "https://www.google.com/maps/place/%s,%s" % (row[5], row[6])
#inform user
print "%s found! - Building tweet! \n" % (name)
time.sleep(1)
#create tweet
buildtweet = "a wild %s spawned in #Dublin - It will expire at %s. %s #PokemonGo \n "%(name, dateoutput, location)
#print tweet
#log
print buildtweet
currentid = row[0]
time.sleep(1)
#send tweet
api = get_api(cfg)
tweet = buildtweet
try:
status = api.update_status(status=tweet)
print "sent!"
except:
pass
print "this tweet failed \n"
time.sleep(30)
row = cur.fetchone()
cur.close()
conn.close()
print "Waiting..... \n "
time.sleep(45)
conn.close()