Difficulty inserting data into MySQL db using pymysql - python

I have written a little script using python3 that gets a RSS news feed using the feedparser library.
I then loop through the entries (dictionary) and then use a try/except block to insert the data into a MySQL db using pymysql (originally I tried to use MySQLDB but read here and other places that is does not work with Python3 or above)
I originally followed the PyMySQL example on git hub, however this did not work for me and I had to use different syntax for pymysql like they have here on digital ocean. However this worked for me when I tested out their example on their site.
But when I tried to incorporate it into my query,there was an error as it would not run the code the try block and just ran the exception code each time.
Here is my code;
#! /usr/bin/python3
# web_scraper.py 1st part of the project, to get the data from the
# websites and store it in a mysql database
import cgitb
cgitb.enable()
import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()
def adbNews():
url = 'http://feeds.feedburner.com/adb_news'
d = feedparser.parse(url)
articles = d['entries']
for article in articles:
dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
try:
sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s,%s)"
c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
conn.commit()
except Exception:
print("Not working")
adbNews()
I am not entirely sure what I am doing wrong. I have converted the string so that it is the format for the MySQL DATETIME type. As I originally did not have this but each time I run the program nothing gets stored in the db and the exception gets printed.
EDIT:
After reading Daniel Roseman's comments I removed the try/except block and read the errors that python gave me. It was to do with an extra argument in my sql query.
Here is he edited working code;
#! /usr/bin/python3
# web_scraper.py 1st part of the project, to get the data from the
# websites and store it in a mysql database
import cgitb
cgitb.enable()
import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()
def adbNews():
url = 'http://feeds.feedburner.com/adb_news'
d = feedparser.parse(url)
articles = d['entries']
for article in articles:
dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
#extra argument was here removed now
sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s)"
c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
conn.commit()
adbNews()

Related

How can i change chrome history with sqlite3 python?

This is my code
import sqlite3
conn = sqlite3.connect("path to chrome history")
c = conn.cursor()
c.executemany("UPDATE urls SET url = REPLACE(url,.foo.,.foo-bar.) WHERE url LIKE %foo%;")
conn.close()
It throws the following error:
c.executemany("UPDATE urls SET url = REPLACE(url,.foo.,.foo-bar.) WHERE url LIKE %foo%;")
TypeError: executemany expected 2 arguments, got 1
How can I change the history in Google Chrome using sqlite3 in Python?
I had some time to look at this after lunch and this is what I hobbled together. Use at your own risk (make a backup of the "History" file before running this).
import sqlite3
conn = sqlite3.connect(r"path to chrome history")
c = conn.cursor()
for url in c.execute("SELECT * FROM urls WHERE url LIKE '%foo%'"):
# Execute returns a tuple. Need to convert to list to edit.
# Then convert back to tuple for use in f-string
url = list(url)
url[1] = url[1].replace("foo", "foo-bar") # String replace
url = tuple(url)
c.execute(f"REPLACE INTO urls VALUES {url}")
c.close()
conn.commit()
conn.close()
Note: Chrome has to be closed for this to run else the file will be locked.

API Calls bombs in loop when object not found

My code is erroring out when the object "#odata.nextLink" is not found in the JSON. I thought the while loop was supposed to account for this? I apologize if this is rudimentary but this is my first python project, so i dont know the stuffs yet.
Also, for what it is worth, the api results are quite limited, there now "total pages" value I can extract
# Import Python ODBC module
import pyodbc
import requests
import json
import sys
cnxn = pyodbc.connect(driver="{ODBC Driver 17 for SQL Server}",server="theplacewherethedatais",database="oneofthosedbthings",uid="u",pwd="pw")
cursor = cnxn.cursor()
storedProc = "exec GetGroupsAndCompanies"
for irow in cursor.execute(storedProc):
strObj = str(irow[0])
strGrp = str(irow[1])
print(strObj+" "+strGrp)
response = requests.get(irow[2], timeout=300000, auth=('u', 'pw')).json()
data = response["value"]
while response["#odata.nextLink"]:
response = requests.get(response["#odata.nextLink"], timeout=300000, auth=('u', 'pw')).json()
data.extend(response["value"])
cnxn.commit()
cnxn.close()
You can use the in keyword to test if a key is present:
while "#odata.nextLink" in response:

LOAD DATA LOCAL INFILE with incremental field

I have multiple unstructured txt files in a directory and I want to insert all of them into mysql; basically, the entire content of each text file should be placed into a row . In MySQL, I have 2 columns: ID (auto increment), and LastName(nvarchar(45)). I used Python to connect to MySql; used LOAD DATA LOCAL INFILE to insert the whole content. But when I run the code I see the following messages in Python console:
.
Also, when I check MySql, I see nothing but a bunch of empty rows with Ids being automatically generated.
Here is the code:
import MySQLdb
import sys
import os
result = os.listdir("C:\\Users\\msalimi\\Google Drive\\s\\Discharge_Summary")
for x in result:
db = MySQLdb.connect("localhost", "root", "Pass", "myblog")
cursor = db.cursor()
file1 = os.path.join(r'C:\\Discharge_Summary\\'+x)
cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test" %(file1,));
db.commit()
db.close()
Can someone please tell me what is wrong with the code? What is the right way to achieve my goal?
I edited my code with:
.....cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test LINES TERMINATED BY '\r' (Lastname) SET id = NULL" %(file1,))
and it worked :)

comparing a given variable to data in a database and checking to see if it exists

I have a sqlite db of API keys and I want to make something check and see if the given key is in the database.I'm generating the API keys using another python script named apikeygen.py. I'm using python 2.7 and pattern 2.6. This is going to be a data scraping/mining/filtering application that I'm doing just for fun and maybe have a future use for malware analysis.
I need help getting the main piece of code that we will call API.py to check and see if the given API key is in the database.
This is the code for the API.py file so far.
import os, sys; sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", ".."))
import sqlite3 as lite
from pattern.server import App
from pattern.server import MINUTE, HOUR, DAY
app = App("api")
def search_db(key=''):
con = lite.connect('apikeys.db')
with con:
cur = con.cursor()
cur.execute("SELECT * FROM keys")
while True:
row = cur.fetchone()
if row == None:
break
print row[2]
I'm still not really clear what you are asking. Why don't you explicitly query for the key, rather than iterating over your whole table?
cur.execute("SELECT * FROM keys WHERE key = ?", (key,))

MySQL connection/query make file not work

I've got some test code I'm working on. In a separate HTML file, a button onclick event gets the URL of the page and passes it as a variable (jquery_input) to this python script. Python then scrapes the URL and identifies two pieces of data, which it then formats and concatenates together (resulting in the variable lowerCaseJoined). This concatenated variable has a corresponding entry in a MySQL database. With each entry in the db, there is an associated .gif file.
From here, what I'm trying to do is open a connection to the MySQL server and query the concatenated variable against the db to get the associated .gif file.
Once this has been accomplished, I want to print the .gif file as an alert on the webpage.
If I take out the db section of the code (connection, querying), the code runs just fine. Also, I am successfully able to execute the db part of the code independently through the Python shell. However, when the entire code resides in one file, nothing happens when I click the button. I've systematically removed the lines of code related to the db connection, and my code begins stalling out at the first line (db = MySQLdb.connection...). So it looks like as soon as I start trying to connect to the db, the program goes kaput.
Here is the code:
#!/usr/bin/python
from bs4 import BeautifulSoup as Soup
import urllib
import re
import cgi, cgitb
import MySQLdb
cgitb.enable() # for troubleshooting
# the cgi library gets the var from the .html file
form = cgi.FieldStorage()
jquery_input = form.getvalue("stuff_for_python", "nothing sent")
# the next section scrapes the URL,
# finds the call no and location,
# formats them, and concatenates them
content = urllib.urlopen(jquery_input).read()
soup = Soup(content)
extracted = soup.find_all("tr", {"class": "bibItemsEntry"})
cleaned = str(extracted)
start = cleaned.find('browse') +8
end = cleaned.find('</a>', start)
callNo = cleaned[start:end]
noSpacesCallNo = callNo.replace(' ', '')
noSpacesCallNo2 = noSpacesCallNo.replace('.', '')
startLoc = cleaned.find('field 1') + 13
endLoc = cleaned.find('</td>', startLoc)
location = cleaned[startLoc:endLoc]
noSpacesLoc = location.replace(' ', '')
joined = (noSpacesCallNo2+noSpacesLoc)
lowerCaseJoined = joined.lower()
# the next section establishes a connection
# with the mySQL db and queries it
# using the call/loc code (lowerCaseJoined)
db = MySQLdb.connect(host="localhost", user="...", "passwd="...",
db="locations")
cur = db.cursor()
queryDb = """
SELECT URL FROM locations WHERE location = %s
"""
cur.execute(queryDb, lowerCaseJoined)
result = cur.fetchall()
cur.close()
db.close()
# the next 2 'print' statements are important for web
print "Content-type: text/html"
print
print result
Any ideas what I'm doing wrong?
I'm new at programming, so I'm sure there's a lot that can be improved upon here. But prior to refining it I just want to get the thing to work!
I figured out the problem. Seems that I had quotation mark before the password portion of the db connection line. Things are all good now.

Categories