How can i change chrome history with sqlite3 python? - python

This is my code
import sqlite3
conn = sqlite3.connect("path to chrome history")
c = conn.cursor()
c.executemany("UPDATE urls SET url = REPLACE(url,.foo.,.foo-bar.) WHERE url LIKE %foo%;")
conn.close()
It throws the following error:
c.executemany("UPDATE urls SET url = REPLACE(url,.foo.,.foo-bar.) WHERE url LIKE %foo%;")
TypeError: executemany expected 2 arguments, got 1
How can I change the history in Google Chrome using sqlite3 in Python?

I had some time to look at this after lunch and this is what I hobbled together. Use at your own risk (make a backup of the "History" file before running this).
import sqlite3
conn = sqlite3.connect(r"path to chrome history")
c = conn.cursor()
for url in c.execute("SELECT * FROM urls WHERE url LIKE '%foo%'"):
# Execute returns a tuple. Need to convert to list to edit.
# Then convert back to tuple for use in f-string
url = list(url)
url[1] = url[1].replace("foo", "foo-bar") # String replace
url = tuple(url)
c.execute(f"REPLACE INTO urls VALUES {url}")
c.close()
conn.commit()
conn.close()
Note: Chrome has to be closed for this to run else the file will be locked.

Related

Can't store a pdf file in a MySql table

I need to store a pdf file in MySql. Whether I use escape_string or not, I always get the same error
b_blob = open(dir + fname_only, "rb")
myblob = b_blob.read() ####<- b'%PDF-1.4\n%\xaa\xab\xac\xad\n4 0 obj\n<<\n/Producer (Apache FOP Version 0.94)\
try:
conn = mysql.connector.connect( usual stuff )
cursor =conn.cursor(buffered=True, dictionary=True)
newblob = conn._cmysql.escape_string(myblob)
query = """INSERT INTO `mytable` (`storing`) VALUES('%s')""" %(newblob)
cursor.execute(query)
except Exception as exc:
Functions.error_handler(exc);
return
b_blob.close()
...MySQL server version for the right syntax to use near '\n%\xaa\xab\xac\xad\n4 0 obj\n<<\n/Producer (Apache FOP Version 0.94)\n/Creation' at line 1
So it looks like your problem is arriving from the quotes at the start of your string. I would consider putting double quotes around the newblob variable. Should look like this.
query = """INSERT INTO `mytable` (`storing`) VALUES("%s")""" %(newblob)

API Calls bombs in loop when object not found

My code is erroring out when the object "#odata.nextLink" is not found in the JSON. I thought the while loop was supposed to account for this? I apologize if this is rudimentary but this is my first python project, so i dont know the stuffs yet.
Also, for what it is worth, the api results are quite limited, there now "total pages" value I can extract
# Import Python ODBC module
import pyodbc
import requests
import json
import sys
cnxn = pyodbc.connect(driver="{ODBC Driver 17 for SQL Server}",server="theplacewherethedatais",database="oneofthosedbthings",uid="u",pwd="pw")
cursor = cnxn.cursor()
storedProc = "exec GetGroupsAndCompanies"
for irow in cursor.execute(storedProc):
strObj = str(irow[0])
strGrp = str(irow[1])
print(strObj+" "+strGrp)
response = requests.get(irow[2], timeout=300000, auth=('u', 'pw')).json()
data = response["value"]
while response["#odata.nextLink"]:
response = requests.get(response["#odata.nextLink"], timeout=300000, auth=('u', 'pw')).json()
data.extend(response["value"])
cnxn.commit()
cnxn.close()
You can use the in keyword to test if a key is present:
while "#odata.nextLink" in response:

PyMySQL with Python 3.5 - selecting into pandas dataframe with the LIKE clause fails due to escape characters?

I am using PyMySQL to fetch some data from the MySQL DB into pandas dataframe. I need to run a select with the LIKE clause, but seems like PyMySQL does something weird with the select statement and doesn't like when one has % in the query:
#connection to MySQL
engine = create_engine('mysql+pymysql://user:password#localhost:1234/mydb', echo=False)
#get descriptions we want
decriptions = pd.read_sql(sql=r"select content from listings where content not like '%The Estimate%'", con = engine)
I get error:
ValueError: unsupported format character 'T' (0x54) at index 54
Any advice on how to get around this?
Try using %%
decriptions = pd.read_sql(sql=r"select content
from listings where content not like '%%The Estimate%'", con = engine)

Difficulty inserting data into MySQL db using pymysql

I have written a little script using python3 that gets a RSS news feed using the feedparser library.
I then loop through the entries (dictionary) and then use a try/except block to insert the data into a MySQL db using pymysql (originally I tried to use MySQLDB but read here and other places that is does not work with Python3 or above)
I originally followed the PyMySQL example on git hub, however this did not work for me and I had to use different syntax for pymysql like they have here on digital ocean. However this worked for me when I tested out their example on their site.
But when I tried to incorporate it into my query,there was an error as it would not run the code the try block and just ran the exception code each time.
Here is my code;
#! /usr/bin/python3
# web_scraper.py 1st part of the project, to get the data from the
# websites and store it in a mysql database
import cgitb
cgitb.enable()
import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()
def adbNews():
url = 'http://feeds.feedburner.com/adb_news'
d = feedparser.parse(url)
articles = d['entries']
for article in articles:
dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
try:
sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s,%s)"
c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
conn.commit()
except Exception:
print("Not working")
adbNews()
I am not entirely sure what I am doing wrong. I have converted the string so that it is the format for the MySQL DATETIME type. As I originally did not have this but each time I run the program nothing gets stored in the db and the exception gets printed.
EDIT:
After reading Daniel Roseman's comments I removed the try/except block and read the errors that python gave me. It was to do with an extra argument in my sql query.
Here is he edited working code;
#! /usr/bin/python3
# web_scraper.py 1st part of the project, to get the data from the
# websites and store it in a mysql database
import cgitb
cgitb.enable()
import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()
def adbNews():
url = 'http://feeds.feedburner.com/adb_news'
d = feedparser.parse(url)
articles = d['entries']
for article in articles:
dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
#extra argument was here removed now
sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s)"
c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
conn.commit()
adbNews()

MySQL connection/query make file not work

I've got some test code I'm working on. In a separate HTML file, a button onclick event gets the URL of the page and passes it as a variable (jquery_input) to this python script. Python then scrapes the URL and identifies two pieces of data, which it then formats and concatenates together (resulting in the variable lowerCaseJoined). This concatenated variable has a corresponding entry in a MySQL database. With each entry in the db, there is an associated .gif file.
From here, what I'm trying to do is open a connection to the MySQL server and query the concatenated variable against the db to get the associated .gif file.
Once this has been accomplished, I want to print the .gif file as an alert on the webpage.
If I take out the db section of the code (connection, querying), the code runs just fine. Also, I am successfully able to execute the db part of the code independently through the Python shell. However, when the entire code resides in one file, nothing happens when I click the button. I've systematically removed the lines of code related to the db connection, and my code begins stalling out at the first line (db = MySQLdb.connection...). So it looks like as soon as I start trying to connect to the db, the program goes kaput.
Here is the code:
#!/usr/bin/python
from bs4 import BeautifulSoup as Soup
import urllib
import re
import cgi, cgitb
import MySQLdb
cgitb.enable() # for troubleshooting
# the cgi library gets the var from the .html file
form = cgi.FieldStorage()
jquery_input = form.getvalue("stuff_for_python", "nothing sent")
# the next section scrapes the URL,
# finds the call no and location,
# formats them, and concatenates them
content = urllib.urlopen(jquery_input).read()
soup = Soup(content)
extracted = soup.find_all("tr", {"class": "bibItemsEntry"})
cleaned = str(extracted)
start = cleaned.find('browse') +8
end = cleaned.find('</a>', start)
callNo = cleaned[start:end]
noSpacesCallNo = callNo.replace(' ', '')
noSpacesCallNo2 = noSpacesCallNo.replace('.', '')
startLoc = cleaned.find('field 1') + 13
endLoc = cleaned.find('</td>', startLoc)
location = cleaned[startLoc:endLoc]
noSpacesLoc = location.replace(' ', '')
joined = (noSpacesCallNo2+noSpacesLoc)
lowerCaseJoined = joined.lower()
# the next section establishes a connection
# with the mySQL db and queries it
# using the call/loc code (lowerCaseJoined)
db = MySQLdb.connect(host="localhost", user="...", "passwd="...",
db="locations")
cur = db.cursor()
queryDb = """
SELECT URL FROM locations WHERE location = %s
"""
cur.execute(queryDb, lowerCaseJoined)
result = cur.fetchall()
cur.close()
db.close()
# the next 2 'print' statements are important for web
print "Content-type: text/html"
print
print result
Any ideas what I'm doing wrong?
I'm new at programming, so I'm sure there's a lot that can be improved upon here. But prior to refining it I just want to get the thing to work!
I figured out the problem. Seems that I had quotation mark before the password portion of the db connection line. Things are all good now.

Categories