Unable to insert data to MySQL with Python/MySQL Connector - python

I'm new to Python (learnt how to code with it in 2 days ago). I'm trying to get feeds from MySQL database and insert theme into other table. But nothing inserted.
Here is my code:
cnx = MySQLConnection(**db_config)
if cnx.is_connected():
print("Database connected successfully...")
cursor = cnx.cursor(dictionary=True)
cursor.execute("SELECT * from external_feeds WHERE discipline = 'ALL' AND actif = 1")
rows = cursor.fetchall()
insert_feed = ("INSERT INTO feeds "
"(categorie, urlflux, titreflux, photonews, textnews, date, titrenews, liensnews, slug, photo)"
"VALUES(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)")
for row in rows:
feed = feedparser.parse(row["url"])
feed_link = row["url"]
name = row["name"]
image = row["photo"]
category = row["discipline"]
x = len(feed.entries)
for i in range(x):
feed_title = feed.entries[i].title
print feed_title
feed_url = feed.entries[i].link
print feed_url
feed_published = feed.entries[i].published
dPubPretty = strftime(feed_published, gmtime())
feed_description = feed.entries[i].description
slug = re.sub('[^a-zA-Z0-9 \n\-]', '', feed_url)
slug = slug.replace('httpwww', '')
slug = slug.replace('http', '')
# print insert_feed
data_feed = (category, feed_link, name, None, feed_description, dPubPretty, feed_title, feed_url, slug, image)
try:
cursor.execute(insert_feed, data_feed)
cursor.commit()
except:
cnx.rollback()
cursor.close()
Is there anyone who can help me figure out where the problem is? I am completly new to this so I'm totally lost

I see that you are performing 'cursor.commit()' after inserting the data, which is incorrect, try using 'cnx.commit()'.

Related

Inserting limited data into database using python

How can I put from how much to how much data to enter into my mysql database, api limits me to 1000 per day. So I would like if I enter 1000 for one session so that I can continue from 1000 and up and so on until I enter everything
mysql=mysql.connector.connect(
host = host_input,
user = user_input,
passwd = passwd_input,
database = database_input
)
with open('sadrzaj.json') as json_file:
data = json.load(json_file)
for p in data['filmovi']:
naslov = p['naslov']
iframe = p['iframe']
opis = p['opis']
movie = GetMovie(title=naslov, api_key='5f8abea5', plot='full')
info = movie.get_data('Title', 'imdbRating', 'Genre', 'Year', 'Runtime', 'Country', 'Plot', 'Poster', 'Type', 'Language')
imdbRating = info["imdbRating"]
genre = info["Genre"]
year = info["Year"]
runtime = info["Runtime"]
country = info["Country"]
poster = info["Poster"]
typee = info['Type']
language = info["Language"]
mycursor = mysql.cursor()
sql = "INSERT INTO serije_filmovi(naslov, iframe, opis, imdbRating, genre, years, runtime, country, poster, typee, language) values(%s,%s, %s, %s, %s, %s,%s, %s, %s, %s, %s)"
val = (naslov, iframe, opis, imdbRating, genre, year, runtime, country, poster, typee, language)
print(naslov)
mycursor.execute(sql,val)
mysql.commit();

How to solve pymysql.err.programmingError during upload using pymysql

I want to create a dataframe and update it to mysql.
If there is a duplicate key, it will be updated and if there is no duplicate key, it will be inserted.
user = 'test'
passw = '...'
host = '...'
port = '...'
database = '...'
conn = pymysql.connect(host=host,
port=port,
user=user,
password=passw,
database=database,
charset='utf8')
curs = conn.cursor()
data = list(dataframe.itertuples(index=False, name=None))
sql = "insert into naversbmapping(brand, startdate, enddate, cost, daycost) values (%s, %s, %s, %s, %s) on duplicate key update brand = %s, startdate = %s, enddate = %s, cost = %s, daycost = %s"
curs.executemany(sql, data)
conn.commit()
conn.close()
However, I get the following error. How do I fix it?
pymysql.err.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s, startdate = %s, enddate = %s, cost = %s, daycost = %s' at line 1")
)
You use following MySQL constriuct so that you don't need the data twice as you have the double number of values on your original, but are only sending it once
$sql = "INSERT INTO naversbmapping(brand, startdate, enddate, cost, daycost) VALUES (%s, %s, %s, %s, %s) ON DUPLICATE KEY UPDATE brand = VALUES(brand), startdate = VALUES(startdate), enddate = VALUES(enddate), cost = VALUES(cost), daycost = VALUES(daycost)")

Python query runs but doesnt insert into mysql

I have bad a python script that connects to a db gets information and then uses a few variables to input this information back into the db. I have tweaked the code until it runs with no errors but in the end it doesnt actually insert anything into the db. Here is the insert code i am using
company_name = input("what is the company name?: ")
host=("")
ts = time.time()
timestamp = datetime.datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
#Start DB connection
conn = mysql.connector.connect (user=dbUser,password=dbPassword,host=host,buffered=True)
cursor = conn.cursor()
#Print Database Information
select_db = ("")
cursor.execute(select_db)
api_keys = ("select * from api_keys order by API_KEYS_ID desc limit 1")
cursor.execute(api_keys)
#print(api_keys)
#for (api_keys) in cursor:
# print(api_keys[0])
#api_key_id = api_keys[0] + 1
#print(api_key_id)
api_keys_id = 170
##
add_api_key = ("INSERT INTO api_keys(api_keys_id, api_key, status, email, ip_address, filter_query, create_date, description) \
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)", (api_keys_id , api_token , 'A' , email , '*' , '(OwnerPartyID:' + ownerPartyID + ') AND (SalesStatus:' + salesStatus + ')' , timestamp , company_name + ' key'));
cursor.execute(*add_api_key)
conn.commit()
print(add_api_key)

How to get web scraped data from Python/Beautiful Soup into a MySQL database

Although I am getting over 10 items as results in Python, right now I am only able to get the last product to appear in my MySQL database (with an id of 12 along with its information like price, picture, etc). I need to fix it so that they all appear and not just one product.
Python code is below.
import requests
from bs4 import BeautifulSoup
import mysql.connector
url = 'https://www.newegg.com/Video-Cards-Video-Devices/Category/ID-38?Tpk=graphics%20card'
source = requests.get(url).text
soup = BeautifulSoup(source, 'lxml')
conn = mysql.connector.connect(host='127.0.0.1', user='x', database='scrape',password="x")
cursor = conn.cursor()
item_container = soup.find_all('div', class_='item-container')
def get_data():
lists = []
for index, item_name in enumerate(item_container):
name = item_name.find_all('a', class_='item-title')[0].text
lists.append({'name': name})
lists[index]['index'] = index
for index, item_price in enumerate(item_container):
price = item_price.find('li', class_='price-current').find('strong')
if price == None:
price == ('Not Available')
lists[index]['price'] = price
else:
price = ('$' + price.text +'.99')
prices = []
lists[index]['price'] = price
for index, item_picture in enumerate(item_container):
picture = 'http:' + item_picture.find('img', class_='lazy-img')['data-src']
lists[index]['picture'] = picture
for index, item_shipping in enumerate(item_container):
shipping = (item_shipping.find('li', class_='price-ship').text).strip()
lists[index]['shipping'] = shipping
def create_table():
val_index = lists[index]['index']
val_name = lists[index]['name']
val_picture = lists[index]['picture']
val_price = lists[index]['price']
val_shipping = lists[index]['shipping']
add_item = ("INSERT INTO newegg "
"(id, itemname, itempic, itemprice, itemshipping) "
"VALUES (%s, %s, %s, %s, %s)")
data_item = (val_index, val_name, val_picture, val_price, val_shipping)
cursor.execute("DELETE FROM newegg ")
conn.commit()
cursor.execute(add_item, data_item)
conn.commit()
cursor.close()
conn.close()
create_table();
get_data()
So the main thing that needs fixing is create_table(). We don't want it to be deleting the database contents right before inserting an item. Also, we need to loop over all of the items in your lists. I would do that this way.
def create_table():
cursor.execute("DELETE FROM newegg ")
conn.commit()
for product in lists:
val_index = product['index']
val_name = product['name']
val_picture = product['picture']
val_price = product['price']
val_shipping = product['shipping']
add_item = ("INSERT INTO newegg "
"(id, itemname, itempic, itemprice, itemshipping) "
"VALUES (%s, %s, %s, %s, %s)")
data_item = (val_index, val_name, val_picture, val_price, val_shipping)
cursor.execute(add_item, data_item)
conn.commit()
Notice, create_table() also no longer closes the connection for you. I would recommend closing the connection in the same scope where you initialized it (in this case, the global scope). Function create_table() doesn't "own" the connection resource so it should not be allowed to destroy it. Though it would make perfect sense to both initialize and destroy the connection inside of the function.
Also, note that this will clear out your table every time you do the scraping. This might be fine, but if you want to change your ids over time, don't delete at the beginning, and get your id column to auto increment or something.

Python to MySQLdb will not pass variables I think I have tried everything

I am trying to store some TV information in a MySQLdb. I have tried about everything and I cannot get the variables to post. There is information in the variables as I am able to print the information.
My Code:
import pytvmaze
import MySQLdb
AddShow = pytvmaze.get_show(show_name='dexter')
MazeID = AddShow.maze_id
ShowName = "Show" + str(MazeID)
show = pytvmaze.get_show(MazeID, embed='episodes')
db = MySQLdb.connect("localhost","root","XXXXXXX","TVshows" )
cursor = db.cursor()
for episode in show.episodes:
Show = show.name
ShowStatus = show.status
ShowSummary = show.summary
Updated = show.updated
Season = episode.season_number
Episode = episode.episode_number
Title = episode.title
AirDate = episode.airdate
ShowUpdate = show.updated
EpisodeSummary = episode.summary
try:
sql = "INSERT INTO " + ShowName + " VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s)""" (Show,ShowStatus,ShowSummary,Updated,Season,Episode,Title,AirDate,ShowUpdate,EpisodeSummary)
cursor.execute(sql)
db.commit()
except:
db.rollback()
db.close()
Any thoughts? Thanks in advance.
EDIT - WORKING CODE
import pytvmaze
import MySQLdb
AddShow = pytvmaze.get_show(show_name='dexter')
MazeID = AddShow.maze_id
ShowNameandID = "Show" + str(MazeID)
show = pytvmaze.get_show(MazeID, embed='episodes')
db = MySQLdb.connect("localhost","root","letmein","TVshows" )
cursor = db.cursor()
for episode in show.episodes:
ShowName = show.name
ShowStatus = show.status
ShowSummary = show.summary
Updated = show.updated
Season = episode.season_number
Episode = episode.episode_number
Title = episode.title
AirDate = episode.airdate
ShowUpdate = show.updated
EpisodeSummary = episode.summary
sql = "INSERT INTO " + ShowNameandID + """ VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)"""
cursor.execute(sql, (ShowName, ShowStatus, ShowSummary, Updated, Season, Episode, Title, AirDate, ShowUpdate, EpisodeSummary))
db.commit()
print sql ##Great for debugging
db.close()
First of all, you've actually made things more difficult for yourself by catching all the exceptions via bare try/expect and then silently rolling back. Temporarily remove the try/except and see what the real error is, or log the exception in the except block. I bet the error would be related to a syntax error in the query since you would miss the quotes around the column value(s).
Anyway, arguably the biggest problem you have is how you pass the variables into the query. Currently, you are using string formatting, which is highly not recommended because of the SQL injection attack danger and problems with type conversions. Parameterize your query:
sql = """
INSERT INTO
{show}
VALUES
(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
""".format(show=Show)
cursor.execute(sql, (ShowStatus, ShowSummary, Updated, Season, Episode, Title, AirDate, ShowUpdate, EpisodeSummary))
Note that it is not possible to parameterize the table name (Show in your case) - we are using string formatting for it - make sure you either trust your source, or escape it manually via MySQLdb.escape_string(), or validate it with a separate custom code.

Categories