I am creating a twitter bot that searches twitter for ticker symbols. I am then sending them to a database. The problem that I have here is that the list that is getting sent is not changing.
Maybe I have to keep connecting to the database, but I can not figure out where my problem is. Can anyone figure out how to make make the list of tickers be different everytime?
def searchTwit():
tweets = api.search("#stocks",count=100)
return tweets
print("connecting to database")
#connecting to the database
conn = pyodbc.connect(
"Driver={SQL Server};"
"Server=..............;"
"Database=master;"
"Trusted_Connection=yes;")
cursor = conn.cursor()
tickList=[]
def getTicker(tweets):
for tweet in tweets:
if "$" in tweet.text:
x = tweet.text.split()
for i in x:
if i.startswith("$") and i[1].isalpha():
i.strip(".")
i.upper()
tickList.append(i)
# print(var_string)
def retrieveTickers():
for i in tickList:
cursor.execute('INSERT INTO master.dbo.TickerTable (TickerName) VALUES (?);', (i))
conn.commit()
# thing to run
print("about to do while ")
while True:
sleep(60 - time() %60)
print("searchtwit")
searchTwit()
theTweets = searchTwit()
getTicker(theTweets)
print("getting Tickers")
retrieveTickers()
print("sending tickers")
print(tickList)
tickList=[]
print(tickList)
You can connect to a remote database or on your local machine. Define which database you want to use, so in your database server be 127.0.0.1:PORT (that means that the database is your machine) (THE PORT will change depending on which SGDB you want
Related
I try to update my sqlite database using flask webhook.
It seems commands line work fine if I type manually in the python console but my flask webhook didn't update my SQLite database. It seems the apps fail at the "cursor.execute()" line.
here is my webhook code:
#app.route('/trendanalyser', methods=['POST'])
def trendanalyser():
data = json.loads(request.data)
if data['passphrase'] == config.WEBHOOK_PASSPHRASE:
#Init update variables
tastate = data['TrendAnalyser']
date_format = datetime.today()
date_update = date_format.strftime("%d/%m/%Y %H:%M:%S")
update_data = ((tastate), (date_update))
#Database connection
connection = sqlite3.connect('TAState15min.db')
cursor = connection.cursor()
#Database Update
update_query = """Update TrendAnalyser set state = ?, date = ? where id = 1"""
cursor.execute(update_query, update_data)
connection.commit()
return("Record Updated successfully")
cursor.close()
else:
return {"invalide passphrase"}
Can you please tell me what's wrong with my code ?
if it's can help, here is my database structure (my db creation):
#Database connection
conn = sqlite3.connect("TAState15min.db")
cursor = conn.cursor()
#Create table
sql_query = """ CREATE TABLE TrendAnalyser (
id integer PRIMARY KEY,
state text,
date text
)"""
cursor.execute(sql_query)
#Create empty row with ID at 1
insert_query = """INSERT INTO TrendAnalyser
(id, state, date)
VALUES (1, 'Null', 'Null');"""
cursor.execute(insert_query)
conn.commit()
#Close database connexion
cursor.close()
**I finally found the issue, webhooks need the full path to the SQLite database to work fine. I just start to code in python, it was a noob issue... **
I finally found the issue, webhooks need the full path to the SQLite database to work fine. I just start to code in python, it was a noob issue...
I have a personal project to create a Telegram bot using python. What I want is to reply to any question with a dynamic answer generate from a database query. I don't want to create data query for every request from bot, so my idea is to generate a set of data (data frame) and then bot can take the answer from there. To generate the data frame, I want to schedule/reload the part of the querying script for every x minutes. My goal is to create Python script which can reload only on querying data without reloading the whole script. is there any ways to do this?
Sample code:
tt = datetime.now()
dsn_tns = cx_Oracle.makedsn(----)
conn = cx_Oracle.connect(user=----, password=----, dsn=dsn_tns)
cursor = conn.cursor()
sql = ("""select *
from TABLE
WHERE REPORTDATE > to_date(:tt,'DD-MM-YYYY HH24:MI:SS')""")
param = {"tt": tt}
data = psql.read_sql(sql,conn)#,params = param)
conn.close()
x = 2314 #value from question via bot
answer = data[(data['number'] == x))]
The part I want to reload regularly is from tt until conn.close().
I'm not sure why you don't want to rerun the query for each bot request, this would make more sense. It seems like you could also have missing data if you do not update your data for each bot request.
However, you can just wrap the code between tt and conn.close() in a function which you can set to run periodically.
def update_data()
global data
tt = datetime.now()
dsn_tns = cx_Oracle.makedsn(----)
conn = cx_Oracle.connect(user=----, password=----, dsn=dsn_tns)
cursor = conn.cursor()
sql = ("""select *
from TABLE
WHERE REPORTDATE > to_date(:tt,'DD-MM-YYYY HH24:MI:SS')""")
param = {"tt": tt}
data = psql.read_sql(sql,conn)#,params = param)
conn.close()
For our semester project we need to insert data from a stream that's updated every 3,5s into 2 separate SQL tables and this has to be done with a Python script.
The Idea is to have 2 rooms (Office, Server) that each have a combined temperature and humidity sensor while the server room has a smoke detector too.
The data is send from a Arduino over USB and the data stream looks like this:
Server:61.20,22.70,221.00Office:64.00,23.00
The Python script I've managed to cobble together looks like this:
import serial
import time
import mysql.connector
mydb = mysql.connector.connect(
host ="127.0.0.1",
user ="root",
password ="",
database ="messwerte"
)
mycursor = mydb.cursor()
device = 'COM3'
try:
arduino = serial.Serial(device, 9600)
except:
print("Error: {}".format()),device;
while True:
try:
time.sleep(2)
data = arduino.readline()
print.data
pieces = data.split(" ")
try:
mycursor.execute("INSERT INTO dht11serial (humidity,temperature,CO2) VALUES (%s,%s,%s)", (pieces[0],pieces[1],pieces[2]))
mydb.commit()
mycursor.close()
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
except:
print("Error: {}".format(err))
Now I would need to insert the values for each room into a SQL table that corresponds to that room but how could I manage that?
Please keep in mind that I have actually no idea what I'm doing. It's my very first time doing anything with Python or SQL.
I created this script based on the data stream you posted. I used the same configuration for the arduino connection. The script will generate the database file and the two tables if they do not exists, not need to worry about losing data each time you run the script.
It should work right away (hope so), if not then you only need to adjust the data stream, i put some comments in the code to help you out. Basically, you just need to create two tables inside the same database and same the data twice.
I changed SQL motor to sqlite3, since data will be stored local.
Put the code inside a .py file and run it! Tell me if it works.
import serial
import time
import sqlite3
#Database file is stored in the same folder where script is located.
#File will created itself if not exists
conn = sqlite3.connect("./data.sqlite3")
cursor = conn.cursor()
#Server table inside data.sqlite3
server = """CREATE TABLE IF NOT EXISTS server (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Office table inside data.sqlite3
office = """CREATE TABLE IF NOT EXISTS office (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Creates tables inside database, nothing will happen if tables already exists
cursor.execute(server)
#You can comment out these lines, use them only if you are creating a new file
cursor.execute(office)
conn.commit()
#Arduino conection
#App makes no sense without this, allow it to crash if there is no connection.
arduino = serial.Serial('COM3', 9600)
#Forever
while True:
#Catching error inside a try-except loop will keep these going forever
#Even if arduino connection gets lost
#Rest 3.5 each loop
time.sleep(3.5)
try:
#Server:61.20,22.70,221.00Office:64.00,23.00,220.00
data = arduino.readline()
#Remove Server: and Office: from data string
#You need to leave a comma where Office: is located to separate both values
cleaned_data = data.replace("Server:","").replace("Office:",",")
#Cleaned data looks like this: 61.20,22.70,221.00,64.00,23.00,220.00
#We know first 3 values are from server, last 3 are from office
#Split the string in elements using commas
pieces = cleaned_data.split(",")
#[ 61.20, 22.70, 221.00, 64.00, 23.00, 220.00 ]
print(pieces)
#Send data to server table
cursor.execute("INSERT INTO server (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[0],pieces[1],pieces[2]))
#Send data to office table
cursor.execute("INSERT INTO office (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[3],pieces[4],pieces[5]))
#Write to database confirmation
conn.commit()
#There is no need to close connection to database
#Prints out what caused the error
except Exception as e:
print(e)
I have been trying to insert data into my MongoDB collection but it's not working:
try:
client = MongoClient(uri,
connectTimeoutMS=30000,
socketTimeoutMS=None)
print("Connection successful")
print()
except:
print("Unsuccessful")
print(client)
print()
db = client["<database>"]
collection = db["<collection>"]
print(db)
print()
print(collection)
print()
doc = {"test": "success"}
collection.insert_one(doc)
print("success")
The URI variable is my connection string copied from MongoDB.
Everything works fine, even the the db and collection variables print out fine until I get to the line: collection.insert_one(doc)
When I run, it just stops at that line and then I get a timeout error after a while. I am using the latest versions of Python and Pymongo
So I resolved the issue:
1) I needed to configure the whitelist entries. (https://docs.atlas.mongodb.com/security-whitelist/)
2) I needed to get off of my Universities Wifi because they block certain things.
There might be few things at play
check if the URL is correct
check if you write rights on the DB
from pymongo import MongoClient
try:
client = MongoClient(uri,
connectTimeoutMS=30000,
socketTimeoutMS=None)
print("Connection successful")
except:
print("Unsuccessful")
db = client["<database>"]
doc = {"test": "success"}
db[collectionName].insert_one(doc)
I'm using Visual Studio 2017 with a Python Console environment. I have a MySQL database set up which I can connect to successfully. I can also Insert data into the DB. Now I'm trying to display/fetch data from it.
I connect fine, and it seems I'm fetching data from my database, but nothing is actually printing to the console. I want to be able to fetch and display data, but nothing is displaying at all.
How do I actually display the data I select?
#importing module Like Namespace in .Net
import pypyodbc
#creating connection Object which will contain SQL Server Connection
connection = pypyodbc.connect('Driver={SQL Server};Server=DESKTOP-NJR6F8V\SQLEXPRESS;Data Source=DESKTOP-NJR6F8V\SQLEXPRESS;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=True;ApplicationIntent=ReadWrite;MultiSubnetFailover=False')
cursor = connection.cursor()
SQLCommand = ("SELECT ID FROM MyAI_DB.dbo.WordDefinitions WHERE ID > 117000")
#Processing Query
cursor.execute(SQLCommand)
#Commiting any pending transaction to the database.
connection.commit()
#closing connection
#connection.close()
I figured it out. I failed to include the right Print statement. Which was:
print(cursor.fetchone())
I also had the connection.commit statement in the wrong place (it was inserted even executing the Print statement). The final code that worked was this:
#importing module Like Namespace in .Net
import pypyodbc
#creating connection Object which will contain SQL Server Connection
connection = pypyodbc.connect('Driver={SQL Server};Server=DESKTOP-NJR6F8V\SQLEXPRESS;Data Source=DESKTOP-NJR6F8V\SQLEXPRESS;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=True;ApplicationIntent=ReadWrite;MultiSubnetFailover=False')
cursor = connection.cursor()
SQLCommand = ("SELECT * FROM MyAI_DB.dbo.WordDefinitions")
#Processing Query
cursor.execute(SQLCommand)
#Commiting any pending transaction to the database.
print(cursor.fetchone())
connection.commit()
#closing connection
#connection.close()