sqlalchemy, postgres, docker - python

I am struggling to understand why my query below returns the following error, when I past it directly from postgres where it works fine. I have read putting the table name in quotes but this does not work :/.
from sqlalchemy import create_engine
db_name = 'blahh'
db_user = 'blahhd'
db_pass = 'blhd'
db_host = 'localhost'
db_port = 5432
## Connect to to the database
db_string = 'postgres://{}:{}#{}:{}/{}'.format(db_user, db_pass, db_host, db_port, db_name)
db = create_engine(db_string)
connection = db.connect()
connection.execute('select cake.name, cake.industry, cake.created_at from cake limit 10;')
connection.close()
error:
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedTable) relation "cake" does not exist
LINE 1: ... cake limi...
^

As explained in the SQLAlchemy documentation, you are missing out the connection object. create_engine simply represents a connection resource. You need to connect to it and later close the connection. You may also need to add the method .fetchall() to actually view the list of items from your connection.
If you try something like this:
from sqlalchemy import create_engine
db_name = 'blahh'
db_user = 'blahhd'
db_pass = 'blhd'
db_host = 'localhost'
db_port = 5432
## Connect to to the database
db_string = 'postgres://{}:{}#{}:{}/{}'.format(db_user, db_pass, db_host, db_port, db_name)
db = create_engine(db_string)
connection = db.connect()
connection.execute('select cake.name, cake.industry, cake.created_at from cake limit 10;').fetchall()
connection.close()

Related

is there some way to solve this sql connecting problem(for mysql) in google colab?

way1
from sqlalchemy import create_engine
db = create_engine('mysql://root:' + 'password' + '#ip:3306/dbname')
way2
def db_connecting(db_name):
path = 'file_path' # id, pw, host
f = open(path)
id, pw, host = f.read().split()
db = pymysql.connect(
user = id,
port = 3306,
passwd = pw,
host = host,
db = db_name,
charset = 'utf8',
cursorclass = pymysql.cursors.DictCursor
)
return db
code
import pandas as pd
a = pd.read_sql('select * from comtbl limit 100', db)
print(a)
I tried these way to connect mysql on my local, but it doesn't work in colab only.
When I tried these code on my labtop which have other ip with my local, it works well.
Is there any way to solve this problem?
I have to use google colab on this project.
when I try these in colab
Error code 'OperationalError', and I found that it occurs when connection failed.

Python querying Redshift fails on "Connection reset by peer" but works with dbeaver

I am trying to query my Redshift DB using python,
I tried both of the following:
with sqlalchemy:
connection_string = "redshift+psycopg2://%s:%s#%s:%s/%s" % (USER, PASS, HOST, str(PORT), DATABASE)
engine = sa.create_engine(connection_string)
session = sessionmaker()
session.configure(bind=engine)
sess = session()
sess.execute('SELECT * FROM MY_TABLE LIMIT 1;')
with redshift_connector:
conn = redshift_connector.connect(
host=HOST,
port=PORT,
database=DATABASE,
user=USER,
password=PASS)
cursor = conn.cursor()
cursor.execute('SELECT * FROM MY_TABLE LIMIT 1;')
all_results = cursor.fetchall()
conn.close()
both are returning 'Connection reset by peer' while when I am trying to connected using DBeaver I am able to run this query without any problems
anything I might be missing?

Is it possible to have the postgres database credentials in a separate file and call them in another file using Pycharm?

I am using the community version of pycharm, my hope was to put the sensitive database connection credentials in a separate file in my pycharm project, so if I shared my other files that contain the actual code, they wouldn't have my connection info. Here is what the "connect1.py" file contains:
import psycopg2
# Database Credentials
DB_HOST = "localhost"
DB_NAME = "movie_watchlist1"
DB_USER = "postgres"
DB_PASS = "postgres123"
def database_credentials():
psycopg2.connect(dbname=DB_NAME, user=DB_USER, password=DB_PASS, host=DB_HOST)
pass
Here are the lines in my database.py file that tries to access this information:
import psycopg2
from connect1 import database_credentials
connection = psycopg2.connect(database_credentials())
And here is the error:
TypeError: missing dsn and no parameters
I think the problem is with the "connection = psycopg2.connect(database_credentials())" line, but I haven't been able to figure it out, any help or suggestions would be greatly appreciated.
import psycopg2
# Database Credentials
DB_HOST = "localhost"
DB_NAME = "movie_watchlist1"
DB_USER = "postgres"
DB_PASS = "postgres123"
def database_credentials():
return psycopg2.connect(dbname=DB_NAME, user=DB_USER, password=DB_PASS, host=DB_HOST)
from connect1 import database_credentials
connection = database_credentials()

Connect to MySQL via SSH Tunnelling | MySQL Connection not available

I am using the following python snippet to connect my MySQL database on a shared hosting server.
import mysql.connector
import sshtunnel
with sshtunnel.SSHTunnelForwarder(
('server.web-hosting.com', 21098),
ssh_username = 'ssh_username',
ssh_password = 'ssh_pass!23',
remote_bind_address = ('127.0.0.1', 3306)
) as tunnel:
connection = mysql.connector.MySQLConnection(
user = 'db_user',
password = 'db_pass',
host = '127.0.0.1',
port = tunnel.local_bind_port,
database = 'demo',
)
mycursor = connection.cursor()
query = "SELECT * FROM sample_table"
mycursor.execute(query)
I am getting the following error. I am able to connect to the database using DBeaver though.
MySQL Connection not available.

general connection to a mysql server

Is there a way to make a general connection to a mysql server and not specifically to any one of its databases? I found the following code snippet. The connect method connects to a specific database called employees.
import mysql.connector
cnx = mysql.connector.connect(user='scott', password='tiger', host='127.0.0.1', database='employees')
cnx.close()
Yes, You can make the same connection without specifying the database name:
cnx = mysql.connector.connect(user='scott', password='tiger', host='127.0.0.1')
It would be the same as connecting from the terminal using:
mysql -h 127.0.0.1 -u scott -ptiger
Note: 127.0.0.1 is your localhost.
Also, I usually do not store the actual connection information in the script. I would do something more like this (If you can):
def CloseConnection(cnxIn, cursorIn):
cursorIn.close()
cnxIn.close
return
user = input('Enter your user name: ')
user = user.strip()
password = getpass.getpass()
host = input('Enter the host name: ')
host = host.strip()
cnx = mysql.connector.connect(user=user, password=password, host=host)
cursor = cnx.cursor(buffered=True)
cursor.execute ('select VERSION()')
row = cursor.fetchone()
CloseConnection(cnx, cursor)

Categories