Write Pandas Dataframe to MYSQL database with SSH - python

The Problem
I would like to use the pandas to_sql to write a dataframe to a MYSQL table. However, my connection requires SSH.
What I have tried
I have a successful connection to execute queries with pymysql, but being able to directly use a function like to_sql would make my life a lot easier to be able to directly push the data like that. See below for my code that I'm working with.
from sshtunnel import SSHTunnelForwarder
import pymysql as db
import pandas as pd
import numpy as np
host = 'host'
localhost = 'localhost'
ssh_username = 'ssh_username'
private_key = '/path/'
# database variables
user='user'
password='password'
database='database'
#query function that works for pulling from database
def query(q):
with SSHTunnelForwarder(
(host, 22),
ssh_username=ssh_username,
ssh_private_key=private_key,
ssh_private_key_password="password",
remote_bind_address=(localhost, port)
) as server:
conn = db.connect(host=localhost,
port=server.local_bind_port,
user=user,
passwd=password,
db=database)
return pd.read_sql_query(q, conn)
# What you need to for to_sql
conn = db.connect(host=host,
port=port,
user=user,
password=password,
db=database)
# test df
np.random.seed(0)
number_of_samples = 10
frame = pd.DataFrame({
'feature1': np.random.random(number_of_samples),
'feature2': np.random.random(number_of_samples),
'class': np.random.binomial(2, 0.1, size=number_of_samples),
},columns=['feature1','feature2','class'])
# to_sql
frame.to_sql(con=conn, name='test_table', if_exists='replace', flavor='mysql')
Maybe Something Else?
I'm considering looking into turning a dataframe into a CSV file and then importing it into the database. Please let me know if you have any clue how to use something like to_sql with SSH.

I ended up using local port forwarding
to solve this problem.
This is what I used in the terminal for local port forwarding:
ssh -N -v SSH_user#SSH_host -L3306:127.0.0.1:3306
I used sqlalchemy for the connection:
from sqlalchemy import create_engine
engine = create_engine("mysql://user:password#127.0.0.1:3306/db?charset=utf8"
df.to_sql(con=engine, name='test_table', if_exists='replace')

Related

MySQL connection with Python is not working

I want to connect with MySQL through Python, although I am successful in connecting using directly writing the credentials like user name, passwd, etc, but I want to make it a little dynamic. So I tried:
import mysql.connector
import pandas as pd
def connection():
host=input('Enter host')
user=input('Enter User Name')
passwd=int(input('passwd'))
database=input('database')
mydb= mysql.connector.connect(host="host", user="user", passwd="passwd", database="database")
return mydb
connection()
However, it gives error in the end:
InterfaceError: 2003: Can't connect to MySQL server on 'host:3306' (11001 getaddrinfo failed)
Please help me out. And if you have better solution while bringing some dynamic user input credential, I will be really grateful. Thank you.
You are passing stirng like "host" to the database and not the variables
So remove the double quotes like
import mysql.connector
import pandas as pd
def connection():
host=input('Enter host')
user=input('Enter User Name')
passwd=int(input('passwd'))
database=input('database')
mydb= mysql.connector.connect(host=host, user=user, passwd=passwd, database=database)
return mydb
connection()
In the connect method of the mysql.connector you are hardcoding the values when you pass them as strings.
Use mysql.connector.connect(host=host, user=user, passwd=passwd, database=database) instead.
You're also parsing the password to a int, I don't think mysql will like that, just use it as a string.

Is it possible to locally connect with python to an RDS instance in a VPC?

I have an RDS (Postgres) instance operational in a VPC. I have successfully configured lambda functions to connect, but I would like to be able to connect locally from my computer. Is this something that is possible? I have read about using an EC2 bastion host, but I would really like to just be able to run it without any additional architecture.
I've tried modifying this code from a different SO answer to no avail:
import psycopg2
from sshtunnel import SSHTunnelForwarder
import time
with SSHTunnelForwarder(
('database host', 22),
ssh_password="password",
ssh_username="user",
remote_bind_address=('127.0.0.1', 5432)) as server:
conn = psycopg2.connect(database="db",port=server.local_bind_port)
curs = conn.cursor()
sql = "select * from tableA limit 10;"
curs.execute(sql)
rows = curs.fetchall()
print(rows)

Python MySQL Connection with SSH

Hi I have a shared hosting i bought and it allows for remote MySQL connection only with SSH.
So far I know that it doesn't have any Public or Private Keys..
And here's my connection setup on the MySQL Workbench which works when I try to connect:
I have looked at another stackoverflow question: Here but none of the answers seems to work for me.. :/ I'm really at a dead end and I need to get this to work. Can someone help me out please?
So I figured it out with like a million trial and error:
import pymysql
import paramiko
import pandas as pd
from paramiko import SSHClient
from sshtunnel import SSHTunnelForwarder
ssh_host = '198.54.xx.xx'
ssh_host_port = 21098 #Ur SSH port
ssh_username = "sshuser123" #Change this
ssh_password = "sshpassword123" #Change this
db_user = 'db user' #change this
db_password = 'password123' #change this
db = 'main_db' #The db that the user is linked to
with SSHTunnelForwarder(
(ssh_host, ssh_host_port),
ssh_username=ssh_username,
ssh_password=ssh_password,
remote_bind_address=('127.0.0.1', 3306)) as tunnel:
conn = pymysql.connect(host='127.0.0.1', user=db_user,
passwd=db_password, db=db,
port=tunnel.local_bind_port)
query = '''SELECT * from tablename;'''
data = pd.read_sql_query(query, conn)
print(data)
conn.close()
This is the code you should use if your SSH on MySql doesn't have any Public / Private Key.
Hope this helps anyone facing the same issue!!
Connect to server 198.54.x.240:21098 via ssh with port-forwarding
like ssh -t -gL 33069:localhost:3306 198.54.x.240
in windows use PuTTY, i like KiTTy (fork putty )
add connection and SSH Tunnel look at the pictures
Connect to MySQL via localhost:33069 (answer you know)WorkBench do the same, but 3306 on 3306, if you need more than 1 remote connection best practice forward different porst.

SQLAlchemy through Paramiko SSH

I have a database on a server which I need to access through SSH. Right now I deal with the DB by using the command line to get the data.
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='XX.XX.XX', username='user', password='pass', port = YYY)
query = "mysql -u " + username_sql + " -p" + password_sql +" dbb -e \"" + sql_query + "\""
ssh.exec_command(query.decode('string_escape'))
ssh.close()
Is there a way to do this with SQLAlchemy to be more efficient and so I can work with pandas DataFrames directly?
from sqlalchemy import create_engine
engine = create_engine(
"mysql://username_sql:password_sql#localhost/dbb")
In case there is anyone who's interested in connecting to a remote Postgresql database via SSH and wants to load data into a pandas DataFrame here is how to do it.
Suppose we have installed a postgresql database on a remote server, to which we can ssh by the following parameters.
SSH parameters:
Server's ip: 10.0.0.101
SSH port: 22 (default port for SSH)
Username: my_username
Password: my_password
Database parameters:
Port: 5432 (postgresql default port)
Database name: db
Database user: postgres_user (default username is postgres)
Database password: postgres_pswd (default password is an empty string)
Table with our data: MY_TABLE
Now, we want to connect to this database on our end and load data into a pandas DataFrame:
from sshtunnel import SSHTunnelForwarder
from sqlalchemy import create_engine
import pandas as pd
server = SSHTunnelForwarder(
('10.0.0.101', 22),
ssh_username="my_username",
ssh_password="my_password",
remote_bind_address=('127.0.0.1', 5432)
)
server.start()
local_port = str(server.local_bind_port)
engine = create_engine('postgresql://{}:{}#{}:{}/{}'.format("postgres_user", "postgres_pswd", "127.0.0.1", local_port, "db"))
dataDF = pd.read_sql("SELECT * FROM \"{}\";".format("MY_TABLE"), engine)
server.stop()
The easiest way to do this would be to run an SSH tunnel to the mysql port on the remote host. For example:
ssh -f user#XX.XX.XX.XX -L 3307:mysql1.example.com:3306 -N
Then connect locally with SQLAlchemy:
engine = create_engine("mysql://username_sql:password_sql#localhost:3307/dbb")
If you really want to use paramiko, try this demo code in the paramiko repo or the sshtunnel module. The ssh command might be the easiest method though.. and you can use autossh to restart the tunnel if it goes down.
You could use the SSHTunnel library as follows:
from sshtunnel import SSHTunnelForwarder #Run pip install sshtunnel
from sqlalchemy.orm import sessionmaker #Run pip install sqlalchemy
with SSHTunnelForwarder(
('10.160.1.24', 22), #Remote server IP and SSH port
ssh_username = "<usr>",
ssh_password = "<pwd>",
remote_bind_address=('127.0.0.1', 5432)
) as server:
server.start() #start ssh sever
print 'Server connected via SSH'
#connect to PostgreSQL
local_port = str(server.local_bind_port)
engine = create_engine('postgresql://<db_user>:<db_pwd>#127.0.0.1:' + local_port +'/<db_name>')
Session = sessionmaker(bind=engine)
session = Session()
print 'Database session created'
#test data retrieval
test = session.execute("SELECT * FROM <table_name>")
Just swap the (host, port) of the server with postgres:
from sshtunnel import SSHTunnelForwarder #Run pip install sshtunnel
server = SSHTunnelForwarder(
(<'your host'>, <host port>),
ssh_username=<"os remote username">,
ssh_pkey=<'path/to/key.pem'>, # or ssh_password.
remote_bind_address=(<'postgres db host'>, <'postgres db port'>))
server.start()
connection_data = 'postgresql://{user}:{password}#{host}:{port}/{db}'.format(user=<'postgres user'>,
password=<'postgres password'>,
host=server.local_bind_host,
port=server.local_bind_port,
db=<'postgres db name'>)
engine = create_engine(connection_data)
# Do your queries
server.stop()
I'm going to piggy back #Matin Kh with a non-postgresql db - MySQL and using Pythonanywhere.com.
This code will take a table and convert it to an excel file.
import sshtunnel
import sqlalchemy
import pymysql
import pandas as pd
from pandas import ExcelWriter
import datetime as dt
from sshtunnel import SSHTunnelForwarder
server = SSHTunnelForwarder(
('ssh.pythonanywhere.com'),
ssh_username='username',
ssh_password='password',
remote_bind_address=('username.mysql.pythonanywhere-services.com', 3306) )
server.start()
local_port = str(server.local_bind_port)
db = 'username$database'
engine = sqlalchemy.create_engine(f'mysql+pymysql://username:password#127.0.0.1:{local_port}/{db}')
print('Engine Created')
df_read = pd.read_sql_table('tablename',engine)
print('Grabbed Table')
writer = ExcelWriter('excelfile.xlsx')
print('writer created')
df_read.to_excel(writer,'8==D') # '8==D' specifies sheet
print('df to excel')
writer.save()
print('saved')
server.stop()

Connecting to remote PostgreSQL database over SSH tunnel using Python

I have a problem with connecting to a remote database using SSH tunnel (now I'm trying with Paramiko). Here is my code:
#!/usr/bin/env python3
import psycopg2
import paramiko
import time
#ssh = paramiko.SSHClient()
#ssh.load_system_host_keys()
#ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
#ssh.connect('pluton.kt.agh.edu.pl', 22, username='aburban', password='pass')
t = paramiko.Transport(('pluton.kt.agh.edu.pl', 22))
t.connect(username="aburban", password='pass')
c = paramiko.Channel(t)
conn = psycopg2.connect(database="dotest")
curs = conn.cursor()
sql = "select * from tabelka"
curs.execute(sql)
rows = curs.fetchall()
print(rows)
A problem is that the program always tries to connect to the local database. I tried with other SSH tunnels and there was the same situation. Database on remote server exists and works fine using "classical" SSH connection via terminal.
You can try using sshtunnel module that uses Paramiko and it's Python 3 compatible.
Hopefully it helps you... I scratched myself the head for a while too to do it within Python code and avoid SSH external tunnels, we need to thank developers that wrap complex libraries into simple code!
Will be simple, generate a tunnel to port 5432 in localhost of remote server from a local port then you use it to connect via localhost to the remote DB.
This will be a sample working code for your needs:
#!/usr/bin/env python3
import psycopg2
from sshtunnel import SSHTunnelForwarder
import time
with SSHTunnelForwarder(
('pluton.kt.agh.edu.pl', 22),
ssh_password="password",
ssh_username="aburban",
remote_bind_address=('127.0.0.1', 5432)) as server:
conn = psycopg2.connect(database="dotest",port=server.local_bind_port)
curs = conn.cursor()
sql = "select * from tabelka"
curs.execute(sql)
rows = curs.fetchall()
print(rows)

Categories