Trying to Connect Python With a PostgreSQL DB with SSH Tunnel - python

I'm trying to connect my script to a PostgreSQL database via sshtunnel and psycopg2. I can access the DB with my credentials through DBeaver. So, basically I have this:
SSH Configuration
Where the SSH paramateres are similar to this:
Host/IP: 'someproxy.something.io'
User Name: 'User1'
Private key: Path to an id_rsa file
Passphrase: 'Password1'
This is my code:
try:
print('Connecting to the postgre database...')
with SSHTunnelForwarder(
"someproxy.something.io",
ssh_username = "User1",
ssh_private_key = "C:/~ /.ssh/User1_id_rsa",
ssh_private_key_password = "Password1",
remote_bind_adress = ("mydatabase-postgres.something.io", 5432)
) as server:
server.start()
And I'm getting this error:
ValueError: Unknown arguments: {'remote_bind_adress': ('mydatabase-postgres.something.io', 5432)}

You have a typo in the parameter name: remote_bind_adress -> remote_bind_address.

Related

connect to MongoDB through ssh tunnel using python

I'm new to python, I asked to connect a MongoDB through an ssh tunnel, and then insert data in MongoDB into oracle DB. I've already sent my public key to the DB team and they sent me the below information :
ssh arash#34.58.115.11
DB info :
host: localhost
port: 27017
db_name: mymongodb
I already tried ssh_pymongo and ssh_tunnel but they didn't succeed. I would be thankful if you could help me in this regard.
from ssh_pymongo import MongoSession
session = MongoSession('34.58.115.11',user='arash')
db = session.connection['mymongodb']
#mycols = db["properties"]
#li = mycols.find_one()
#print(li)
list(db.collection.find({}))
also ssh_tunnel:
from sshtunnel import SSHTunnelForwarder
server = SSHTunnelForwarder(
'alfa.8iq.dev',
ssh_username="arash",
remote_bind_address=('34.58.115.11'))
server.start()

Python MySQL Connection with SSH

Hi I have a shared hosting i bought and it allows for remote MySQL connection only with SSH.
So far I know that it doesn't have any Public or Private Keys..
And here's my connection setup on the MySQL Workbench which works when I try to connect:
I have looked at another stackoverflow question: Here but none of the answers seems to work for me.. :/ I'm really at a dead end and I need to get this to work. Can someone help me out please?
So I figured it out with like a million trial and error:
import pymysql
import paramiko
import pandas as pd
from paramiko import SSHClient
from sshtunnel import SSHTunnelForwarder
ssh_host = '198.54.xx.xx'
ssh_host_port = 21098 #Ur SSH port
ssh_username = "sshuser123" #Change this
ssh_password = "sshpassword123" #Change this
db_user = 'db user' #change this
db_password = 'password123' #change this
db = 'main_db' #The db that the user is linked to
with SSHTunnelForwarder(
(ssh_host, ssh_host_port),
ssh_username=ssh_username,
ssh_password=ssh_password,
remote_bind_address=('127.0.0.1', 3306)) as tunnel:
conn = pymysql.connect(host='127.0.0.1', user=db_user,
passwd=db_password, db=db,
port=tunnel.local_bind_port)
query = '''SELECT * from tablename;'''
data = pd.read_sql_query(query, conn)
print(data)
conn.close()
This is the code you should use if your SSH on MySql doesn't have any Public / Private Key.
Hope this helps anyone facing the same issue!!
Connect to server 198.54.x.240:21098 via ssh with port-forwarding
like ssh -t -gL 33069:localhost:3306 198.54.x.240
in windows use PuTTY, i like KiTTy (fork putty )
add connection and SSH Tunnel look at the pictures
Connect to MySQL via localhost:33069 (answer you know)WorkBench do the same, but 3306 on 3306, if you need more than 1 remote connection best practice forward different porst.

PyGreSQL/pg hangs when connecting to DB over SSH tunnel

In my Python script, I want to be able to connect to a Postgres DB via an SSH tunnel.
I'm using sshtunnel package to create a tunnel, and using PyGreSQL to connect to the DB.
When I try to establish the database connection, the pg.connect call just hangs. I don't get any errors at all. When I use psql to connect to the DB using the tunnel created by sshtunnel, the connection is successful.
When I create the tunnel beforehand using ssh in shell, pg.connect call successfully connects to the database.
So to summarize:
Tunnel created in Python/sshtunnel -> pg.connect call hangs
Tunnel created in Python/sshtunnel -> psql works just fine
Tunnel created using ssh -> pg.connect call is successful
This seems to be a problem with PyGreSQL since psql can access the DB using tunnel by sshtunnel just fine. However, there could be something different about the tunnel by sshtunnel package that I'm not seeing.
This is the command I'm using to create the tunnel using SSH:
ssh -g -L <local_bind_port>:localhost:<remote_bind_port> -f -N root#myip
Following is my code to connect to the DB in Python using SSH Tunnel and pg.connect
from sshtunnel import SSHTunnelForwarder
dbasename = 'db'
username = 'admin'
password = 'admin'
portnum = 5432
tunnel = SSHTunnelForwarder(
<ip_address>,
ssh_username="admin",
ssh_password="admin",
remote_bind_address=('127.0.0.1', portnum)
)
tunnel.start()
# The line below hangs
db = pg.connect(host=tunnel.local_bind_host, port=tunnel.local_bind_port, dbname=dbasename, user=username, passwd=password)
Any ideas about what could cause this problem? Are there any logs etc I that might help identify the problem?
Thanks.
EDIT:
It turns out that if I open a tunnel using python/SSHTunnel in one python shell, but use pg.connect to connect to that tunnel in the 2nd python shell it connects successfully.
So if I copy paste the following in the 1st shell:
from sshtunnel import SSHTunnelForwarder
dbasename = 'db'
username = 'admin'
password = 'admin'
portnum = 5432
tunnel = SSHTunnelForwarder(
<ip_address>,
ssh_username="admin",
ssh_password="admin",
remote_bind_address=('127.0.0.1', portnum)
)
tunnel.start()
then open another shell and connect to the tunnel from the 1st shell
import pg
# This works for some reason
db = pg.connect(host='127.0.0.1', port=<local hind port from the 1st shell>, dbname=dbasename, user=username, passwd=password)
the connection is successful

Connect to mongo database, given ssh key

I am trying to use pymongo to connect to a mongo database.
I have been given:
DB_name
DB_username
DB_password
DB_port
SSH_address
SSH_username
mongo RSA private key (.pem file)
I have tried running
from pymongo import MongoClient
client = MongoClient(host=SSH_address,
port=DB_port,
username=DB_username,
password=DB_password)
client.list_database_names()
but get a timed out error.
How can I pass the remaining information (such as the RSA private key) to MongoClient, so that I can successfully connect?
Using SSH tunnel client to connect the MongoDB client works for me:
server = SSHTunnelForwarder(
(MONGO_HOST, MONGO_PORT),
ssh_username=MONGO_USER,
ssh_password=MONGO_PASS,
remote_bind_address=('127.0.0.1', 27017)
)

SSH tunnel forwarding with jump host and remote database

I have a remote MySQL database hosted on Amazon RDS ("D"). For security purposes, it is only accessible through a remote server ("C"). C is accessible via ssh through a jump host "B". I need a double ssh tunnel to then access a remote SQL host.
[A: local host] -> [B: jump host] -> [C: target host] => [D: RDS MySQL host]
I would like to access D through Python, using paramiko and/or sshtunnel. All of the information I can find involves:
a single ssh tunnel and a remote SQL host (ex. A -> C => D, no jump host)
ssh first with mysqldb in python
python mysql connectivity via ssh
a double ssh tunnel to an SQL host (ex. A -> B -> C, D is hosted on C).
Connecting to remote Postgresql database over ssh tunnel using python
Paramiko: Port Forwarding Around A NAT Router
Nested SSH session with Paramiko
So far, I'm using paramiko with a proxy command to get from A to C. I can access D by executing a command on C, but not by connecting with mysqldb or sqlalchemy (my ultimate goal).
My current code:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
proxy = paramiko.ProxyCommand("ssh -A B_username#B_host -W C_host:12345")
ssh.connect("C_host", username="C_username", sock=proxy)
stdin, stdout, stderr = ssh.exec_command("mysql -u D_username -p D_password -h D_host_rds")
print("STDOUT:\n{}\n\nSTDERR:\n{}\n".format(stdout.read(), stderr.read()))
# successfully prints out MySQL welcome screen
I'm looking for something like this (modified from example 2 in the sshtunnel docs):
import paramiko
from sshtunnel import SSHTunnelForwarder
with SSHTunnelForwarder(
intermediate = {
("B_host", 22),
ssh_username = "B_username",
ssh_password = "B_password")},
remote = {
("C_host", 12345),
ssh_username = "C_username",
ssh_password = "C_password")},
remote_bind_address=("D_host_rds", 3306),
local_bind_address=("0.0.0.0", 3307)) as server:
conn = MySQLdb.connect(
user = "D_username",
passwd = "D_password",
db = "my_database",
host = "127.0.0.1",
port = 3307)
tl;dr: How do I forward a port through two ssh jumps in Python?
I figured it out. It works with a combination of ssh config settings and the SSHTunnelForwarder context manager from the sshtunnel library.
Using the following model and naming conventions:
[A: local host] -> [B: jump host] -> [C: target host] => [D: RDS MySQL host]
I set up my ~/.ssh/config to get from A to C through B:
Host C_ssh_shortcut
HostName C_host
User C_user
Port 22
ForwardAgent yes
ProxyCommand ssh B_user#B_host -W %h:%p
I added the key/keys I used to log in to B and C to my ssh-agent:
ssh-add
And finally I set up SSHTunnelForwarder:
import sqlalchemy
from sshtunnel import SSHTunnelForwarder
with SSHTunnelForwarder(
"C_ssh_shortcut", # The SSHTunnelForwarder "ssh_address_or_host" argument, which takes care of bypassing B through the ProxyCommand set up in ~/.ssh/config
remote_bind_address=(D_host, 3306), # Points to your desired destination, ie. database host on 3306, which is the MySQL port
local_bind_address=('', 1111) # Gives a local way to access this host and port on your machine. '' is localhost / 127.0.0.1, 1111 is an unused port
) as server:
connection_string = "mysql+pymysql://D_user:D_password#localhost:1111/D_dbname" # note that D_host and D_port were replaced by the host and port defined in "local_bind_address"
engine = sqlalchemy.create_engine(connection_string)
# do your thing
From here, I am able to use my engine as usual to interact with my database.
This code work for me
import pymysql
import paramiko
from paramiko import SSHClient
from sshtunnel import SSHTunnelForwarder
from sqlalchemy import create_engine
#ssh config
mypkey = paramiko.RSAKey.from_private_key_file('your/user/location/.ssh/id_rsa')
ssh_host = 'your_ssh_host'
ssh_user = 'ssh_host_username'
ssh_port = 22
#mysql config
sql_hostname = 'your_mysql_host name'
sql_username = 'mysql_user'
sql_password = 'mysql_password'
sql_main_database = 'your_database_name'
sql_port = 3306
host = '127.0.0.1'
with SSHTunnelForwarder(
(ssh_host, ssh_port),
ssh_username=ssh_user,
ssh_pkey=mypkey,
remote_bind_address=(sql_hostname, sql_port)) as tunnel:
engine = create_engine('mysql+pymysql://'+sql_username+':'+sql_password+'#'+host+':'+str(tunnel.local_bind_port)+'/'+sql_main_database)
connection = engine.connect()
print('engine creating...')
sql = text(""" select * from nurse_profiles np limit 50""")
nurseData = connection.execute(sql)
connection.close()
nurseList = []
for row in nurseData:
nurseList.append(dict(row))
print('nurseList len: ', len(nurseList))
print('nurseList: ', nurseList)
I use this code for PostgreSQL database and it works. I am sure it will work too if use MySQL database. I change the PostgreSQL database part here to MySQL, here is the code:
import pymysql
import paramiko
import sqlalchemy
from sshtunnel import SSHTunnelForwarder
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
import pandas as pd
#SSH config
mypkey = paramiko.RSAKey.from_private_key_file('id_rsa_file', password = 'id_rsa_password')
ssh_host = 'your_ssh_host'
ssh_user = 'your_ssh_host_username'
ssh_port = 22
#SQL config
sql_hostname = 'your_sql_host_name'
sql_username = 'sql_user'
sql_password = 'sql_password'
sql_main_database = 'your_database_name'
sql_port = 3306
host = '127.0.0.1'
with SSHTunnelForwarder((ssh_host, ssh_port),
ssh_username=ssh_user,
ssh_pkey=mypkey,
remote_bind_address=(sql_hostname, sql_port)) as tunnel:
#Connect to SQL
local_port = str(tunnel.local_bind_port)
engine = create_engine(f'mysql+pymysql://{sql_username}:{sql_password}#127.0.0.1:' + local_port +f'/{sql_main_database}')
Session = sessionmaker(bind = engine)
session = Session()
print('Database session created!')
#To inspect the schemas and tables in your database
inspector = inspect(engine)
schemas = inspector.get_schema_names()
for schema in schemas:
print(f'schema:{schema}')
for table_name in inspector.get_table_names(schema = schema):
print(f'table: {table_name}')
query_code = "your query code from SQL here"
#Execute query code
exec_database = session.execute(query_code)
df = pd.DataFrame(exec_database.fetchall())
df.columns = exec_database.keys()
print('Dataframe created from database!')
session.close()
engine.dispose()
You can also change the part below:
#Execute query code
exec_database = session.execute(query_code)
df = pd.DataFrame(exec_database.fetchall())
df.columns = exec_database.keys()
to read SQL query directly using pandas using code below:
df = pd.read_sql_query(query_code, engine)
Additionally, part of code below:
#To inspect the schemas and tables in your database
inspector = inspect(engine)
schemas = inspector.get_schema_names()
for schema in schemas:
print(f'schema:{schema}')
for table_name in inspector.get_table_names(schema = schema):
print(f'table: {table_name}')
is only necessary when you don't have any idea what schemas and tables are in your database. You can use those codes above to inspect and show them.

Categories