pyodbc connection with Cloudera Impala fails on colab - python

I've installed pyodbc and configured system ODBC. Anything else I should configure?
pyodbc.autocommit=True
conn = pyodbc.connect("DSN=Cloudera Impala DSN", autocommit=True)
print("pass")
cursor = conn.cursor()

You can use - conn = pyodbc.connect(DSN="Cloudera Impala DSN", autocommit=True)
We use
cfg = {'DSN': 'Cloudera Impala DSN', 'host': 'xx.com', 'port': 1234,'username': 'uu', 'password': 'pp'}
conn_string='DSN=%s; database='default';AuthMech=3;UseSASL=1; UID=%s; PWD=%s; SSL=1;AllowSelfSignedServerCert=1;CAIssuedCertNamesMismatch=1' %(cfg['DSN'], cfg['username'], cfg['password'])
conn = pyodbc.connect(conn_string, autocommit=True)
cursor = conn.cursor()

Related

How to add my password to my pyodbc connection in python

I have the following code to connect to my SQL Server database. I am wondering where/how I can add my password so it automatically connects instead of asking for my password.
server = 'myserver'
database = 'mydatabase'
username ='johndoe#xyz.com'
Authentication='ADI'
driver= '{ODBC Driver 17 for SQL Server}'
conn = pyodbc.connect('DRIVER='+driver+
';SERVER='+server+
';PORT=1433;DATABASE='+database+
';UID='+username+
';AUTHENTICATION='+Authentication
)
I tried this but it did not work.
server = 'myserver'
database = 'mydatabase'
username ='johndoe#xyz.com'
Authentication='ADI'
driver= '{ODBC Driver 17 for SQL Server}'
conn = pyodbc.connect('DRIVER='+driver+
';SERVER='+server+
';PORT=1433;DATABASE='+database+
';UID='+username+
';AUTHENTICATION='+Authentication
';PWD= 'MyPassword'
)
Secondarily, is there another way to have it read my password without putting it in the code itself? If so, I would love any information on that.
I'm not 100% on pyodbc but this works for mysqldb & psycopg2.
If you are able to use the Environment Variables, you can store information in there and call it using os in your script. Once you have it stored, you can just change your code to something like this below:
import os
server = 'myserver'
database = 'mydatabase'
username ='johndoe#xyz.com'
Authentication='ADI'
driver= '{ODBC Driver 17 for SQL Server}'
conn = pyodbc.connect(DRIVER=driver,
SERVER=server,
PORT=1433,
DATABASE=database,
UID=username,
AUTHENTICATION=Authentication,
PWD= os.environ['MYPASSWORD']
)
This is what ended up working for me:
#def get_connection():
conn = pyodbc.connect(DRIVER= '{ODBC Driver 17 for SQL Server}',
SERVER='ServerName',
DATABASE = 'DatabaseName',
PORT=PortNumber,
Trusted_Connection = 'Yes',
UID = 'MyUserName',
PWD = 'MyPass',
Authentication = 'ActiveDirectoryPassword'
)

Python querying Redshift fails on "Connection reset by peer" but works with dbeaver

I am trying to query my Redshift DB using python,
I tried both of the following:
with sqlalchemy:
connection_string = "redshift+psycopg2://%s:%s#%s:%s/%s" % (USER, PASS, HOST, str(PORT), DATABASE)
engine = sa.create_engine(connection_string)
session = sessionmaker()
session.configure(bind=engine)
sess = session()
sess.execute('SELECT * FROM MY_TABLE LIMIT 1;')
with redshift_connector:
conn = redshift_connector.connect(
host=HOST,
port=PORT,
database=DATABASE,
user=USER,
password=PASS)
cursor = conn.cursor()
cursor.execute('SELECT * FROM MY_TABLE LIMIT 1;')
all_results = cursor.fetchall()
conn.close()
both are returning 'Connection reset by peer' while when I am trying to connected using DBeaver I am able to run this query without any problems
anything I might be missing?

sqlalchemy, postgres, docker

I am struggling to understand why my query below returns the following error, when I past it directly from postgres where it works fine. I have read putting the table name in quotes but this does not work :/.
from sqlalchemy import create_engine
db_name = 'blahh'
db_user = 'blahhd'
db_pass = 'blhd'
db_host = 'localhost'
db_port = 5432
## Connect to to the database
db_string = 'postgres://{}:{}#{}:{}/{}'.format(db_user, db_pass, db_host, db_port, db_name)
db = create_engine(db_string)
connection = db.connect()
connection.execute('select cake.name, cake.industry, cake.created_at from cake limit 10;')
connection.close()
error:
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedTable) relation "cake" does not exist
LINE 1: ... cake limi...
^
As explained in the SQLAlchemy documentation, you are missing out the connection object. create_engine simply represents a connection resource. You need to connect to it and later close the connection. You may also need to add the method .fetchall() to actually view the list of items from your connection.
If you try something like this:
from sqlalchemy import create_engine
db_name = 'blahh'
db_user = 'blahhd'
db_pass = 'blhd'
db_host = 'localhost'
db_port = 5432
## Connect to to the database
db_string = 'postgres://{}:{}#{}:{}/{}'.format(db_user, db_pass, db_host, db_port, db_name)
db = create_engine(db_string)
connection = db.connect()
connection.execute('select cake.name, cake.industry, cake.created_at from cake limit 10;').fetchall()
connection.close()

Python: Connect to an Azure PostgreSQL instance through SSH Tunnel

I am trying to use Python to connect to a PostgreSQL instance, which is located on Azure through an SSH tunnel. I can connect to the database with DBeaver with no Problem.
Here is the code that I am using.
from sshtunnel import SSHTunnelForwarder
server = SSHTunnelForwarder(
('160.**.**.**', 22),
ssh_username="*******",
ssh_password="*******",
remote_bind_address=('localhost', 5432))
server.start()
print("server connected")
params = {
'database': '*******',
'user': '*****postgresadmin#*****dev-postgres',
'password': '************',
'host': '**********-postgres.postgres.database.azure.com',
'port': server.local_bind_port
}
conn = psycopg2.connect(**params)
cur = conn.cursor()
text = "select * from table"
cur.execute(text)
However I get the following error:
conn = _connect(dsn, connection_factory=connection_factory, **kwasync) psycopg2.OperationalError: could not translate host name "**********-postgres.postgres.database.azure.com" to address: Unknown host
I also tried sqlalchemy using this with the same result.
Any idea on what I am doing wrong? Do I maybe need the IP-address of the host instead of the domain?
The SSHTunnelForwarder is used if you want to do some stuff on the remote server.
Another code block is needed if you need to use remote server as a bridge to connect to another server:
import sshtunnel
with sshtunnel.open_tunnel(
(REMOTE_SERVER_IP, 443),
ssh_username="",
ssh_password="*******",
remote_bind_address=(AZURE_SERVER_HOST, 22),
local_bind_address=('0.0.0.0', 10022)
) as tunnel:
params = {
'database': '*******',
'user': '*****postgresadmin#*****dev-postgres',
'password': '************',
'host': '127.0.0.1',
'port': 10022
}
conn = psycopg2.connect(**params)
cur = conn.cursor()
text = "select * from table"
cur.execute(text)

Connecting to oracle DB through python

I'm trying to connect to an oracle db using python through PyCharm, below is my code and a screenshot of the connection details
Code:
import cx_Oracle
try:
conn = cx_Oracle.connect('sys/123#//localhost:1521/XEPDB1')
except:
print("Connection Error")
exit()
Output
Connection Error
There are multiple ways to do that, either with SID or service name
SID :
import cx_Oracle
dsn_tns = cx_Oracle.makedsn('server', 'port', 'sid')
conn = cx_Oracle.connect(user='username', password='password', dsn=dsn_tns)
Service name :-
import cx_Oracle
dsn_tns = cx_Oracle.makedsn('server', 'port', service_name='service_name')
conn = cx_Oracle.connect(user='username', password='password', dsn=dsn_tns)
You can refer to this documentation HERE

Categories