I'm doing a microservice in Python 3.7 that connects to a Neo4j database. It's the first time I work connecting Python with Neo4j and I'm using py2neo version 4.3.0.
Everything works OK, but now to adhere to the standard, I need to create a healthcheck to verify the connection to the Database.
I wanted to use the
from py2neo import Graph, Database
and use
db = Database ("bolt: // localhost: 7474", auth = ("neo4j", "xxxx"))
and
db.kernel_version (Dont work)
but with this I do not verify that there is connection is up. Does anybody have any suggestions?
If checking the kernel version doesn't work then the connection is not ok. Below is a script to check if the connection from python to neo4j (via py2neo) is up and running.
from py2neo import Graph
graph = Graph("bolt://localhost:7687", auth=("neo4j", "xxxxx"))
try:
graph.run("Match () Return 1 Limit 1")
print('ok')
except Exception:
print('not ok')
Related
I am trying to open a web service app that will allow me to connect to the database on azure with python code, Ive tried using sqlalchemy and pyodbc and i am successfully able to connect to the database on my machine, in the local host, i can perform all necessary actions i want to there. but i want to be able to set this code up to be able to hit specific routes in an ajax call that can perform certain actions on my database, like flipping a users active flag to false. however the problem is that when i upload the python code to azure, using this guide here (https://learn.microsoft.com/en-us/azure/app-service/quickstart-python?tabs=bash&pivots=python-framework-flask) it just returns a 500 server error, i cant find anything in the trace as to why its not working, I thought that it might just be that my local machine is whitelisted in the IP address's to the database, but even still if i add that app services IP address to the allowed IP's it still returns a server error. here is the setup of the code:
from flask import Flask
from sqlalchemy import create_engine
import pyodbc
app = Flask(__name__)
#app.route('/')
def connection():
Driver = "{ODBC Driver 13 for SQL Server}"
Server = "server string"
Port = 1433
Database = "dbname"
Uid = "user"
Pwd = "pass"
try:
cnxn = pyodbc.connect(f'DRIVER={Driver};SERVER={Server};DATABASE={Database};Uid={Uid};Pwd={Pwd};Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;')
cursor = cnxn.cursor()
cursor.execute(f"SELECT * FROM (Database Table) where id = 999999999;")
for row in cursor:
print('row = %r' % (row,))
return "Connection to database successful"
I ommitted certain details but the syntax should remain intact, Again, in my local machine I can connect to the database and return data. but once it compiles the code in azure, it no longer works.
excuse the try statement. I was trying to catch the error and send it to the client in hopes of gathering more information from the 500 error, but it didnt work.
**Edit: its worth mentioning that if i remove the actual connection string and anything to do with connecting to the database, the code will return "connection to database successful" this leads me to believe that it isnt fully making a connection to the database at all. and thats why its erroring out, however the question remains, why can i connect in my local enviornment but not in azure?
have you heard about remote debugging in azure. This is the best tool to trace out errors using your visual studio code or VS where you web or app is initially working well. You can use multiple methods to debug line by line or cluster by cluster. Kindly check it out here
https://learn.microsoft.com/en-us/visualstudio/debugger/remote-debugging?view=vs-2019
So the short answer to this is the App service was missing the driver {ODBC Driver 13 for SQL Server} it wasn't erroring out locally because I have that driver installed locally, but this driver isn't available in the app. To get this to work I would have to install the App to a Virtual Machine to be able to download the driver appropriately.
fortunately however Azure web apps built on a linux base already come with {ODBC Driver 17 for SQL Server} So flipping the driver from 13 to 17 allowed the app to successfully connect to my Database.
I need to check oracle connectivity using python script.
My oracle connection string is in below format.
jdbc:oracle:thin:#ldap://ovd.mycomp.com:38901/cn=Oraclecontext,o=eus,dc=mycomp,dc=com/pidev ldap://ovd- mwdc.mycomp.com:38901/cn=Oraclecontext,o=eus,dc=mycomp,dc=com/pidev.
I tried https://dbajonblog.wordpress.com/2019/12/18/python-and-cx_oracle-for-oracle-database-connections/ but that did not help.
Could you please help me.
Thanks in advance.
The general cx_Oracle documentation on working with JDBC and Oracle SQL Developer Connection Strings has some info however if you're using LDAP you'll need to do some extra configuration. See https://stackoverflow.com/a/32151099/4799035 and https://github.com/oracle/node-oracledb/issues/1212#issuecomment-591940440 The steps are the same for cx_Oracle. Also see Connect to DB using LDAP with python cx_Oracle
I created following script using cx_oracle, which works fine.
only restriction is the dns name should be TNS entry.
Script:
import cx_Oracle
import sys
from botocore.exceptions import ClientError
connection = None
def isOracleHealthy(dbname, username, password, dns, log):
try:
sys.stderr.write(dns)
connection=cx_Oracle.connect("{}/{}#{}".format(username, password, dns))
cur=connection.cursor()
for result in cur.execute("SELECT * FROM dual"):
log.info(result)
return True
except Exception as e:
sys.stderr.write(dbname+' Oracle health check failed.\n')
log.error(dbname+' Oracle health check failed.')
return False
I have been having major trouble connecting my python shell to my postgres. I am doing this on windows. I have downloaded psycopg2 and everything for this to process, however it still is not working.
import psycopg2
conn=psycopg2.connect("dbname = 'test' user ='postgres' host ='localhost' password = 'mypassword'")
It gives me an error telling me that the database "test" does not exist, however it does! If you guys have any advice at all on what I should test out, that would be amazing. Thank you!
You can layout connection parameters as a string and pass it to the connect() function as like:
conn = psycopg2.connect("dbname=test user=postgres password=postgres")
Or you can use a list of keyword arguments like
conn = psycopg2.connect(host="localhost",database="test", user="postgres", password="postgres")
If its still fails then you should check on PostgreSQL side. You should try to connect the db in question using command line and see if error re appears or not. if it appears then something is missing on DB server side.
I'm on a W8 machine, where I use Python (Anaconda distribution) to connect to Impala in our Hadoop cluster using the Impyla package. Our hadoop cluster is secured via Kerberos. I have followed the API REFERENCE how to configure the connection.
from impala.dbapi import connect
conn = connect( host='localhost', port=21050, auth_mechanism='GSSAPI',
kerberos_service_name='impala')
We are using Kerberos GSSAPI with SASL
auth_mechanism='GSSAPI'
I have managed to install python-sasl library for WIN8 but still I encounter this error.
Could not start SASL: Error in sasl_client_start (-4) SASL(-4): no mechanism available: No worthy mechs found (code THRIFTTRANSPORT): TTransportException('Could not start SASL: Error in sasl_client_start (-4) SASL(-4): no mechanism available: No worthy mechs found',)
I wonder if I am still missing some dependencies.
Install the kerberos Python package, it will fix your issue.
I ran into the same issue but i fixed it by installing the right version of required libraries.
Install below python libraries using pip:
six==1.12.0
bit_array==0.1.0
thrift==0.9.3
thrift_sasl==0.2.1
sasl==0.2.1
impyla==0.13.8
Below code is working fine with the python version 2.7 and 3.4.
import ssl
from impala.dbapi import connect
import os
os.system("kinit")
conn = connect(host='hostname.io', port=21050, use_ssl=True, database='default', user='urusername', kerberos_service_name='impala', auth_mechanism = 'GSSAPI')
cur = conn.cursor()
cur.execute('SHOW DATABASES;')
result=cur.fetchall()
for data in result:
print (data)
Try this to get tables for kerberized cluster. In my case CDH-5.14.2-1.
Make sure you have a valid ticket before running this code.
with python 2.7 having below packages.
thrift-0.9.3
thriftpy-0.3.8
thrift_sasl-0.3.0
impyla==0.14.2.2
Working Code
from impala.dbapi import connect
from impala.util import as_pandas
# 21000 is impala daemon port.
conn = connect(host='yourHost', port=21050, auth_mechanism='GSSAPI')
cursor = conn.cursor()
cursor.execute("SHOW TABLES")
# After running .execute(), Impala will store the result sets on the server
# until it is fetched. Use the method .fetchall() to pull the entire result
# set over the network (you should only do it if you know dataset is small)
tables = cursor.fetchall()
print("Displaying list of tables")
# the result is a list of tuples
for t in tables:
# we know that each row in SHOW TABLES result
# should only contains one table name
print(t[0])
# exit() enable for only one table
print("eol >>>")
For me, installing this package fixed it: libsasl2-modules-gssapi-mit
For me, the following connection parameters worked. I did not have to install any additional packages in python.
connect(host="your_host", port=21050, auth_mechanism='GSSAPI', timeout=100000, use_ssl=False, ca_cert=None, ldap_user=None, ldap_password=None, kerberos_service_name='impala')
To connection Impala using python you can follow below steps,
Install Coludera ODBC Driver for Impala.
Create DSN using 64-bit ODBC driver, put your server details, below is sample screen shot for same
Use below code snippet for connectivity
import pyodbc
with pyodbc.connect("DSN=impala_con", autocommit=True) as conn:
... df = pd.read_sql("", conn)
python cannot connect hiveserver2
make sure you install cyrus-sasl-devel and cyrus-sasl-gssapi
I have recently installed Microsoft SQL Server 2014 on my PC as I want to create a database for a web application I am building (I have been learning Python for a year and have very basic experience with SQLite).
After installing SQL Server 2014 and creating a database called Users, I am just trying to run some very basic commands to my database but I am falling at the first hurdle over and over!
I have installed pymssql and pyodbc and tried running commands directly with these but have failed. (e.g. pymssql gives me a TypeError: argument of type 'NoneType' is not iterable when I set the variable conn = pymssql.connect(server, user, password, "tempdb")
My latest attempt is to use SQLalchemy to achieve my long awaited connection with SQL database. However, after installing this, it is failing on the following error:
"sqlalchemy.exc.OperationalError: (pymssql.OperationalError) (20009, 'DB-Lib error message 20009, severity 9:\nUnable to connect: Adaptive Server is unavailable or does not exist\nNet-Lib error during Unknown error (10035)\n')"
The question I need answering is, how do I start talking to my database using SQLalchemy?
The code I am using is as follows:
from sqlalchemy import *
engine = create_engine('mssql+pymssql://Han & Lew:#SlugarPlum:1433/Users')
m = MetaData()
t = Table('t', m,
Column('id', Integer, primary_key=True),
Column('x', Integer))
m.create_all(engine)
Yes, my PC is called SlugarPlum. User is Han & Lew. And my server is called THELROYSERVER. DSN = 1433. No password. (I don't know if it is wise that I am giving this information online but the data I have is not sensitive so I guess it's worth a shot.)
Also, if anyone can direct me to an ultra-beginners resource for Python-SQL server that would be awesome as I am getting beaten up by how complex this seems to be!
Here's a connect function and example for connecting via pyodbc. Connecting via pymssql should be as easy as formatting the connecting string for pymssql. I've provided Windows and Linux options, but only tested on Linux. I hope it helps.
def sqlalchemy_connect(connect_string):
""" Connect to the database via ODBC, start SQL Alchemy engine. """
def connect():
return pyodbc.connect(connect_string, autocommit=True)
db = create_engine('mssql://', creator=connect)
db.echo = False
return db
def main():
global DBCONN
# Linux with FreeTDS
connect_string = "DRIVER={FreeTDS};SERVER=<server name>;PORT=<port num>;DATABASE=<db>;UID=<user>;PWD=<password>;TDS_Version=<version num>;"
# Windows with SQL Server
connect_string = "DRIVER={SQL Server};SERVER=<server name>;PORT=<port num>;DATABASE=<db>;UID=<user>;PWD=<password>;"
DBCONN = sqlalchemy_connect(connect_string)
if __name__ == "__main__":
main()