unable to connect to Hive through Python - python

I have installed Hive on a Debian system using the steps provided in the link:
https://phoenixnap.com/kb/install-hive-on-ubuntu
I have followed all the steps and able to create a database and table in Hive. However the property hive.metastore.uris is not set in hive-site.xml. When I try to connect to Hive using pyhive module in python I get this error:
thrift.transport.TTransport.TTransportException: Could not connect to any of [('172.16.0.125', 10000)]

Related

Connect to DB2 via JayDeBeApi JDBC in Python M1 Mac

I am trying to connect to DB2 from Python on my Mac M1. Unfortunately, I have no control over the python version (Python 3.10.6) installed on the Mac. I tried to follow the instructions on Connect to DB2 via JayDeBeApi JDBC in Python to set up JDBC connection but failing with error
FileNotFoundError: [Errno 2] JVM DLL not found: /Library/Java/JavaVirtualMachines/jdk-13.0.1.jdk/Contents/Home/lib/libjli.dylib
Though the file libjle.dylib exists in the above folder, the error pops up. I have tried trying to find a solution online with no luck.
I have connected to the same DB2 database through DBeaver on my Mac, but not able to through python.
Note, I don't have admin privileges to install/re-install but can only download packages from a repo.
Any pointers would really be very helpful
I tried the following code
import jaydebeapi
import jpype
jar = '/Users/JARFILES/jcc-11.5.7.0.jar' # location of the jdbc driver jar which is used by DBEaver
args='-Djava.class.path=%s' % jar
jvm = jpype.getDefaultJVMPath()
jpype.startJVM(jvm, args)
jaydebeapi.connect(
'com.ibm.db2.jcc.DB2Driver',
['jdbc:db2://server:port/database','myusername','mypassword']
)

ibm_db.connect string stating error with database name

I'm running python on a server that will eventually do my ETL and one of the databases that I have to connect to is running Informix (I believe its version 14). I have access to the Informix DB and have an active ODBC connection on a Windows 10 machine that connects no problem. I'm trying to get the basic connection down to the Informix Server and keep getting the following error:
Exception: [IBM][CLI Driver] SQL1013N The database alias name or database name "" could not be found. SQLSTATE=42705 SQLCODE=-1013
My code to connect:
import ibm_db
import pandas as pd
conn_str = ibm_db.connect('database=mydatabase;host=myservername;port=9088;protocol=onsoctcp;uid=myusername;pwd=mypassword','','')
df = pd.read_sql('SELECT * FROM schema.table', conn_str)
print(df)
With the error, I know the name is correct and I've tried a few different variations on the DB name as I know that its case sensitive with informix. I've also tried connecting to some of the configuration databases that are available in the drop down from the ODBC connection and everything matches up. I've also tried running it in a virtual environment and treating the server as a strictly python server. OS is Ubuntu 20.04LTS.
I tried the IfxPy install again and had an error showing a missing files/sources and an error on the use_2to3 file that was giving me issue. I managed to get it installed once I found this post: (Cannot connect to Informix DB using python) which shows one other thing that has to be declared than the GitHub page for IfxPy.
export CSDK_HOME=$INFORMIXDIR
Once I followed that and declared it on my system it compiled no problem and I've been able to extract some data. I'm still playing with it but I'm at least getting data out, now I just have to get it in a useable format. Appreciate all the help.

HIVE not connecting with Python?

I have installed Hadoop and HIVE on windows 10 by following tutorials,
https://exitcondition.com/install-hadoop-windows/ & https://www.youtube.com/watch?v=npyRXkMhrgk respectively.
Both Hadoop and HIVE are running on my machine, I have been able to put files in HDFS and run queries in HIVE, but when I try to connect HIVE with python it gives different errors. Such as
from pyhive import hive
hive.Connection(host='localhost',port=10000,auth='NOSASL')
it gives following error:
TTransportException: TSocket read 0 bytes
I have tried impala as well but it did not work.
How can I connect python with hive, is it possible on windows 10 or should I shift to linux?
Pyhive had issues with auth = NOSASL in past.. not sure whether it got fixed .
Try hdfs3 python lib
conda install hdfs3
from hdfs3 import HDFileSystem
hdfs=HDFileSystem(host='localhost',port=9000)
More info available here..
https://medium.com/#arush.xtremelife/connecting-hadoop-hdfs-with-python-267234bb68a2

Can we connect to oracle database using python without cx_Oracle in Linux

Can we connect to oracle database using python in linux without "cx_Oracle" library. We have a restriction of installing "Oracle Instant client which is mandatory required for cx_oracle library" .
We are getting below error while trying to connect to oracle database using "cx_Oracle"
Error: cx_Oracle.DatabaseError: DPI-1047: Cannot locate a 64-bit Oracle
Client library: "libclntsh.so: cannot open shared object file: No such
file or directory"
Please suggest if there is any way to connect to oracle without "cx_oracle" and without installing Instant client

How can I connect with an Oracle 8i database through a Python 3.6 code?

I wanna connect to Oracle 8i Database using Python2.7 or Python3.6 as I am not an Oracle guy so I need your help on this.
I am having following scenario:
My Database server is located at remote location.
I have to connect with that database through either version of Python2.7 or Python3.6.
After connection I just wants to do as normal queries.
Things which I have already done is:
cx_Oracle library 6.2 version installed.
Oracle instant Client libraries installed and using these libraries I am able to connect from Oracle 9i to Oracle 12c.
Now I just wants to make connection with Oracle 8i database.
thank you.
Uh, Oracle 8 ... where did you manage to find that fossil?
Anyway: this page says that you should use "OJDBC and JayDeBeApi" which works with databases
supported by Oracle's JDBC drivers (currently 8.1.7 to 11.2.0.2.0)
There's some more info, so - have a look.

Categories