Not able to connect python with hive on windows machine - python

I have installed anaconda navigator and working in the spyder workbook.
I am getting an error like TTransport exception.
I have installed all the packages like pyhive, sasl, thrift, thrift-sasl. while connecting python to hive, I was getting this error,"Could not start SASL: b'Error in sasl_client_start (-4) SASL(-4): no mechanism available: Unable to find a callback: 2'".
Is there any packages do I want to install?? As well I am working in windows, so kindly help me with that..
from pyhive import hive
import sasl
import thrift
import thrift_sasl
conn1 = hive.Connection(host="xxxxx", port=00000, username="yyyy")
cur2 = conn1.cursor()
Errors:
conn1 = hive.Connection(host="xxxx", port=00000, username="yyyy")
Traceback (most recent call last):
File "", line 1, in
conn1 = hive.Connection(host="xxxx", port=00000, username="yyyy")
File "C:\Users\sgpbtp02\AppData\Local\Continuum\anaconda3\lib\site-packages\pyhive\hive.py", line 192, in init
self._transport.open()
File "C:\Users\sgpbtp02\AppData\Local\Continuum\anaconda3\lib\site-packages\thrift_sasl__init__.py", line 79, in open
message=("Could not start SASL: %s" % self.sasl.getError()))
TTransportException: Could not start SASL: b'Error in sasl_client_start (-4) SASL(-4): no mechanism available: Unable to find a callback: 2'

Related

Stuck with a MySQLdb SSL connection error in Python2.7

I have to maintain an old website built using Python2.7 that needs to continue working until we've finished creating a completely new version with more modern tools. Now this old website needs access to a remote MySQL database (connection is set up and working correctly), which so far has worked using the following:
import MySQLdb
db = MySQLdb.connect(host=Host,user=User,passwd=Pass,db=DBse)
Now the server has been upgraded from Ubuntu 18.04. to Ubuntu 20.04., and while I managed to install pip and MySQLdb for Python2.7, I now get the following error for the lines above:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/MySQLdb/__init__.py", line 86, in Connect
return Connection(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/MySQLdb/connections.py", line 204, in __init__
super(Connection, self).__init__(*args, **kwargs2)
_mysql_exceptions.OperationalError: (2026, 'SSL connection error: unknown error number')
The SSL connection works fine in Python3 or directly from the command line.
Is there anything I can do to make this work?
Locate and add the line skip_ssl in /etc/mysql/my.cnf ( have tried this on MySQL 8.0.29 , Ubuntu 18)
This was the only thing that worked for me.
[mysqld]
skip_ssl

SASL error when trying to connect to hive(hue) by python from my PC - Windows10

Need your help!
I read all the documentation that I found on the internet (StackOverflow, Github, etc.), but nothing helped.
I am trying to connect to hive(hue) by python from my PC, my script is:
When I run this code I got this error message:
Traceback (most recent call last):
File "C:/Users/myuser/Documents/Python/testing.py", line 6, in <module>
cursor = hive.connect('myconnect', port=10000, username='root').cursor()
File "C:\Users\myuser\AppData\Local\Continuum\anaconda3\lib\site-packages\pyhive\hive.py", line 94,
in connect
return Connection(*args, **kwargs)
File "C:\Users\myuser\AppData\Local\Continuum\anaconda3\lib\site-packages\pyhive\hive.py", line
192, in __init__self._transport.open()File
"C:\Users\myuser\AppData\Local\Continuum\anaconda3\lib\site-packages\thrift_sasl\__init__.py", line
79, in open message=("Could not start SASL: %s" % self.sasl.getError()))
thrift.transport.TTransport.TTransportException: Could not start SASL: b'Error in sasl_client_start
(-4) SASL(-4): no mechanism available: Unable to find a callback: 2'
My system detailes:
OS: Windows 10 Pro
Python version: Python 3.7.4
Distribution: Anaconda, Inc. on win32
I found a way to work around the problem.
Actions taken:
Installed VirtualBox with Ubuntu OS on my PC.
Run Ubuntu as root.
Installed python, pip3 and the relevant libraries(pyhive, sasl, thrift_sasl, and thrift)
Run my code - works great!
My conclusion that my problem caused due to security(I am not an admin on my station) and windows issues.

Python scapy "NameError: global name 'LOOPBACK_NAME' is not defined"

I have this very simple scapy program, which does a arp_ping to my subnet
from scapy.all import srp, Ether, ARP, conf
def arp_ping(subnet):
conf.verb = 1
answered, unanswered = srp(Ether(dst="ff:ff:ff:ff:ff:ff")/ARP(pdst=subnet),timeout=2,verbose=False, inter=0.1)
return [rcv.sprintf(r"%Ether.src% - %ARP.psrc%") for snd, rcv in answered]
if __name__=="__main__":
subnet = '192.168.1.0/24'
for i in arp_ping(subnet):
print i
EDIT
After reinstalling atleast PyWin32 and WinPcap with choco I now have a this NameError:
Traceback (most recent call last):
File "arp_sender.py", line 1, in <module>
from scapy.all import srp, Ether, ARP, conf
File "C:\Python27\lib\site-packages\scapy\all.py", line 16, in <module>
from scapy.arch import *
File "C:\Python27\lib\site-packages\scapy\arch\__init__.py", line 83, in <module>
from scapy.arch.windows import *
File "C:\Python27\lib\site-packages\scapy\arch\windows\__init__.py", line 465, in <module>
conf.iface = get_working_if()
File "C:\Python27\lib\site-packages\scapy\arch\windows\__init__.py", line 463, in get_working_if
return LOOPBACK_NAME
NameError: global name 'LOOPBACK_NAME' is not defined
Checking for depencies issue
Running scapy.bat to check for depencies results into this message:
INFO: Can't load Python libreadline or completer
INFO: Can't import matplotlib. Won't be able to plot.
INFO: Can't import PyX. Won't be able to use psdump() or pdfdump().
WARNING: No match between your pcap and windows network interfaces found. You probably won't be able to send packets. Deactivating unneeded interfaces and restarting Scapy might help.Check your winpcap and powershell installation, and access rights.
INFO: Could not get readline console. Will not interpret ANSI color codes.
WARNING: No default IPv4 routes found. Your Windows release may no be supported and you have to enter your routes manually
Traceback (most recent call last):
File "C:\Python27\Scripts\\scapy", line 25, in <module>
interact()
File "C:\Python27\lib\site-packages\scapy\main.py", line 300, in interact
scapy_builtins = __import__("all",globals(),locals(),".").__dict__
File "C:\Python27\lib\site-packages\scapy\all.py", line 16, in <module>
from scapy.arch import *
File "C:\Python27\lib\site-packages\scapy\arch\__init__.py", line 83, in <module>
from scapy.arch.windows import *
File "C:\Python27\lib\site-packages\scapy\arch\windows\__init__.py", line 465, in <module>
conf.iface = get_working_if()
File "C:\Python27\lib\site-packages\scapy\arch\windows\__init__.py", line 463, in get_working_if
return LOOPBACK_NAME
NameError: global name 'LOOPBACK_NAME' is not defined
My guess is that:
WARNING: No match between your pcap and windows network interfaces found. You probably won't be able to send packets. Deactivating unneeded interfaces and restarting Scapy might help.Check your winpcap and powershell installation, and access rights.
is causing the issue, but I'm unsure how to resolve this.
You are using an old scapy version, update it via
pip install scapy --upgrade to get 2.4.0
I can see that because recent versions do not need libreadline anymore
In the latest version, the only dependency you need to install is:
Winpcap or Npcap
IPython (if asked)
You may have run your Python code as non-administrator.
I just encountered the same problem when I used the PyCharm IDE, which was not opened in the administrator mode (by default).
After I restarted the IDE as the administrator, the error message was gone.
Similarly, when you want to use the scapy library in Linux, you have to specify "sudo ..."

wfastcgi fails import cx_Oracle, but `python -c "import cx_Oracle" succeeds

I have IIS setup with fastCGI, serving a flask app. So far so good. Next I whish to add some database connectivity, so I add the line import cx_Oracle to my app. Now this error is thrown:
Error occurred while reading WSGI handler:
Traceback (most recent call last):
File "D:\Anaconda2\lib\site-packages\wfastcgi.py", line 791, in main
env, handler = read_wsgi_handler(response.physical_path)
File "D:\Anaconda2\lib\site-packages\wfastcgi.py", line 633, in read_wsgi_handler
handler = get_wsgi_handler(os.getenv("WSGI_HANDLER"))
File "D:\Anaconda2\lib\site-packages\wfastcgi.py", line 616, in get_wsgi_handler
raise ValueError('"%s" could not be imported%s' % (handler_name, last_tb))
ValueError: "Bloomberg_server.app" could not be imported:
Traceback (most recent call last):
File "D:\Anaconda2\lib\site-packages\wfastcgi.py", line 600, in get_wsgi_handler
handler = __import__(module_name, fromlist=[name_list[0][0]])
File "D:\website\init__.py", line 6, in import cx_Oracle
ImportError: DLL load failed: The specified module could not be found. StdOut: StdErr:
As the title suggests I fail to reproduce the issue in a controlled environment. The very same import statement works fine in the conda environment and moreover, I can run the Flask debug server just fine with pages that rely on a database connection.
I am at loss. Who has a clue what's going on here? The path/oracle_home variables are pointing to the instant client and I have only one python environment installed.
I am too embarrased to admit how long this has taken me, but I've found the answer.
FastCGI's core business is keeping subprocesses alive so that subsequent calls to the server do not require booting a Python environment. In other words, after installing a python package it is advised to reboot. I solved my first question on SO by rebooting..
The answer to this question got me thinking in the right direction.

Python connect to Hadoop using Hive with Kerberos authentication

I am trying to connect python to Hadoop using Hive using Kerberos. Tried various sources but failed in connecting
import pyhs2
conn_config = {'krb_host': 'hostname', 'krb_service': 'hive'}
pyhs2.connect(host='hostname',
port=10000,
authMechanism="KERBEROS",
password="********",
user='hostname#SCGLOBALUAT.ADUAT.SCOTIACAPITAL.COM')
Error Encountered:
authMechanism="KERBEROS") as conn:
File "build\bdist.win-amd64\egg\pyhs2\__init__.py", line 7, in connect
File "build\bdist.win-amd64\egg\pyhs2\connections.py", line 46, in __init__
File "build\bdist.win-amd64\egg\pyhs2\cloudera\thrift_sasl.py", line 66, in open
thrift.transport.TTransport.TTransportException: Could not start SASL: Error in sasl_client_start (-4) SASL(-4): no mechanism available: Unable to find a callback: 2
can please somebody help me to give clear instructions to connect python to Hadoop using Hive with Kerberos Ticket

Categories