Issue with using snowflake-connector-python with Python 3.x - python

I've spent half a day trying to figure it out on my own but now I've run out of ideas and googling requests.
So basically what I want is to connect to our Snowflake database using snowflake-connector-python package. I was able to install the package just fine (together with all the related packages that were installed automatically) and my current pip3 list results in this:
Package Version
-------------------------- ---------
asn1crypto 1.3.0
azure-common 1.1.25
azure-core 1.6.0
azure-storage-blob 12.3.2
boto3 1.13.26
botocore 1.16.26
certifi 2020.6.20
cffi 1.14.0
chardet 3.0.4
cryptography 2.9.2
docutils 0.15.2
gitdb 4.0.5
GitPython 3.1.3
idna 2.9
isodate 0.6.0
jmespath 0.10.0
msrest 0.6.17
oauthlib 3.1.0
oscrypto 1.2.0
pip 20.1.1
pyasn1 0.2.3
pyasn1-modules 0.0.9
pycparser 2.20
pycryptodomex 3.9.8
PyJWT 1.7.1
pyOpenSSL 19.1.0
python-dateutil 2.8.1
pytz 2020.1
requests 2.23.0
requests-oauthlib 1.3.0
s3transfer 0.3.3
setuptools 47.3.1
six 1.15.0
smmap 3.0.4
snowflake-connector-python 2.2.8
urllib3 1.25.9
wheel 0.34.2
Just to be clear, it's a clean python-venv although I've tried it on the main one, too.
When running the following code in VScode:
#!/usr/bin/env python
import snowflake.connector
# Gets the version
ctx = snowflake.connector.connect(
user='user',
password='pass',
account='acc')
I'm getting this error:
AttributeError: module 'snowflake' has no attribute 'connector'
Does anyone have any idea what could be the issue here?

AttributeError: module 'snowflake' has no attribute 'connector'
Your test code is likely in a file named snowflake.py which is causing a conflict in the import (it is ending up importing itself). Rename the file to some other name and it should allow you to import the right module and run the connector functions.

Try importing 'connector' explicitly. I had the same error.
import pandas as pd
import snowflake as sf
from snowflake import connector

When I had this issue I had both the snowflake and snowflake-connector-python module installed in my environment, so it was confused on which one to use. If you're trying to use connector, just pip uninstall snowflake

If you have installed both snowflake and snowflake-connector-python, just uninstalling snowflake package will resolve the issue.
_
To list installed python packages, use command
pip list

I installed python 3.6 again.
I removed this line from code
#!/usr/bin/env python
It worked.

pip install snowflake-connector-python
Have you tried using this package instead in the jupyter notebook?

If anyone comes across this same issue and doesn't get reprieve from the above fixes, my install had a snowflake.py file saved to the site-packages folder.
This caused the 'import snowflake.connector' statement to attempt to import from the snowflake.py file instead of the snowflake folder, and of course couldn't find the connector module since the .py file isn't a package. Renaming or moving that file solved my problem.

I had a sub-folder named snowflake, but depending on how I run the script it either could or could not import the snowflake.connector.

Related

import mysql.connector not working on Google Cloud Functions

I am attempting to run my python script in google cloud functions to access it via an HTTP request for automation purposes. The problem I am having is when deploying the following:
from datetime import datetime, date, timedelta
import time
import mysql.connector
import requests
from dotty_dict import dotty
from fake_useragent import UserAgent
from mysql.connector import Error
import smtplib
from email.message import EmailMessage
from fractions import Fraction
I get the following error: ModuleNotFoundError: No module named '_version'"
I then change my import to only import MySQL as standalone like the following:
import time
import mysql
import requests
from dotty_dict import dotty
from fake_useragent import UserAgent
import smtplib
from email.message import EmailMessage
from fractions import Fraction
I then get an AttributeError naturally: AttributeError: module 'mysql' has no attribute 'connector'"
My connection within my function looks like this: connection = mysql.connector.connect
I have got requirements.txt file which contains the following:
fake_useragent==1.1.1
mysql_connector_repackaged==0.3.1
numpy==1.23.5
requests==2.28.1
Doing some general research it was recommended to try upgrade the MySQL package - I proceeded to upgrade the MySQL package via the cloud shell with the normal pip command: pip install mysql-connector-python --upgrade
I get the following issue:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
grpcio-status 1.51.1 requires protobuf>=4.21.6, but you have protobuf 3.20.1 which is incompatible.
googleapis-common-protos 1.57.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-vision 3.1.4 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-videointelligence 2.8.3 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-translate 3.8.4 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-spanner 3.24.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-logging 3.3.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-language 2.6.1 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-datastore 2.11.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-bigquery 3.4.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-bigquery-storage 2.16.2 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-cloud-appengine-logging 1.1.6 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
google-api-core 2.11.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.20.1 which is incompatible.
I then attempted to upgrade protobuf and I received this error:
mysql-connector-python 8.0.31 requires protobuf<=3.20.1,>=3.11.0, but you have protobuf 4.21.11 which is incompatible.
Successfully installed protobuf-4.21.11
Any ideas on what I need to do to get this working?
You can refer to this Stackoverflow Link, which states that
Sometimes the protobuf package might be installed without your involvement.
To resolve this issue, try to downgrade the protobuf plugin by using the command
pip install protobuf==3.20.*
Or try to add it in the requirements.txt which will overwrite the existing protobuf package.
In the above mentioned Stackoverflow Link there are many alternate solutions as well you can refer to them which might be useful.
Meanwhile check other packages installed on your Google cloud function should be compatible with the MySQL connector Package. You can also refer to these Github Link and Documentation which may help you.

synology NAS, pip issue, python libraries

I recently acquired a NAS and trying to run a python script. Issue is that I cannot run the python script as a normal user (python script.py). Python is only running if I run as a sudo user (e.g., sudo python script.py)
When I run as a normal user, it does not detect the libraries such as pandas. But I do not want to run as a sudo because sudo would required a password.
I think the issue comes from the pip installation.
Do you know what could be the issue?
Greatly appreciate
In the terminal, when I run "pip list" as a normal user, it states 'ModuleNotFoundError: No module named 'pip._internal'
`
user#NAS:~$ pip list
Traceback (most recent call last):
File "/bin/pip", line 5, in <module>
from pip._internal.cli.main import main
ModuleNotFoundError: No module named 'pip._internal'
but when I run as a sudo:
user#NAS:~$ sudo pip list
Password:
Package Version
certifi 2022.6.15
charset-normalizer 2.1.1
idna 3.3
numpy 1.23.2
pandas 1.5.1
pip 22.3.1
pyasn1 0.4.5
pysmb 1.2.2
python-dateutil 2.8.2
pytz 2022.2.1
requests 2.28.1
setuptools 56.0.0
six 1.16.0
thread6 0.2.0
urllib3 1.26.12
wheel 0.38.4

Import "google.oauth2.credentials" could not be resolved Visual Studio Code

I am trying to use google.oauth2, google_auth_oauthlib.flow and googleapiclient.discovery for my web app. But I'm getting these errors:
Import "google.auth.transport.requests" could not be resolved
Import "google.oauth2.credentials" could not be resolved
Import "googleapiclient.discovery" could not be resolved
And output is : from google.auth.transport.requests import Request ModuleNotFoundError: No module named 'google'
Here's my pip list :
google 3.0.0
google-api-core 2.8.2
google-api-python-client 2.51.0
google-auth 2.8.0
google-auth-httplib2 0.1.0
google-auth-oauthlib 0.5.2
googleapis-common-protos 1.56.3
gspread 5.4.0
httplib2 0.20.4
oauth2client 4.1.3
oauthlib 3.2.0
requests 2.28.0
requests-oauthlib 1.3.1
I don't understand why am I getting these errors. I hope you'll help me.
Maybe there is more than one python environment on your machine, please use CTRL + SHIFT + P to open the command palette type and select Python: Select Interpreter (or click on the interpreter version displayed in the lower right corner).
Choose the correct python interpreter and this will solve your problem.

Why is the requests module not installed according to the google-auth library?

I'm using the Google Calendar API to fetch some events and I'm using a venv to house all the packages required, and I've run into a simple problem: google.auth.transport.requests can't function correctly, citing the requests package is not installed.
Code and error:
The problem can be observed using a single line of code:
from google.auth.transport.requests import Request
The following error is dumped to the console from google.auth.transport.requests (at the bottom of two tracebacks):
File "<string>", line 3, in raise_from
ImportError: The requests library is not installed, please install the requests package to use the requests transport.
Failed attempts
Deleting and remaking the venv.
Install modules with --no-cache-dir and --ignore-installed.
Information
Executing import requests or from google.auth.transport.requests import Request from the console from env\Scripts\python works without problem.
The same lines when put in a file temp.py when placed inside the following directories execute in the following manner:
AutoMate\: Safe
AutoMate\src\: Safe
AutoMate\src\sources\: Error
AutoMate\src\sources\temp\: Safe (sources\temp only made for debugging)
AutoMate\src\sources\util\: Safe.
Note: All tests here and below have been ran from AutoMate\ using env\Scripts\python and env\Scripts\pip.
None of the google auth modules have been installed in the pip outside of the venv.
The project structure is as follows:
AutoMate
| temp.py
├───env
│ ├───Include
│ ├───Lib
│ └───Scripts
└───src
│ AutoMate.pyw
│
└───sources
│ calendar.py --> Problematic file
│ whatsapp.py
│ __init__.py
│
└───util
Output of env\Scripts\pip list:
Package Version
------------------------ ---------
cachetools 4.2.1
certifi 2020.12.5
chardet 4.0.0
google-api-core 1.26.1
google-api-python-client 2.0.2
google-auth 1.27.1
google-auth-httplib2 0.1.0
google-auth-oauthlib 0.4.3
googleapis-common-protos 1.53.0
httplib2 0.19.0
idna 2.10
oauthlib 3.1.0
packaging 20.9
pip 21.0.1
protobuf 3.15.5
pyasn1 0.4.8
pyasn1-modules 0.2.8
pyparsing 2.4.7
pytz 2021.1
requests 2.25.1
requests-oauthlib 1.3.0
rsa 4.7.2
selenium 3.141.0
setuptools 49.2.1
six 1.15.0
uritemplate 3.0.1
urllib3 1.26.3
Solution
Any file that uses google.auth.transport.requests must not have any file named exactly calendar.py** in the same directory as itself.
Any* other filename works, even calendar.python and Calendar.py.
** On testing it seems like requests.py and datetime.py are also some invalid names, while math.py and time.py seem to work fine, maybe the invalid names are used internally? Need further knowledge.
Reason
It looks like some there's some internal interference to me, I'm hoping someone will respond with the reason to this problem as well.
There is a try-except block within google.auth.transport.requests while importing requests package, which defaults to that error message even if the package was already installed. Code
Trying to import requests directly in REPL revealed that there was in issue with from collections import Mapping (I was using Python v3.10 at the time). As suggested here, downgrading my Python version and creating a new venv helped me with this problem.

ImportError: No module named azure.storage.blob

I am trying to download a file from Azure onto my local device via python using the following code:
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='account_name',
account_key='mykey')
block_blob_service.get_blob_to_path('container', 'file', 'out-test.csv')
However, when I run this code I get the following error:
ImportError: No module named azure.storage.blob
I already have installed the azure modules as seen by the snipping of the output of pip list:
C:\Users\hpfc87>pip list
Package Version
-------------------- ---------
adal 0.6.0
asn1crypto 0.24.0
azure-common 1.1.11
azure-nspkg 2.0.0
azure-storage 0.36.0
azure-storage-blob 1.1.0
azure-storage-common 1.1.0
azure-storage-file 1.1.0
azure-storage-nspkg 3.0.0
azure-storage-queue 1.1.0
I have tried following various posts about this issue like the following, but I have had no luck:
ImportError: No module named azure.storage.blob (when doing syncdb)
https://github.com/Azure/azure-storage-python/issues/262
Any advice?
Just for summary, your issue is resulted from the conflict between python 2.7 version you installed and the python version with Anaconda.
The packages you installed should to be in the python 2.7 environment. However, spyder uses the Anaconda self-brought python environment.
Command line uses system python environment which is determined by the the previous python environment variable in the path.

Categories