I am a making a very simple API call to the Google Vision API, but all the time it's giving me error that 'google.oauth2' module not found. I've pip installed all the dependcies. To check this, I've imported google.oauth2 module in command line Python and It's working there. Please help me with this.
There are multiple reasons for this:
Check whether you have installed dependencies in only one place or multiple places. Try to install it only in the source library folder.
If above doesn't solve uninstall all Google packages from your local machine, delete the lib folder in your app folder, create it again and then execute:
pip install -t lib google-auth google-auth-httplib2 google-api-python-client --upgrade
Hope this should solve your problem!!
Related
I want to create an airflow DAG to transfer files to cloud storage but I'm running into a problem importing Google Cloud libraries.
Libraries I want to use:
from airflow.providers.google.cloud.operators.gcs import GCSCreateBucketOperator, GCSDeleteBucketOperator
from airflow.providers.google.cloud.operators.gcs import GCSCreateBucketOperator
from airflow.providers.google.cloud.transfers.gcs_to_local import GCSToLocalFilesystemOperator
from airflow.providers.google.cloud.transfers.local_to_gcs import LocalFilesystemToGCSOperator
I want to create an airflow DAG to transfer files to cloud storage but I'm running into a problem importing Google Cloud libraries.
Libraries I want to use:
The error I got:
pkg_resources.ContextualVersionConflict: (protobuf 4.21.9 (/opt/anaconda3/lib/python3.9/site-packages), Requirement.parse('protobuf<4.0.0dev'), {'google-cloud-secret-manager'})
I tried pip install googleapis-common-protos --upgrade to fix the problem but still the same problem persists
In your virtual env, you can try to install the Apache Airflow package with extra gcp to prevent depencencies conflicts :
Example with pip :
requirements.txt file
apache-airflow[gcp]==2.4.2
pip command :
pip install -r requirements.txt
You can also use another package manager with Python like pipenv and PipFile :
apache-airflow = { version = "==2.4.2", extras = ["gcp"] }
In all the cases, I really recommend you to use a virtual env to isolate the packages for your current project and to prevent conflict on installed packages.
TL;DR How do I install AWS X-Ray sdk, because the pip install doesn't seem to get the full package
Hello Folks,
I'm working on trying out AWS X-Ray for my Python lambda. I know the library is large, but I wanted to get a POC before putting it into a layer. The issue is that when I install it, it doesn't seem to get installed.
When I run pip install -r requirements.txt I see that it installs two packages, aws_xray_sdk and aws_xray_sdk-2.9.0.dist-info. When I look into these packages I see that they are 740 kilobytes altogether (makes me think this is a stub of some sort)
When I upload my lambda and test it, I get the following error even though the directories are in my venv:
[ERROR] Runtime.ImportModuleError: Unable to import module 'users/main': No module named 'aws_xray_sdk' Traceback (most recent call last):
Any help is greatly appreciated.
To install the X-Ray SDK for Python, you simply do pip install aws-xray-sdk and that should work. This is explained in the documentation.
If this is not working, there is something else that's wrong with your setup.
The issue was that I was not putting my dependencies in the root of my .zip file as the docs clearly show ='(
4. Create a deployment package with the installed libraries at the root.
I'm using the "import ldap" in a python code. This is on a windows 10 machine.
I installed the python-ldap module
pip3 install python-ldap
Installed the dependencies based on the instructions at Python Can't install packages
Also resolved all the pip deployment issues based on Installing python-ldap in a virtualenv on Windows
I'm now getting the following error when executing the import ldap statement. am I missing something here? Any ideas to resolve it?
thon39\site-packages\ldap\__init__.py", line 34, in <module>
import _ldap
ImportError: DLL load failed while importing _ldap: The specified module could not be found.
Visit the unofficial Python binaries page:
https://www.lfd.uci.edu/~gohlke/pythonlibs/#python-ldap
Download the appropriate WHL package for your system.
For example, if you're using Python 3.8 on an x64 system, download python_ldap‑3.3.1‑cp38‑cp38‑win_amd64.whl
(hint: do NOT download the +sasl version unless you have the Cyrus SASL code running on your system...)
Start the VirtualEnv for your project, if you're using one (C:\Users\youruser\.virtualenv\YourVirtualEnv\Scripts\activate.bat) -- if you're not, skip this step.
Then run pip3 install C:\Path\To\python_ldap_x.x.x-cpXX-cpXX-winXX.whl and this should install the Python DLL (pyd) file for you.
I want to run a docker image in azure as a container. In visual studio Code everything works fine. But azure has problems with "from azure.servicebus import QueueClient" I get the errormessage:
"File "./main.py", line 11, in
from azure.servicebus import QueueClient, Message
ModuleNotFoundError: No module named 'azure' "
I want to unstall the package azure-servicebus
How can I do that?
azure-servicebus package can be installed in the docker image using
pip install azure-servicebus
More information about package can be found at https://pypi.org/project/azure-servicebus/
However how to install it via Dockerfile it will depends on how are you installing other dependent packages (maybe via requirements.txt file or directly installing via pip or using other packager like conda etc). If you are able to post sample Dockerfile, we could suggest changes in the file which could work immediately.
When I run:
from google.colab import auth
I get this error:
ModuleNotFoundError: No module named 'google.colab'
This module is required for accessing files on Google drive from python. How can I resolve this error?
You can simply download google-colab and use it in local.
pip install google-colab
If you want to run google-colab from your local machine and you want to install if via conda, just type the following:
conda install -c conda-forge google-colab
For reference: https://anaconda.org/conda-forge/google-colab
AFAIK, you can execute the module 'google.colab' from within the notebook environment of colab.research.google.com (it is not a publicly available package)
OFF-TOPIC:
Looking at the tag conda in your question. I assume that you are running the code from your local machine. Please make use of PyDrive to read from google drive on your local machine.
References:
Google Colaboratory FAQ
PyDrive
11
You can use !setup.py install to do that.
Colab is just like a Jupyter notebook. Therefore, we can use the ! operator here to install any package in Colab. What ! actually does is, it tells the notebook cell that this line is not a Python code, its a command line script. So, to run any command line script in Colab, just add a ! preceding the line.
For example: !pip install tensorflow. This will treat that line (here pip install tensorflow) as a command prompt line and not some Python code. However, if you do this without adding the ! preceding the line, it'll throw up an error saying "invalid syntax".
But keep in mind that you'll have to upload the setup.py file to your drive before doing this (preferably into the same folder where your notebook is).
Hope this answers your question