I have a Jupyter notebook that i've built a script in for extracting data from a Google Sheet using these two imports:
from googleapiclient.discovery import build
from google.oauth import service_account
I'm trying to copy it to AWS Lambda and I'm having trouble uploading these three libraries to a layer:
google-api-python-client
google-auth-httplib2
google-auth-oauthlib
I downloaded them from pypi.org. They all only have one download option and don't specify which version of python 3 they're compatible with, except google-api-python-client which has "Python 3.7, 3.8, 3.9, 3.10 and 3.11 are fully supported and tested." in the comments.
I just checked and it looks like my Jupyter notebook is running Python 3.10. I've also copied the script into VSCode and these libraries also appear to only work in Python 3.10. Which is weird since at least one of them should still work in all versions.
It makes me think i'm doing something wrong.
Also, it doesn't look like Lambda supports 3.10? So is there no way to run Google libraries on it? Or do I need to use older libraries?
If you don't have 3.9 locally, you can use Docker to run it inside a container and see which packages you need.
FROM amazon/aws-lambda-python:3.9
RUN pip install google-api-python-client google-auth-httplib2 google-auth-oauthlib
Build it:
docker build . --progress=plain
See logs:
#5 24.65 Successfully installed cachetools-5.3.0 certifi-2022.12.7 charset-normalizer-3.0.1 google-api-core-2.11.0
google-api-python-client-2.77.0 google-auth-2.16.0
google-auth-httplib2-0.1.0 google-auth-oauthlib-1.0.0
googleapis-common-protos-1.58.0 httplib2-0.21.0 idna-3.4
oauthlib-3.2.2 protobuf-4.21.12 pyasn1-0.4.8 pyasn1-modules-0.2.8
pyparsing-3.0.9 requests-2.28.2 requests-oauthlib-1.3.1 rsa-4.9
six-1.16.0 uritemplate-4.1.1 urllib3-1.26.14
So your requirements.txt for Python 3.9 will look like:
google-api-python-client==2.77.0
google-auth-httplib2==0.1.0
google-auth-oauthlib==1.0.0
I recommend you work locally using the same version of Python and its packages. Docker is great for that!
Related
I am tying to install pycryptodome to azure synapse notebook. PFB details.
scenario - I have created a notebook and Apache spark pool in azure synapse. I used the Below command to list the packages installed on pool. I don't see my required packages in the list. So I tried to install it using requirement.txt file and requirement.yml file in package section of Apache spark pool.
steps performed-
pip list : command to see the packages already installed.
created below file and uploaded in package section of Apache Spark pool.
requirement.txt:
pycryptodome==3.16.0
requirement.yml code:
name: pycrypto_lib
channels:
-defaults
dependencies:
-pip:
-pycryptodome
error= PFA screenshot
please share your suggestion. Thanks!
If you want to install pycryptodome. Follow the below steps:
Go to Manage -> Go to Apache Spark pool -> select Packages -> upload package_file.txt -> apply.
Note: inside package_file.txt make sure to mention your package name:
pycryptodome==3.16.0
I tested on my environment successfully installed pycryptodome==3.16.0
To verify package version. Follow below code:
import pkg_resources
for d in pkg_resources.working_set:
print(d)
You can check Successfully package installed:
I am using psutil library on my databricks cluster which was running fine for last couple of weeks. When I started the cluster today, this specific library failed to install. I noticed there was a different version of psutil got updated in the site.
Currently my python script fails with 'No module psutil'
Tried installing previous version of psutil using pip install but still my code fails with the same error.
Is there any alternative to psutil or is there a way to install it in databricks
As I known, there are two ways to install a Python package in Azure Databricks cluster, as below.
As the two figures below, move to the Libraries tab of your cluster and click the Install New button to type the package name of you want to install, then wait to install successfully
Open a notebook, type the shell command as below to install a Python package via pip. Note: At here, for installing in the current environment of databricks cluster, not in the system environment of Linux, you must use /databricks/python/bin/pip, not only pip.
%sh
/databricks/python/bin/pip install psutil
Finally, I run the code below, it works for the two ways above.
import psutil
for proc in psutil.process_iter(attrs=['pid', 'name']):
print(proc.info)
psutil.pid_exists(<a pid number in the printed list above>)
In additional to #Peter response, you can also use "Library utilities" to install Python libraries.
Library utilities allow you to install Python libraries and create an environment scoped to a notebook session. The libraries are available both on the driver and on the executors, so you can reference them in UDFs. This enables:
Library dependencies of a notebook to be organized within the
notebook itself.
Notebook users with different library dependencies
to share a cluster without interference.
Example: To install "psutil" library using library utilities:
dbutils.library.installPyPI("psutil")
**Reference: **Databricks - library utilities
Hope this helps.
I am trying to connect Firebase with an AWS Lambda. I am using their firebase-admin sdk. I have installed and created the dependancy package as described here. But I am getting this error on Lambda:
Unable to import module 'index':
Failed to import the Cloud Firestore library for Python.
Make sure to install the "google-cloud-firestore" module.
I have previously also tried setting up a similar function using node.js but I received an error message because GRPC was not configured. I think that this error message might be stemming from that same problem. I don't know how to fix this. I have tried:
pip install grpcio -t path/to/...
and installing google-cloud-firestore, but neither fixed the problem. When I run the code from my terminal, I get no errors.
Part of the problem here is that grpcio compiles a platform specific dynamic module: cygrpc.cpython-37m-darwin.so (in my case). According to this response you cannot import dynamic modules in a zip file: https://stackoverflow.com/a/58140801
Updating to python 3.8 fix this for me
As Alex DeBrie mentioned in his article on serverless.com,
The plugins section registers the plugin with the Framework. In the custom section, we tell the plugin to use Docker when installing packages with pip. It will use a Docker container that's similar to the Lambda environment so the compiled extensions will be compatible. You will need Docker installed for this to work.
Which means, the environment is different between Local and Lambda, so the compiled extensions would differ. If use a container to contain packages installed by pip, the container would mimic the environment of Lambda, then everything would run well.
If you use Serverless Frame work to deploy your Python app to AWS Lambda, add these lines to serverless.yml file:
...
plugins:
- serverless-python-requirements
...
custom:
pythonRequirements:
dockerizePip: non-linux
dockerImage: mlupin/docker-lambda:python3.9-build
...
then serverless-python-requirements would automatically open a Docker container based on mlupin/docker-lambda:python3.9-build image.
This container would mimic the Lamda environment, let pip install and compile everything in it. So the compiled extensions will be compatible.
This worked in my case. Hope this helps.
I want to use fstpso package in python which needs ANTLR3 python runtime.
I downloaded antlr_python_runtime-3.1.3.tar.gz from http://www.antlr3.org/download/Python/ and ran the command sudo python setup.py install. The output of the command was
Installed /path/to/python/packages/antlr_python_runtime-3.1.3-py2.7.egg
But after this when I try to import fstpso module in python, it throws the error
The ANTLR3 python runtime was not detected; pyfuzzy cannot import FST-PSO's FLC files
I am using python 2.7.12 on linux.
Is there something I did wrong? Or I have to update any PATH in the environment?
Thanks for your help!!
I'm fst-pso main developer. In the last days I reimplemented the Sugeno reasoner from scratch, to finally remove the pufuzzy/ANTL3 dependency. I just uploaded the new package on PyPI.
Now you can pip install the new version of fst-pso (v 1.4.0); please let me know if that works correctly.
Problem:
I'd like to install Pmw 2.0.0 (project page here) so that I can use it with tkinter in python3. The setup script from the package detects which version of python you're using and installs the version that is appropriate for your system (Ubuntu 15 in my case). I can't find any references to switches to make it install the 2.0.0 instead of 1.3.3(the Python 2.7 version), nor have I been able to get the script to install to the python3 libraries.
What I've done so far:
I've changed the python version detector in the setup script from
if sys.version_info[0]<3:
version='2.0.0' # really '1.3.3'
packages=['Pmw', 'Pmw.Pmw_1_3_3', 'Pmw.Pmw_1_3_3.lib',]
to
if sys.version_info[0]<2:
version='2.0.0' # really '1.3.3'
packages=['Pmw', 'Pmw.Pmw_1_3_3', 'Pmw.Pmw_1_3_3.lib',]
to attempt to force the installer to default to the python3 version, which it does, but it installs them in the python2.7 libraries (/usr/local/lib/python2.7/distpackages).
What I want to do:
I'm looking for a way to force the installer to put the 3.4-compatible package into the python3 libraries. If that means getting it to install both packages in their respective correct directories, that's fine, too. I'm stumped about what to try next.
Answered by RazZiel on AskUbuntu:
Link here.
Instead of using the command sudo python setup.py build and then sudo python setup.py install, I should have been using python3 to execute the setup script. I've managed to outthink myself pretty badly on this one.