Python in AWS Elastic Beanstalk: Private package dependencies - python

I would like to deploy a Python Flask application on beanstalk.
The application depends on external packages (e.g. geopy) and internal packages (e.g. adam_geography).
The manual
Create a requirements.txt file and place it in the top-level directory
of your source bundle.
This would probably fetch geopy and its dependencies, but would not fetch adam_geography which is available from a custom repo inside my VPC.
How do I specify/upload private, internal Python package dependencies in a Beanstalk application?

1) copy internal Python package to server
2) use Pip's "editable installs" feature to install the private package:
pip install -e path/to/SomeProject
http://pip.readthedocs.org/en/latest/reference/pip_install.html#editable-installs

Use ebextensions to specify custom commands you can use to download files on all your EC2 instances. These ebextensions can be used to run pip like #shavenwarthog suggested in his answer.
Create a directory called .ebextensions in your app source root directory. Inside this directory create a file with a .config extension say 01-custom-files.config.
This file can contain custom unix commands you want to run on each EC2 instance.
You can run your own scripts here.
You can also use container_commands which are executed after unzipping your app source on the EC2 instance.
Read more about commands and container_commands here. You can also find examples here:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/customize-containers-ec2.html#customize-containers-format-commands
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/customize-containers-ec2.html#customize-containers-format-container_commands

Related

How to configure pip to look for modules in a local directory first

I have a local pip server in my company that hosts a lot of modules, some of those are outdated.
Since I don't have the ability to add/modify modules in the server, I have created a local directory and downloaded the latest versions for some modules I need.
Now I want to modify my local pip configuration pip.conf so that pip would firstly look for modules in my directory, and if it doesn't find any, then look up them up in the local server.
How can I achieve that? Is it possible to do so without hosting a server locally?
You should be able to cache files locally. Useful if you would be offline. See https://pip.pypa.io/en/latest/user_guide/#installing-from-local-packages

Cannot install packages for Python Azure Function

I have a Python Azure function which executes locally. It is deployed to Azure and I selected the 'free app plan'. The Python has dependencies on various modules, such as requests. The modules are not loaded into the app like they are locally on my machine. The function fails when triggered.
I have tried installing the dependencies using Kudu console from my site, this hangs with message cleaning up >> every time.
I have tried installing the dependencies using SSH terminal from my site, the installations succeed but i cannot see the modules when python pip list in kudo and the app still fails. I cannot navigate the directories ls does nothing.
I tried to install extensions using the portal but this option is greyed out in development-tools.
You can find a requirements.txt in your local function folder.
If you want function on azure to install the 'requests', your requirements.txt should be like this:(Azure will install the extension based on this file)
azure-functions
requests
And all these packages will be packaged into a new package on Azure, so you can not display which packages using pip list. Also, please keep in mind that Linux's Kudu feature is limited and you cannot install packages through it.
Problem seems comes from VS Code, you can use command to deploy your function app.
For example, my functionapp on Azure named 423PythonBowman2, So this is my command:
func azure functionapp publish 423PythonBowman --build remote
I quoted requests in the code, and with cmd deploy my function can works fine on portal with no errors.
Have a look of the offcial doc:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=macos%2Ccsharp%2Cbash#publish

Pip install package from private github repo in google cloud appengine

I am using google cloud appengine and deploying with gcloud app deploy and a standard app.yaml file. My requirements.txt file has one private package that is fetched from github (git+ssh://git#github.com/...git). This install works locally, but when I run the deploy I get
Host key verification failed.
fatal: Could not read from remote repository.
This suggests there is no ssh key when installing. Reading docs (https://cloud.google.com/appengine/docs/standard/python3/specifying-dependencies) it appears that this just isn't an option???
Dependencies are installed in a Cloud Build environment that does not provide access to SSH keys. Packages hosted on repositories that require SSH-based authentication must be copied into your project directory and uploaded alongside your project's code using the pip package manager.
To me this seems severely not-optimal - the whole point of factoring out code into a package was to be able to avoid duplication in repos. Now, if I want to use appengine, you're telling me this not possible?
Is there really no workaround?
See:
https://cloud.google.com/appengine/docs/standard/python3/specifying-dependencies#private_dependencies
The App Engine service does not (and should not) have access to your private repo.
One alternative (that you don't want) is to upload your public key to the App Engine service.
The other -- as documented -- is that you must provide the content of your private repo to the service as part of your upload.
I'm going through the same issue, deploying on gcloud a python project that contains in its requirements.txt some private repositories. As #DazWilkin wrote already, there's no way to deploy it like you do normally.
One option would be to create a docker image of the whole project and its dependencies, save it into the gcloud docker registry and then pull it into the App Engine instance.

How do you pip install libraries to the 'antenv' venv on an Azure Web App?

I am trying to deploy a Flask app to an Azure Web App (Linux, python3.7 runtime) using FTP.
I copied the "application.py" over and a "requirements.txt", but I can see in the logs that nothing is being installed.
The Web App is using an 'antenv' virtual environment but it won't install anything. How do I add libraries to this 'antenv' virtual environment?
Yes, I see that you have resolved the issue. You must use Git to deploy Python apps to App Service on Linux so that your dependencies in requirements.txt are installed (root folder).
To install Django and any other dependencies, you must provide a requirements.txt file and deploy to App Service using Git.
The antenv folder is where App Service creates a virtual environment with your dependencies. If you expand this node, you can verify that the packages you named in requirements.txt are installed in antenv/lib/python3.7/site-packages. Refer this document for more details.
Additionally, Although the container can run Django and Flask apps automatically, provided the app matches an expected structure, you can also provide a custom startup command file through which you have full control over the Gunicorn command line. A custom startup command is typically required for Flask apps, but not Django apps.
Turns out I had to run these commands and do a git push while my local venv was activated. At that point I saw azure start downloading all the libraries in my requirements.txt

How do I connect to an external Oracle database using the Python cx_Oracle package on Google App Engine Flex?

My Python App Engine Flex application needs to connect to an external Oracle database. Currently I'm using the cx_Oracle Python package which requires me to install the Oracle Instant Client.
I have successfully run this locally (on macOS) by following the Instant Client installation steps. The steps required me to do the following:
Make a directory called /opt/oracle
Create a symlink from /opt/oracle/instantclient_12_2/libclntsh.dylib.12.1 to ~/lib/
However, I am confused about how to do the same thing in App Engine Flex (instructions). Specifically, here's what I'm confused about:
The instructions say I should run sudo yum install libaio to install the libaio package. How do I do this on GAE Flex? Or is this package already available?
I think I can add the Instant Client files to GAE (a whopping ~100MB!), then set the LD_LIBRARY_PATH environment variable in app.yaml to export LD_LIBRARY_PATH=/opt/oracle/instantclient_12_2:$LD_LIBRARY_PATH. Will this work?
Is this even feasible without using custom Docker containers on App Engine Flex?
Overall I'm not sure if I'm on the right track. Would love to hear from someone who has managed this before :)
If any of your dependencies is not available in the base GAE flex images provided by Google and cannot be installed via pip (because it's not a python package or it's not available in PyPI or whatever other reason) then you can't use the requirements.txt file to get it installed in your GAE flex app.
The proper way to satisfy such dependencies would be to build your own custom runtime. From About Custom Runtimes:
Custom runtimes allow you to define new runtime environments, which
might include additional components like language interpreters or
application servers.
Yes, that means providing a custom Docker file. In your particular case you'd be installing the Instant Client and libaio inside this Dockerfile. See also Building Custom Runtimes.
Answering your first question, I think that the instructions in the oracle website just show that you have to install said library for your application to work.
In the case of App engine flex, they way to ensure that the libraries are present in the deployment is with the requirements.txt textfile. There is a documentation page which does explain how to do so.
On the other hand, I will assume that "Instant Client Files" are not libraries, but necessary data for your App to run. You should use Google Cloud Storage to serve them, or any other alternative of Storage within Google Cloud.
I believe that, if this is all what you need for your App to work, pushing your own custom container should not be necessary.

Categories