I need to deploy a flask app to google app engine.
I used docker and there lines are in Dockerfile:
ADD requirements.txt /app/requirements.txt
RUN pip install -r /app/requirements.txt
In requirements.txt file:
Flask==0.12
gunicorn==19.6.0
boto==2.46.1
gcs-oauth2-boto-plugin==1.8
ffmpeg-normalize
It is supposed to install install all dependencies. But somehow "ffmpeg-normalize" is not installed in google app engine instances.
Can anyone help me with that?
If there is another better way doing the package installation, I will be happy to go with as well. Thanks!!
This could be happening for a few reasons. Here are my guesses :)
How do you know the package isn't being installed? Can you share the docker build output that happens when you gcloud app deploy?
Another thing to try here, just to be sure is to run:
gcloud app instances list
Then...
gcloud beta app instances ssh [instance]
--service [svc]
--version [v]
--container gaeapp
From there, you can ls around in the container and see exactly what got installed.
I would guess that the pip package is getting installed, but maybe you just didn't install the native dependency you need for ffmpeg. Here's an example of how to do this with Docker + App Engine:
https://github.com/JustinBeckwith/next17/blob/master/videobooth/Dockerfile
Since you're already using docker - what happens when you build this container locally? Have you tried:
docker build -t myapp .
docker run -it -p 8080:8080 myapp
Hopefully one of this helps give you a clue to figuring out what's happening. Hope this helps!
Related
I am trying to deploy a Flask app to an Azure Web App (Linux, python3.7 runtime) using FTP.
I copied the "application.py" over and a "requirements.txt", but I can see in the logs that nothing is being installed.
The Web App is using an 'antenv' virtual environment but it won't install anything. How do I add libraries to this 'antenv' virtual environment?
Yes, I see that you have resolved the issue. You must use Git to deploy Python apps to App Service on Linux so that your dependencies in requirements.txt are installed (root folder).
To install Django and any other dependencies, you must provide a requirements.txt file and deploy to App Service using Git.
The antenv folder is where App Service creates a virtual environment with your dependencies. If you expand this node, you can verify that the packages you named in requirements.txt are installed in antenv/lib/python3.7/site-packages. Refer this document for more details.
Additionally, Although the container can run Django and Flask apps automatically, provided the app matches an expected structure, you can also provide a custom startup command file through which you have full control over the Gunicorn command line. A custom startup command is typically required for Flask apps, but not Django apps.
Turns out I had to run these commands and do a git push while my local venv was activated. At that point I saw azure start downloading all the libraries in my requirements.txt
I've developed and tested a dash app. It works as expected. Next step is to deploy the app to AWS Elastic Beanstalk using a preconfigured Docker container.
I am currently trying to set up a local docker environment for testing as described here
Running the command (via PowerShell):
docker build -t dash-app -f Dockerfile.
successfully downloads the preconfigured image, then proceeds to install python modules as specified in requirements.txt, until it gets to the cryptography module, where it throws a runtime error saying it requires setuptools version 18.5 or newer.
My Dockerfile has this line in it:
FROM amazon/aws-eb-python:3.4.2-onbuild-3.5.1
I've tried adding a line to the dockerfile to force upgrade pip and setuptools within the container as suggested here and here, but nothing seems to work.
Iv'e been doing some maintenance on the server side and then deployed the app to the GAE. since I deployed the new version, it doesn't recognize the flask_sqlalchemy module. It worked perfectly before the commit. is there something that I'm missing?
Ok for anyone getting this error, apparently I forgot to copy the libraries to the lib folder. I did that by using the command: pip install -t lib -r requirements.txt
I have a small python flask app on a CentOS-7 VM that runs in docker, along with an nginx reverse proxy. The requirements.txt pulls in several external utilities using git+ssh such as:
git+ssh://path-to-our-repo/some-utility.git
I had to make a change to the utility, so I cloned it locally, and I need the app to use my local version.
Say the cloned and modified utility is in a local directory:
/var/work/some-utility
In the requirements.txt I changed the entry to:
git+file:///var/work/some-utility
But when I try to run the app with
sudo docker-compose up
I get the error message
Invalid requirement: 'git+file:///var/work/some-utility'
it looks like a path. Does it exist ?
How can I get it to use my local copy of "some-utility" ?
I also tried:
git+file:///var/work/some-utility#egg=someutility
but that produced the same error.
I looked at PIP install from local git repository.
This is related to this question:
https://stackoverflow.com/questions/7225900/how-to-pip-install-packages-according-to-requirements-txt-from-a-local-directory?rq=1
I suppose most people would say why not just check in a development branch of some-utility to the corporate git repo, but in my case I do not have privileges for that.
Or maybe my problem is related to docker, and I need to map the some-utility folder into the docker container, and then use that path? I am a docker noob.
--- Edit ---
Thank you larsks for your answer. I tried to add the some-utility folder to the docker-compose.yml:
volumes:
- ./some-utility:/usr/local/some-utility
and then changed the requirements.txt to
git+file:///usr/local/some-utility
but our local git repo just went down for maintenance, so I will have to wait a bit for it to come back up to try this.
=== Edit 2 ===
After I made the above changes, I get the following error when running docker-compose when it tries to build my endpoint app:
Cloning file:///usr/local/some-utility to /tmp/pip-yj9xxtae-build
fatal: '/usr/local/some-utility' does not appear to be a git repository
But the /usr/local/some-utility folder does contain the cloned some-utility repo, and I can go there and run git status.
If you're running pip install inside a container, then of course /var/work/some-utility needs to be available inside the container.
You can expose the directory inside your container using a host volume mount, like this:
docker run -v /var/work/some-utility:/var/work/some-utility ...
I've built a web app using django deployed on openshift. I'm trying to add the third party reusable app markdown-deux. I've followed the install instructions (used pip) and it works fine on the localhost development server.
I've added 'markdown_deux' to my settings.py and tried it with and without a requirements.txt. However, I still get a 500 error and from rhc tail the error "Import error: no module named markdown_deux".
I've tried restarting my app and resyncing the db but I'm still getting the same errors. I've RTFM but to no avail.
Openshift has mechanisms to automatically check and add dependencies after each git push, depending on your application type. So you don't need to install dependencies manually.
For python applications modify the projects setup.py.
Python application owners should modify setup.py in the root of the git repository with the list of dependencies that will be installed using easy_install. The setup.py should look something like this:
from setuptools import setup
setup(name='YourAppName',
version='1.0',
description='OpenShift App',
author='Your Name',
author_email='example#example.com',
url='http://www.python.org/sigs/distutils-sig/',
install_requires=['Django>=1.3', 'CloudMade'],
)
Read all details at the Openshift Help Center.
You've used pip to install it locally, but you actually need to install it on your server as well. Usually you would do that by adding it to the requirements.txt file and ensuring that your deployment process includes running pip install -r requirements.txt on the server.