How to freeze requirements that can't be satisfied locally? - python

I'm writing a Python app to deploy on Heroku. Per Heroku's guide, I need to list package requirements in a Pip requirements.txt file. The guide instructs me to install the packages locally, then run pip freeze > requirements.txt to write the frozen requirements file.
However, one of the packages I want to use in deployment on Heroku can't be installed locally. It's incompatible with my operating system.
So how do I write a requirements.txt including this package suitable for Heroku?
The only way I can think of is to write it by hand - but this would be tedious, because the package in question has many dependencies of its own. Besides, this defeats the point of the package manager.
When deploying Ruby apps to Heroku, Bundler makes this easy. In my Gemfile I write
gem "pg", :group => :production
gem "sqlite3", :group => :development
The command bundle install then writes a frozen version list Gemfile.lock (analogous to requirements.txt). It doesn't install the packages listed under the 'production' group, but it still freezes a consistent list of versioned packages.
Example: Gemfile and Gemfile.lock

You can have more than one file, and call them different things, but Heroku does expect a requirements.txt. For instance, for dev, you could maintain a dev_requirements.txt
Locally you can run:
$ pip freeze > dev_requirements.txt
etc, and
$ pip install -r dev_requirements.txt
and Heroku will run:
$ pip install -r requirements.txt

It's not possible. Issue reported to pip https://github.com/pypa/pip/issues/747

Related

With pip, should I create requirements.txt from pip freeze, or by manually editing the requirements.txt file?

For context, I don't know much about Python, let alone idiomatic Python. I'm working on a brownfield project. Everything I say about docker may be irrelevant to the question I'm asking, but I can't tell: Our code runs in a Docker container. Instead of using virtual environments, we hardcode the Python version and run this in the Dockerfile:
ADD requirements.txt /
RUN pip install -r /requirements.txt \
&& rm -rf /requirements.txt
At the moment, we have two ways of adding requirements to the requirements.txt:
By running this command (using twilio as an example):
docker-compose run --rm django bash -c "pip install twilio && pip freeze > requirements.txt"
By going to pypi.org, finding a dependency's name and current version, and manually adding that line to the hosts requirements.txt.
Both seem to work, but my gut tells me there are latent downsides to one/both of these. What are the pros and cons of each choice and which one is considered idiomatic? If neither of these are considered idiomatic, what's the right way to add to the requirements.txt?
I've been googling, but a lot of the results are questionable because they are really old. e.g., pip 20.3.2020 added resolver functionality, and I don't know what ripples that had on best practices.
The requirements.txt file indicates all the dependencies that must be installed for your application to run correctly.
Running pip freeze will dump all the actually installed libraries (development, other projects, deprecated, etc) into a freshly created requirements.txt file.
Adding manually the dependencies is a more controlled manner to list your dependencies.
I recommend you adding manually the dependencies to the file while building the project. If you found some are not necessary, remove them. With the pip freeze, maybe other secondary libraries will remain.

Forcing Flask to only use libraries contained in requirements.txt?

I have a Flask app currently running on Google App Engine. Locally, the app is running inside a virtual environment and references the appropriate libraries installed in the venv/Lib/site-packages directory.
When the app is updated in GAE, requirements.txt is used to determine what libraries/dependencies need to be installed. I frequently get tedious errors like "Module not found" and have to remember to add said module in my requirements.txt, and then have to redeploy and check the error logs, which takes time.
I have a bunch of dependencies installed in my virtual environment, only some of which need to be referenced in my requirements.txt file, since I only use a few in my Flask app. So, I am trying to figure out a way to test my app locally as if it was running on GAE by forcing Flask to reference only those dependencies in my requirements.txt file so if there is a "Module not found" error, I won't have to repeat gcloud app deploy and have to scour through the logs all over again but rather just do it quickly on my own machine.
Hopefully that wasn't to convoluted, lol.
To be clear, not everything installed in your virtual env needs to be declared in your requirements.txt file. Some libraries are installed because they are dependencies of another. For example, just listing Flask will lead to Jinja also being installed
To your specific issue, you're basically saying you did not narrow down the actual libraries you need for your project. This is usually due to copying over installed libraries from another project.
You can use pip3 freeze > requirements.txt or pip2 freeze > requirements.txt to auto-generate your requirements.txt file. The problem with this method is that it will include everything installed in your virtual env and it seems you don't want this.
Some suggest using pipreqs (see this Stackoverflow answer).
I normally do it the manual way i.e. Remove the existing venv, create a requirements.txt with just the basics - python, flask/django, run your program and then manually add each library it complains about into the requirements.txt file and reinstall the contents of your requirements.txt file. Rinse & Repeat till you no longer get errors. Now you have your full requirements.
1.Install venv --- sudo apt install python3-venv --- for python3
2.You create Virtual environment for all flask server.
Choose some directory and run this command.
python3 -m venv venv
3.Run this command for activate for venv
source venv/bin/activate
4.Choose diroctory flask server
pip3 freeze > requirements.txt -run this command
5. Finally run this command
pip3 install -r requirements.txt
6. You can use this venv all flask server. You can update it.

Upgrading a project from buildout to pip?

I am trying to upgrade a Pyramid project to Python 3. I am also exploring various on how to improve the build system so that is more modern.
Currently we are using Python's buildout to setup an instance. The approach is simple:
All the required eggs (including app - which is my package) are stored in a buildout configuration file with their exact versions specified.
Run buildout to get the exact eggs (including third party stuff) from a local package server.
Run the instance using ./bin/pserve config.ini.
For my new app source code that has Python 3 changes, I am trying to get rid of everything and just use pip instead. This is what I have done now (in a docker container):
git clone git#github.com/org/app.git
cd project
# Our internal components are fetched using the `git` directive inside `requirements.txt`.
pip install -r requirements.txt # Mostly from PyPi.
pip install .
It works, but is this the correct way to deploy an application for deployment?
Will I be able to convert the entire installation to a simple: pip install app and run it using pserve config.ini if I do the following:
Upload the latest app egg to my package server.
Sync setup.py and requirements.txt so that Python to do pip install -r requirements.txt (or its equivalent) internally?
pip install app.
Copy config.ini to the machine where I am going to install.
Run pserver config.ini
I wanted to know if the above approach can be made to work before proceeding with the egg creation, mocking a simple package server etc. I am not sure if I can really do pip install for a web application; and I think requirements.txt has some significance in this case.
I haven't explored wheels yet, but if the above works, I will try that as well.
Since I am really new to packaging, would appreciate if I can some suggestions to modernize by build using the latest tools.
After reading some of the links like requirements.txt vs setup.py, I think requirements.txt is needed for Web Apps especially if you want a consistent behaviour for deployment purposes. A project or an application seems to be different than a library where pip install suffices.
If that is the case, I think the ideal way is to do pip install -r requirements.txt and then pip install app from a local package server without git cloning?
Resources: install_requires vs requirements files

Python - how to let users run script with libraries

I plan on sharing a Python program on GitHub.
However it uses additional libraries like http, Selenium, BeautifulSoup and the Google Calendar API.
How do I include these libraries in the directory I push to GitHub, so that all users have to do is run python script.py, instead of having to install the libraries?
I thought of generating an executable with pyinstaller but that didn't work :/
You use a pip requirements.txt file for this.
If you have done your work inside of a virtual environment then run on your command line/terminal:
pip freeze > requirements.txt
Then commit and push the file to your github repository.
If you have not done your script within a virtual environment then run:
pip freeze > requirements.txt
And edit the file so you only have the modules you require.
I'd recommend always use a virtual environment for this since it makes your application easy to share. In django framework employing virtualenv's is very common.
Your collaborators can install your dependencies using:
pip install -r requirements.txt
After cloning your github repo.
Usually you don't need to embed your dependencies in your project (not practical! specially when they are many). Instead you may include requirements.txt inside your project to list the modules (and version number) that are required by your application. Then when a user need to use your script, they can run something like this:
pip install -r requirements.txt
read more about requirements files here:
https://pip.readthedocs.org/en/1.1/requirements.html#requirements-files

Does Django have an equivalent of Rails's "bundle install"?

One thing I like about Rails projects is that when deploying to a remote server, if everything is set up correctly you can just do:
$: bundle install
And the system will install the various dependencies (ruby gems) needed to run the project.
Is there something similar for Python/Django?
You can freeze requirements. This generates a list of all the Python modules that your project needs. I believe bundle is similar in concept.
For example:
virtualenv --no-site-packages myproject_env # create a blank Python virtual environment
source myproject_env/bin/activate # activate it
(myproject_env)$ pip install django # install django into the virtual environment
(myproject_env)$ pip install other_package # etc.
...
(myproject_env)$ pip freeze > requirements.txt
The last line generates a text file will all the packages that were installed in your custom environment. You can use that file to install the same requirements on other servers:
pip install -r requirements.txt
Of course you don't need to use pip, you can create the requirements file by hand; it doesn't have any special syntax requirements. Just a package and (possibly) version identifier on each line. Here is a sample of a typical django project with some extra packages:
Django==1.4
South==0.7.4
Werkzeug==0.8.3
amqplib==1.0.2
anyjson==0.3.1
celery==2.5.1
django-celery==2.5.1
django-debug-toolbar==0.9.4
django-extensions==0.8
django-guardian==1.0.4
django-picklefield==0.2.0
kombu==2.1.4
psycopg2==2.4.5
python-dateutil==2.1
six==1.1.0
wsgiref==0.1.2
xlwt==0.7.3
The closest is probably virtualenv, pip and a requirements file.
With those 3 ingredients is quite easy to write a simple bootstrap scripts.
More demanding and complex is buildout. But I would only go for it if virtualenv and pip are not sufficient.
And if you extend this approach with fabric and optional cuisine, you already have your project deployment automated. Check out these links for more information:
http://www.caktusgroup.com/blog/2010/04/22/basic-django-deployment-with-virtualenv-fabric-pip-and-rsync/
http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django/
http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django/

Categories