I would like to understand current state of Python build systems and requirements management.
Imagine, that I checked out sources of some project that is using poetry (or pipenv). And this project has pyproject.toml file with build system specified. Of course I can look into pyproject, see that this one is using Poetry, install poetry and run poetry install, but I would like to avoid it.
Question: Is there a build-system-agnostic way to build Python project?
By "build" I mean install all the necessary requirements for the project to be run in-place.
With requirements.txt I would achieve that by running pip install -r requirements.txt.
To install the project (and its dependencies), recent versions of pip are perfectly capable of doing this:
path/to/python -m pip install path/to/project
or
path/to/python -m pip install --editable path/to/project
To build distributions of the project, currently build is the only build back-end agnostic tool I know of:
python -m build
For the content of the pyproject.toml itself, see:
https://stackoverflow.com/a/64151860
Related
I've got a simple project that I want to be able to be pip installed - and where the 'development time' dependencies are the same as the 'use time' dependencies.
To avoid duplication between setup.py and requirements.txt - I'm using the 'trick' to have my requirements.txt simply contain . (to pick up dependencies from setup.py).
When working on the project, we can then clone it and pip install -r requirements.txt.
The problem I'm having with this is that this also seems to end up installing the actual project itself (presumably due to going through setup.py?)
Is there an option I can use with pip install to prevent this? Or some other way to work around it without ending up installing the project itself?
In NodeJS's npm you can create a package.json file to track your project dependencies. When you want to install them you just run npm install and it looks at your package file and installs them all with that single command.
When distributing my code, does python have an equivalent concept or do I need to tell people in my README to install each dependency like so:
pip install package1
pip install package2
Before they can use my code?
Once all necessary packages are added
pip freeze > requirements.txt
creates a requirement file.
pip install -r requirements.txt
installs those packages again, say during production.
The best way may be pipenv! I personally use it!
However in this guide i'll explain how to do it with just python and pip! And without pipenv! That's the first part! And it will give us a good understanding about how pipenv works! There is a second part that treat pipenv! Check the section pipenv (The more close to npm).
Python and pip
To get it all well with python! Here the main elements:
virtual environment
requirements file (listing of packages)
pip freeze command
How to install packages from a requirements file
Virtual environment and why
Note that for this the package venv is to be used! It's the official thing! And shiped with python 3 installation starting from 3.3+ !
To know well the what is it and the why check this out
https://docs.python.org/3/tutorial/venv.html
In short! A virtual environment will help us manage an isolated version of python interpreter! And so too installed packages! In this way! Different project will not have to depends on the same packages installation and have to conflict! Read the link above explain and show it well!
... This means it may not be possible for one Python installation to meet the requirements of every application. If application A needs version 1.0 of a particular module but application B needs version 2.0, then the requirements are in conflict and installing either version 1.0 or 2.0 will leave one application unable to run.
You may like to check the explanation on flask framework doc!
https://flask.palletsprojects.com/en/1.1.x/installation/#virtual-environments
Why we care about this and should use it! To isolate the projects! (each have it's environment)! And then freeze command will work per project base! Check the last section
Usage
Here a good guide on how to setup and work:
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
Check the installation section first!
Then
To create a virtual environment you go to your project directory and run:
On macOS and Linux:
> python3 -m venv env
On Windows:
> py -m venv env
Note You should exclude your virtual environment directory from your version control system using .gitignore or similar.
To start using the environment in the console, you have to activate it
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#activating-a-virtual-environment
On macOS and Linux:
> source env/bin/activate
On Windows:
> .\env\Scripts\activate
See the part on how you check that you are in the environment (using which (linux, unix) or where (windows)!
To deactivate you use
> deactivate
Requirement files
https://pip.pypa.io/en/latest/user_guide/#requirements-files
“Requirements files” are files containing a list of dependencies to be installed using pip install like so
(How to Install requirements files)
pip install -r requirements.txt
Requirements files are used to hold the result from pip freeze for the purpose of achieving repeatable installations. In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.
python -m pip freeze > requirements.txt
python -m pip install -r requirements.txt
Some of the syntax:
pkg1
pkg2==2.1.0
pkg3>=1.0,<=2.0
== for precise!
requests==2.18.4
google-auth==1.1.0
Force pip to accept earlier versions
ProjectA
ProjectB<1.3
Using git with a tag (fixing a bug yourself and not waiting)
git+https://myvcs.com/some_dependency#sometag#egg=SomeDependency
Again check the link https://pip.pypa.io/en/latest/user_guide/#requirements-files
I picked all the examples from them! You should see the explanations! And details!
For the format details check: https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
Freeze command
Pip can export a list of all installed packages and their versions using the freeze comman! At the run of the command! The list of all installed packages in the current environment get listed!
pip freeze
Which will output something like:
cachetools==2.0.1
certifi==2017.7.27.1
chardet==3.0.4
google-auth==1.1.1
idna==2.6
pyasn1==0.3.6
pyasn1-modules==0.1.4
requests==2.18.4
rsa==3.4.2
six==1.11.0
urllib3==1.22
We can write that to a requirements file as such
pip freeze > requirements.txt
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
Installing packages Resume
By using venv (virtual environment) for each project! The projects are isolated! And then freeze command will list only the packages installed on that particular environmnent! Which make it by project bases! Freeze command make the listing of the packages at the time of it's run! With the exact versions matching! We generate a requirements file from it (requirements.txt)! Which we can add to a project repo! And have the dependencies installed!
The whole can be done in this sense:
Linux/unix
python3 -m venv env
source env/bin/activate
pip3 install -r requirements.txt
Windows
py -m venv env
.\env\Scripts\activate
pip3 install -r requirements.txt
First time setup after cloning a repo!
Creating the new env!
Then activating it!
Then installing the needed packages to it!
Otherwise here a complete guide about installing packages using requiremnets files and virtual environment from the official doc: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
This second guide show all well too: https://docs.python.org/3/tutorial/venv.html
Links listing (already listed):
https://pip.pypa.io/en/latest/user_guide/#requirements-files
https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
pipenv (The more close to npm)
https://pipenv.pypa.io/en/latest/
pipenv is a tool that try to be like npm for python! Is a super set of pip!
pipenv create virtual environment for us! And manage the dependencies!
A good feature too is the ability to writie packages.json like files! With scripts section too in them!
Executing pipfile scripts
run python command with alias in command line like npm
Installation
https://pipenv.pypa.io/en/latest/install/
virtualenv-mapping-caveat
https://pipenv.pypa.io/en/latest/install/#virtualenv-mapping-caveat
For me having the env created within the project (just like node_modules) should be even the default! Make sure to activate it! By setting the environment variable!
pipenv can seems just more convenient!
Mainly managing run scripts is too good to miss on! And a one tool that simplify it all!
Basic usage and comparing to npm
https://pipenv.pypa.io/en/latest/basics/
(make sure to check the guide above to get familiar with the basics)
Note that the equivalent of npm package.json is the PipFile file!
An example:
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
flask = "*"
simplejson = "*"
python-dotenv = "*"
[dev-packages]
watchdog = "*"
[scripts]
start = "python -m flask run"
[requires]
python_version = "3.9"
There is Pipfile.lock like package.lock
To run npm install equivalent! You run pipenv install!
To insall a new package
pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package you provided.
Just like with npm!
$ pipenv install "requests>=1.4" # will install a version equal or larger than 1.4.0
$ pipenv install "requests<=2.13" # will install a version equal or lower than 2.13.0
$ pipenv install "requests>2.19" # will install 2.19.1 but not 2.19.0
If the PIPENV_VENV_IN_PROJECT=1 env variable is set! To make pipenv set the virtual environmnent within the project! Which is created in a directory named .venv (equiv to node_modules).
Also running pipenv install without a PipFile in the directory! Neither a virtual environment! Will create the virtual environment on .venv directory (node_modules equiv)! And generate a PipFile and Pipfile.lock!
Installing flask example:
pipenv install flask
Installing as dev dependency
pipenv install watchdog -d
or
pipenv install watchdog -dev
just like with npm!
pipenv all commands (pipenv -h)
Commands:
check Checks for PyUp Safety security vulnerabilities and against PEP
508 markers provided in Pipfile.
clean Uninstalls all packages not specified in Pipfile.lock.
graph Displays currently-installed dependency graph information.
install Installs provided packages and adds them to Pipfile, or (if no
packages are given), installs all packages from Pipfile.
lock Generates Pipfile.lock.
open View a given module in your editor.
run Spawns a command installed into the virtualenv.
scripts Lists scripts in current environment config.
shell Spawns a shell within the virtualenv.
sync Installs all packages specified in Pipfile.lock.
uninstall Uninstalls a provided package and removes it from Pipfile.
update Runs lock, then sync.
Command help
pipenv install -h
importing from requirements.txt
https://pipenv.pypa.io/en/latest/basics/#importing-from-requirements-txt
environment management with pipenv
https://pipenv.pypa.io/en/latest/basics/#environment-management-with-pipenv
pipenv run
To run anything with the project virtual environment you need to use pipenv run
As like pipenv run python server.py!
Custom scripts shortcuts
scripts in npm!
https://pipenv.pypa.io/en/latest/advanced/#custom-script-shortcuts
[scripts]
start = "python -m flask run"
And to run
pipenv run start
Just like with npm!
If you’d like a requirements.txt output of the lockfile, run $ pipenv lock -r. This will include all hashes, however (which is great!). To get a requirements.txt without hashes, use $ pipenv run pip freeze.
To mention too the pipenv cli rendering is well done:
Make sure to read the basics guide!
And you can see how rich is pipenv!
Yes, it's called the requirements file:
https://pip.pypa.io/en/stable/cli/pip_install/#requirement-specifiers
You can specify the package name & version number.
You can also specify a git url or a local path.
In the usual case, you would specify the package followed by the version number, e.g.
sqlalchemy=1.0.1
You can install all the packages specified in a requirements.txt file through the command
pip install -r requirements.txt
Once all the packages have been installed, run
pip freeze > requirements.txt
This will save the package details in the file requirements.txt.
For installation, run
pip install -r requirements.txt
to install the packages specified by requirements.txt.
I would like to propose pipenv here. Managing packages with Pipenv is easier as it manages the list and the versions of packages for you because I think you need to run pip freeze command each time you make changes to your packages.
It will need a Pipfile. This file will contain all of your required packages and their version just like package.json.
You can delete/update/add projects using pipenv install/uninstall/update <package>
This also generates a dependency tree for your project. Just like package-lock.json
Checkout this post on Pipfiles
Learn more about Pipenv
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
One thing I like about Rails projects is that when deploying to a remote server, if everything is set up correctly you can just do:
$: bundle install
And the system will install the various dependencies (ruby gems) needed to run the project.
Is there something similar for Python/Django?
You can freeze requirements. This generates a list of all the Python modules that your project needs. I believe bundle is similar in concept.
For example:
virtualenv --no-site-packages myproject_env # create a blank Python virtual environment
source myproject_env/bin/activate # activate it
(myproject_env)$ pip install django # install django into the virtual environment
(myproject_env)$ pip install other_package # etc.
...
(myproject_env)$ pip freeze > requirements.txt
The last line generates a text file will all the packages that were installed in your custom environment. You can use that file to install the same requirements on other servers:
pip install -r requirements.txt
Of course you don't need to use pip, you can create the requirements file by hand; it doesn't have any special syntax requirements. Just a package and (possibly) version identifier on each line. Here is a sample of a typical django project with some extra packages:
Django==1.4
South==0.7.4
Werkzeug==0.8.3
amqplib==1.0.2
anyjson==0.3.1
celery==2.5.1
django-celery==2.5.1
django-debug-toolbar==0.9.4
django-extensions==0.8
django-guardian==1.0.4
django-picklefield==0.2.0
kombu==2.1.4
psycopg2==2.4.5
python-dateutil==2.1
six==1.1.0
wsgiref==0.1.2
xlwt==0.7.3
The closest is probably virtualenv, pip and a requirements file.
With those 3 ingredients is quite easy to write a simple bootstrap scripts.
More demanding and complex is buildout. But I would only go for it if virtualenv and pip are not sufficient.
And if you extend this approach with fabric and optional cuisine, you already have your project deployment automated. Check out these links for more information:
http://www.caktusgroup.com/blog/2010/04/22/basic-django-deployment-with-virtualenv-fabric-pip-and-rsync/
http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django/
http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django/
I had one machine with my commonly used python package installed.
and i would like to install the same package on another machine or same machine with different python version. I would like to know whether pip or easy-install or some other method can let me install those packages in a batch. When i use perl, it has something like a bundle package, how to do that in python?
Pip has some great features for this.
It lets you save all requirements from an environment in a file using pip freeze > reqs.txt
You can then later do : pip install -r reqs.txt and you'll get the same exact environnement.
You can also bundle several libraries into a .pybundle file with the command pip bundle MyApp.pybundle -r reqs.txt, and later install it with pip install MyApp.pybundle.
I guess that's what you're looking for :)
I keep a requirements.txt file in one of my repositories that has all my basic python requirements and use PIP to install them on any new machine.
Each of my projects also has it's own requirements.txt file that contains all of it's dependencies for use w/virtualenv.