I had one machine with my commonly used python package installed.
and i would like to install the same package on another machine or same machine with different python version. I would like to know whether pip or easy-install or some other method can let me install those packages in a batch. When i use perl, it has something like a bundle package, how to do that in python?
Pip has some great features for this.
It lets you save all requirements from an environment in a file using pip freeze > reqs.txt
You can then later do : pip install -r reqs.txt and you'll get the same exact environnement.
You can also bundle several libraries into a .pybundle file with the command pip bundle MyApp.pybundle -r reqs.txt, and later install it with pip install MyApp.pybundle.
I guess that's what you're looking for :)
I keep a requirements.txt file in one of my repositories that has all my basic python requirements and use PIP to install them on any new machine.
Each of my projects also has it's own requirements.txt file that contains all of it's dependencies for use w/virtualenv.
Related
In my requirements.txt I have packages defined in following manner:
Django ~= 2.2.0
It means that when I use pip install -r requirements.txt pip will find the latest available 2.2.x version and install it along with all dependencies.
What I need is requirements-formatted list of all packages with explicit versions that will be installed but without actually installing any packages. So example output would be something like:
Django==2.2.23
package1==0.2.1
package2==1.4.3
...
So in other words I'm looking for something like pip freeze results but without installing anything.
pip-compile is what you need!
Doc: https://github.com/jazzband/pip-tools)
python -m pip install pip-tools
pip-compile requirements.txt --output-file requirements-all.txt
The pip-compile command lets you compile a requirements.txt file from your dependencies, this way you can pip install you dependencies to always have the same environment
TL;DR
Try pipdetree or pip-tree.
Explanation
pip, contrary to most package managers, doesn't have a big dependency graph to look up. What it does is that it lets arbitrary setup code to be executed, which automatically pulls the dependencies. This means that, for example, a package could manage their dependencies in an other way than putting them in requirements.txt (see fastai for an example of a project that handles the dependencies differently).
So, there is, theoretically, no other way to see all the dependencies than to actually run an install on an isolated environment, see what was pulled, then delete the environment (because it could potentially be the same part of the code that does the installation and that brings the dependencies). You could actually do that with venv.
In practice, tools like pipdetree or pip-tree fetch the dependencies based on some standardization of the requirements (most packages separate the dependencies and the installation, and actually let pip handle both).
I am learning how to use venv here: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#installing-from-source
And it says I can install a source package by:
python3 -m pip install .
Which works, but now if I do pip freeze then I see:
my-package # file:///Users/joesmith/my-package
The problem is if I export this to a requirements.txt and try to install this environment on another machine then it won't work cause the path to source changed obviously.
What is the proper way to use a source package locally like i did but also export it afterwards so that another person can recreate the environment/run the code on another machine?
Pip has support for VCS like git. You can upload your code to git (e.g. Github, Gitlab, ..) for example and then use the requirements.txt. like this:
git+http://git.example.com/MyProject#egg=MyProject
https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support
You would install package from PyPI rather than from source.
i.e. pip install requests
In this way other developers will also easily run your project.
I am currently working on a Python project, and I want the user to automatically have access to the dependencies that I used. Do they have to download (e.g. pip install) them manually? If so, is there an easy way to make them download the necessary packages easily?
virtualenv is What you need. You install packages your project needed when developing it. After coding, you can run pip freeze > requirements.txt to save all packages to requirements.txt. pip install -r requirements.txt will install all packages automatically.
further, Docker is more better for releasing projects to PC.
You need to create a virtual environment, see the link on how to, Then could use pip freeze > requirements.txt, to store your env dependencies to a file and then your user can simply use pip install -r requirements.txt to install them in one go
See documentation for more details
I'm trying to make a script to install python packages offline. Here's what I've done:
Uninstall Python and delete old Python 2.7 folder on C:
Install clean copy of Python 2.7.16
Install all required packages via pip install (with internet connection) like normal
In command prompt, navigate to the folder I am going to put my wheels into and do pip freeze -> requirements.txt
Do pip download -r requirements.txt
Uninstall python, delete Python 2.7 again, install python, test script
My script is literally: python -m pip install --no-index --find-links . -r requirements.txt
It is installing packages normally, until it hits cffi. For some reason, it does not recognize this package even though it is in the folder. In the folder, its name is cffi-1.12.2-cp27-cp27m-win32.whl. In the requirements file, it's listed as cffi==1.12.2. They're the same version, so I'm not quite sure what is causing this, and I couldn't really find any questions with a similar problem where it was just one package and all the versions and procedure was similar. If someone could steer me in the right direction, that would be great. Thank you! All I can think is to install it explicitly at the beginning of the script, but I really don't want to have to do it that way if I can avoid it (in case I run into that issue with other packages later).
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.