What is the true equivalent of package.json for pip? - python

I know about requirements.txt, but that only includes a list of dependencies.
But what about the other meta information like the package name, author, main function etc. ?
Also i know about setup.py but since i want to programmatically access values inside, i need a configuaration file standard like yaml/json and not python code.
Did the python community come out with something truly comparable to package.json ?

1. Without 3rd Party Packages
pip freeze > requirements.txt
In the local machine. And in the server,
pip install -r requirements.txt
This installs all the dependencies
2. With a 3rd Party Package
pipenv
I would use pipenv instead of pip. pipenv automatically generate Pipfile and Pipfile.lock that is far superior to requirements.txt
Install pipenv and setting it for your project
pip install --user pipenv
cd yourproject
pipenv install package1 package2 ...
to install packages from Pipfile is as simple as
pipenv install
Read more: https://pipenv.pypa.io/en/latest/
poetry
I have recently moved from pipenv to poetry because poetry has everything pipenv offers and much more. It is end-to-end, as it includes building and publishing of your project to pypi.
installing poetry
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
and set .poetry/bin in your path.
poetry new yourproject
cd yourproject
poetry add packagename
Like pipenv this generate pyproject.toml file that context all your requirements. Like Pipenv, to install your depence
poetry install
See more: https://poetry.eustace.io/docs/
See Python packaging war: Pipenv vs. Poetry for short review of these awesome packages

Like the comments on my question answered, it's pyproject.toml together with the poetry package management tool

Related

Pipenv installed packages on environment (Pipenv file)

About the Pipfile that pipenv generates: when I run pipenv shell in a specific folder, the virtual environment works just fine, and everything I install in there works fine as well, but the Pipfile doesn't seem to be updating with the packages I'm installing. When I check the dependency tree with pipenv graph, it shows all the dependencies I've been using. Is there something I'm missing with how pipenv works?
Note:
Whenever I want to create a new env i follow these steps:
mkdir app
cd app
pipenv shell
pip install <somepackage>
touch main.py # (add my code)
python main.py
You have to install packages using the command pipenv install [package] in order for pipenv to create/update the Pipfile and Pipfile.lock files.
As you already installed the dependencies with pip, you can run pipenv run pip freeze > requirements.txt && pipenv install -r requirements.txt and it will create or update the aforementioned files for you. It is best, though, that you declare each package you want because this method also writes each package dependencies on these files.
Read more here

Is it possible to update Pipfile after installing packages using pip install?

I just created a pipenv environment using pipenv --python 3.9. I then did pipenv shell and started installing packages with pip install. It turns out this doesn't seem to be the usual way of doing things with pipenv. Is there any command I can run to update the Pipfile with all the packages I installed with pip install? I searched but couldn't find anything.
When you have multiple packages you'd like to install, you usually have whats called a requirements.txt file, that contains all the packages you'd like to use for your project.
You can run
$ pipenv run pip freeze > requirements.txt
To generate the requirements file to the current directory you're in, while the virtual environment is active.
Initially you're going to have to install all your packages manually. But after you can run
$ pipenv install -r path/to/requirements.txt
to import all the packages in requirements.txt from the shell/virtual environment.
Instead of running pipenv shell and then pip install <package>, you should simply run pipenv install <package> (outside the virtual environment, from the same location you ran pipenv --python 3.9).
This will install the package in your virtual environment and will automatically update your Pipefile and Pipfile.lock files.
You may skip the Pipfile.lock update by using --skip-lock flag - pipenv install --skip-lock <package>
You can use pipreqs which generates requirements.txt file based on imports.
pip install pipreqs
pipreqs
pipenv install -r requirements.txt

Can pip install packages recursively from requirements.txt

I have a projectA that depends on other projects. Some of them also my projects from private git repository. I listed all dependencies of projectA in requirements.txt for all my packages.
Now projectB that projectA depends of have dependencies also (listed in requirements txt and setup.py), but pip doesn't install them when I'm running
pip install -r requirements.txt  for projectA.
snakebasket appears to be a wrapper of pip with exactly this design goal.

Installing non-pip library in Django [duplicate]

We'd like to use pip with github to install private packages to our production servers. This question concerns what needs to be in the github repo in order for the install to be successful.
Assuming the following command line (which authenticates just fine and tries to install):
pip install git+ssh://git#github.com/BlahCo/search/tree/prod_release_branch/ProductName
What needs to reside in the ProductName? Is it the contents of what would normally be in the tar file after running setup.py with the sdist option, or is the actual tar.gz file, or something else?
I'm asking here because I've tried several variations and can't make it work. Any help appreciated.
You need the whole python package, with a setup.py file in it.
A package named foo would be:
foo # the installable package
├── foo
│   ├── __init__.py
│   └── bar.py
└── setup.py
And install from github like:
$ pip install git+ssh://git#github.com/myuser/foo.git
or
$ pip install git+https://github.com/myuser/foo.git#v123
or
$ pip install git+https://github.com/myuser/foo.git#newbranch
More info at https://pip.pypa.io/en/stable/reference/pip_install/#vcs-support
I had similar issue when I had to install from github repo, but did not want to install git , etc.
The simple way to do it is using zip archive of the package. Add /zipball/master to the repo URL:
$ pip install https://github.com/hmarr/django-debug-toolbar-mongo/zipball/master
Downloading/unpacking https://github.com/hmarr/django-debug-toolbar-mongo/zipball/master
Downloading master
Running setup.py egg_info for package from https://github.com/hmarr/django-debug-toolbar-mongo/zipball/master
Installing collected packages: django-debug-toolbar-mongo
Running setup.py install for django-debug-toolbar-mongo
Successfully installed django-debug-toolbar-mongo
Cleaning up...
This way you will make pip work with github source repositories.
If you want to use requirements.txt file, you will need git and something like the entry below to anonymously fetch the master branch in your requirements.txt.
For regular install:
git+git://github.com/celery/django-celery.git
For "editable" install:
-e git://github.com/celery/django-celery.git#egg=django-celery
Editable mode downloads the project's source code into ./src in the current directory. It allows pip freeze to output the correct github location of the package.
Clone target repository same way like you cloning any other project:
git clone git#github.com:myuser/foo.git
Then install it in develop mode:
cd foo
pip install -e .
You can change anything you wan't and every code using foo package will use modified code.
There 2 benefits ot this solution:
You can install package in your home projects directory.
Package includes .git dir, so it's regular Git repository. You can push to your fork right away.
Here is the simple solution
With git
pip install git+https://github.com/jkbr/httpie.git
Without git
pip install https://github.com/jkbr/httpie/tarball/master
or
pip install https://github.com/jkbr/httpie/zipball/master
or
pip install https://github.com/jkbr/httpie/archive/master.zip
Note: You need a python package with the setup.py file in it.
Below format could be use to install python libraries via pip from GitHub.
pip install <LibName>#git+ssh://git#github.com/<username>/<LibName>#egg<LibName>
you can try this way in Colab
!git clone https://github.com/UKPLab/sentence-transformers.git
!pip install -e /content/sentence-transformers
import sentence_transformers
Tested Optimized Ubuntu Solution using the terminal command:
Step 1:
In a selected directory clone the git repo.
Example:
$ git clone https://github.com/httpie/httpie.git
Step 2:
select/change path to the directory, to the cloned folder
$ cd ClonedFolderName
Step 3:
Enter following command to install that package
ColnedFolderName(directory Name) $ pip install ./
pip install ./ is command to enter in cloned directory name
Note: Make sure setup.py is inside cloned repo. (which is by default in it)

How can I use a pip requirements file to uninstall as well as install packages?

I have a pip requirements file that changes during development.
Can pip be made to uninstall packages that do not appear in the requirements file as well as installing those that do appear? Is there a standard method?
This would allow the pip requirements file to be the canonical list of packages - an 'if and only if' approach.
Update: I suggested it as a new feature at https://github.com/pypa/pip/issues/716
This should uninstall anything not in requirements.txt:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | xargs pip uninstall -y
Although this won't work quite right with packages installed with -e, i.e. from a git repository or similar. To skip those, just filter out packages starting with the -e flag:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | grep -v '^-e ' | xargs pip uninstall -y
Then, obviously:
pip install -r requirements.txt
Update for 2016:
You probably don't really want to actually use the above approach, though. Check out pip-tools and pip-sync which accomplish what you are probably looking to do in a much more robust way.
https://github.com/nvie/pip-tools
Update for May, 2016:
You can now also use pip uninstall -r requirements.txt, however this accomplishes basically the opposite - it uninstalls everything in requirements.txt
Update for May, 2019:
Check out pipenv or Poetry. A lot has happened in the world of package management that makes this sort of question a bit obsolete. I'm actually still quite happily using pip-tools, though.
You can now pass the -r requirements.txt argument to pip uninstall.
pip uninstall -r requirements.txt -y
At least as of pip 8.1.2, pip help uninstall shows:
...
Uninstall Options:
-r, --requirement <file> Uninstall all the packages listed in the given requirements file. This option can be
used multiple times.
...
It's not a feature of pip, no. If you really want such a thing, you could write a script to compare the output of pip freeze with your requirements.txt, but it would likely be more hassle than it's worth.
Using virtualenv, it is easier and more reliable to just create a clean environment and (re)install from requirements.txt, like:
deactivate
rm -rf venv/
virtualenv venv/
source venv/bin/activate
pip install -r requirements.txt
The short answer is no, you can't do that with pip.
Here's a simple solution that works:
pip uninstall $(pip freeze) -y
This is an old question (but a good one), and things have changed substantially since it was asked.
There's an offhand reference to pip-sync in another answer, but deserves its own answer, because it solves precisely the OP's problem.
pip-sync takes a requirements.txt file as input, and "trues up" your current Python environment so that it matches exactly what's in that requirements.txt. This includes removing any packages that are present in your env but absent from requirements.txt.
Example: Suppose we want our env to contain (only) 3 libraries: libA, libB, and libC, like so:
> cat requirements.txt
libA==1.0
libB==1.1
libC==1.2
But our env currently contains libC and libD:
> pip freeze
libC==1.2
libD==1.3
Running pip-sync will result in this, which was our desired final state:
> pip-sync requirements.txt
> pip freeze
libA==1.0
libB==1.1
libC==1.2
Stephen's proposal is a nice idea, but unfortunately it doesn't work
if you include only direct requirements in your file, which sounds
cleaner to me.
All dependencies will be uninstalled,
including even distribute, breaking down pip itself.
Maintaining a clean requirements file while version tracking a virtual environment
Here is how I try to version-track my virtual environment.
I try to maintain a minimal requirements.txt, including only
the direct requirements, and not even mentioning version constraints where
I'm not sure.
But besides, I keep, and include in version tracking (say git),
the actual status of my virtualenv in a venv.pip file.
Here is a sample workflow:
setup virtualenv workspace, with version tracking:
mkdir /tmp/pip_uninstalling
cd /tmp/pip_uninstalling
virtualenv venv
. venv/bin/activate
initialize version tracking system:
git init
echo venv > .gitignore
pip freeze > venv.pip
git add .gitignore venv.pip
git commit -m "Python project with venv"
install a package with dependencies, include it in requirements file:
echo flask > requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
Now start building your app, then commit and start a new branch:
vim myapp.py
git commit -am "Simple flask application"
git checkout -b "experiments"
install an extra package:
echo flask-script >> requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
... play with it, and then come back to earlier version
vim manage.py
git commit -am "Playing with flask-script"
git checkout master
Now uninstall extraneous packages:
pip freeze | grep -v -f venv.pip | xargs pip uninstall -y
I suppose the process can be automated with git hooks, but let's not go off topic.
Of course, it makes sense then to use some package caching system
or local repository like pip2pi
you can create a new file with all installed packages
pip freeze > uninstall.txt
and then uninstall all of those
pip uninstall -r uninstall.txt -y
and then finally re-install the packages you had in your original requirements.txt file
pip install -r requirements.txt
Piggybacking off #stephen-j-fuhry here is a powershell equivalent I use:
pip freeze | ? { $_ -notmatch ((gc req.txt) -join "|") }
While this doesn't directly answer the question, a better alternative to requirements.txt now is using a Pipfile. This functions similarly to a Ruby Gemfile. Currently, you need to make use of the pipenv tool but hopefully this will eventually be incorporated into pip. This provides the pipenv clean command which does what you want.
(Note that you can import an existing requirements.txt with pipenv install -r requirements.txt. After this you should have a Pipfile and the requirements.txt can be removed.)
It is possible now using:
pip uninstall -r requirements.txt

Categories