I have a python script that use additional libraries. I want to write a additional script for downloading these needed packages. That script will run by the user before run the actual script. I am using Linux environment and all packages downloaded with pip or sudo from command line. What is the proper way to do serve that kind of script? Is setup.py created for that purpose?
There is a Pypi way to do that, use requirements.txt file.
# requirements.txt
numpy==1.5.1
scipy==0.9.0
Then run:
pip install -r requirements.txt
You can use pip freeze to get the currently installed packages.
pip freeze | grep numpy
# numpy==1.5.1
Related
I have created a program that uses some external libraries such as matplotlib, NumPy, pandas... I want to share my program with other users. The problem is that those users need to install those libraries and I want to avoid that. I have read that an executable file can be created, however, I want the other users could also be able to edit the program. How could I do this?
If you share the program as an executable file then it is true that the user won't have to install the libraries. But once the code is converted it can not be edited, each time you edit the code you will need to convert it every time to update the version. If you want to share editable code then the best option is to share a requirements.txt file along with the program.
You can generate requirements.txt which lists all your current dependencies. Use pipreqs module to list only the libraries used in your program.
install pipreqs (without $)
$ pip install pipreqs
Then go to your program directory and in the terminal/PowerShell/command prompt type (without $)
$ pipreq .
it will generate requirements.txt in the program directory zip the entire folder and share it. User need to install from this requirements.txt file, they can install from the file using the following command (without $)
$ pip install -r requirements.txt
What you need is to add a requirements.txt. This is a file where one specifies dependencies for a certain project. For example your program can have dependency on a certain NumPy version and by adding it to the requirements.txt file, when another user needs to install the dependencies for that project, he can easily automate it.
If you are using a virtual environment, you could simply get the environment.txt doing a pip freeze > requirements.txt. Otherwise you might need to add the used libraries to this file by yourself.
To let other people install the dependencies on their environment, they will need to execute the following pip command instead of installing every single module:
$ pip install -r requirements.txt
As mentioned in the comments, using the requirements file is the way to go and a common standard. You can create the requirements file using the following pip command:
$ cd <root directory>
$ pip freeze > requirements.txt
Is there a possibility to see all pip installed packages in Pycharm?
Because I have the Problem: I write in PyCharm and it works fine, but now I want to move the project to a server... And now I don't know how can I quickly export this
There is such tool provided by PyCharm. You can find it in Tools -> Sync Python Requirements...
Edit:
If after you're receiving an empty file, PyCharm would also suggests you to install certain plugins.
type in terminal
pip list
this will show all the installed requirements in the terminal.
try the following line to update all installed requirements into the requirements.txt file
pip freeze > requirements.txt
then in the server after making a virtualenv run the following command to install all the requirements
pip install -r requiremnets.txt
Use the command pip freeze >requirements.txt locally to import the environment you need into the file,
then use the command pip install -r requirements.txt on the server to install the required environment
I am currently working on a Python project, and I want the user to automatically have access to the dependencies that I used. Do they have to download (e.g. pip install) them manually? If so, is there an easy way to make them download the necessary packages easily?
virtualenv is What you need. You install packages your project needed when developing it. After coding, you can run pip freeze > requirements.txt to save all packages to requirements.txt. pip install -r requirements.txt will install all packages automatically.
further, Docker is more better for releasing projects to PC.
You need to create a virtual environment, see the link on how to, Then could use pip freeze > requirements.txt, to store your env dependencies to a file and then your user can simply use pip install -r requirements.txt to install them in one go
See documentation for more details
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
I had one machine with my commonly used python package installed.
and i would like to install the same package on another machine or same machine with different python version. I would like to know whether pip or easy-install or some other method can let me install those packages in a batch. When i use perl, it has something like a bundle package, how to do that in python?
Pip has some great features for this.
It lets you save all requirements from an environment in a file using pip freeze > reqs.txt
You can then later do : pip install -r reqs.txt and you'll get the same exact environnement.
You can also bundle several libraries into a .pybundle file with the command pip bundle MyApp.pybundle -r reqs.txt, and later install it with pip install MyApp.pybundle.
I guess that's what you're looking for :)
I keep a requirements.txt file in one of my repositories that has all my basic python requirements and use PIP to install them on any new machine.
Each of my projects also has it's own requirements.txt file that contains all of it's dependencies for use w/virtualenv.