I am currently using conda for this purpose.
After generating local environment.yml, I run $ conda create -n environment.yml on the remote server.
But this doesn't include global packages that my code references.
I can add a requirements.txt using pipreqs and then run pip install -r requirements.txt remotely but this doesn't take into account dependencies like dlib or boost that a package may need for installation.
Is there any solution for this?
You have two diff type of dependencies. 1) Need to install via apt-get like boost, opencv 2) Need to install via pip.
You need to install apt-get library manually on the server and can define pip related libraries in the requirements.txt file. Because apt-get libraries are environment independent.
Related
I am learning how to use venv here: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#installing-from-source
And it says I can install a source package by:
python3 -m pip install .
Which works, but now if I do pip freeze then I see:
my-package # file:///Users/joesmith/my-package
The problem is if I export this to a requirements.txt and try to install this environment on another machine then it won't work cause the path to source changed obviously.
What is the proper way to use a source package locally like i did but also export it afterwards so that another person can recreate the environment/run the code on another machine?
Pip has support for VCS like git. You can upload your code to git (e.g. Github, Gitlab, ..) for example and then use the requirements.txt. like this:
git+http://git.example.com/MyProject#egg=MyProject
https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support
You would install package from PyPI rather than from source.
i.e. pip install requests
In this way other developers will also easily run your project.
I am currently working on a Python project, and I want the user to automatically have access to the dependencies that I used. Do they have to download (e.g. pip install) them manually? If so, is there an easy way to make them download the necessary packages easily?
virtualenv is What you need. You install packages your project needed when developing it. After coding, you can run pip freeze > requirements.txt to save all packages to requirements.txt. pip install -r requirements.txt will install all packages automatically.
further, Docker is more better for releasing projects to PC.
You need to create a virtual environment, see the link on how to, Then could use pip freeze > requirements.txt, to store your env dependencies to a file and then your user can simply use pip install -r requirements.txt to install them in one go
See documentation for more details
Is there a way to install Airflow without pip?
I am trying to install Airflow on a offline computer that does not have pip. I have downloaded the packages from the internet but I am not sure how to run the installation without pip.
Does anybody knows how to run an installation with 'setup.py'?
# /usr/bin/bash
python setup.py build
python setup.py install
If you get problems like the lack of setuptools you can install it (depending on your system, it usually a package. If you want to be sure to have everything download and later install the python-devel (sometimes also python-dev) package with your OS package manager) or you could use airflow via docker, you will build the image and then export it to the offline unit.
To install it without pip,
download airflow zip from git-repo.
unzip the contents and follow the instructions in the INSTALL file. (Steps inside create a virtualenv and then start the airflow installation)
These are the steps you will find in INSTALL file:
python -m my_env
source my_env/bin/activate
# [required] by default one of Apache Airflow's dependencies pulls in a GPL
# library. Airflow will not install (and upgrade) without an explicit choice.
#
# To make sure not to install the GPL dependency:
# export SLUGIFY_USES_TEXT_UNIDECODE=yes
# In case you do not mind:
# export AIRFLOW_GPL_UNIDECODE=yes
# [required] building and installing
# by pip (preferred)
pip install .
# or directly
python setup.py install
Alternative, If you are facing any conflicts while installing through pip,
create a virtualenv
source-activate virtualenv
Follow steps from airflow to install and configure it.
One way or the other you will have to use pip as there might be a number of modules you may have to download using pip. Creating a virtualenv here will help.
Suppose I have a python interpreter with many modules installed on my local system, and it has been tuned to just work.
Now I want to create a virtualenv to freeze these, so that they won't be broke by upgrading in the future.
How can I make it? Thanks.
I can't use pip freeze, because that's a cluster on which there's no pip and I don't have the privileges to install it. And I don't want the reinstall the modules either, I'm looking for that whether there's a cloning way.
Run pip freeze to create a list of all modules currently installed on the system. Then make a virtualenv and install these modules.
pip freeze > env_modules.txt
virtualenv my_env && cd my_env && source bin/activate
pip install -r ../env_modules.txt
Virtualenv does not work because it uses local python interpreter.
My solution is to use conda (anoconda or miniconda) to build the environment, so if you need some packages, you can just conda install them. Then copy it to the remote machine and run.
I think the best is to use cpvirtualenv like this:
cpvirtualenv <name_of_virtualenv_to_be_copied> <name_of_new_virtualenv>
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.