I have several python apps (each in separate GitHub repos) which have grown to the point that they all need access to some functions and classes that live inside of one of the apps.
I have extracted the shared code into a separate repo and have it pushed as a package into Gemfury.
I am now stuck on the steps required to make development with this package easy - the package is installed as a dependency via a Pipfile and everything loads into my development docker container as expected.
However, if I want to make changes to this package, it’s a long winded process of pushing a new version, installing the new version just to see how it acts with my main apps code.
I now have a separate checkout of the shared repo loaded into the docker container as a volume, but when using ‘python setup.py develop’ the develop version isn’t user over the previously installed version (as part of the install from Pipfile.lock when the container is created).
Does anybody else have this challenge and know a way around this?
With regular pip install you have:
src> git checkout ...app1.git
src> git checkout ...app2.git
src> pip install app1
src> pip install app2
this installs using versions fetched from install_requires in setup.py (in global scope). If your order is incorrect it will fetch the app from pypi rather than use your local version.
I don't know about pipenv, but with virtualenv you can keep your code and environment separate:
src> virtualenv dev
src> . dev\bin\activate
(dev) src> pip install -r app1\requirements.txt
(dev) src> pip install -r app2\requirements.txt
(dev) src> pip install -e app1
(dev) src> pip install -e app2
then any changes to app1 will be immediately visible to app2 provided that app2's requirements.txt includes -e ../app1. (Removing the -e ../ when running a ci pipeline can be done with sed 's,-e ../,,g' requirements.txt > requirements-ci.txt, and it will then pick the latest version published to Gemfury).
Your virtualenv is insulated from any globally installed packages.
Related
I'm trying to set up the development environment for modifying a Python library. Currently, I have a fork of the library, I cloned it from remote and installed it with
pip install -e git+file:///work/projects/dev/git_project#branch#egg=git_project
However, it seems that instead of creating a symbolic link with pip install -e to the directory where I cloned my package, pip would copy the package to src/git_project in my virtual environment, making it difficult to modify it from there and push changes to my fork at the same time. Am I missing out on something or pip install -e doesn't actually make a symlink when installing from VCS?
I know that I can also do pip install -e git+git:// to install from my remote, but it makes it difficult to see real-time changes I make without pushing my code to this fork all the time.
Is there a way I can clone a fork to my local development environment, pip install a specific branch from this cloned repo, and create a symlink link to the actual git_project folder so that I can modify the package there, push changes to my remote, and at the same time import the library anywhere in my environment to see real-time changes I make on my branch without committing anything yet?
Thanks for any help!
pip install -e git+URL means "clone the repository from the URL locally and install". If you already have the repository cloned locally and want to simply install from it: just install without Git:
cd /work/projects/dev/git_project
git checkout branch
pip install -e .
I have a package that I am developing for a local server. I would like to have the current stable release importable in a Jupyter notebook using import my_package and the current development state importable (for end-to-end testing and stuff) with import my_package_dev, or something like that.
The package is version controlled with git. The master branch holds the stable release, and new development work is done in the develop branch.
I currently pulled these two branches into two different folders:
my_package/ # tracks master branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff
my_package_dev/ # tracks develop branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff for dev branch
My setup.py file looks like this:
from setuptools import setup
setup(
name='my_package', # or 'my_package_dev' for the dev version
# metadata stuff...
)
I can pip install my_package just fine, but I have been unable to get anything to link to the name my_package_dev in Python.
Things I have tried
pip install my_package_dev
Doesn't seem to overwrite the existing my_package, but doesn't seem to make my_package_dev available either, even though pip says it finishes OK.
pip install -e my_package_dev
makes an egg and puts the development package path in easy-install.pth, but I cannot import my_package_dev, and my_package is still the old content.
Adding a file my_package_dev.pth to site-packages directory and filling it with /path/to/my_package_dev
causes no visible change. Still does not allow me to import my_package_dev.
Thoughts on a solution
It looks like the best approach is going to be to use virtual environments, as discussed in the answers.
With pip install you install packages by its name in setup.py's name attribute. If you have installed both and execute pip freeze, you will see both packages listed. Which code is available depends on how they are included in Python path.
The issue is those two packages contains just a python module named my_package, that it why you can not import my_package_dev (it does not exist).
I would suggest you to have an working copy for each version (without modifying package name) and use virtualenv to keep environments isolated (one virtualenv for stable version and the other for dev).
You could also use pip's editable install to keep the environment updated with the working copies.
Note: Renaming my_package_dev's my_package module directory to my_package_dev, will also work. But it will be harder to merge changes from one version to the other.
The answer provided by Gonzalo got me on the right track: use virtual environments to manage two different builds. I created the virtual environment for the master (stable) branch with:
$ cd my_package
$ virtualenv venv # make the virtual environment
$ source venv/bin/activate
(venv) $ pip install -r requirements.txt # install everything listed as a requirement
(venv) $ pip install -e . # install my_package dynamicially so that any changes are visible right away
(venv) $ sudo venv/bin/python -m ipykernel install --name 'master' --display-name 'Python 3 (default)'
And for the develop branch, I followed the same procedure in my my_package_dev folder, giving it a different --name and --display-name value.
Note that I needed to use sudo for the final ipykernel install command because I kept getting permission denied errors on my system. I would recommend trying without sudo first, but for this system it needed to be installed system-wide.
Finally, to switch between which version of the tools I am using, I just have to select Kernel -> Change kernel and choose Python 3 (default) or Python 3 (develop). The import stays the same (import my_package), so nothing in the notebook has to change.
This isn't quite my ideal scenario since it means that I will then have to re-run the whole notebook any time I change kernels, but it works!
In NodeJS's npm you can create a package.json file to track your project dependencies. When you want to install them you just run npm install and it looks at your package file and installs them all with that single command.
When distributing my code, does python have an equivalent concept or do I need to tell people in my README to install each dependency like so:
pip install package1
pip install package2
Before they can use my code?
Once all necessary packages are added
pip freeze > requirements.txt
creates a requirement file.
pip install -r requirements.txt
installs those packages again, say during production.
The best way may be pipenv! I personally use it!
However in this guide i'll explain how to do it with just python and pip! And without pipenv! That's the first part! And it will give us a good understanding about how pipenv works! There is a second part that treat pipenv! Check the section pipenv (The more close to npm).
Python and pip
To get it all well with python! Here the main elements:
virtual environment
requirements file (listing of packages)
pip freeze command
How to install packages from a requirements file
Virtual environment and why
Note that for this the package venv is to be used! It's the official thing! And shiped with python 3 installation starting from 3.3+ !
To know well the what is it and the why check this out
https://docs.python.org/3/tutorial/venv.html
In short! A virtual environment will help us manage an isolated version of python interpreter! And so too installed packages! In this way! Different project will not have to depends on the same packages installation and have to conflict! Read the link above explain and show it well!
... This means it may not be possible for one Python installation to meet the requirements of every application. If application A needs version 1.0 of a particular module but application B needs version 2.0, then the requirements are in conflict and installing either version 1.0 or 2.0 will leave one application unable to run.
You may like to check the explanation on flask framework doc!
https://flask.palletsprojects.com/en/1.1.x/installation/#virtual-environments
Why we care about this and should use it! To isolate the projects! (each have it's environment)! And then freeze command will work per project base! Check the last section
Usage
Here a good guide on how to setup and work:
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
Check the installation section first!
Then
To create a virtual environment you go to your project directory and run:
On macOS and Linux:
> python3 -m venv env
On Windows:
> py -m venv env
Note You should exclude your virtual environment directory from your version control system using .gitignore or similar.
To start using the environment in the console, you have to activate it
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#activating-a-virtual-environment
On macOS and Linux:
> source env/bin/activate
On Windows:
> .\env\Scripts\activate
See the part on how you check that you are in the environment (using which (linux, unix) or where (windows)!
To deactivate you use
> deactivate
Requirement files
https://pip.pypa.io/en/latest/user_guide/#requirements-files
“Requirements files” are files containing a list of dependencies to be installed using pip install like so
(How to Install requirements files)
pip install -r requirements.txt
Requirements files are used to hold the result from pip freeze for the purpose of achieving repeatable installations. In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.
python -m pip freeze > requirements.txt
python -m pip install -r requirements.txt
Some of the syntax:
pkg1
pkg2==2.1.0
pkg3>=1.0,<=2.0
== for precise!
requests==2.18.4
google-auth==1.1.0
Force pip to accept earlier versions
ProjectA
ProjectB<1.3
Using git with a tag (fixing a bug yourself and not waiting)
git+https://myvcs.com/some_dependency#sometag#egg=SomeDependency
Again check the link https://pip.pypa.io/en/latest/user_guide/#requirements-files
I picked all the examples from them! You should see the explanations! And details!
For the format details check: https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
Freeze command
Pip can export a list of all installed packages and their versions using the freeze comman! At the run of the command! The list of all installed packages in the current environment get listed!
pip freeze
Which will output something like:
cachetools==2.0.1
certifi==2017.7.27.1
chardet==3.0.4
google-auth==1.1.1
idna==2.6
pyasn1==0.3.6
pyasn1-modules==0.1.4
requests==2.18.4
rsa==3.4.2
six==1.11.0
urllib3==1.22
We can write that to a requirements file as such
pip freeze > requirements.txt
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
Installing packages Resume
By using venv (virtual environment) for each project! The projects are isolated! And then freeze command will list only the packages installed on that particular environmnent! Which make it by project bases! Freeze command make the listing of the packages at the time of it's run! With the exact versions matching! We generate a requirements file from it (requirements.txt)! Which we can add to a project repo! And have the dependencies installed!
The whole can be done in this sense:
Linux/unix
python3 -m venv env
source env/bin/activate
pip3 install -r requirements.txt
Windows
py -m venv env
.\env\Scripts\activate
pip3 install -r requirements.txt
First time setup after cloning a repo!
Creating the new env!
Then activating it!
Then installing the needed packages to it!
Otherwise here a complete guide about installing packages using requiremnets files and virtual environment from the official doc: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
This second guide show all well too: https://docs.python.org/3/tutorial/venv.html
Links listing (already listed):
https://pip.pypa.io/en/latest/user_guide/#requirements-files
https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
pipenv (The more close to npm)
https://pipenv.pypa.io/en/latest/
pipenv is a tool that try to be like npm for python! Is a super set of pip!
pipenv create virtual environment for us! And manage the dependencies!
A good feature too is the ability to writie packages.json like files! With scripts section too in them!
Executing pipfile scripts
run python command with alias in command line like npm
Installation
https://pipenv.pypa.io/en/latest/install/
virtualenv-mapping-caveat
https://pipenv.pypa.io/en/latest/install/#virtualenv-mapping-caveat
For me having the env created within the project (just like node_modules) should be even the default! Make sure to activate it! By setting the environment variable!
pipenv can seems just more convenient!
Mainly managing run scripts is too good to miss on! And a one tool that simplify it all!
Basic usage and comparing to npm
https://pipenv.pypa.io/en/latest/basics/
(make sure to check the guide above to get familiar with the basics)
Note that the equivalent of npm package.json is the PipFile file!
An example:
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
flask = "*"
simplejson = "*"
python-dotenv = "*"
[dev-packages]
watchdog = "*"
[scripts]
start = "python -m flask run"
[requires]
python_version = "3.9"
There is Pipfile.lock like package.lock
To run npm install equivalent! You run pipenv install!
To insall a new package
pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package you provided.
Just like with npm!
$ pipenv install "requests>=1.4" # will install a version equal or larger than 1.4.0
$ pipenv install "requests<=2.13" # will install a version equal or lower than 2.13.0
$ pipenv install "requests>2.19" # will install 2.19.1 but not 2.19.0
If the PIPENV_VENV_IN_PROJECT=1 env variable is set! To make pipenv set the virtual environmnent within the project! Which is created in a directory named .venv (equiv to node_modules).
Also running pipenv install without a PipFile in the directory! Neither a virtual environment! Will create the virtual environment on .venv directory (node_modules equiv)! And generate a PipFile and Pipfile.lock!
Installing flask example:
pipenv install flask
Installing as dev dependency
pipenv install watchdog -d
or
pipenv install watchdog -dev
just like with npm!
pipenv all commands (pipenv -h)
Commands:
check Checks for PyUp Safety security vulnerabilities and against PEP
508 markers provided in Pipfile.
clean Uninstalls all packages not specified in Pipfile.lock.
graph Displays currently-installed dependency graph information.
install Installs provided packages and adds them to Pipfile, or (if no
packages are given), installs all packages from Pipfile.
lock Generates Pipfile.lock.
open View a given module in your editor.
run Spawns a command installed into the virtualenv.
scripts Lists scripts in current environment config.
shell Spawns a shell within the virtualenv.
sync Installs all packages specified in Pipfile.lock.
uninstall Uninstalls a provided package and removes it from Pipfile.
update Runs lock, then sync.
Command help
pipenv install -h
importing from requirements.txt
https://pipenv.pypa.io/en/latest/basics/#importing-from-requirements-txt
environment management with pipenv
https://pipenv.pypa.io/en/latest/basics/#environment-management-with-pipenv
pipenv run
To run anything with the project virtual environment you need to use pipenv run
As like pipenv run python server.py!
Custom scripts shortcuts
scripts in npm!
https://pipenv.pypa.io/en/latest/advanced/#custom-script-shortcuts
[scripts]
start = "python -m flask run"
And to run
pipenv run start
Just like with npm!
If you’d like a requirements.txt output of the lockfile, run $ pipenv lock -r. This will include all hashes, however (which is great!). To get a requirements.txt without hashes, use $ pipenv run pip freeze.
To mention too the pipenv cli rendering is well done:
Make sure to read the basics guide!
And you can see how rich is pipenv!
Yes, it's called the requirements file:
https://pip.pypa.io/en/stable/cli/pip_install/#requirement-specifiers
You can specify the package name & version number.
You can also specify a git url or a local path.
In the usual case, you would specify the package followed by the version number, e.g.
sqlalchemy=1.0.1
You can install all the packages specified in a requirements.txt file through the command
pip install -r requirements.txt
Once all the packages have been installed, run
pip freeze > requirements.txt
This will save the package details in the file requirements.txt.
For installation, run
pip install -r requirements.txt
to install the packages specified by requirements.txt.
I would like to propose pipenv here. Managing packages with Pipenv is easier as it manages the list and the versions of packages for you because I think you need to run pip freeze command each time you make changes to your packages.
It will need a Pipfile. This file will contain all of your required packages and their version just like package.json.
You can delete/update/add projects using pipenv install/uninstall/update <package>
This also generates a dependency tree for your project. Just like package-lock.json
Checkout this post on Pipfiles
Learn more about Pipenv
I've got a virtualenv set up for a django app. So far I've installed all my packages via pip when the virtualenv is activated, but I now need to clone one from bitbucket. Is there a special way to do this or do I just need to open a terminal, goto venv/lib/python2.7/site-packages and run the clone command?
Here's the repository i'm trying to clone https://bitbucket.org/basti/python-amazon-product-api/src
Use the -e flag and specify a git repo:
pip install -e git://github.com/manojlds/mylib.git#egg=mylib
The url above can be bitbucket, github etc.
-e, --editable <VCS+REPOS_URL[#REV]#EGG=PACKAGE>
Install a package directly from a checkout. Source will be checked
out into src/PACKAGE (lower-case) and installed in-place (using
setup.py develop). You can run this on an existing directory/checkout
(like pip install -e src/mycheckout). This option may be provided
multiple times. Possible values for VCS are: svn, git, hg and bzr.
clone repository,
if your app is has setup.py, then run python setup.py install
when virtual env is actived.
else copy this app inside you django project and add name of it your INSTALLED_APPS in settings.py
or you can use pip install -e <repo_addr>, see doc.
I'm writing a Python app to deploy on Heroku. Per Heroku's guide, I need to list package requirements in a Pip requirements.txt file. The guide instructs me to install the packages locally, then run pip freeze > requirements.txt to write the frozen requirements file.
However, one of the packages I want to use in deployment on Heroku can't be installed locally. It's incompatible with my operating system.
So how do I write a requirements.txt including this package suitable for Heroku?
The only way I can think of is to write it by hand - but this would be tedious, because the package in question has many dependencies of its own. Besides, this defeats the point of the package manager.
When deploying Ruby apps to Heroku, Bundler makes this easy. In my Gemfile I write
gem "pg", :group => :production
gem "sqlite3", :group => :development
The command bundle install then writes a frozen version list Gemfile.lock (analogous to requirements.txt). It doesn't install the packages listed under the 'production' group, but it still freezes a consistent list of versioned packages.
Example: Gemfile and Gemfile.lock
You can have more than one file, and call them different things, but Heroku does expect a requirements.txt. For instance, for dev, you could maintain a dev_requirements.txt
Locally you can run:
$ pip freeze > dev_requirements.txt
etc, and
$ pip install -r dev_requirements.txt
and Heroku will run:
$ pip install -r requirements.txt
It's not possible. Issue reported to pip https://github.com/pypa/pip/issues/747