What I have:
local Python3 files that I want to turn into a module test_module
test_module folder containing an empty __init__.py, a setup.py file (see below) and subdirectories with several source
files
What I want:
continuously work on and improve test_module locally
have an easy way to install test_module and all its dependencies locally in my own virtual environment (created using python3 -m venv my_environment)
run files that make use of the module via python myexample.py, without having to take care of adapting my local PYTHONPATH variable each time i enter or exit the my_environment
share my python code with others via git, and allow them to install their code locally on their machines using the same procedure (as simple as possible)
learn best practices on how to create my own module
How I'm doing it at the moment:
pip freeze > requirements.txt and pip install -r requirements.txt for installing dependencies
adding export PYTHONPATH="${PYTHONPATH}:." to my_environment/bin/activate, to have my own module in the search path
(as found here: How do you set your pythonpath in an already-created virtualenv?)
I'd like to know if there are "cleaner" solutions based on setup.py, possibly involving something like pip install ./test_module or similar that takes care of 2.-3. automagically.
My current setup.py file looks as follows
from setuptools import setup
setup(
name='test_module',
version='0.1',
description='Some really good stuff, that I am still working on',
author='Bud Spencer',
author_email='bud.spencer#stackoverflow.com',
packages=['test_module'], # same as name
install_requires=['numpy', 'scipy', 'sklearn', 'argparse'], # external packages as dependencies
)
It sounds like you want to run pip install -e <path/url> from within your virtual env, which will install a package (with a setup.py file as you have) from either a local path or a Git repo. See https://pip.pypa.io/en/stable/reference/pip_install/#vcs-support for an explanation on the syntax of the latter.
Example:
pip install -e git+https://github.com/me/test_module/#egg=test-module
If you have already installed and want to pull the latest code from the repo, add an --upgrade switch to the above.
Related
I have some issues with a published package and wish to edit the code myself (may generate a pull request later to contribute). I am quite confused about how to do this since it seems there is a lack of step-by-step guidance. Could anybody give me a very detailed instruction about how this is done (or a link)? My understanding and also my questions about the workflow are:
Fork the package through git/github and have a local synced copy (done!).
Create a new Anaconda environment (done!)?
Install the package as normal: $conda install xxx or $python setup.py develop?
Do I make changes to the package directly in the package folder in Anaconda if I use python setup.py develop?
Or make changes to the local forked copy and install/update again and what are the commands for this?
Do I need to update the setup.py file as well before running it either way?
You can simply git-clone the package repo to your local computer and then install it in "development" or "editable" mode. This way you can easily make changes to the code while at the same time incorporating it into your own projects. Of course, this will also allow you to create pull requests later on.
Using Anaconda (or Miniconda) you have 2 equivalent options for this:
using conda (conda-develop):
conda develop <path_to_local_repo>
using pip (pip install options)
pip install --editable <path_to_local_repo>
What these commands basically do is creating a link to the local repo-folder inside the environments site-packages folder.
Note that for "editable" pip installs you need a a basic setup.py:
import setuptools
setuptools.setup(name=<anything>)
On the other hand the conda develop <path_to_local_repo> command unfortunately doesn't work in environment.yml files.
Using Windows
Learning about virtualenv. Here is my understanding of it and a few question that I have. Please correct me if my understanding is incorrect.
virtualenv are environments where your pip dependencies and its selected version are stored for a particular project. A folder is made for your project and inside there are the dependencies.
I was told you would not want to save your .py scripts in side of virtual ENV, if that's the case how do I access the virtual env when I want to run that project? Open it up in the command line under source ENV/bin/activate then cd my way to where my script is stored?
By running pip freeze that creates a requirements.txt file in that project folder that is just a txt. copy of the dependencies of that virtual env?
If I'm in a second virutalenv who do I import another virtualenv's requirements? I've been to the documentation but I still don't get it.
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
Guess I'm confused on the "requirements" description. Isn't best practice to always call our requirements, requirements.txt? If that's the case how does env2 know I'm want env1 requirements?
Thank you for any info or suggestions. Really appreciate the assistance.
I created a virtualenv C:\Users\admin\Documents\Enviorments>virtualenv django_1
Using base prefix'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32'
New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
* disclaimer * I mainly use conda environments instead of virtualenv, but I believe that most of this is the same across both of them and is true to your case.
You should be able to access your scripts from any environment you are in. If you have virtenvA and virtenvB then you can access your script from inside either of your environments. All you would do is activate one of them and then run python /path/to/my/script.py, but you need to make sure any dependent libraries are installed.
Correct, but for clarity the requirements file contains a list of the dependencies by name only. It doesn't contain any actual code or packages. You can print out a requirements file but it should just be a list which says package names and their version numbers. Like pandas 1.0.1 numpy 1.0.1 scipy 1.0.1 etc.
In the lines of code you have here you would export the dependencies list of env1 and then you would install these dependencies in env2. If env2 was empty then it will now just be a copy of env1, otherwise it will be the same but with all the packages of env1 added and if it had a different version number of some of the same packages then this would be overwritten
virtualenv simply creates a new Python environment for your project. Think of it as another copy of Python that you have in your system. Virutual environment is helpful for development, especially if you will need different versions of the same libraries.
Answer to your first question is, yes, for each project that you use virtualenv, you need to activate it first. After activating, when you run python script, not just your project's scripts, but any python script, will use dependencies and configuration of the active Python environment.
Answer to the second question, pip freeze > requirements.txt will create requirements file in active folder, not in your project folder. So, let's say in your cmd/terminal you are in C:\Desktop, then the requirements file will be created there. If you're in C\Desktop\myproject folder, the file will be created there. Requirements file will contain the packages installed on active virtualenv.
Answer to 3rd question is related to second. Simply, you need to write full path of the second requirements file. So if you are in first project and want to install packages from second virtualenv, you run it like env2/bin/pip install -r /path/to/my/first/requirements.txt. If in your terminal you are in active folder that does not have requirements.txt file, then running pip install will give you an error. So, running the command does not know which requirements file you want to use, you specify it.
I created a virtualenv
C:\Users\admin\Documents\Enviorments>virtualenv django_1 Using base prefix 'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32' New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
Yes, saving virtualenv separately from your project files is one of concepts. virtualenvwrapper and pipenv works like that. But personally if I use virtualenv in the simplest form, then I just create the directory with the same name inside virtualenv's directory (next to bin/) and I keep project files there.
pip freeze prints to console the packages (and it's versions) you've installed inside your virtualenv using pip. If you want to save those requirements to file you should do something like pip freeze > requirements.txt
There are few posibilites:
you can activate one virtualenv, then go (cd /path/to/venv2) to another virtualenv.
you can copy your requirements.txt file from one virtualenv and install those requirements in your second virtualenv
I had to manually build a package and copy it to the site-packages directory. When I type pip list into a console it isn't listed, though I can use it in python scripts. How can I make pip aware of the package?
Installing it via pip is not an option.
You say "Installing it via pip is not an option.", but I'm assuming installing it via pip using a local copy still is. If so, the way to do that is to clone your library into a directory (say /my/lib/dir), where the root of the source for the root package appears below /my/lib/dir (ex: if the package you want to install is imported as import foo, then you should have /my/lib/dir/foo). If there is no file named setup.py in your copy of the code, then you need to create a simple one. Something like
# in a file called setup.py above the `foo` directory
from distutils.core import setup
setup(name='foo',
version='1.0',
packages=['foo'],
)
Finally, run pip install . from /my/lib/dir.
It's definitely a hack, but making pip aware of a package without installing it via pip is asking for a hack :-)
I have a package that I am developing for a local server. I would like to have the current stable release importable in a Jupyter notebook using import my_package and the current development state importable (for end-to-end testing and stuff) with import my_package_dev, or something like that.
The package is version controlled with git. The master branch holds the stable release, and new development work is done in the develop branch.
I currently pulled these two branches into two different folders:
my_package/ # tracks master branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff
my_package_dev/ # tracks develop branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff for dev branch
My setup.py file looks like this:
from setuptools import setup
setup(
name='my_package', # or 'my_package_dev' for the dev version
# metadata stuff...
)
I can pip install my_package just fine, but I have been unable to get anything to link to the name my_package_dev in Python.
Things I have tried
pip install my_package_dev
Doesn't seem to overwrite the existing my_package, but doesn't seem to make my_package_dev available either, even though pip says it finishes OK.
pip install -e my_package_dev
makes an egg and puts the development package path in easy-install.pth, but I cannot import my_package_dev, and my_package is still the old content.
Adding a file my_package_dev.pth to site-packages directory and filling it with /path/to/my_package_dev
causes no visible change. Still does not allow me to import my_package_dev.
Thoughts on a solution
It looks like the best approach is going to be to use virtual environments, as discussed in the answers.
With pip install you install packages by its name in setup.py's name attribute. If you have installed both and execute pip freeze, you will see both packages listed. Which code is available depends on how they are included in Python path.
The issue is those two packages contains just a python module named my_package, that it why you can not import my_package_dev (it does not exist).
I would suggest you to have an working copy for each version (without modifying package name) and use virtualenv to keep environments isolated (one virtualenv for stable version and the other for dev).
You could also use pip's editable install to keep the environment updated with the working copies.
Note: Renaming my_package_dev's my_package module directory to my_package_dev, will also work. But it will be harder to merge changes from one version to the other.
The answer provided by Gonzalo got me on the right track: use virtual environments to manage two different builds. I created the virtual environment for the master (stable) branch with:
$ cd my_package
$ virtualenv venv # make the virtual environment
$ source venv/bin/activate
(venv) $ pip install -r requirements.txt # install everything listed as a requirement
(venv) $ pip install -e . # install my_package dynamicially so that any changes are visible right away
(venv) $ sudo venv/bin/python -m ipykernel install --name 'master' --display-name 'Python 3 (default)'
And for the develop branch, I followed the same procedure in my my_package_dev folder, giving it a different --name and --display-name value.
Note that I needed to use sudo for the final ipykernel install command because I kept getting permission denied errors on my system. I would recommend trying without sudo first, but for this system it needed to be installed system-wide.
Finally, to switch between which version of the tools I am using, I just have to select Kernel -> Change kernel and choose Python 3 (default) or Python 3 (develop). The import stays the same (import my_package), so nothing in the notebook has to change.
This isn't quite my ideal scenario since it means that I will then have to re-run the whole notebook any time I change kernels, but it works!
Problem: the package I want to install is outdated on pip, and conda doesn't have it in the repo. So, when I install a python package from github using,
git clone package_url
cd package_name
python setup.py
should I DOWNLOAD the package from within the directory that is the directory in which conda or pip usually would install my package? For example, should I run git clone from within:
['/Users/home/anaconda/lib/python2.7/site-packages',
'/Users/home/anaconda/lib/site-python']
OR, can I just run git clone, from whatever directory I happen to be in.
The concern is that I download from git in something like, /Users/home/Downloads, and then, when I run the setup.py file, I would only install within the /Users/home/Downloads directory, and then when I import, I wouldn't be able to find the package.
Accepted answer: I can run the git clone command in terminal from within any directory. Then, I can change directory into the newly established directory for the package that I cloned, and run the setup.py script. Running the setup.py script should "automatically install [the package] within the site-packages of whatever python [is being] used when python [is invoked]". I hope this helps someone overly anxious about running setup.py files.
Run it from the folder containing setup.py.
Doing:
python setup.py install
Will install the package in the appropriate directory. The file already contains the logic that puts the package in the right installation directory, so you don't need to worry about how the package make its way to its installation directory.
It can be simpler to use pip for this package as well, by pointing pip directly at the URL:
pip install git+http://....git
The git+ in front of the URL is required.
You can even go a step further and install a specific branch:
pip install git+http://....git#branchname
You can run the setup.py file as you stated, and you follow it by install as follow:
python setup.py install
Usually, this would lead to installing the package you want to the python path.