How to add my own Module to my Anaconda environment - python

Basically the question is already up there. I created my environemt for the project with miniconda. Now I want to include some modules, bundled in directory into this environemt. So what I did was putting the directory in the '''/miniconda/env/..../sitepackages/mymodule/''' directory. When I run the module from the command line, where my current working directory is this directory it works. As soon as I just activate this conda environment and work in a different directory it tells me ModuleNotFoundError: No module named 'stdiio'
Hope it kind of makes sense and got more or less clear the question. Any help would be really appreciated.

If your module is installable (e.g., you have a setup.py), then you can activate your Conda env and install using pip:
conda activate -n myenv
pip install /some/path/to/mymodule
If you are actively developing the module, then use pip install -e, instead.
If your module is not installable, but just some source folders with __init__.py files, then another option is to add the containing folder to PYTHONPATH. For example, if your module is in /some/path/to/mymodule, then you would use
export PYTHONPATH="/some/path/to:$PYTHONPATH"
Be careful with PYTHONPATH - one can encounter confusing problems if you allow conflicting outside modules to "leak" into your Conda environment (e.g., adding a site-packages from another Python install).
Installation should be the preferred option, and if you need to use PYTHONPATH, set it in an env-specific manner using activation hooks.

If you're using Python on Windows, then it may be best to resolve this by adding PYTHONPATH to the system variables, or if it's already set, then you may want to append the folder path to the variable.
If you don't already know how to access the system variables, or are confused on how to adjust PYTHONPATH for your needs, then you can check out this StackOverflow question, which has several in-depth answers that explain how to set/update PYTHONPATH on Windows:
How to add to the PYTHONPATH in Windows, so it finds my modules/packages?

Related

pip Virtualenv vs. directory in project

Why does pip need to use virtual environments to isolate packages per-project, instead of just installing them in a default directory in the project? This seems like added complexity without benefit.
NPM, for example, installs packages in the <project_root>\node_modules by default. No virtual environment necessary, and packages are still installed in a project-independent way.
Edit: To be clear, I'm interested in the practical advantages to pip's use of virtual environments over package management systems like NPM, Nuget, and Webpack, which all use directories in the project. Otherwise, if this is just a limitation of Python's modules system, then I'd be interested to know that too.
Because Python's module system doesn't work that way. If pip were to install, say, requests by just downloading it to a python_modules directory, that wouldn't be enough to for import requests to work; it would have to be import python_modules.requests, but then we'd still have problems whenever requests tried to import one of its dependencies, as that would need python_modules prepended, too, and it'd just be a big mess. The solution that virtual environments use is to modify the PYTHONPATH environment variable to include python_modules, plus some extra stuff to take care of executable scripts and not importing packages from outside the virtualenv.
I think maybe you don't know what a virtual environment actually is.
If you were to put some module in a project-specific directory, like myproj/modules, then you would have to add myproj/modules to the search path that Python uses so that you module can be found. One way to do that is to define or modify the environment variable PYTHONPATH. Any directories listed in that variable will be searched for modules, in addition to some hard-coded set of directories.
$ export PYTHONPATH=./myproj/modules
However, that's really all a virtual environment is. The directory contains the desired version of Python, along with whatever modules you want to use. The activate script you run to "enable" a virtual environment does little more than set the value of PATH and PYTHONPATH so that anytime you run python, both the correct version is used and your project-specific set of modules is used in place of any global library.

Anaconda: Permanently include external packages (like in PYTHONPATH)

I know how to install packages in Anaconda using conda install and also how to install packages that are on PyPi which is described in the manual.
But how can I permanently include packages/folders into the PYTHONPATH of an Anaconda environment so that code that I am currently working on can be imported and is still available after a reboot?
My current approach is to use sys:
import sys
sys.path.append(r'/path/to/my/package')
which is not really convenient.
Any hints?
I found two answers to my question in the Anaconda forum:
1.) Put the modules into into site-packages, i.e. the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages which is always on sys.path. This should also work by creating a symbolic link.
2.) Add a .pth file to the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages. This can be named anything (it just must end with .pth). A .pth file is just a newline-separated listing of the full path-names of directories that will be added to your path on Python startup.
Alternatively, if you only want to link to a particular conda environment then add the .pth file to ~/anaconda3/envs/{NAME_OF_ENVIRONMENT}/lib/pythonX.X/site-packages/
Both work straightforward and I went for the second option as it is more flexible.
*** UPDATE:
3.) Use conda develop i. e. conda-develop /path/to/module/ to add the module which creates a .pth file as described under option 2.).
4.) Create a setup.py in the folder of your package and install it using pip install -e /path/to/package which is the cleanest option from my point of view because you can also see all installations using pip list. Note that the option -e allows to edit the package code. See here for more information.
Thanks anyway!
I'm able to include local modules using the following:
conda-develop /path/to/module/
I hope it helps.
The way I do this, which I believe is the most native to conda, is by creating env_vars.sh files in my environment, as per the official documentation here.
For macOS and Linux users, the steps are as follows:
Go to your environment folder (e.g. /miniconda1/env/env_name). $CONDA_PREFIX is the environemnt variable for your environment path.
cd $CONDA_PREFIX
Create the activate.d and deactivate.d directories.
mkdir -p ./etc/conda/activate.d
mkdir -p ./etc/conda/deactivate.d
Inside the each respective directory, create one env_vars.sh file. The one in the activate.d directory will set (or export) your environment variables when you conda activate your environment. The file in the deactivate.d directory will serve to unset the environment variables when you conda deactivate your environment.
touch ./etc/conda/activate.d/env_vars.sh
touch ./etc/conda/deactivate.d/env_vars.sh
First edit the $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh to export the desired environment variables.
#!/bin/sh
export VAR_A='some-thing-here'
export VAR_B=/path/to/my/file/
Afterwards, open to edit the $CONDA_PREFIX/etc/conda/deactivate/env_vars.sh, in order to unset the env variables when you conda deactivate like so:
#!/bin/sh
unset VAR_A
unset VAR_B
Again, the source of my description comes straight from the conda docs here.
Just to add to Cord Kaldemeyer's answer above, for the 2nd option. If you only want to link to a particular conda environment then add the .pth file to ~/anaconda3/envs/{NAME_OF_ENVIRONMENT}/lib/pythonX.X/site-packages/

What is "revirtual" in this answer?

In this question on S/O:
Can existing virtualenv be upgraded gracefully?
The accepted answer says that you can:
use the Python 2.6 virtualenv to "revirtual" the existing directory
I cannot seem to find any details on how to "revirtual" an existing virtualenv. I know how to manually install python, but I am looking very specifically for the "python / virtualenv" way to upgrade python inside a specific virtualenv.
In my very specific situation, I have become an administrator in someone else' stead. The system python is 2.6.6, however the virtualenvs are all using 2.7.4 within the virtualenv path, which is something like:
/home/user_name/.virtualenvs/application_name/bin/python2.7
While there is no python 2.7.x in the system. I cannot find any evidence of python having been manually installed, and I cannot find any details online about using pip or using apt-get/yum or anything to install different versions of python within the virtualenv.
So, my very specific questions are:
How does one "revirtual" a virtualenv?
How does one upgrade python within a virtualenv the "python" way?
Are there alternative ways to manage python versions within virtualenvs?
Thanks, and please let me know if I can clarify my questions in any way!
-S
Usual caveat: make backup copies first. However, the structure created by virtualenv is not that complex, so it should be possible to either find what you need or create a new one and migrate. The path with ~/.virtualenvs in it means it was probably created with virtualenvwrapper so you could read up on that too.
virtualenv makes copies of the executables in the bin directory, which means they don't have to exist elsewhere. In the lib and includes directories, however, by default there will be symlinks to items from the "source" python (unless someone changed that setting). You could do a ls -l in those directories and maybe you will find where python 2.7 is installed - or maybe some broken symlinks.
Under lib should be one or more python-<version> directories, with some symlinks (probably) to python standard library stuff and a site-packages directory, which is where your installed packages are.
Now, upgrading. If you try to update the virtualenv "in-place" (i.e. you just run virtualenv --python==python<version> <existing-directory>) then you'll probably run into two issues: (1) The bin/python symlink will not be replaced unless you delete/move it, and (2) the directories under lib are per python version, so if you don't use 2.7 then the site-packages directory won't automatically carry over -- you'll have to reinstall packages, or go in and move it, which will probably work unless you have compiled binary extensions, or are migrating to 3.x, or something else weird happens.
An alternate, cleaner approach is to create a new, fresh virtualenv, and then reinstall the necessary packages. Find your own app's package(s) first and figure out if they have a setup.py that works or are just code that's sitting in a directory. If they don't have setup.py that correctly brings in all dependencies, then in your existing virtualenv folder you can run ./bin/pip freeze to get a listing of installed packages. You can save this to a file and use pip install -r <filename> on it later.

How to get virtualenv to use dist-packages on Ubuntu?

I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()

Access a Python Package from local git repository

I have a local git repository on my machine, let's say under /develop/myPackage.
I'm currently developing it as a python package (a Django app) and I would like to access it from my local virtualenv. I've tried to include its path in my PYTHONPATH (I'm on a Mac)
export PATH="$PATH:/develop/myPackage"
The directory already contains a __init__.py within its root and within each subdirectory.
No matter what I do but I can't get it work, python won't see my package.
The alternatives are:
Push my local change to github and install the package within my virtualenv from there with pip
Activate my virtualenv and install the package manually with python setup.py install
Since I often need to make changes to my code the last two solution would require too much work all the time even for a small change.
Am I doing something wrong? Would you suggest a better solution?
Install it in editable mode from your local path:
pip install -e /develop/MyPackage
This actually symlinks the package within your virtualenv so you can keep on devving and testing.
The example you show above uses PATH, and not PYTHONPATH. Generally, the search path used by python is partially predicated on the PYTHONPATH environment variable (PATH has little use for this case.)
Try this:
export PYTHONPATH=$PYTHONPATH:/develop/myPackage
Though in reality, you likely want it to be pointing to the directory that contains your package (so you can do 'import myPackage', rather than importing things within the package. That being said, you likely want:
export PYTHONPATH=$PYTHONPATH:/develop/
Reference the python docs here for more information about Python's module/package search path: http://docs.python.org/2/tutorial/modules.html#the-module-search-path
By default, Python uses the packages that it was installed with as it's default path, and as a result PYTHONPATH is unset in the environment.

Categories