I've written a python script that uses steampy.
To that library I cloned it to a local folder, but now I don't know how to make my script use the local library instead of the installed one.
I'm coming from Angular where this is achievable by making a link with npm link between the two libraries.
Also, in my local steampy all imports referring to steampy error out, for example:
from steampy.exceptions import ApiException, ...
No name 'exceptions' in module 'steampy.exceptions' pylint(no-name-in-module)
Unable to import 'steampy.exceptions' pylint(import-error)`
If you are working in a virtualenv, you can just try:
pip install -e <path to the lib>
The -e flag makes the install editable, this means that if you do changes on the steampy repo, those will be available on the virtualenv.
Simply put the steampy folder in the same directory as your script.
steampy
main.py
Related
I would like to make changes (and possibly contribute if its any good) to a public project on GitHub. I've forked and cloned the module but Im unclear how to get my program to import the local library instead of the 'official' installed module.
I tried cloning it into my project folder but when I imported it and tried to use it things got weird calmap\calmap.plot()
I also tried doing sys.path.append and the folder location. But it seems to still import the official one instead of the forked.
I'm assuming that I could put my program inside the module folder so that module would be found first but I can't image thats the 'correct' way to do it.
|
|-->My_Project_Folder/
|
|-->Forked_Module/
|-->docs/
|-->Forked_Module/
|-->__init__.py
If you're already using anaconda, then you can create a new environment just for the development of this feature.
First, create a new environment:
# develop_lib is the name of the environment.
# You can pick anything that is memorable instead.
# You can also use whatever python version you require ...
conda create -n develop_lib python3.5
Once you have the environment, then you probably want to enter that environment in your current session:
source activate develop_lib
Ok, now that you have the environment set up, you'll probably need to install some requirements for whatever third party library you're developing. I don't know what those dependencies are, but you can install them in your environment using conda install (if they're available) or using pip. Now you're ready to start working with the library that you want to update. python setup.py develop should be available assuming that the package has a standard build process. After you've run that, things should be good to go. You can make changes, run tests, etc.
If you use sys.path.append() the new "path" will be used if none of the previous contains the module you are importing. If you want that the "added path" has precedence over all the older, you have to use
sys.path.insert(0, "path")
In this way, if you print the sys.path you will see that the added path is at the beginning of the list and the module you are importing will be loaded from the path you have specified.
to import from the forked repo instead of python package you should
make a virtual environment for the cloned project then activate it, that way the environment is isolated from the globally installed packages.
1- you need to fork your repo;
2- create a virtual env and activate it;
3- clone your repo.
now if you print your import you will see the path of the forked repo.
import any_module
print(any_module)
I have a cluster system at work with Python and some modules installed on that system - however I wanted to use the most up to date version of the module - it has several methods not present in older versions, so I built it and it's deps locally in the area I have access to:
# From my home directory: /gpfs/env/yrq12edu
# Get the source I need for the up to date version of the module I want to install locally.
svn co svn://svn.code.sf.net/p/simupop/code/trunk simuPOP
# Install PCRE stuff...
cd pcre-8.34
./configure --prefix=/gpfs/env/yrq12edu/pcre_install
make
make install
export PATH=/gpfs/env/yrq12edu/pcre_install/bin:$PATH
export LD_LIBRARY_PATH=/gpfs/env/yrq12edu/pcre_install/lib:$LD_LIBRARY_PATH
cd ..
# Install Swig Stuff...
cd swig-3.0.0
./configure --prefix=/gpfs/env/yrq12edu/swig_install
make
make install
export PATH=/gpfs/env/yrq12edu/swig_install/bin:$PATH
cd ..
export PYTHONPATH=/gpfs/env/yrq12edu/PythonModules/lib/python2.7/site-packages
# Build the up to date simuPOP module I need locally...
cd simuPOP
python setup.py install --prefix=/gpfs/env/yrq12edu/PythonModules
How can I ensure that when I execute my Python scripts in the cluster it will try and use my local module rather than the system one? I have obviously changed PYTHONPATH during the build process which I know should allow modules to be loaded locally, but wondered which it will load when there is the choice of the system installed old version, or my new locally installed version. Will Python just know to favour the local one and load it instead or do I have to specify some option to force it?
Thanks,
Ben W.
According to the docs Python will load the built-in module if it's available. If it's not, it then looks in each path in sys.path (which starts with the current directory).
However, if I'm reading it correctly, standard modules are different from built-in modules. Standard modules are found by looking in sys.path, so if you put your path at the start of sys.path Python will get your module instead of the standard one.
I have a local git repository on my machine, let's say under /develop/myPackage.
I'm currently developing it as a python package (a Django app) and I would like to access it from my local virtualenv. I've tried to include its path in my PYTHONPATH (I'm on a Mac)
export PATH="$PATH:/develop/myPackage"
The directory already contains a __init__.py within its root and within each subdirectory.
No matter what I do but I can't get it work, python won't see my package.
The alternatives are:
Push my local change to github and install the package within my virtualenv from there with pip
Activate my virtualenv and install the package manually with python setup.py install
Since I often need to make changes to my code the last two solution would require too much work all the time even for a small change.
Am I doing something wrong? Would you suggest a better solution?
Install it in editable mode from your local path:
pip install -e /develop/MyPackage
This actually symlinks the package within your virtualenv so you can keep on devving and testing.
The example you show above uses PATH, and not PYTHONPATH. Generally, the search path used by python is partially predicated on the PYTHONPATH environment variable (PATH has little use for this case.)
Try this:
export PYTHONPATH=$PYTHONPATH:/develop/myPackage
Though in reality, you likely want it to be pointing to the directory that contains your package (so you can do 'import myPackage', rather than importing things within the package. That being said, you likely want:
export PYTHONPATH=$PYTHONPATH:/develop/
Reference the python docs here for more information about Python's module/package search path: http://docs.python.org/2/tutorial/modules.html#the-module-search-path
By default, Python uses the packages that it was installed with as it's default path, and as a result PYTHONPATH is unset in the environment.
Yesterday, I edited the bin/activate script of my virtualenv so that it sets the PYTHONPATH environment variable to include a development version of some external package. I had to do this because the setup.py of the package uses distutils and does not support the develop command à la setuptools. Setting PYTHONPATH works fine as far as using the Python interpreter in the terminal is concerned.
However, just now I opened the project settings in PyCharm and discovered that PyCharm is unaware of the external package in question - PyCharm lists neither the external package nor its path. Naturally, that's because PyCharm does not (and cannot reliably) parse or source the bin/activate script. I could manually add the path in the PyCharm project settings, but that means I have to repeat myself (once in bin/activate, and again in the PyCharm project settings). That's not DRY and that's bad.
Creating, in site-packages, a symlink that points to the external package is almost perfect. This way, at least the source editor of PyCharm can find the package and so does the Python interpreter in the terminal. However, somehow PyCharm still does not list the package in the project settings and I'm not sure if it's ok to leave it like that.
So how can I add the external package to my virtualenv/project in such a way that…
I don't have to repeat myself; and…
both the Python interpreter and PyCharm would be aware of it?
Even when a package is not using setuptools pip monkeypatches setup.py to force it to use setuptools.
Maybe you can remove that PYTHONPATH hack and pip install -e /path/to/package.
One option is to add path dynamically:
try:
import foo
except ImportError:
sys.path.insert(0. "/path/to/your/package/directory")
import foo
But it is not the best solution because it is very likely that that code will not get into the final version of the application. One more option (and more appropriate imho) is to make simple setup.py file for package and deploy it in virtualenv with develop parameter or by pip with -e parameter:
python setup.py develop
or:
pip install -e /path/to/your/package/directory
http://packages.python.org/distribute/setuptools.html#development-mode
This is an improvement on ndpu's answer that will work regardless of where the real file is.
You can dereference the symlink and then set sys.path before importing local imports.
import os.path
import sys
# Ensure this file is dereferenced if it is a symlink
if __name__ == '__main__' and os.path.islink(__file__):
try:
sys.path.remove(os.path.dirname(__file__))
except ValueError:
pass
sys.path.insert(0, os.path.dirname(os.path.realpath(__file__)))
# local imports go here
I'd like to start developing an existing Python module. It has a source folder and the setup.py script to build and install it. The build script just copies the source files since they're all python scripts.
Currently, I have put the source folder under version control and whenever I make a change I re-build and re-install. This seems a little slow, and it doesn't settle well with me to "commit" my changes to my python install each time I make a modification. How can I cause my import statement to redirect to my development directory?
Use a virtualenv and use python setup.py develop to link your module to the virtual Python environment. This will make your project's Python packages/modules show up on the sys.path without having to run install.
Example:
% virtualenv ~/virtenv
% . ~/virtenv/bin/activate
(virtenv)% cd ~/myproject
(virtenv)% python setup.py develop
Virtualenv was already mentioned.
And as your files are already under version control you could go one step further and use Pip to install your repo (or a specific branch or tag) into your working environment.
See the docs for Pip's editable option:
-e VCS+REPOS_URL[#REV]#egg=PACKAGE, --editable=VCS+REPOS_URL[#REV]#egg=PACKAGE
Install a package directly from a checkout. Source
will be checked out into src/PACKAGE (lower-case) and
installed in-place (using setup.py develop).
Now you can work on the files that pip automatically checked out for you and when you feel like it, you commit your stuff and push it back to the originating repository.
To get a good, general overview concerning Pip and Virtualenv see this post: http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django
Install the distrubute package then use the developer mode. Just use python setup.py develop --user and that will place path pointers in your user dir location to your workspace.
Change the PYTHONPATH to your source directory. A good idea is to work with an IDE like ECLIPSE that overrides the default PYTHONPATH.