I am using some custom modules, not available on PyPI. Is it possible to manage the dependency through virtualenv?
Yes. pip can install packages from -
PyPI (and other indexes) using requirement specifiers.
VCS project urls.
Local project directories.
Local or remote source archives.
So all you have to do it provide the location of the module from some VCS or local directory in the requirements.txt file and pip install -r requirements.txt after activating virtualenv, and it'll work. More examples can be found at pip documentation.
Just keep in mind that pip will run python setup.py install after downloading and extracting your custom module. So you must package your module to support that.
Related
i've installed a python project, and it imports modules(Like almost every project). The problem is when i want to install them(because i haven't got the modules), for example: In the project is imported a module called "a" but when i go and install "a" with pip install a, it says ERROR: Could not find a version that satisfies the requirement a (from versions: none) ERROR: No matching distribution found for a. How could i know the name of the module that is imported in that python project?
Edit:
btw i just found out the module that the project uses comes in the zip where the python project is. How could i install it so it works?
All pip packages are listed here. If you want to import a module called a inside a python script, the command to install it could be sometimes pip install b. Because the name of the stored package can varied from the python import name. To find how to install it the best is to get the pypi url of your package. You can googling the python error ModuleNotFoundError: No module named 'dgvd', it always show you the pypi url in top links.
The good practice in a project is to have a txt file called requirement.txt that you create in bash using this command:
pip freeze > requirement.txt
Then install all packages in once using:
pip install -r requirement.txt
For installing from zip simply use:
pip install *.zip
or specify the path directly:
pip install <path to .zip>
pip install ./my-archive.zip
Same applies for a tarball or any other format. It can be even a folder. However, it has to include a proper setup.py or other mechanism for pip to install it and pip has to support the packaging format (be it archive, networking protocol, version control system (git prefix), etc).
pip install ./my-folder
pip install ./
pip install .
pip install ..
etc
If, however, there is no setup.py present, you'll need to simply copy-paste the files somewhere where your project/module resides (or set PYTHONPATH or sys.path to that folder) to be able to import them. See this or this question for more.
I have some local packages hosted on my own machine, I would like to include a copy of them in distribution of other packages that depends on them. When installing a local package, pip freeze shows something like
public-package==3.0.1
local-package # file:///home/user/local-package/dist/local-package-1.0.0.tar.gz
but if one try to install that package on other computer will get error from pip because local-package path does not exist. Can I extend setup.py commands to process that requirements.txt file, extract local packages path, copy local packages into deps folder of dist archive and rewrite requirements.txt like
public-package==3.0.1
local-package # deps/local-package-1.0.0.tar.gz
and make pip treat deps/ as a relative path to the package archive itself?
I managed to do it with source distributions and overriding sdist and egg_info commands to make setuptools bundle local dependencies together with package and to make pip search dependencies in that bundle when installing the built package later. But later I figured out it makes system vulnerable to dependency confusion attacks because local packages installed from that bundle are visible with pip freeze, if for some reason the dependency location, like local-package # file:///home/user/packages/local-package.tar.gz is stripped to just local-package pip will search it on pypi, which allows dependency confusion to happen.
The best solution for this problem is to vendor all local dependencies where their source code is copied to the package, pip itself vendors its dependencies using vendoring.
I have a python project hosted on PyPI. I've discovered I have some dependency conflicts that need to be pinned. I know pip looks at the install_requires key in setup.py for dependencies, but I've read it's best to place pinned dependencies in a requirements.txt file. I've included this file (see below) using pip freeze, but I am unsure whether pip install project is sufficient to install dependencies as well.
# requirments.txt
numpy==1.9.2
pandas==0.16.2
I would like to make the simplest installation process for the user. For a package hosted on PyPI:
How do I setup requirements to simply pip install a project and include all of it's pinned dependencies automatically (similar to conda)?
Must install_requires=['numpy', 'pandas'] be included? If so, how do I best set it up to install the pinned versions only.
I'm trying to install the module mockupbase in order to import HTMLParser for my Flask Web App. Mockupbase is not a package/module within the Python Package Index, so pip install doesn't work in my visual studio development environment. The only resource I could find online about installing third party libraries was http://flask.pocoo.org/docs/0.10/extensiondev/, but this link says extensions must be registered under the python package index. I feel like there should be an alternative route to installing third party packages without registering them on python package index. I'm familiar with installing packages on my local computer, but am not sure how to implement this on my flask web project.
How do I install a third party python module/library not registered on Python Package Index for my flask web application
It seems that we cannot install module mockupbase on VS using pip and easy_install.
However, I have ever install custom module as following steps, you can try it.
For example, I create a Hello.py file and store it into C:/Python folder.
Then, I can use it via this method:
import sys
sys.path.append('c:/python')
import hello
hello.hello() # hello,world
For this issue, I recommend you refer to this documents:The module-search-path
And you can see this post.
You can specify url, local file path, ... instead of package name. By specifying url, file path, pip will try to download it, unpack it and install it.
According to Installing Packages - User Guide - pip documentation,
pip supports installing from PyPI, version control, local projects,
and directly from distribution files.
If you have multiple packages, you can follow Fast & Local Installs:
Often, you will want a fast install from local archives, without
probing PyPI.
First, download the archives that fulfill your requirements:
$ pip install --download <DIR> -r requirements.txt
Then, install using --find-links and --no-index:
$ pip install --no-index --find-links=[file://]<DIR> -r requirements.txt
I am working in a team and wrote some python code that uses libraries that need to be installed separately (because they are not part of standard python distribution). How should I specify those ? What is the right/correct/pythonic way to do this?
I personally use pip install -r requirements.txt
https://pip.pypa.io/en/latest/user_guide.html#requirements-files
Check out tool called pip. It's what most python projects use these days.
Typically, one would do the following (for example, we want to install the requests package for our new project):
pip install requests
and then
pip freeze > requirements.txt
Now, we have installed requests on our system and saved the dependency version to a file which we can distribute with our project.
At this point, requirements.txt contains:
requests==2.7.0
To install the same set of requirements (in our case only the requests package) on some other system, one would do the following:
pip install -r requirements.txt
You need to make a setup.py file for your package that specifies required packages. You may need to reorganize your file structure and include data and a manifest.
Then create a distribution of your package, EG: a wheel file.
Then when using pip install your_package_distro.whl, pip will determine your packages dependencies are met and install them from PyPI unless you specify another package source (EG: https://pypi.anaconda.org/)
Read through the following references to distribute your code:
Python 2.7.10 documentation - Distributing Python Modules, Section 2: Writing the Setup Script
Setuptools - Building and Distributing Packages with Setuptools
Hitchhiker's Guide to Packaging
Hitchhiker's Guide to Python - Packaging your Code