I have a script that I would like distributed on other machines, however it has a dependent module that is not a part of the python standard library. More specifically I want to use the pymysql library. When I am on my local machine I can just use pip install pymysql, but want to know if there are other options in packaging/distributing my script so that others do not have to run pip install pymysql.
My current error on other machines is:
ImportError: No module named pymysql
You can define dependencies inside your setup.py file. Try doing something like this:
setup(..., install_requires=['pymysql'])
If you are using setuptools to package your library. Here is some documentation: https://python-packaging.readthedocs.io/en/latest/dependencies.html
This question may help you some more: How to specify dependencies when creating the setup.py file for a python package
By doing this, if anyone installs your library, it will automatically install all the dependencies too.
If you need to define dependencies for a GitHub repository, you can define a requirements.txt file and keep it with the main code inside the repository (not inside any folder). This is how yours might look:
pymysql
It's just 1 line containing the requirements. If you have any more, simply type them on the next line:
pymysql
numpy
pandas
If anyone clones your repository, they will need to execute the following command in the directory in which they have cloned the files:
python3 -m pip install -r requirements.txt
Related
i've installed a python project, and it imports modules(Like almost every project). The problem is when i want to install them(because i haven't got the modules), for example: In the project is imported a module called "a" but when i go and install "a" with pip install a, it says ERROR: Could not find a version that satisfies the requirement a (from versions: none) ERROR: No matching distribution found for a. How could i know the name of the module that is imported in that python project?
Edit:
btw i just found out the module that the project uses comes in the zip where the python project is. How could i install it so it works?
All pip packages are listed here. If you want to import a module called a inside a python script, the command to install it could be sometimes pip install b. Because the name of the stored package can varied from the python import name. To find how to install it the best is to get the pypi url of your package. You can googling the python error ModuleNotFoundError: No module named 'dgvd', it always show you the pypi url in top links.
The good practice in a project is to have a txt file called requirement.txt that you create in bash using this command:
pip freeze > requirement.txt
Then install all packages in once using:
pip install -r requirement.txt
For installing from zip simply use:
pip install *.zip
or specify the path directly:
pip install <path to .zip>
pip install ./my-archive.zip
Same applies for a tarball or any other format. It can be even a folder. However, it has to include a proper setup.py or other mechanism for pip to install it and pip has to support the packaging format (be it archive, networking protocol, version control system (git prefix), etc).
pip install ./my-folder
pip install ./
pip install .
pip install ..
etc
If, however, there is no setup.py present, you'll need to simply copy-paste the files somewhere where your project/module resides (or set PYTHONPATH or sys.path to that folder) to be able to import them. See this or this question for more.
I had to manually build a package and copy it to the site-packages directory. When I type pip list into a console it isn't listed, though I can use it in python scripts. How can I make pip aware of the package?
Installing it via pip is not an option.
You say "Installing it via pip is not an option.", but I'm assuming installing it via pip using a local copy still is. If so, the way to do that is to clone your library into a directory (say /my/lib/dir), where the root of the source for the root package appears below /my/lib/dir (ex: if the package you want to install is imported as import foo, then you should have /my/lib/dir/foo). If there is no file named setup.py in your copy of the code, then you need to create a simple one. Something like
# in a file called setup.py above the `foo` directory
from distutils.core import setup
setup(name='foo',
version='1.0',
packages=['foo'],
)
Finally, run pip install . from /my/lib/dir.
It's definitely a hack, but making pip aware of a package without installing it via pip is asking for a hack :-)
What I have:
local Python3 files that I want to turn into a module test_module
test_module folder containing an empty __init__.py, a setup.py file (see below) and subdirectories with several source
files
What I want:
continuously work on and improve test_module locally
have an easy way to install test_module and all its dependencies locally in my own virtual environment (created using python3 -m venv my_environment)
run files that make use of the module via python myexample.py, without having to take care of adapting my local PYTHONPATH variable each time i enter or exit the my_environment
share my python code with others via git, and allow them to install their code locally on their machines using the same procedure (as simple as possible)
learn best practices on how to create my own module
How I'm doing it at the moment:
pip freeze > requirements.txt and pip install -r requirements.txt for installing dependencies
adding export PYTHONPATH="${PYTHONPATH}:." to my_environment/bin/activate, to have my own module in the search path
(as found here: How do you set your pythonpath in an already-created virtualenv?)
I'd like to know if there are "cleaner" solutions based on setup.py, possibly involving something like pip install ./test_module or similar that takes care of 2.-3. automagically.
My current setup.py file looks as follows
from setuptools import setup
setup(
name='test_module',
version='0.1',
description='Some really good stuff, that I am still working on',
author='Bud Spencer',
author_email='bud.spencer#stackoverflow.com',
packages=['test_module'], # same as name
install_requires=['numpy', 'scipy', 'sklearn', 'argparse'], # external packages as dependencies
)
It sounds like you want to run pip install -e <path/url> from within your virtual env, which will install a package (with a setup.py file as you have) from either a local path or a Git repo. See https://pip.pypa.io/en/stable/reference/pip_install/#vcs-support for an explanation on the syntax of the latter.
Example:
pip install -e git+https://github.com/me/test_module/#egg=test-module
If you have already installed and want to pull the latest code from the repo, add an --upgrade switch to the above.
I am working in a team and wrote some python code that uses libraries that need to be installed separately (because they are not part of standard python distribution). How should I specify those ? What is the right/correct/pythonic way to do this?
I personally use pip install -r requirements.txt
https://pip.pypa.io/en/latest/user_guide.html#requirements-files
Check out tool called pip. It's what most python projects use these days.
Typically, one would do the following (for example, we want to install the requests package for our new project):
pip install requests
and then
pip freeze > requirements.txt
Now, we have installed requests on our system and saved the dependency version to a file which we can distribute with our project.
At this point, requirements.txt contains:
requests==2.7.0
To install the same set of requirements (in our case only the requests package) on some other system, one would do the following:
pip install -r requirements.txt
You need to make a setup.py file for your package that specifies required packages. You may need to reorganize your file structure and include data and a manifest.
Then create a distribution of your package, EG: a wheel file.
Then when using pip install your_package_distro.whl, pip will determine your packages dependencies are met and install them from PyPI unless you specify another package source (EG: https://pypi.anaconda.org/)
Read through the following references to distribute your code:
Python 2.7.10 documentation - Distributing Python Modules, Section 2: Writing the Setup Script
Setuptools - Building and Distributing Packages with Setuptools
Hitchhiker's Guide to Packaging
Hitchhiker's Guide to Python - Packaging your Code
I'm new to Python so this may sound silly.
I want to use a Python library I've found on Github, lets say on https://github.com/praw-dev/praw, and I want to be able to do git pull in the future to pull the latest commits.
Question: Should I git clone <git url> in the project directory and delete everything except the praw directory, then in my python script do a import praw?
In iPython,
import praw
gives the error ImportError: No module named praw
Directory Structure
~\myProject\
praw\
myNotebook.ipynb
Actually, if given package is not on PyPI (or you want a specific branch) you can still install it through pip from GitHub with:
pip install git+https://github.com/[repo owner]/[repo]#[branch name]
And for your problem it would be (although #pandita's answer is correct for normal usage case):
pip install git+https://github.com/praw-dev/praw.git
For more information check this answer.
Experimental Python module finder/loader from github, like in golang.
So, in golang we can import like:
import "github.com/parnurzeal/gorequest"
But in python we should install package by our hands:
pip install requests
And import it like:
import requests
But with this magic package and power of PEP-0302 we can do it automatically:
from github_com.kennethreitz import requests
assert requests.get('https://github.com/nvbn/import_from_github_com').status_code == 200
Installation
You should have git, Python 3.2+ and pip:
pip install import_from_github_com
Reference: https://github.com/nvbn/import_from_github_com
Just clone the files in any dir on your python path and then build the lib typically with python setup.py install from the command line.
I typically clone a libray form git in my site_libraries folder ( the folder that holds all of your pip installed packages ). From there you can pull and then build the libraries from git just like any other git repo. Having the files there is nice because all of your libs are in once place on your python path.
You might want to consider using pip instead of git to install and upgrade the package (that is unless you have a pressing reason to use git).
pip install praw
to update the package you can do
pip install --upgrade praw
Also have a look here for further information on how to use pip.