See when packages were installed / updated using pip - python

I know how to see installed Python packages using pip, just use pip freeze. But is there any way to see the date and time when package is installed or updated with pip?

If it's not necessary to differ between updated and installed, you can use the change time of the package file.
Like that for Python 2 with pip < 10:
import pip, os, time
for package in pip.get_installed_distributions():
print "%s: %s" % (package, time.ctime(os.path.getctime(package.location)))
or like that for newer versions (tested with Python 3.7 and installed setuptools 40.8 which bring pkg_resources):
import pkg_resources, os, time
for package in pkg_resources.working_set:
print("%s: %s" % (package, time.ctime(os.path.getctime(package.location))))
an output will look like numpy 1.12.1: Tue Feb 12 21:36:37 2019 in both cases.
Btw: Instead of using pip freeze you can use pip list which is able to provide some more information, like outdated packages via pip list -o.

Unfortunately, python packaging makes this a bit complicated since there is no consistent place that lists where the package files or module directories are placed.
Here's the best I've come up with:
#!/usr/bin/env python
# Prints when python packages were installed
from __future__ import print_function
from datetime import datetime
import os
import pip
if __name__ == "__main__":
packages = []
for package in pip.get_installed_distributions():
package_name_version = str(package)
try:
module_dir = next(package._get_metadata('top_level.txt'))
package_location = os.path.join(package.location, module_dir)
os.stat(package_location)
except (StopIteration, OSError):
try:
package_location = os.path.join(package.location, package.key)
os.stat(package_location)
except:
package_location = package.location
modification_time = os.path.getctime(package_location)
modification_time = datetime.fromtimestamp(modification_time)
packages.append([
modification_time,
package_name_version
])
for modification_time, package_name_version in sorted(packages):
print("{0} - {1}".format(modification_time,
package_name_version))

Solution 1 : packages.date.py :
import os
import time
from pip._internal.utils.misc import get_installed_distributions
for package in get_installed_distributions():
print (package, time.ctime(os.path.getctime(package.location)))
Solution 2 : packages.alt.date.py :
#!/usr/bin/env python
# Prints when python packages were installed
from __future__ import print_function
from datetime import datetime
from pip._internal.utils.misc import get_installed_distributions
import os
if __name__ == "__main__":
packages = []
for package in get_installed_distributions():
package_name_version = str(package)
try:
module_dir = next(package._get_metadata('top_level.txt'))
package_location = os.path.join(package.location, module_dir)
os.stat(package_location)
except (StopIteration, OSError):
try:
package_location = os.path.join(package.location, package.key)
os.stat(package_location)
except:
package_location = package.location
modification_time = os.path.getctime(package_location)
modification_time = datetime.fromtimestamp(modification_time)
packages.append([
modification_time,
package_name_version
])
for modification_time, package_name_version in sorted(packages):
print("{0} - {1}".format(modification_time,
package_name_version))
Solution 1 & 2 compatibility :
updated solution for pip v10.x
python v2, v2.7, v3, v3.5, v3.7

I was recently looking for this too. But although there are many good answers here, the real issue is that since pip is not keeping logs by default, we have to resort to using the file creation and modification times, known as ctime and mtime, respectively. (See MAC-times.) Unfortunately, using this method has two side effects:
Different OS's and FS's handles the ctime/mtime differently (if even available)
Python installations are using many different directories, and some remain after installations while others are created on the fly when running. Making it hard to know exaclty what files to check the dates on.
However, there is a tool called pip-date that try to combine a few different methods.
pip install pip-date

You could use the --log option:
--log <path> Path to a verbose appending log. This log is inactive by default.
E.g:
$ pip install --log ~/.pip/pip.append.log gunicorn
Or you can set it in your pip.conf to be enabled by default:
[global]
log = <path>
Then all the pip operations will be logged verbosely into the specified file along with a log separator and timestamp, e.g.:
$ pip install --log ~/.pip/pip.append.log gunicorn
$ pip install --log ~/.pip/pip.append.log --upgrade gunicorn
logs the following to ~/.pip/pip.append.log:
------------------------------------------------------------
/usr/bin/pip run on Mon Jul 14 14:35:36 2014
Downloading/unpacking gunicorn
...
Successfully installed gunicorn
Cleaning up...
------------------------------------------------------------
/usr/bin/pip run on Mon Jul 14 14:35:57 2014
Getting page https://pypi.python.org/simple/gunicorn/
URLs to search for versions for gunicorn in /usr/lib/python2.7/site-packages:
* https://pypi.python.org/simple/gunicorn/
...
Requirement already up-to-date: gunicorn in /usr/lib/python2.7/site-packages
Cleaning up...
You could parse out what you need from this log. While not the nicest it's a standard pip facility.

I don't know all pip options but for one module you can get list of its files
and then you can check its dates using python or bash.
For example list of files in requests module
pip show --files requests
result:
Name: requests
Version: 2.2.1
Location: /usr/local/lib/python2.7/dist-packages
Requires:
Files:
../requests/hooks.py
../requests/status_codes.py
../requests/auth.py
../requests/models.py
etc.
BTW: you can use --help to see more options for some functions
pip --help
pip list --help
pip show --help
etc.

pip freeze gives you all the installed packages. Assuming you know the folder:
time.ctime(os.path.getctime(file))
should give you the creation time of a file, i.e. date of when the package has been installed or updated.

Related

How to get a list of version numbers for python packages released up until a specific date?

Consider having a python requirements.txt file with a list of (un-versioned) dependencies (python packages). After you install them (e.g. pip install -r requirements.txt) you can call pip freeze and get a (versioned) list of all installed python packages.
This will be a snapshot of the python package versions (and their dependencies) available at the time. What I need to generate is this same list, but for a date in the past (let's say 2018-06-12).
I guess technically, I only need to find the released versions for all packages contained in the requirements.txt file.
Ideally, there would be a command pip install -r requirements.txt --before 2018-06-21 and then just call pip freeze, but I didn't see anything like that in pip install --help. I did see a way to specify another --index-url and I could imagine if there was an archived index from that date, I could point pip to that and it should work?
There is also a --constraint option, which:
Constrain versions using the given constraints file
But I'm guessing I would already have to have the date-constraint versions in that case?
From your question, if I get it right, you wanted to install dependencies with the following command:
pip install -r requirements.txt --before 2018-06-21
which require patching pip itself in order to add --before option to supply target date.
The code below it the second best thing. At the moment it is a rough sketch, but it does what you need, well almost, instead of generating requirements.txt, it outputs to the console the packages with the latest version up until the supplied date, in the format:
$ pipenv run python <script_name>.py django click --before 2018-06-21
pip install django==2.0.6 click==6.7
It's not exactly what you had in mind, but very close to it. Feel free to change it for your needs, by adding (or not) -r option and outputting every dependency on the new line, then with redirecting output, it would look something like that:
$ pipenv run python <script_name>.py django click --before 2018-06-21 >> requirements.txt
Code (or just use link to gist):
import sys
import requests
from bs4 import BeautifulSoup
from datetime import datetime
import click
PYPI_URL = "https://pypi.org/project/{project_name}/#history"
def get_releases(request):
soup = BeautifulSoup(request, 'html.parser')
releases = list()
for release in soup.find_all('div', class_='release'):
release_version = release.find('p', class_='release__version').text.strip()
if not is_numeric(release_version):
continue
release_date = try_parsing_date(release.find('time').text.strip())
releases.append({'version': release_version, 'date': release_date})
sorted_packages = sorted(releases, key=lambda s: list(map(int, s['version'].split('.'))))
return sorted_packages
def is_numeric(s):
for char in s:
if not char.isdigit() and char not in [" ", ".", ","]:
return False
return True
def try_parsing_date(text):
for fmt in ('%d.%m.%Y', '%d/%m/%Y', '%b %d, %Y', '%Y-%m-%d'):
try:
return datetime.strptime(text, fmt)
except ValueError:
pass
click.echo('Not valid date format. Try to use one of this: <31.12.2018>, <31/12/2019> or <2018-12-31>')
sys.exit(0)
#click.command(context_settings=dict(help_option_names=['-h', '--help']))
#click.option('-b', '--before', help='Get latest package before specified date')
#click.argument('packages', nargs=-1, type=click.UNPROCESSED)
def cli(before, packages):
target_date = try_parsing_date(before) if before else datetime.today()
required_packages = list()
not_found = list()
for package in packages:
project_url = PYPI_URL.format(project_name=package)
r = requests.get(project_url)
if r.status_code is not 200:
not_found.append(package)
continue
releases = get_releases(r.text)
last_release = None
for idx, release in enumerate(releases):
release_date = release['date']
if release_date > target_date:
if last_release and last_release['date'] <= release_date:
continue
last_release = release
required_packages.append({'package': package,
'release_date': last_release['date'],
'release_version': last_release['version']})
print('pip install ' + ' '.join('{}=={}'.format(p['package'], str(p['release_version'])) for p in required_packages))
if len(not_found) > 0:
print('\nCould not find the following packages: {}'.format(' '.join(p for p in not_found)))
if __name__ == '__main__':
cli()
Required dependencies (Python3):
beautifulsoup4==4.7.1
Click==7.0
requests==2.21.0
Alright, one possible answer (although not a great one) is to just manually go through each dependency in the requirements.txt, look that package up on https://pypi.org and then visit the release history (e.g. https://pypi.org/project/requests/#history). From there it's easy enough to see which version had been released at what date (e.g. https://pypi.org/project/requests/2.19.0/ for requests when including 2018-06-12) and then just use that as the version (requests==2.19.0).
A slightly better answer might be to extract that info (maybe via curl) from pypi programmatically, extract all version info (including the dates), sort it and pick the right one.
I found a tool that seems to fullfill your needs (still alpha):
https://pypi.org/project/pypi-timemachine/
As I read from its README, it creates a proxy to pypi.org which employs the date filter.

Python extract imports and download them with pip

I want to get a list from all imports of a (selfwritten) module and fetch them via PIP programmatically. Is there a way to do this?
I thought of analysing the file via open(model.py) extract the import statements and then subprocess PIP, but is there a better way?
EDIT:
This helps out with PIP:
http://blog.ducky.io/python/2013/08/22/calling-pip-programmatically/
There are two options that I know of.
pigar
pipreqs
Both will pull imports from your project and give you a requirements.txt file that you can use with pip.
You could wrap it in a try/except condition.
something like:
import pip
while True:
try:
import mymodule
break
except ImportError as e:
dependency = str(e).split(" ")[-1]
if dependency == 'mymodule':
break
pip.main(['install', dependency])
my thinking:
try to import - if you don't have the dependencies installed you should raise an ImportError
if it fails the last word of the error message should be the name of the module you need, install it using pip as suggested in the page you linked.
you could also get an ImportError if your module doesn't exist - so we test for that and break
I can imagine a problem, depending on the pip module (which I haven't used) if a module has a different import name to pip name, eg MySQLdb, which is installed via $pip install MySQL-python
Depending on the answer from Rob, I came to the following solution:
def satisfy_dependencies(path_to_dir):
# Generate requirements.txt using pipreqs and then use pip to fetch the requirements
proc = Popen(["pipreqs", path_to_dir, "--savepath", os.path.join(path_to_dir, "requirements.txt"), "--force"])
while proc.poll() is None:
time.sleep(0.1)
if os.path.exists(os.path.join(path_to_dir, "requirements.txt")):
pip = Popen(["pip", "install", "-r", os.path.join(path_to_dir, "requirements.txt")])
while pip.poll() is None:
time.sleep(0.1)
os.remove(os.path.join(path_to_dir, "requirements.txt"))
Much sub processing but did the job in my case.
I'll be back.

How to tell Python to prefer module from $HOME/lib/python over /usr/lib/python?

In Python, I'm getting an error because it's loading a module from /usr/lib/python2.6/site-packages but I'd like it to use my version in $HOME/python-modules/lib/python2.6/site-packages, which I installed using pip-python --install-option="--prefix=$HOME/python-modules --ignore-installed
How can I tell Python to use my version of the library? Setting PYTHONPATH to $HOME/python-modules/lib/python2.6/site-packages doesn't help, since /usr/lib/... apparently has precedence.
Take a look at the site module for ways to customize your environment.
One way to accomplish this is to add a file to a location currently on sys.path called usercustomize.py, when Python is starting up it will automatically import this file, and you can use it to modify sys.path.
First, set $PYTHONPATH to $HOME (or add $HOME if $PYTHONPATH has a value), then create the file $HOME/usercustomize.py with the following contents:
import sys, os
my_site = os.path.join(os.environ['HOME'],
'python-modules/lib/python2.6/site-packages')
sys.path.insert(0, my_site)
Now when you start Python you should see your custom site-packages directory before the system default on sys.path.
Newer Python versions now have built-in support to search the opendesktop location:
$HOME/.local/lib/pythonX.Y/site-packages
If you put your local modules there you don't have to any sys.path manipulations.
If one has multiple versions of a package installed, say e.g. SciPy:
>>> import scipy; print(scipy.__version__); print(scipy.__file__)
0.17.0
/usr/lib/python3/dist-packages/scipy/__init__.py
and one would like the user installed version (installed e.g. using pip install --user --upgrade scipy) to be prefered, one needs a usercustomize.py file in ~/.local/lib/python3.5/site-packages/ with e.g. this content:
import sys, os
my_site = os.path.join(
os.environ['HOME'], '.local/lib/python%d.%d/site-packages' % (
sys.version_info[0], sys.version_info[1]))
for idx, pth in enumerate(sys.path):
if pth.startswith('/usr'):
sys.path.insert(idx, my_site)
break
else:
raise ValueError("No path starting with /usr in sys.path")
(the for loop selecting index ensures that packages installed in "develop mode" takes precedence) now we get our user specific version of SciPy:
>>> import scipy; print(scipy.__version__); print(scipy.__file__)
0.18.1
/home/user/.local/lib/python3.5/site-packages/scipy/__init__.py
to prefer packages installed to userbase (e.g. pip install --user --upgrade cool_thing )
in ~/.bashrc,~/.profile, or whatever the init file for your shell is, add
export PYTHONUSERBASE="$HOME/python-modules"
in $PYTHONUSERBASE/usercustomize.py
#!/usr/bin/env python
import sys, site
sys.path.insert(0, site.getusersitepackages())

Check if Python Package is installed

What's a good way to check if a package is installed while within a Python script? I know it's easy from the interpreter, but I need to do it within a script.
I guess I could check if there's a directory on the system that's created during the installation, but I feel like there's a better way. I'm trying to make sure the Skype4Py package is installed, and if not I'll install it.
My ideas for accomplishing the check
check for a directory in the typical install path
try to import the package and if an exception is throw, then install package
If you mean a python script, just do something like this:
Python 3.3+ use sys.modules and find_spec:
import importlib.util
import sys
# For illustrative purposes.
name = 'itertools'
if name in sys.modules:
print(f"{name!r} already in sys.modules")
elif (spec := importlib.util.find_spec(name)) is not None:
# If you choose to perform the actual import ...
module = importlib.util.module_from_spec(spec)
sys.modules[name] = module
spec.loader.exec_module(module)
print(f"{name!r} has been imported")
else:
print(f"can't find the {name!r} module")
Python 3:
try:
import mymodule
except ImportError as e:
pass # module doesn't exist, deal with it.
Python 2:
try:
import mymodule
except ImportError, e:
pass # module doesn't exist, deal with it.
As of Python 3.3, you can use the find_spec() method
import importlib.util
# For illustrative purposes.
package_name = 'pandas'
spec = importlib.util.find_spec(package_name)
if spec is None:
print(package_name +" is not installed")
Updated answer
A better way of doing this is:
import subprocess
import sys
reqs = subprocess.check_output([sys.executable, '-m', 'pip', 'freeze'])
installed_packages = [r.decode().split('==')[0] for r in reqs.split()]
The result:
print(installed_packages)
[
"Django",
"six",
"requests",
]
Check if requests is installed:
if 'requests' in installed_packages:
# Do something
Why this way? Sometimes you have app name collisions. Importing from the app namespace doesn't give you the full picture of what's installed on the system.
Note, that proposed solution works:
When using pip to install from PyPI or from any other alternative source (like pip install http://some.site/package-name.zip or any other archive type).
When installing manually using python setup.py install.
When installing from system repositories, like sudo apt install python-requests.
Cases when it might not work:
When installing in development mode, like python setup.py develop.
When installing in development mode, like pip install -e /path/to/package/source/.
Old answer
A better way of doing this is:
import pip
installed_packages = pip.get_installed_distributions()
For pip>=10.x use:
from pip._internal.utils.misc import get_installed_distributions
Why this way? Sometimes you have app name collisions. Importing from the app namespace doesn't give you the full picture of what's installed on the system.
As a result, you get a list of pkg_resources.Distribution objects. See the following as an example:
print installed_packages
[
"Django 1.6.4 (/path-to-your-env/lib/python2.7/site-packages)",
"six 1.6.1 (/path-to-your-env/lib/python2.7/site-packages)",
"requests 2.5.0 (/path-to-your-env/lib/python2.7/site-packages)",
]
Make a list of it:
flat_installed_packages = [package.project_name for package in installed_packages]
[
"Django",
"six",
"requests",
]
Check if requests is installed:
if 'requests' in flat_installed_packages:
# Do something
If you want to have the check from the terminal, you can run
pip3 show package_name
and if nothing is returned, the package is not installed.
If perhaps you want to automate this check, so that for example you can install it if missing, you can have the following in your bash script:
pip3 show package_name 1>/dev/null #pip for Python 2
if [ $? == 0 ]; then
echo "Installed" #Replace with your actions
else
echo "Not Installed" #Replace with your actions, 'pip3 install --upgrade package_name' ?
fi
Open your command prompt type
pip3 list
As an extension of this answer:
For Python 2.*, pip show <package_name> will perform the same task.
For example pip show numpy will return the following or alike:
Name: numpy
Version: 1.11.1
Summary: NumPy: array processing for numbers, strings, records, and objects.
Home-page: http://www.numpy.org
Author: NumPy Developers
Author-email: numpy-discussion#scipy.org
License: BSD
Location: /home/***/anaconda2/lib/python2.7/site-packages
Requires:
Required-by: smop, pandas, tables, spectrum, seaborn, patsy, odo, numpy-stl, numba, nfft, netCDF4, MDAnalysis, matplotlib, h5py, GridDataFormats, dynd, datashape, Bottleneck, blaze, astropy
In the Terminal type
pip show some_package_name
Example
pip show matplotlib
You can use the pkg_resources module from setuptools. For example:
import pkg_resources
package_name = 'cool_package'
try:
cool_package_dist_info = pkg_resources.get_distribution(package_name)
except pkg_resources.DistributionNotFound:
print('{} not installed'.format(package_name))
else:
print(cool_package_dist_info)
Note that there is a difference between python module and a python package. A package can contain multiple modules and module's names might not match the package name.
if pip list | grep -q \^'PACKAGENAME\s'
# installed ...
else
# not installed ...
fi
You can use this:
class myError(exception):
pass # Or do some thing like this.
try:
import mymodule
except ImportError as e:
raise myError("error was occurred")
Method 1
to search weather a package exists or not use pip3 list command
#**pip3 list** will display all the packages and **grep** command will search for a particular package
pip3 list | grep your_package_name_here
Method 2
You can use ImportError
try:
import your_package_name
except ImportError as error:
print(error,':( not found')
Method 3
!pip install your_package_name
import your_package_name
...
...
I'd like to add some thoughts/findings of mine to this topic.
I'm writing a script that checks all requirements for a custom made program. There are many checks with python modules too.
There's a little issue with the
try:
import ..
except:
..
solution.
In my case one of the python modules called python-nmap, but you import it with import nmap and as you see the names mismatch. Therefore the test with the above solution returns a False result, and it also imports the module on hit, but maybe no need to use a lot of memory for a simple test/check.
I also found that
import pip
installed_packages = pip.get_installed_distributions()
installed_packages will have only the packages has been installed with pip.
On my system pip freeze returns over 40 python modules, while installed_packages has only 1, the one I installed manually (python-nmap).
Another solution below that I know it may not relevant to the question, but I think it's a good practice to keep the test function separate from the one that performs the install it might be useful for some.
The solution that worked for me. It based on this answer How to check if a python module exists without importing it
from imp import find_module
def checkPythonmod(mod):
try:
op = find_module(mod)
return True
except ImportError:
return False
NOTE: this solution can't find the module by the name python-nmap too, I have to use nmap instead (easy to live with) but in this case the module won't be loaded to the memory whatsoever.
I would like to comment to #ice.nicer reply but I cannot, so ...
My observations is that packages with dashes are saved with underscores, not only with dots as pointed out by #dwich comment
For example, you do pip3 install sphinx-rtd-theme, but:
importlib.util.find_spec(sphinx_rtd_theme) returns an Object
importlib.util.find_spec(sphinx-rtd-theme) returns None
importlib.util.find_spec(sphinx.rtd.theme) raises ModuleNotFoundError
Moreover, some names are totally changed.
For example, you do pip3 install pyyaml but it is saved simply as yaml
I am using python3.8
If you'd like your script to install missing packages and continue, you could do something like this (on example of 'krbV' module in 'python-krbV' package):
import pip
import sys
for m, pkg in [('krbV', 'python-krbV')]:
try:
setattr(sys.modules[__name__], m, __import__(m))
except ImportError:
pip.main(['install', pkg])
setattr(sys.modules[__name__], m, __import__(m))
A quick way is to use python command line tool.
Simply type import <your module name>
You see an error if module is missing.
$ python
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
>>> import sys
>>> import jocker
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named jocker
$
Hmmm ... the closest I saw to a convenient answer was using the command line to try the import. But I prefer to even avoid that.
How about 'pip freeze | grep pkgname'? I tried it and it works well. It also shows you the version it has and whether it is installed under version control (install) or editable (develop).
I've always used pylibcheck to check if a lib is installed or not, simply download it by doing pip install pylibcheck and the could could be like this
import pylibcheck
if not pylibcheck.checkPackage("mypackage"):
#not installed
it also supports tuples and lists so you can check multiple packages and if they are installed or not
import pylibcheck
packages = ["package1", "package2", "package3"]
if pylibcheck.checkPackage(packages):
#not installed
you can also install libs with it if you want to do that, recommend you check the official pypi
The top voted solution which uses techniques like importlib.util.find_spec and sys.modules and catching import exceptions works for most packages but fails in some edge cases (such as the beautifulsoup package) where the package name used in imports is somewhat different (bs4 in this case) than the one used in setup file configuration. For these edge cases, this solution doesn't work unless you pass the package name used in imports instead of the one used in requirements.txt or pip installations.
For my use case, I needed to write a package checker that checks installed packages based on requirements.txt, so this solution didn't work. What I ended up using was subprocess.check to call the pip module explicitly to check for the package installation:
import subprocess
for pkg in packages:
try:
subprocess.check_output('py -m pip show ' + pkg)
except subprocess.CalledProcessError as ex:
not_found.append(pkg)
It's a bit slower than the other methods but more reliable and handles the edge cases.
Go option #2. If ImportError is thrown, then the package is not installed (or not in sys.path).
Is there any chance to use the snippets given below? When I run this code, it returns "module pandas is not installed"
a = "pandas"
try:
import a
print("module ",a," is installed")
except ModuleNotFoundError:
print("module ",a," is not installed")
But when I run the code given below:
try:
import pandas
print("module is installed")
except ModuleNotFoundError:
print("module is not installed")
It returns "module pandas is installed".
What is the difference between them?

How do I find the location of my Python site-packages directory?

How do I find the location of my site-packages directory?
There are two types of site-packages directories, global and per user.
Global site-packages ("dist-packages") directories are listed in sys.path when you run:
python -m site
For a more concise list run getsitepackages from the site module in Python code:
python -c 'import site; print(site.getsitepackages())'
Caution: In virtual environments getsitepackages is not available with older versions of virtualenv, sys.path from above will list the virtualenv's site-packages directory correctly, though. In Python 3, you may use the sysconfig module instead:
python3 -c 'import sysconfig; print(sysconfig.get_paths()["purelib"])'
The per user site-packages directory (PEP 370) is where Python installs your local packages:
python -m site --user-site
If this points to a non-existing directory check the exit status of Python and see python -m site --help for explanations.
Hint: Running pip list --user or pip freeze --user gives you a list of all installed per user site-packages.
Practical Tips
<package>.__path__ lets you identify the location(s) of a specific package: (details)
$ python -c "import setuptools as _; print(_.__path__)"
['/usr/lib/python2.7/dist-packages/setuptools']
<module>.__file__ lets you identify the location of a specific module: (difference)
$ python3 -c "import os as _; print(_.__file__)"
/usr/lib/python3.6/os.py
Run pip show <package> to show Debian-style package information:
$ pip show pytest
Name: pytest
Version: 3.8.2
Summary: pytest: simple powerful testing with Python
Home-page: https://docs.pytest.org/en/latest/
Author: Holger Krekel, Bruno Oliveira, Ronny Pfannschmidt, Floris Bruynooghe, Brianna Laugher, Florian Bruhin and others
Author-email: None
License: MIT license
Location: /home/peter/.local/lib/python3.4/site-packages
Requires: more-itertools, atomicwrites, setuptools, attrs, pathlib2, six, py, pluggy
>>> import site; site.getsitepackages()
['/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages']
(or just first item with site.getsitepackages()[0])
A solution that:
outside of virtualenv - provides the path of global site-packages,
insidue a virtualenv - provides the virtualenv's site-packages
...is this one-liner:
python -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())"
Formatted for readability (rather than use as a one-liner), that looks like the following:
from distutils.sysconfig import get_python_lib
print(get_python_lib())
Source: an very old version of "How to Install Django" documentation (though this is useful to more than just Django installation)
For Ubuntu,
python -c "from distutils.sysconfig import get_python_lib; print get_python_lib()"
...is not correct.
It will point you to /usr/lib/pythonX.X/dist-packages
This folder only contains packages your operating system has automatically installed for programs to run.
On ubuntu, the site-packages folder that contains packages installed via setup_tools\easy_install\pip will be in /usr/local/lib/pythonX.X/dist-packages
The second folder is probably the more useful one if the use case is related to installation or reading source code.
If you do not use Ubuntu, you are probably safe copy-pasting the first code box into the terminal.
This is what worked for me:
python -m site --user-site
A modern stdlib way is using sysconfig module, available in version 2.7 and 3.2+. Unlike the current accepted answer, this method still works regardless of whether or not you have a virtual environment active.
Note: sysconfig (source) is not to be confused with the distutils.sysconfig submodule (source) mentioned in several other answers here. The latter is an entirely different module and it's lacking the get_paths function discussed below. Additionally, distutils is deprecated in 3.10 and will be unavailable soon.
Python currently uses eight paths (docs):
stdlib: directory containing the standard Python library files that are not platform-specific.
platstdlib: directory containing the standard Python library files that are platform-specific.
platlib: directory for site-specific, platform-specific files.
purelib: directory for site-specific, non-platform-specific files.
include: directory for non-platform-specific header files.
platinclude: directory for platform-specific header files.
scripts: directory for script files.
data: directory for data files.
In most cases, users finding this question would be interested in the 'purelib' path (in some cases, you might be interested in 'platlib' too). The purelib path is where ordinary Python packages will be installed by tools like pip.
At system level, you'll see something like this:
# Linux
$ python3 -c "import sysconfig; print(sysconfig.get_path('purelib'))"
/usr/local/lib/python3.8/site-packages
# macOS (brew installed python3.8)
$ python3 -c "import sysconfig; print(sysconfig.get_path('purelib'))"
/usr/local/Cellar/python#3.8/3.8.3/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages
# Windows
C:\> py -c "import sysconfig; print(sysconfig.get_path('purelib'))"
C:\Users\wim\AppData\Local\Programs\Python\Python38\Lib\site-packages
With a venv, you'll get something like this
# Linux
/tmp/.venv/lib/python3.8/site-packages
# macOS
/private/tmp/.venv/lib/python3.8/site-packages
# Windows
C:\Users\wim\AppData\Local\Temp\.venv\Lib\site-packages
The function sysconfig.get_paths() returns a dict of all of the relevant installation paths, example on Linux:
>>> import sysconfig
>>> sysconfig.get_paths()
{'stdlib': '/usr/local/lib/python3.8',
'platstdlib': '/usr/local/lib/python3.8',
'purelib': '/usr/local/lib/python3.8/site-packages',
'platlib': '/usr/local/lib/python3.8/site-packages',
'include': '/usr/local/include/python3.8',
'platinclude': '/usr/local/include/python3.8',
'scripts': '/usr/local/bin',
'data': '/usr/local'}
A shell script is also available to display these details, which you can invoke by executing sysconfig as a module:
python -m sysconfig
Addendum: What about Debian / Ubuntu?
As some commenters point out, the sysconfig results for Debian systems (and Ubuntu, as a derivative) are not accurate. When a user pip installs a package it will go into dist-packages not site-packages, as per Debian policies on Python packaging.
The root cause of the discrepancy is because Debian patch the distutils install layout, to correctly reflect their changes to the site, but they fail to patch the sysconfig module.
For example, on Ubuntu 20.04.4 LTS (Focal Fossa):
root#cb5e85f17c7f:/# python3 -m sysconfig | grep packages
platlib = "/usr/lib/python3.8/site-packages"
purelib = "/usr/lib/python3.8/site-packages"
root#cb5e85f17c7f:/# python3 -m site | grep packages
'/usr/local/lib/python3.8/dist-packages',
'/usr/lib/python3/dist-packages',
USER_SITE: '/root/.local/lib/python3.8/site-packages' (doesn't exist)
It looks like the patched Python installation that Debian/Ubuntu are distributing is a bit hacked up, and they will need to figure out a new plan for 3.12+ when distutils is completely unavailable. Probably, they will have to start patching sysconfig as well, since this is what pip will be using for install locations.
Let's say you have installed the package 'django'. import it and type in dir(django). It will show you, all the functions and attributes with that module. Type in the python interpreter -
>>> import django
>>> dir(django)
['VERSION', '__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__', 'get_version']
>>> print django.__path__
['/Library/Python/2.6/site-packages/django']
You can do the same thing if you have installed mercurial.
This is for Snow Leopard. But I think it should work in general as well.
As others have noted, distutils.sysconfig has the relevant settings:
import distutils.sysconfig
print distutils.sysconfig.get_python_lib()
...though the default site.py does something a bit more crude, paraphrased below:
import sys, os
print os.sep.join([sys.prefix, 'lib', 'python' + sys.version[:3], 'site-packages'])
(it also adds ${sys.prefix}/lib/site-python and adds both paths for sys.exec_prefix as well, should that constant be different).
That said, what's the context? You shouldn't be messing with your site-packages directly; setuptools/distutils will work for installation, and your program may be running in a virtualenv where your pythonpath is completely user-local, so it shouldn't assume use of the system site-packages directly either.
The native system packages installed with python installation in Debian based systems can be found at :
/usr/lib/python2.7/dist-packages/
In OSX - /Library/Python/2.7/site-packages
by using this small code :
from distutils.sysconfig import get_python_lib
print get_python_lib()
However, the list of packages installed via pip can be found at :
/usr/local/bin/
Or one can simply write the following command to list all paths where python packages are.
>>> import site; site.getsitepackages()
['/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages']
Note: the location might vary based on your OS, like in OSX
>>> import site; site.getsitepackages()
['/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages', '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/site-python', '/Library/Python/2.7/site-packages']
pip show will give all the details about a package:
https://pip.pypa.io/en/stable/reference/pip_show/ [pip show][1]
To get the location:
pip show <package_name>| grep Location
In Linux, you can go to site-packages folder by:
cd $(python -c "import site; print(site.getsitepackages()[0])")
All the answers (or: the same answer repeated over and over) are inadequate. What you want to do is this:
from setuptools.command.easy_install import easy_install
class easy_install_default(easy_install):
""" class easy_install had problems with the fist parameter not being
an instance of Distribution, even though it was. This is due to
some import-related mess.
"""
def __init__(self):
from distutils.dist import Distribution
dist = Distribution()
self.distribution = dist
self.initialize_options()
self._dry_run = None
self.verbose = dist.verbose
self.force = None
self.help = 0
self.finalized = 0
e = easy_install_default()
import distutils.errors
try:
e.finalize_options()
except distutils.errors.DistutilsError:
pass
print e.install_dir
The final line shows you the installation dir. Works on Ubuntu, whereas the above ones don't. Don't ask me about windows or other dists, but since it's the exact same dir that easy_install uses by default, it's probably correct everywhere where easy_install works (so, everywhere, even macs). Have fun. Note: original code has many swearwords in it.
An additional note to the get_python_lib function mentioned already: on some platforms different directories are used for platform specific modules (eg: modules that require compilation). If you pass plat_specific=True to the function you get the site packages for platform specific packages.
This works for me.
It will get you both dist-packages and site-packages folders.
If the folder is not on Python's path, it won't be
doing you much good anyway.
import sys;
print [f for f in sys.path if f.endswith('packages')]
Output (Ubuntu installation):
['/home/username/.local/lib/python2.7/site-packages',
'/usr/local/lib/python2.7/dist-packages',
'/usr/lib/python2.7/dist-packages']
This should work on all distributions in and out of virtual environment due to it's "low-tech" nature. The os module always resides in the parent directory of 'site-packages'
import os; print(os.path.dirname(os.__file__) + '/site-packages')
To change dir to the site-packages dir I use the following alias (on *nix systems):
alias cdsp='cd $(python -c "import os; print(os.path.dirname(os.__file__))"); cd site-packages'
A side-note: The proposed solution (distutils.sysconfig.get_python_lib()) does not work when there is more than one site-packages directory (as recommended by this article). It will only return the main site-packages directory.
Alas, I have no better solution either. Python doesn't seem to keep track of site-packages directories, just the packages within them.
from distutils.sysconfig import get_python_lib
print get_python_lib()
You should try this command to determine pip's install location
Python 2
pip show six | grep "Location:" | cut -d " " -f2
Python 3
pip3 show six | grep "Location:" | cut -d " " -f2
Answer to old question. But use ipython for this.
pip install ipython
ipython
import imaplib
imaplib?
This will give the following output about imaplib package -
Type: module
String form: <module 'imaplib' from '/usr/lib/python2.7/imaplib.py'>
File: /usr/lib/python2.7/imaplib.py
Docstring:
IMAP4 client.
Based on RFC 2060.
Public class: IMAP4
Public variable: Debug
Public functions: Internaldate2tuple
Int2AP
ParseFlags
Time2Internaldate
For those who are using poetry, you can find your virtual environment path with poetry debug:
$ poetry debug
Poetry
Version: 1.1.4
Python: 3.8.2
Virtualenv
Python: 3.8.2
Implementation: CPython
Path: /Users/cglacet/.pyenv/versions/3.8.2/envs/my-virtualenv
Valid: True
System
Platform: darwin
OS: posix
Python: /Users/cglacet/.pyenv/versions/3.8.2
Using this information you can list site packages:
ls /Users/cglacet/.pyenv/versions/3.8.2/envs/my-virtualenv/lib/python3.8/site-packages/
I made a really simple function that gets the job done
import site
def get_site_packages_dir():
return [p for p in site.getsitepackages()
if p.endswith(("site-packages", "dist-packages"))][0]
get_site_packages_dir()
# '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages'
If you want to retrieve the results using the terminal:
python3 -c "import site;print([p for p in site.getsitepackages() if p.endswith(('site-packages', 'dist-packages')) ][0])"
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages
I had to do something slightly different for a project I was working on: find the relative site-packages directory relative to the base install prefix. If the site-packages folder was in /usr/lib/python2.7/site-packages, I wanted the /lib/python2.7/site-packages part. I have, in fact, encountered systems where site-packages was in /usr/lib64, and the accepted answer did NOT work on those systems.
Similar to cheater's answer, my solution peeks deep into the guts of Distutils, to find the path that actually gets passed around inside setup.py. It was such a pain to figure out that I don't want anyone to ever have to figure this out again.
import sys
import os
from distutils.command.install import INSTALL_SCHEMES
if os.name == 'nt':
scheme_key = 'nt'
else:
scheme_key = 'unix_prefix'
print(INSTALL_SCHEMES[scheme_key]['purelib'].replace('$py_version_short', (str.split(sys.version))[0][0:3]).replace('$base', ''))
That should print something like /Lib/site-packages or /lib/python3.6/site-packages.
Something that has not been mentioned which I believe is useful, if you have two versions of Python installed e.g. both 3.8 and 3.5 there might be two folders called site-packages on your machine. In that case you can specify the python version by using the following:
py -3.5 -c "import site; print(site.getsitepackages()[1])

Categories