I created a python library which depends on pypiwin32 package. For some functionality, they use _winreg package. It works on Windows, however RTD virtualenv is not running on Windows and this package is not available. Since it is part of python itself and not on pypi, there is no way I could make it a dependency.
Each time i build docs from source code, it fails on missing the _winreg package.
I tried to remove dependency on pypiwin32 only for RTD with something like this in setup.py:
if os.environ.get('READTHEDOCS') == 'True':
REQUIRED = []
else:
REQUIRED = [
"pypiwin32"
]
It works for all .rst Sphinx files. On the other hand, no documentation for functions is generated. On local machine (Windows) everything is properly documented.
Note: Read the docs documentation is generated from rtd branch of my github project.
Is there any workaround for this?
Thank you.
I might have found working solution. Since Read The Docs does not support _winreg, I disabled whole dependency on pypiwin32 for Read The Docs.
# ... part of setup.py
if os.environ.get('READTHEDOCS') == 'True':
REQUIRED = []
else:
REQUIRED = [
"pypiwin32"
]
READTHEDOCS is environmental variable available only on RTD, see more.
This broke all calls from my library. Thus I created another file with mockup functions just for Read The Docs:
# ... part of __init__.py
if os.environ.get('READTHEDOCS') != 'True':
from win32api import GetModuleHandle
# ... import rest of win32api functions
else:
from .read_the_docs import *
And content of read_the_docs.py:
def GetMOduleHandle(*args, **kwargs):
pass
# ... rest of file
This way, local build use full pypiwin32 with _winreg, but on Read The Docs, Sphinx uses this kind of "mockup" utility.
Related
My context is appengine_config.py, but this is really a general Python question.
Given that we've cloned a repo of an app that has an empty directory lib in it, and that we populate lib with packages by using the command pip install -r requirements.txt --target lib, then:
dirname ='lib'
dirpath = os.path.join(os.path.dirname(__file__), dirname)
For importing purposes, we can add such a filesystem path to the beginning of the Python path in the following way (we use index 1 because the first position should remain '.', the current directory):
sys.path.insert(1, dirpath)
However, that won't work if any of the packages in that directory are namespace packages.
To support namespace packages we can instead use:
site.addsitedir(dirpath)
But that appends the new directory to the end of the path, which we don't want in case we need to override a platform-supplied package (such as WebOb) with a newer version.
The solution I have so far is this bit of code which I'd really like to simplify:
sys.path, remainder = sys.path[:1], sys.path[1:]
site.addsitedir(dirpath)
sys.path.extend(remainder)
Is there a cleaner or more Pythonic way of accomplishing this?
For this answer I assume you know how to use setuptools and setup.py.
Assuming you would like to use the standard setuptools workflow for development, I recommend using this code snipped in your appengine_config.py:
import os
import sys
if os.environ.get('CURRENT_VERSION_ID') == 'testbed-version':
# If we are unittesting, fake the non-existence of appengine_config.
# The error message of the import error is handled by gae and must
# exactly match the proper string.
raise ImportError('No module named appengine_config')
# Imports are done relative because Google app engine prohibits
# absolute imports.
lib_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'libs')
# Add every library to sys.path.
if os.path.isdir(lib_dir):
for lib in os.listdir(lib_dir):
if lib.endswith('.egg'):
lib = os.path.join(lib_dir, lib)
# Insert to override default libraries such as webob 1.1.1.
sys.path.insert(0, lib)
And this piece of code in setup.cfg:
[develop]
install-dir = libs
always-copy = true
If you type python setup.py develop, the libraries are downloaded as eggs in the libs directory. appengine_config inserts them to your path.
We use this at work to include webob==1.3.1 and internal packages which are all namespaced using our company namespace.
You may want to have a look at the answers in the Stack Overflow thread, "How do I manage third-party Python libraries with Google App Engine? (virtualenv? pip?)," but for your particular predicament with namespace packages, you're running up against a long-standing issue I filed against site.addsitedir's behavior of appending to sys.path instead of inserting after the first element. Please feel free to add to that discussion with a link to this use case.
I do want to address something else that you said that I think is misleading:
My context is appengine_config.py, but this is really a general Python
question.
The question actually arises from the limitations of Google App Engine and the inability to install third-party packages, and hence, seeking a workaround. Rather than manually adjusting sys.path and using site.addsitedir. In general Python development, if your code uses these, you're Doing It Wrong.
The Python Packaging Authority (PyPA) describes the best practices to put third party libraries on your path, which I outline below:
Create a virtualenv
Mark out your dependencies in your setup.py and/or requirements files (see PyPA's "Concepts and Analyses")
Install your dependencies into the virtualenv with pip
Install your project, itself, into the virtualenv with pip and the -e/--editable flag.
Unfortunately, Google App Engine is incompatible with virtualenv and with pip. GAE chose to block this toolset in an attempt sandbox the environment. Hence, one must use hacks to work around the limitations of GAE to use additional or newer third party libraries.
If you dislike this limitation and want to use standard Python tooling for managing third-party package dependencies, other Platform as a Service providers out there eagerly await your business.
I am writing a utility in python that needs to check for (and if necessary, install and even upgrade) various other modules with in a target project/virtualenv, based on user supplied flags and/or input. I am currently trying to utilize 'pip' directly/programatically (because of it's existing support for the various repo types I will need to access), but I am having difficulty in finding examples or documentation on using it this way.
This seemed like the direction to go:
import pip
vcs = pip.vcs.VersionControl(url="http://path/to/repo/")
...but it gives no joy.
I need help with some of the basics aparently - like how can I use pip to pull/export a copy of an svn repo into a given local directory. Ultimately, I will also need to use it for git and mercurial checkouts as well as standard pypi installs. Any links, docs or pointers would be much appreciated.
Pip uses a particular format for vcs urls. The format is
vcsname+url#rev
#rev is optional, you can use it to reference a specific commit/tag
To use pip to retrieve a repository from a generic vcs to a local directory you may do this
from pip.vcs import VcsSupport
req_url = 'git+git://url/repo'
dest_path = '/this/is/the/destination'
vcs = VcsSupport()
vc_type, url = req_url.split('+',1)
backend = vcs.get_backend(vc_type)
if backend:
vcs_backend = backend(req_url)
vcs_backend.obtain(dest_path)
else:
print('Not a repository')
Check https://pip.pypa.io/en/stable/reference/pip_install/#id8 to know which vcs are supported
I'm looking for a way to include some feature in a python (extension) module in installation phase.
In a practical manner:
I have a python library that has 2 implementations of the same function, one internal (slow) and one that depends from an external library (fast, in C).
I want that this library is optional and can be activated at compile/install time using a flag like:
python setup.py install # (it doesn't include the fast library)
python setup.py --enable-fast install
I have to use Distutils, however all solution are well accepted!
The docs for distutils include a section on extending the standard functionality. The relevant suggestion seems to be to subclass the relevant classes from the distutils.command.* modules (such as build_py or install) and tell setup to use your new versions (through the cmdclass argument, which is a dictionary mapping commands to classes which are to be used to execute them). See the source of any of the command classes (e.g. the install command) to get a good idea of what one has to do to add a new option.
An example of exactly what you want is the sqlalchemy's cextensions, which are there specifically for the same purpose - faster C implementation. In order to see how SA implemented it you need to look at 2 files:
1) setup.py. As you can see from the extract below, they handle the cases with setuptools and distutils:
try:
from setuptools import setup, Extension, Feature
except ImportError:
from distutils.core import setup, Extension
Feature = None
Later there is a check if Feature: and the extension is configured properly for each case using variable extra, which is later added to the setup() function.
2) base.py: here look at how BaseRowProxy is defined:
try:
from sqlalchemy.cresultproxy import BaseRowProxy
except ImportError:
class BaseRowProxy(object):
#....
So basically once C extensions are installed (using --with-cextensions flag during setup), the C implementation will be used. Otherwise, pure Python implementation of the class/function is used.
With my Java projects at present, I have full version control by declaring it as a Maven project. However I now have a Python project that I'm about to tag 0.2.0 which has no version control. Therefore should I come accross this code at a later date, I won't no what version it is.
How do I add version control to a Python project, in the same way Maven does it for Java?
First, maven is a build tool and has nothing to do with version control. You don't need a build tool with Python -- there's nothing to "build".
Some folks like to create .egg files for distribution. It's as close to a "build" as you get with Python. This is a simple setup.py file.
You can use SVN keyword replacement in your source like this. Remember to enable keyword replacement for the modules that will have this.
__version__ = "$Revision$"
That will assure that the version or revision strings are forced into your source by SVN.
You should also include version keywords in your setup.py file.
Create a distutils setup.py file. This is the Python equivalent to maven pom.xml, it looks something like this:
from distutils.core import setup
setup(name='foo',
version='1.0',
py_modules=['foo'],
)
If you want dependency management like maven, take a look at setuptools.
Ants's answer is correct, but I would like to add that your modules can define a __version__ variable, according to PEP 8, which can be populated manually or via Subversion or CVS, e.g. if you have a module thingy, with a file thingy/__init__.py:
___version___ = '0.2.0'
You can then import this version in setup.py:
from distutils.core import setup
import thingy
setup(name='thingy',
version=thingy.__version__,
py_modules=['thingy'],
)
This question already has answers here:
How to read a (static) file from inside a Python package?
(6 answers)
Closed 2 years ago.
I've written a Python package that includes a bsddb database of pre-computed values for one of the more time-consuming computations. For simplicity, my setup script installs the database file in the same directory as the code which accesses the database (on Unix, something like /usr/lib/python2.5/site-packages/mypackage/).
How do I store the final location of the database file so my code can access it? Right now, I'm using a hack based on the __file__ variable in the module which accesses the database:
dbname = os.path.join(os.path.dirname(__file__), "database.dat")
It works, but it seems... hackish. Is there a better way to do this? I'd like to have the setup script just grab the final installation location from the distutils module and stuff it into a "dbconfig.py" file that gets installed alongside the code that accesses the database.
Try using pkg_resources, which is part of setuptools (and available on all of the pythons I have access to right now):
>>> import pkg_resources
>>> pkg_resources.resource_filename(__name__, "foo.config")
'foo.config'
>>> pkg_resources.resource_filename('tempfile', "foo.config")
'/usr/lib/python2.4/foo.config'
There's more discussion about using pkg_resources to get resources on the eggs page and the pkg_resources page.
Also note, where possible it's probably advisable to use pkg_resources.resource_stream or pkg_resources.resource_string because if the package is part of an egg, resource_filename will copy the file to a temporary directory.
Use pkgutil.get_data. It’s the cousin of pkg_resources.resource_stream, but in the standard library, and should work with flat filesystem installs as well as zipped packages and other importers.
That's probably the way to do it, without resorting to something more advanced like using setuptools to install the files where they belong.
Notice there's a problem with that approach, because on OSes with real a security framework (UNIXes, etc.) the user running your script might not have the rights to access the DB in the system directory where it gets installed.
Use the standard Python-3.7 library's importlib.resources module,
which is more efficient than setuptools:pkg_resources
(on previous Python versions, use the backported importlib_resources library).
Attention: For this to work, the folder where the data-file resides must be a regular python-package. That means you must add an __init__.py file into it, if not already there.
Then you can access it like this:
try:
import importlib.resources as importlib_resources
except ImportError:
# In PY<3.7 fall-back to backported `importlib_resources`.
import importlib_resources
## Note that the actual package could have been used,
# not just its (string) name, with something like:
# from XXX import YYY as data_pkg
data_pkg = '.'
fname = 'database.dat'
db_bytes = importlib_resources.read_binary(data_pkg, fname)
# or if a file-like stream is needed:
with importlib_resources.open_binary(data_pkg, fname) as db_file:
...