Buildout adding eggs to existing recipe - python

Edit2: added entire stderr
Edit1: deleted my supposed answer. Updated description of problem and added full buildout.cfg text
Edit0: fixed links
I am trying to port a Pyramid project I have to Google App Engine. I am attempting to leverage Tobias Rodaebel's (thanks!) buildout recipe that aids in this. I used the pyramid_appengine scaffold, essentially following the procedure described here.
Things went relatively smoothly until I tried to add additional eggs to the buildout. I attempt to add the pymongo package to the ./buildout.cfg file,
[buildout]
include-site-packages=false
find-links=http://dist.plone.org/thirdparty/
extends = versions.cfg
versions = versions
update-versions-file = versions.cfg
show-picked-versions = true
develop=src/bkk
parts=bkk
tests
service-bkk
ae-sdk-version=1.9.18
ae-runtime=2.7
ae-sdk-location = ${buildout:parts-directory}/google_appengine
ae-extra-paths =
${buildout:bin-directory}
${buildout:directory}/parts/bkk
${buildout:directory}/parts/google_appengine
${buildout:directory}/parts/google_appengine/lib/antlr3
${buildout:directory}/parts/google_appengine/lib/django
${buildout:directory}/parts/google_appengine/lib/fancy_urllib
${buildout:directory}/parts/google_appengine/lib/yaml/lib
unzip=true
supervisor-port = 9999
supervisor-conf-dir = ${buildout:directory}/conf
supervisor-log-dir = ${buildout:directory}/var/log
service-conf-templates = ${buildout:directory}/conf.tmpl
project-name=bkk
[bkk]
recipe=rod.recipe.appengine
packages=
pyramid
pymongo
pyramid_jinja2
repoze.lru
zope.interface
zope.deprecation
venusian
translationstring
jinja2
webob
src=src/bkk
server-script=devappserver
zip-packages=false
use_setuptools_pkg_resources=true
url=https://storage.googleapis.com/appengine-sdks/featured/google_appengine_${buildout:ae-sdk-version}.zip
[bootstrap]
recipe=zc.recipe.egg
eggs=pastescript
extra-paths=${buildout:ae-extra-paths}
[tests]
recipe = zc.recipe.egg
eggs =
WebTest
WebOb
pytest
pytest-cov
interpreter = python
extra-paths=${buildout:ae-extra-paths}
[supervisor]
recipe = zc.recipe.egg
[mk-supervisor-log-dir]
recipe = collective.recipe.cmd:py
on_install = true
cmds =
>>> if not os.path.isdir('${buildout:supervisor-log-dir}'):os.makedirs('${buildout:supervisor-log-dir}')
[service-supervisor]
recipe = collective.recipe.template
input = ${buildout:service-conf-templates}/supervisord.conf_tmpl
output = ${buildout:directory}/etc/supervisord.conf
depends = ${mk-supervisor-log-dir:recipe} ${supervisor:recipe}
[service-bkk]
recipe = collective.recipe.template
input = ${buildout:service-conf-templates}/service-${buildout:project-name}.conf_tmpl
output = ${buildout:supervisor-conf-dir}/service-${buildout:project-name}.conf
port=8000
admin_port=8010
api_port=8020
depends = ${service-supervisor:recipe}
and also unzip the corresponding egg into the ./eggs directory. I end up getting the following error:
Develop: '/Users/npk1/Dev/bkk/bkk_gae/bkk/src/bkk'
/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'paster_plugins'
warnings.warn(msg)
warning: no files found matching '*.ini'
warning: no files found matching '*.rst'
warning: no files found matching '*.jpg' under directory 'bkk'
warning: no files found matching '*.pt' under directory 'bkk'
warning: no files found matching '*.txt' under directory 'bkk'
warning: no files found matching '*.mak' under directory 'bkk'
warning: no files found matching '*.mako' under directory 'bkk'
warning: no files found matching '*.js' under directory 'bkk'
warning: no files found matching '*.html' under directory 'bkk'
warning: no files found matching '*.xml' under directory 'bkk'
Uninstalling bkk.
Unused options for buildout: 'ae-runtime' 'include-site-packages' 'unzip'.
Installing bkk.
rod.recipe.appengine: Google App Engine distribution already downloaded.
While:
Installing bkk.
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
File "/Users/npk1/Dev/bkk/bkk_gae/bkk/eggs/zc.buildout-2.3.1-py2.7.egg/zc/buildout/buildout.py", line 1946, in main
getattr(buildout, command)(args)
File "/Users/npk1/Dev/bkk/bkk_gae/bkk/eggs/zc.buildout-2.3.1-py2.7.egg/zc/buildout/buildout.py", line 626, in install
installed_files = self[part]._call(recipe.install)
File "/Users/npk1/Dev/bkk/bkk_gae/bkk/eggs/zc.buildout-2.3.1-py2.7.egg/zc/buildout/buildout.py", line 1370, in _call
return f()
File "/Users/npk1/Dev/bkk/bkk_gae/bkk/eggs/rod.recipe.appengine-2.0.6-py2.7.egg/rod/recipe/appengine/__init__.py", line 380, in install
self.copy_packages(ws, temp_dir)
File "/Users/npk1/Dev/bkk/bkk_gae/bkk/eggs/rod.recipe.appengine-2.0.6-py2.7.egg/rod/recipe/appengine/__init__.py", line 290, in copy_packages
raise KeyError, '%s: package not found.' % p
KeyError: 'pyramid_jinja2: package not found.'
Walking through the code it seems that the zc.recipe is assembling its own list of dependencies and checking them against my projects and pymongo does not show up, though perhaps I am wrong.
Does anyone know the process when adding new packages to an existing recipe? Is that the wrong way to think about it? I will continue to try to digest buildout's documentation, as I am new to the utility.
Thanks

Pyramid has a cookbook of recipes.
These two recipes describe how to deploy Pyramid on GAE. Both provide guidance using a different method than the path you started down.
Pyramid on Google’s App Engine (using buildout)
Pyramid on Google’s App Engine (using appengine-monkey)

Related

Toolchains with Bazel 0.5.4

Is there an import of rules_python that defines toolchains but is compatible with bazel release 0.5.4? If not, what's the minimum version of bazel that does implement toolchains?
$bazel info release
release 0.5.4
I've inherited an application that builds with bazel 0.5.4, and would prefer not to destabilize the application with a significant upgrade. But, it does require toolchains in order to find python 3.
Suppose the application consists of the WORKSPACE and BUILD files from the most recent rules release README, plus a small python objective in the BUILD:
py_test(
name = "sandbox_test",
srcs = ["sandbox_test.py"],
default_python_version = "PY3",
srcs_version = "PY3",
)
What additional WORKSPACE definitions could enable this to run? With the latest rules_python provided:
git_repository(
name = "rules_python",
commit = "740825b7f74930c62f44af95c9a4c1bd428d2c53",
remote = "https://github.com/bazelbuild/rules_python.git",
)
toolchain.bzl cannot be found:
ERROR: error loading package 'toolchain_demo': Extension file not found. Unable to load file '#bazel_tools//tools/python:toolchain.bzl': file doesn't exist or isn't a file
ERROR: error loading package 'toolchain_demo': Extension file not found. Unable to load file '#bazel_tools//tools/python:toolchain.bzl': file doesn't exist or isn't a file
Is there a way to patch toolchain definitions into the WORKSPACE, or are toolchains with 0.5.4 impossible?

python setup.py install does not work with latest setuptools: no scripts in '<*>.egg-info'

Using setuptools==27.2.0, the travis tests of our package picca (https://github.com/igmhub/picca) works well. It is no longer the case using the latest version of setuptools==41.0.0: https://github.com/igmhub/picca/issues/591 .
The issue seems to be linked to where setuptools tries to read the scripts.
I get the following error:
Traceback (most recent call last):
File "/home/travis/virtualenv/python3.6.3/bin/picca_deltas.py", line 4, in <module>
__import__('pkg_resources').run_script('picca==4.0', 'picca_deltas.py')
File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/pkg_resources/__init__.py", line 666, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/pkg_resources/__init__.py", line 1437, in run_script
.format(**locals()),
pkg_resources.ResolutionError: Script 'scripts/picca_deltas.py' not found in metadata at '/home/travis/build/igmhub/picca/py/picca.egg-info'
When looking to the path /home/travis/build/igmhub/picca/py/picca.egg-info/ indeed there are no scripts folder.
Our python setup is the following, is there something we should change so that setup tools know where to find the scripts?
#!/usr/bin/env python
import glob
from setuptools import setup
scripts = glob.glob('bin/*')
description = "Package for Igm Cosmological-Correlations Analyses"
version="4.0"
setup(name="picca",
version=version,
description=description,
url="https://github.com/igmhub/picca",
author="<***>",
author_email="<***>",
packages=['picca','picca.fitter2'],
package_dir = {'': 'py'},
package_data = {'picca': ['fitter2/models/*/*.fits']},
install_requires=['numpy','scipy','iminuit','healpy','fitsio',
'llvmlite','numba','h5py','future','setuptools'],
test_suite='picca.test.test_cor',
scripts = scripts
)
The command /home/travis/virtualenv/python3.6.3/bin/picca_deltas.py looks like the following on my computer:
#!<where is python>/python/3.6.3/bin/python
# EASY-INSTALL-SCRIPT: 'picca==4.0','picca_deltas.py'
__requires__ = 'picca==4.0'
__import__('pkg_resources').run_script('picca==4.0', 'picca_deltas.py')
Thanks for the help.
This is a wild shot, but I just had the same issue.
Look at how /home/travis/build/igmhub/picca/py/picca.egg-info looks like your build folder. It shouldn't be looking into the egg-info in the build folder, but the one you installed.
Just change directory and you should be fine:
cd .. # or cd anywhere outside your build folder
picca_deltas.py
Should work fine.
I found a fix, I now run the test using pytest instead of python setup.py test.
https://github.com/igmhub/picca/pull/698/files.
The clue was in the following message;
WARNING: Testing via this command is deprecated and will be removed in a future version. Users looking for a generic test entry point independent of test runner are encouraged to use tox.

bazel can't build py_proto_library

My BUILD file is pretty simple, just
load("#protobuf_bzl//:protobuf.bzl", "py_proto_library")
py_proto_library(
name = "struct_py_pb2",
srcs = ["struct.proto"],
)
But bazel gives a bunch of baffling error messages like:
$ bazel build google/genomics/v1:all
ERROR: thomaswc//v1/BUILD:22:1: no such package '': BUILD file not found on package path and referenced by '//v1:struct_py_pb2'
ERROR: Analysis of target '//v1:struct_py_pb2' failed; build aborted: no such package '': BUILD file not found on package path
INFO: Elapsed time: 0.581s
FAILED: Build did NOT complete successfully (2 packages loaded)
currently loading: #protobuf_bzl//
I see other projects on github using bazel and py_proto_library, though, so I know it must be possible. Is there some WORKSPACE or .bzl magic that I need?
After a bunch of digging, I found a work-around: the default values of default_runtime and protoc are screwed up, so you need to override them:
py_proto_library(
name = "struct_py_pb2",
srcs = ["struct.proto"],
default_runtime = "#com_google_protobuf//:protobuf_python",
protoc = "#com_google_protobuf//:protoc",
)

Compile and use python-openzwave with open-zwave in non-standard location

I manually compiled python-openzwave to work with C++ library.
I would like to use it as Kodi addon (OpenELEC running on Pi 3), so can not use standard installation.
I've compiled everything, downloaded missing six and louie libs, and now try to run hello_world.py.
My current dirs structure is the following:
- root
- bin
- .lib
- config
Alarm.o
...
libopenzwave.a
libopenzwave.so
libopenzwave.so.1.4
...
- libopenzwave
driver.pxd
group.pxd
...
- louie
__init__.py
dispatcher.py
...
- openzwave
__init__.py
command.py
...
six.py
hello_world.py
But when I run hello_world.py, I get the following error -
Traceback (most recent call last):
File "hello_world.py", line 40, in <module>
from openzwave.controller import ZWaveController
File "/storage/.kodi/addons/service.multimedia.open-zwave/openzwave/controller.py", line 34, in <module>
from libopenzwave import PyStatDriver, PyControllerState
ImportError: No module named libopenzwave
If I move libopenzwave.a and libopenzwave.so to root folder, then I get the following error:
Traceback (most recent call last):
File "hello_world.py", line 40, in <module>
from openzwave.controller import ZWaveController
File "/storage/.kodi/addons/service.multimedia.open-zwave/openzwave/controller.py", line 34, in <module>
from libopenzwave import PyStatDriver, PyControllerState
ImportError: dynamic module does not define init function (initlibopenzwave)
What is wrong with my setup?
In general the steps required consist of calls to make build which handles building the .cpp files for openzwave and downloading all dependencies (including Cython); and make install which runs the setup-api, setup-lib.py (this setup script also creates the C++ Python extention for openzwave), setup-web.py and setup-manager.py.
Since you cannot run make install as you specified and are instead using the archive they provide, the only other options for creating the python extention, after building the openzwave library with make build, is generating the .so files for it without installing to standard locations.
Building the .so for the cython extention in the same folder as the Cython scripts is done by running:
python setup.py build_ext --inplace
This should create a shared library in src-lib named libopenzwave.so (it is different from the libopenzwave.so contained in the bin/ directory) which contains all the functionality specified in the extention module. You could try adding that to the libopenzwave folder.
If you pass special compiler flags during make build for building the openzwave library you should specify the same ones when executing the setup-lib.py script. This can be done by specifying the CFLAGS before executing it (as specified here) or else you might have issues like error adding symbols: File in wrong format.
Here's the description of the python-openzwave's build from the question's perspective. Almost all the steps correspond to the root Makefile's targets.
Prerequisites. There are several independent targets with little to no organization. Most use Debian-specific commands.
Cython is not needed if building from an archive (details below)
openzwave C++ library (openzwave openzwave/.lib/ target).
Build logic: openzwave/Makefile, invoked without parameters (but with inherited environment).
Inputs: openzwave/ subtree (includes libhidapi and libtinyxml, statically linked).
Outputs: openzwave/.lib/libopenzwave.{a,so}
Accepts PREFIX as envvar (/usr/local by default)
The only effect that affects us is: $(PREFIX)/etc/openzwave/ is assigned to a macro which adds a search location for config files (Options.cpp): config/ -> /etc/openzwave/ -> <custom location>.
libopenzwave Python C extension module (install-lib target - yes, the stock Makefile cannot just build it; the target doesn't even have the dependency on the library).
Build logic: setup-lib.py
Inputs: src-lib/, openzwave/.lib/libopenzwave.a
Outputs: build/<...>/libopenzwave.so - yes, the same name as openzwave's output, so avoid confusing them
By default, openzwave is linked statically with the module so you don't need to include the former into a deployment
The module does, however, need the config folder from the library. It is included by the build script when making a package.
Contrary to what Jim says, Cython is not needed to build from an archive, the archive already includes the generated .cpp.
Now, the catch is: the module itself uses pkg_resources to locate its data. So you cannot just drop the .so and config into the currect directory and call it a day. You need to make pkg_resources.get_distribution('libopenzwave') succeed.
pkg_resources claims to support "normal filesystem packages, .egg files, and unpacked .egg files."
In particular, I was able to pull this off: make an .egg (setup-lib.py bdist_egg), unpack it into the current directory and rename EGG-INFO into libopenzwave.egg-info (like it is in site-packages). A UserWarning is issued if I don't specifically add the .so's location into PYTHON_PATH/sys.path before importing the module.
openzwave,pyozwman and pyozwweb Python packages (install)
these are pure Python packages. The first one uses the C extension module, others use the first one.
Build logic: setup-api.py,setup-manager.py,setup-web.py
Input: src-*/
Output: (pure Python)
They only use pkg_resources.declare_namespace() so you're gonna be fine with just the proper files/dirs on sys.path without any .egg-info's

What index is buildout really using?

I'm trying to install couchdbkit using following buildout config:
[buildout]
parts = eggs
include-site-packages = false
versions = versions
[eggs]
recipe = zc.recipe.egg:eggs
eggs =
couchdbkit
[versions]
couchdbkit = 0.6.3
It installs package successfully but I get numerous errors like this during setup on some machines:
Download error on http://hg.e-engura.org/couchdbkit/: [Errno -2] Name or service not known -- Some packages may not be found!
Be default buildout should find packages using this index. But I can't understand source of this weird hostname. Nothing here points to this location.
How does it actually work?
The underlying setuptools code also scans for homepage and download links from the simple index and does this quite aggressively.
The couchdbkit setup.py file lists http://hg.e-engura.org/couchdbkit/ as the homepage, so all homepage links on the simple index link there.
You can prevent zc.buildout from trying to connect to that host by setting up a whitelist of hosts it can connect to:
[buildout]
# ...
allow-hosts =
*.python.org
*.google.com
*.googlecode.com
*.sourceforge.net
for example.

Categories