Installing a package from private pypi in zc.buildout - python

I'm trying to install a python package from the private reportlab pypi server using zc.buildout.
When I install using the instructions provided on their own site, then it installs without problem. http://www.reportlab.com/reportlabplus/installation/
If however I install using zc.buildout, I keep getting Couldn't find distributions for 'rlextra'. I added their pypi repo to find-links, so I'm not sure what I'm missing.
My buildout config:
[buildout]
versions = versions
include-site-packages = false
extensions = mr.developer
unzip = true
find-links = https://[user]:[pass]#www.reportlab.com/pypi
parts =
python
django
compass-config
auto-checkout = *
eggs =
...
rlextra
...
... etc.
Edit: I should point out that I did in the end do a manual download of the package, and using it in my buildout as a develop package. Even though this solves the immediate issue, I would still like to know why my original setup is not working.

You are passing in the PyPI main link for the find-links URL, but find-links only works with the simple index style pages (which exist per package on PyPI).
For example, the beautifulsoup4 package has a simple index page at https://pypi.python.org/simple/beautifulsoup4/.
The ReportLab server also has simple pages; add the one for this package to your buildout:
find-links = https://[user]:[pass]#www.reportlab.com/pypi/simple/rlextra/
IIRC you can also add the top-level https://[user]:[pass]#www.reportlab.com/pypi/simple URL as a find-links, but being more specific saves on the URL round-trips.

Related

How to make setup.py's install_requires section skip already installed dependencies

I am trying to create a python package that depends on another package hosted in a private repo. The setup.py looks like this:
setup(
name="parasol",
version="2.18.1",
...
install_requires=[
...
"pyGSO # git+https://dev.azure.com/.../pyGSO#Main",
],
)
This works fine. Unfortunately, I now need to install it in an environment which has no access to the private repo. To work around that I am installing pyGSO manually in a separate build step, and then running this. But of course, as soon as it gets to the pyGSO requirement here it fails since it has no access to check the repo. What can I do?
Ideas I had, if anyone knows how to implement them:
Add a minimum version indicator to the requirement, so that if I manually install a newer version it won't try to access the unavailable repo
Somehow have it skip this dependency if it already is installed
You can tell setup.py to not add the dependency with an environment variable:
import os
NO_PYGSO = os.getenv("PARASOL_NO_PYGSO")
dependencies = [...]
if NO_PYGSO is None:
dependencies.append("pyGSO # git+https://dev.azure.com/.../pyGSO#Main")
setup(
name="parasol",
version="2.18.1",
...
install_requires=dependencies,
)
Then before installing in your environment without access to the repo, you do export PARASOL_NO_PYGSO=true (or equivalent for your shell, in case it's not bash).

buildout can't find pip packages for Plone

I'm trying to add add-ons to Plone through buildout but it can't find the packages. I've tried it in a virtualenv andd the system wide python
I followed the set up on the Plone site
Setup instructions
Every add-on I try brings up the following error
Installing instance.
/home/a/Plone/zinstance/local/lib/python2.7/site-packages/pkg_resources/__init__.py:192: RuntimeWarning: You have iterated over the result of pkg_resources.parse_version. This is a legacy behavior which is inconsistent with the new version class introduced in setuptools 8.0. In most cases, conversion to a tuple is unnecessary. For comparison of versions, sort the Version instances directly. If you have another use case requiring the tuple, please file a bug with the setuptools project describing that need.
stacklevel=1,
Couldn't find index page for 'collective.addthis' (maybe misspelled?)
Getting distribution for 'collective.addthis'.
Couldn't find index page for 'collective.addthis' (maybe misspelled?)
While:
Installing instance.
Getting distribution for 'collective.addthis'.
Error: Couldn't find a distribution for 'collective.addthis'.
I add the packages to the buildout.cnfg
eggs =
Plone
Pillow
collective.addthis
Plone and Pillow build fine but every add-on I try brings up the same error.
there have been a lot of similar problems reported on https://community.plone.org/
the problem most probably is the migration from in pypi.python.org to pypi.org
you can add:
index = https://pypi.org/simple/
and if you are using allowed hosts you need 2 new ones and can skip *.python.org:
allow-hosts =
pypi.org
files.pythonhosted.org
alternatively you can use a current versions of setuptools and zc.buildout
attention: for setuptools > 38.7.0 you need to pin
plone.recipe.zope2instance = 4.4.0
(see https://github.com/plone/plone.recipe.zope2instance/blob/4.4.0/CHANGES.rst)

buildout cfg requires modules that are already installed

I am trying to run:
sudo bin/buildout
And I get a message saying:
mr.developer: Queued 'django-accessibillity' for checkout.
mr.developer: Queued 'django-registration' for checkout.
mr.developer: Queued 'ivc-formutils' for checkout.
And asks me for password for a repository. However the problem is that this repository no longer exists and I can't download files from there. Therefore I got these modules from other sources and installed them. But when I run buildout installation still asks me for the same modules.
Is it somehow possible to get buildout to recognize that I already have these modules or is there some other workaround ?
mr.developer will, by default, update your repositories when buildout runs. You can disable this with the auto-checkout option in the [buildout] section:
[buildout]
extensions = mr.developer
# ...
auto-checkout = false
However, if you have a checkout that is no longer present in the repository, convert your sources to the fs type instead of git or svn or whatever repository type they were before.
Look for the [sources] section (unless a sources key is set in the [buildout] section, in which case that'll name the right section). It'll have entries like:
[sources]
django-accessibillity = git https://some.gitserver.com/some/repository.git
django-registration = svn https://some.svnserver.com/some/svn/repo/trunk
Change these to use fs <name-of-package> instead:
[sources]
django-accessibillity = fs django-accessibillity
django-registration = fs django-registration

Python package dependency tree

I would like to analyze the dependency tree of Python packages. How can I obtain this data?
Things I already know
setup.py sometimes contains a requires field that lists package dependencies
PyPi is an online repository of Python packages
PyPi has an API
Things that I don't know
Very few projects (around 10%) on PyPi explicitly list dependencies in the requires field but pip/easy_install still manage to download the correct packages. What am I missing? For example the popular library for statistical computing, pandas, doesn't list requires but still manages to install numpy, pytz, etc.... Is there a better way to automatically collect the full list of dependencies?
Is there a pre-existing database somewhere? Am I repeating existing work?
Do similar, easily accessible, databases exist for other languages with distribution systems (R, Clojure, etc...?)
You should be looking at the install_requires field instead, see New and changed setup keywords.
requires is deemed too vague a field to rely on for dependency installation. In addition, there are setup_requires and test_requires fields for dependencies required for setup.py and for running tests.
Certainly, the dependency graph has been analyzed before; from this blog article by Olivier Girardot comes this fantastic image:
The image is linked to the interactive version of the graph.
Using tool like pip, you can list all requirements for each package.
The command is:
pip install --no-install package_name
You can reuse part of pip in your script. The part responsible for parsing requirements is module pip.req.
Here is how you can do it programmatically using python pip package:
from pip._vendor import pkg_resources # Ensure pip conf index-url pointed to real PyPi Index
# Get dependencies from pip
package_name = 'Django'
try:
package_resources = pkg_resources.working_set.by_key[package_name.lower()] # Throws KeyError if not found
dependencies = package_resources._dep_map.keys() + ([str(r) for r in package_resources.requires()])
dependencies = list(set(dependencies))
except KeyError:
dependencies = []
And here is how you can get dependencies from the PyPi API:
import requests
import json
package_name = 'Django'
# Package info url
PYPI_API_URL = 'https://pypi.python.org/pypi/{package_name}/json'
package_details_url = PYPI_API_URL.format(package_name=package_name)
response = requests.get(package_details_url)
data = json.loads(response.content)
if response.status_code == 200:
dependencies = data['info'].get('requires_dist')
dependencies2 = data['info'].get('requires')
dependencies3 = data['info'].get('setup_requires')
dependencies4 = data['info'].get('test_requires')
dependencies5 = data['info'].get('install_requires')
if dependencies2:
dependencies.extend(dependencies2)
if dependencies3:
dependencies.extend(dependencies3)
if dependencies4:
dependencies.extend(dependencies4)
if dependencies5:
dependencies.extend(dependencies5)
dependencies = list(set(dependencies))
You can use recursion to call dependencies of dependencies to get the full tree. Cheers!

Problem installing OpenERP server with buildout !

I'm trying to deploy OpenERP with a buildout and my own piece of code. In fact I would like to build a complete deployement structure allowing me to use OpenERP with custom modules and patch.
First of all, before adding any personnal configuration, I was trying to create a buildout which will have the responsability to configure everything.
Buildout Configuration
My buildout.cfg configuration file look like this:
[buildout]
parts = eggs
versions=versions
newest = false
extensions = lovely.buildouthttp
unzip = true
find-links =
http://download.gna.org/pychart/
[versions]
[eggs]
recipe = zc.recipe.egg
interpreter = python
eggs =
Paste
PasteScript
PasteDeploy
psycopg2
PyChart
pydot
openerp-server
Configuration problem
But when trying to launch the buildout I have a couples of errors when trying to install the last needed egg (openerp-server)
On my side it just cannot find these modules, but they are in my eggs dir:
Error: python module psycopg2 (PostgreSQL module) is required
Error: python module libxslt (libxslt python bindings) is required
Error: python module pychart (pychart module) is required
Error: python module pydot (pydot module) is required
error: Setup script exited with 1
An error occured when trying to install openerp-server 5.0.0-3. Look above this message for any errors that were output by easy_install.
Is this possible that openerp hardcoded the his searching path somewhere ?
easy_install, a try
I decided to give a try to a clean virtualenv without any relation to the main site-package. But when using easy_install on openerp-server:
$ source openerp-python/bin/activate
$ easy_install openerp-server
...
File "build/bdist.linux-i686/egg/pkg_resources.py", line 887, in extraction_error
pkg_resources.ExtractionError: Can't extract file(s) to egg cache
The following error occurred while trying to extract file(s) to the Python egg
cache:
SandboxViolation: mkdir('/home/mlhamel/.python-eggs/psycopg2-2.0.13-py2.5-linux-x86_64.egg-tmp', 511) {}
I have always the error message however psyopg2 was installed or not on my machine
System's Configuration
Ubuntu 9.10 x86-64
Tried on Python 2.5/Python 2.6
Ok I did this recently:
Don't try to install the egg, openerp is not really standard.
I used this buildout snippet:
# get the openerp-stuff as a distutils package
[openerp-server]
recipe = zerokspot.recipe.distutils
urls = http://www.openerp.com/download/stable/source/openerp-server-5.0.6.tar.gz
# similar idea for the web component
[openerp-web]
recipe = zc.recipe.egg:scripts
find-links = http://www.openerp.com/download/stable/source/openerp-web-5.0.6.tar.gz
# add some symlinks so you can run it out of bin
[server-symlinks]
recipe = cns.recipe.symlink
symlink = ${buildout:parts-directory}/openerp-server/bin/openerp-server = ${buildout:bin-directory}
The key however, is that I did not use virtualenv. You don't need to with buildout. Buildout + virtualenv is like Trojan + Ramses... one is enough, unless you are ... well one is enough. ;)
Now for this particular project I had followed the debian instructions and installed the required libs via aptitude. This was only because I was new to buildout at the time, one could just as easily install the psycopg2 module
Here are some excellent instructions. Ignore the django stuff if you don't need it. Dan Fairs is both a great writer and great buildout tutor. Check it out. Disclaimer: I am a disciple of the man, based on his buildout usage.
I am certain you do not want to use the egg on pypi, it never worked for me, openerp is not eggified, it's a distutils package.
Good luck!
Just for the record: there is a buildout recipe for OpenERP available in Pypi.
I'm not familiar with buildout, but if I were going to try building an OpenERP installer, I'd start by looking at the nice one from Open Source Consulting. I've used it and been pretty happy with it.
Last time I checked, it doesn't set up the CRM e-mail gateway, but everything else I need was covered.

Categories