I am trying to run:
sudo bin/buildout
And I get a message saying:
mr.developer: Queued 'django-accessibillity' for checkout.
mr.developer: Queued 'django-registration' for checkout.
mr.developer: Queued 'ivc-formutils' for checkout.
And asks me for password for a repository. However the problem is that this repository no longer exists and I can't download files from there. Therefore I got these modules from other sources and installed them. But when I run buildout installation still asks me for the same modules.
Is it somehow possible to get buildout to recognize that I already have these modules or is there some other workaround ?
mr.developer will, by default, update your repositories when buildout runs. You can disable this with the auto-checkout option in the [buildout] section:
[buildout]
extensions = mr.developer
# ...
auto-checkout = false
However, if you have a checkout that is no longer present in the repository, convert your sources to the fs type instead of git or svn or whatever repository type they were before.
Look for the [sources] section (unless a sources key is set in the [buildout] section, in which case that'll name the right section). It'll have entries like:
[sources]
django-accessibillity = git https://some.gitserver.com/some/repository.git
django-registration = svn https://some.svnserver.com/some/svn/repo/trunk
Change these to use fs <name-of-package> instead:
[sources]
django-accessibillity = fs django-accessibillity
django-registration = fs django-registration
Related
I saw this nice explanation video (link) of packaging using pip and I got two questions:
The first one is:
I write a code which I want to share with my colleagues, but I do not aim to share it via pypi. Thus, I want to share it internally, so everyone can install it within his/ her environment.
I actually needn't to create a wheel file with python setup.py bdist_wheel, right? I create the setup.py file and I can install it with the command pip install -e . (for editable use), and everyone else can do it so as well, after cloning the repository. Is this right?
My second question is more technical:
I create the setup.py file:
from setuptools import setup
setup(
name = 'helloandbyemate',
version = '0.0.1',
description="Say hello in slang",
py_modules=['hellomate'],
package_dir={"": "src"}
)
To test it, I write a file hellomate.py which contains a function printing hello, mate!. I put this function in src/. In the setup.py file I put only this module in the list py_modules. In src/ is another module called byemate.py. When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
I actually needn't to create a wheel file ... everyone else can do it so as well, after cloning the repository. Is this right?
This is correct. However, the installation from source is slower, so you may want to publish wheels to an index anyway if you would like faster installs.
When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
Yes, this is an artifact of the "editable" installation mode. It works by putting the src directory onto the sys.path, via a line in the path configuration file .../lib/pythonX.Y/site-packages/easy-install.pth. This means that the entire source directory is exposed and everything in there is available to import, whether it is packaged up into a release file or not.
The benefit is that source code is "editable" without reinstallation (adding/removing/modifying files in src will be reflected in the package immediately)
The drawback is that the editable installation is not exactly the same as a "real" installation, where only the files specified in the setup.py will be copied into site-packages directly
If you don't want other files such as byemate.py to be available to import, use a regular install pip install . without the -e option. However, local changes to hellomate.py won't be reflected until the installation step is repeated.
Strict editable installs
It is possible to get a mode of installation where byemate.py is not exposed at all, but live modifications to hellomate.py are still possible. This is the "strict" editable mode of setuptools. However, it is not possible using setup.py, you have to use a modern build system declaration in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "helloandbyemate"
version = "0.0.1"
description = "Say hello in slang"
[tool.setuptools]
py-modules = ["hellomate"]
include-package-data = false
[tool.setuptools.package-dir]
"" = "src"
Now you can perform a strict install with:
pip install -e . --config-settings editable_mode=strict
I'm trying to install a python package from the private reportlab pypi server using zc.buildout.
When I install using the instructions provided on their own site, then it installs without problem. http://www.reportlab.com/reportlabplus/installation/
If however I install using zc.buildout, I keep getting Couldn't find distributions for 'rlextra'. I added their pypi repo to find-links, so I'm not sure what I'm missing.
My buildout config:
[buildout]
versions = versions
include-site-packages = false
extensions = mr.developer
unzip = true
find-links = https://[user]:[pass]#www.reportlab.com/pypi
parts =
python
django
compass-config
auto-checkout = *
eggs =
...
rlextra
...
... etc.
Edit: I should point out that I did in the end do a manual download of the package, and using it in my buildout as a develop package. Even though this solves the immediate issue, I would still like to know why my original setup is not working.
You are passing in the PyPI main link for the find-links URL, but find-links only works with the simple index style pages (which exist per package on PyPI).
For example, the beautifulsoup4 package has a simple index page at https://pypi.python.org/simple/beautifulsoup4/.
The ReportLab server also has simple pages; add the one for this package to your buildout:
find-links = https://[user]:[pass]#www.reportlab.com/pypi/simple/rlextra/
IIRC you can also add the top-level https://[user]:[pass]#www.reportlab.com/pypi/simple URL as a find-links, but being more specific saves on the URL round-trips.
I would like to include certain user defined files such cfg.ini under package_data also with write permissions. But by default write permissions are disabled because I installed my package with sudo.
sudo python setup.install
By the way i am using setuptools to create my setup.py
Any suggestions please.
There are 2 solutions to this.
Install the package to the local user (instead of the system) by just using (without sudo):
python setup.py install
This would allow the user to be able to edit the custom files.
As an alternative to having user defined files under the *package_data*, you could simply have user specific files generated in user specific directories which were read by your module. Then be edited by the user. On Windows this would be stored under the AppData directory for the user.
The appdirs package provides easy access to user directories for storing application data (it is MIT licensed).
To install appdirs from PyPi:
sudo pip install appdirs
Or you can download the source, and then install it with:
sudo python setup.py install
Here's an example:
>>> import appdirs
>>> user_dir = appdirs.user_data_dir("YourAppOrModule", "YourName")
>>> print(user_dir)
C:\Users\TheUser\AppData\Local\YourName\YourAppOrModule
And then you can generate the user files under user_dir and allow the user to edit them from there.
The Twisted Plugin System is the preferred way to write extensible twisted applications.
However, due to the way the plugin system is structured (plugins go into a twisted/plugins directory which should not be a Python package), writing a proper setup.py for installing those plugins appears to be non-trivial.
I've seen some attempts that add 'twisted.plugins' to the 'packages' key of the distutils setup command, but since it is not really a package, bad things happen (for example, an __init__.py is helpfully added by some tools).
Other attempts seem to use 'package_data' instead (eg, http://bazaar.launchpad.net/~glyph/divmod.org/trunk/view/head:/Epsilon/epsilon/setuphelper.py), but that can also fail in weird ways.
The question is: has anyone successfully written a setup.py for installing twisted plugins which works in all cases?
I document a setup.py below that is needed only if you have users with pip < 1.2 (e.g. on Ubuntu 12.04). If everyone has pip 1.2 or newer, the only thing you need is packages=[..., 'twisted.plugins'].
By preventing pip from writing the line "twisted" to .egg-info/top_level.txt, you can keep using packages=[..., 'twisted.plugins'] and have a working pip uninstall that doesn't remove all of twisted/. This involves monkeypatching setuptools/distribute near the top of your setup.py. Here is a sample setup.py:
from distutils.core import setup
# When pip installs anything from packages, py_modules, or ext_modules that
# includes a twistd plugin (which are installed to twisted/plugins/),
# setuptools/distribute writes a Package.egg-info/top_level.txt that includes
# "twisted". If you later uninstall Package with `pip uninstall Package`,
# pip <1.2 removes all of twisted/ instead of just Package's twistd plugins.
# See https://github.com/pypa/pip/issues/355 (now fixed)
#
# To work around this problem, we monkeypatch
# setuptools.command.egg_info.write_toplevel_names to not write the line
# "twisted". This fixes the behavior of `pip uninstall Package`. Note that
# even with this workaround, `pip uninstall Package` still correctly uninstalls
# Package's twistd plugins from twisted/plugins/, since pip also uses
# Package.egg-info/installed-files.txt to determine what to uninstall,
# and the paths to the plugin files are indeed listed in installed-files.txt.
try:
from setuptools.command import egg_info
egg_info.write_toplevel_names
except (ImportError, AttributeError):
pass
else:
def _top_level_package(name):
return name.split('.', 1)[0]
def _hacked_write_toplevel_names(cmd, basename, filename):
pkgs = dict.fromkeys(
[_top_level_package(k)
for k in cmd.distribution.iter_distribution_names()
if _top_level_package(k) != "twisted"
]
)
cmd.write_file("top-level names", filename, '\n'.join(pkgs) + '\n')
egg_info.write_toplevel_names = _hacked_write_toplevel_names
setup(
name='MyPackage',
version='1.0',
description="You can do anything with MyPackage, anything at all.",
url="http://example.com/",
author="John Doe",
author_email="jdoe#example.com",
packages=['mypackage', 'twisted.plugins'],
# You may want more options here, including install_requires=,
# package_data=, and classifiers=
)
# Make Twisted regenerate the dropin.cache, if possible. This is necessary
# because in a site-wide install, dropin.cache cannot be rewritten by
# normal users.
try:
from twisted.plugin import IPlugin, getPlugins
except ImportError:
pass
else:
list(getPlugins(IPlugin))
I've tested this with pip install, pip install --user, and easy_install. With any install method, the above monkeypatch and pip uninstall work fine.
You might be wondering: do I need to clear the monkeypatch to avoid messing up the next install? (e.g. pip install --no-deps MyPackage Twisted; you wouldn't want to affect Twisted's top_level.txt.) The answer is no; the monkeypatch does not affect another install because pip spawns a new python for each install.
Related: keep in mind that in your project, you must not have a file twisted/plugins/__init__.py. If you see this warning during installation:
package init file 'twisted/plugins/__init__.py' not found (or not a regular file)
it is completely normal and you should not try to fix it by adding an __init__.py.
Here is a blog entry which describes doing it with 'package_data':
http://chrismiles.livejournal.com/23399.html
In what weird ways can that fail? It could fail if the installation of the package doesn't put the package data into a directory which is on the sys.path. In that case the Twisted plugin loader wouldn't find it. However, all installations of Python packages that I know of will put it into the same directory where they are installing the Python modules or packages themselves, so that won't be a problem.
Maybe you could adapt the package_data idea to use data_files instead: it wouldn’t require you to list twisted.plugins as package, as it uses absolute paths. It would still be a kludge, though.
My tests with pure distutils have told me that its is possible to overwrite files from another distribution. I wanted to test poor man’s namespace packages using pkgutil.extend_path and distutils, and it turns out that I can install spam/ham/__init__.py with spam.ham/setup.py and spam/eggs/__init__.py with spam.eggs/setup.py. Directories are not a problem, but files will be happily overwritten. I think this is actually undefined behavior in distutils which trickles up to setuptools and pip, so pip could IMO close as wontfix.
What is the usual way to install Twisted plugins? Drop-it-here by hand?
I use this approach:
Put '.py' and '.pyc' versions of your file to "twisted/plugins/" folder inside your package.
Note that '.pyc' file can be empty, it just should exist.
In setup.py specify copying both files to a library folder (make sure that you will not overwrite existing plugins!). For example:
# setup.py
from distutils import sysconfig
LIB_PATH = sysconfig.get_python_lib()
# ...
plugin_name = '<your_package>/twisted/plugins/<plugin_name>'
# '.pyc' extension is necessary for correct plugins removing
data_files = [
(os.path.join(LIB_PATH, 'twisted', 'plugins'),
[''.join((plugin_name, extension)) for extension in ('.py', '.pyc')])
]
setup(
# ...
data_files=data_files
)
I'm trying to deploy OpenERP with a buildout and my own piece of code. In fact I would like to build a complete deployement structure allowing me to use OpenERP with custom modules and patch.
First of all, before adding any personnal configuration, I was trying to create a buildout which will have the responsability to configure everything.
Buildout Configuration
My buildout.cfg configuration file look like this:
[buildout]
parts = eggs
versions=versions
newest = false
extensions = lovely.buildouthttp
unzip = true
find-links =
http://download.gna.org/pychart/
[versions]
[eggs]
recipe = zc.recipe.egg
interpreter = python
eggs =
Paste
PasteScript
PasteDeploy
psycopg2
PyChart
pydot
openerp-server
Configuration problem
But when trying to launch the buildout I have a couples of errors when trying to install the last needed egg (openerp-server)
On my side it just cannot find these modules, but they are in my eggs dir:
Error: python module psycopg2 (PostgreSQL module) is required
Error: python module libxslt (libxslt python bindings) is required
Error: python module pychart (pychart module) is required
Error: python module pydot (pydot module) is required
error: Setup script exited with 1
An error occured when trying to install openerp-server 5.0.0-3. Look above this message for any errors that were output by easy_install.
Is this possible that openerp hardcoded the his searching path somewhere ?
easy_install, a try
I decided to give a try to a clean virtualenv without any relation to the main site-package. But when using easy_install on openerp-server:
$ source openerp-python/bin/activate
$ easy_install openerp-server
...
File "build/bdist.linux-i686/egg/pkg_resources.py", line 887, in extraction_error
pkg_resources.ExtractionError: Can't extract file(s) to egg cache
The following error occurred while trying to extract file(s) to the Python egg
cache:
SandboxViolation: mkdir('/home/mlhamel/.python-eggs/psycopg2-2.0.13-py2.5-linux-x86_64.egg-tmp', 511) {}
I have always the error message however psyopg2 was installed or not on my machine
System's Configuration
Ubuntu 9.10 x86-64
Tried on Python 2.5/Python 2.6
Ok I did this recently:
Don't try to install the egg, openerp is not really standard.
I used this buildout snippet:
# get the openerp-stuff as a distutils package
[openerp-server]
recipe = zerokspot.recipe.distutils
urls = http://www.openerp.com/download/stable/source/openerp-server-5.0.6.tar.gz
# similar idea for the web component
[openerp-web]
recipe = zc.recipe.egg:scripts
find-links = http://www.openerp.com/download/stable/source/openerp-web-5.0.6.tar.gz
# add some symlinks so you can run it out of bin
[server-symlinks]
recipe = cns.recipe.symlink
symlink = ${buildout:parts-directory}/openerp-server/bin/openerp-server = ${buildout:bin-directory}
The key however, is that I did not use virtualenv. You don't need to with buildout. Buildout + virtualenv is like Trojan + Ramses... one is enough, unless you are ... well one is enough. ;)
Now for this particular project I had followed the debian instructions and installed the required libs via aptitude. This was only because I was new to buildout at the time, one could just as easily install the psycopg2 module
Here are some excellent instructions. Ignore the django stuff if you don't need it. Dan Fairs is both a great writer and great buildout tutor. Check it out. Disclaimer: I am a disciple of the man, based on his buildout usage.
I am certain you do not want to use the egg on pypi, it never worked for me, openerp is not eggified, it's a distutils package.
Good luck!
Just for the record: there is a buildout recipe for OpenERP available in Pypi.
I'm not familiar with buildout, but if I were going to try building an OpenERP installer, I'd start by looking at the nice one from Open Source Consulting. I've used it and been pretty happy with it.
Last time I checked, it doesn't set up the CRM e-mail gateway, but everything else I need was covered.