There dont seem to be any instructions on how to build this sucker. Downloaded from http://benjamin.smedbergs.us/pymake/
The usual files are not in the top directory - make.py and mkparse.py. neither of them seem to do much.. seems like it needs a makefile, but there isn't one in any part of the distro..
> python make.py build
make.py[0]: Entering directory '/Users/ron/lib/pymake-default'
No makefile found
any hints?
pymake is a make utility, and running make.py looks for a Makefile (that you've created, for your own project). There's no build step specifically required for pymake itself.
Related
Imaging that we are given a finished C++ source code of a library, called MyAwesomeLib. The goal is to expose some of its power to python, so we create a wrapper using swig and generated a python package called PyMyAwesomeLib.
The directory structure now looks like
root_dir
|-src/
|-lib/
| |- libMyAwesomeLib.so
| |- _PyMyAwesomeLib.so
|-swig/
| |- PyMyAwesomeLib.py
|-python/
|- Script_using_myawesomelib.py
So far so good. Ideally, all we want to do next is to copy lib/*.so swig/*.py and python/*.py into the corresponding directory in site-packages in a pythonic way, i.e. using
python setup.py install
However, I got very confused when trying to achieve this simple goal using setuptools and distutils. Both tools handles the compilation of python extensions through an internal system, where the source file, compiler flags etc. are passed using setup(ext_module=[Extension(...)]). But this is ridiculous since MyAsesomeLib has a fully functioning build system that is based on makefile. Porting the logic embedded in makefiles would be redundant and completely un-necessary work.
After some research, it seems there are two options left, I can either override setuptools.command.build and setuptools.command.install to use the existing makefile and copy the results directly, or I can somehow let setuptools know about these files and ask it to copy them during installation. The second way is more appealing, but it is what gives me the most headache. I have tried the following optionts without success
package_data, and include_package_data does not work because *.so files are not under version control and they are not inside of any package.
data_files does not seems to work since the files only get included when running python setup.py sdist, but ignored when python setup.py install. This is the opposite of what I want. The .so files should not be included in the source distribution, but get copied during the installation step.
MANIFEST.in failed for the same reason as data_files.
eager_resources does not work either, but honestly I do not know the difference between eager_resources and data_files or MANIFEST.in.
I think this is actually a common situation, and I hope there is a simple solution to it. Any help would be greatly appreciated.
Porting the logic embedded in makefiles would be redundant and
completely un-necessary work.
Unfortunately, that's exactly what I had to do. I've been struggling with this same issue for a while now.
Porting it over actually wasn't too bad. distutils does understand SWIG extensions, but it this was implemented rather haphazardly on their part. Running SWIG creates Python files, and the current build order assumes that all Python files have been accounted for before running build_ext. That one wasn't too hard to fix, but it's annoying that they would claim to support SWIG without mentioning this. Distutils attempts to be cross-platform when compiling things, so there is still an advantage to using it.
If you don't want to port your entire build system over, use the system's package manager. Many complex libraries do this (but they also try their best with setup.py). For example, to get numpy and lxml on Ubuntu you'd just do:
sudo apt-get install python-numpy python-lxml. No pip.
I realize you'd rather write one setup file instead of dealing with every package manager ever so this is probably not very helpful.
If you do try to go the setuptools route there is one fatal flaw I ran into: dependencies.
For instance, if you are distributing a SWIG-based project, it's going to need libpython. If they don't have it, an error like this happens:
#include <Python.h>
error: File not found
That's pretty unhelpful to the average user.
Even worse, if you require a shared library but the user's library is out of date, the user can get some crazy errors. You're at the mercy of their C++ compiler to output Google-friendly error messages so they can figure it out.
The long-term solution would be to get setuptools/distutils to get better at detecting non-python libraries, hopefully as good as Ruby's gem. I pretty much had to roll my own. For instance, in this setup.py I'm working on you can see a few functions at the top I hacked together for dependency detection (still doesn't work on all systems...definitely not Windows).
So I created a setup.py script for my python program with distutils and I think it behaves a bit strange. First off it installs all data_files into /usr/local/my_directory by default which is a bit weird since this isn't a really common place to store data, is it?
I changed the path to /usr/share/my_directory/. But now I'm not able to write to the database inside that directory and I can't set required permission from within setup.py neither since the actual database file has not been created when I run it.
Is my approach wrong? Should I use another tool for distributing?
Because at least for Linux, writing a simple setup sh script seems easier to me at the moment.
The immediate solution is to invoke setup.py with --prefix=/the/path/you/want.
A better approach would be to include the data as package_data. This way they will be installed along side your python package and you'll find it much easier to manage it (find paths etc).
I read in the Python documentation:
The build command is responsible for putting the files to install into a build directory.
I fear this documentation may be incomplete. Does python setup.py build do anything else? I expect this step to generate object files with Python bytecode, which will be interpreted at execution time by Python VM.
Also, I'm building an automated code check in my source code repository. I want to know if there is any benefit of running setup.py build (does it do any checks?) or is a static code/PEP8 checker such as Pylint good enough?
Does python setupy.py build do anything else?
If your package contains C extensions (or defines some custom compilation tasks), they will be compiled too. If you only have Python files in your package, copying is all build does.
I expect this step to generate object files with Python bytecode, which will be interpreted at execution time by Python VM.
No, build does not do that. This happens at install stage.
I want to know if there is any benefit of running setup.py build (Does it do any checks?) or is a static code/PEP8 checker such as Pylint good enough?
By all means, run pylint. build does not even check the syntax.
I'd like to to install Django into a custom location, I've read the distutils documentation and it suggests that I should be able to do something like the following to install under my home directory (when run from an unpacked django tarball).
> python setup.py install --home=~/code/packages/install --install-purelib=modules --install-platlib=modules --install-scripts=scripts --install-data=data
However, every time I run this, it doesn't seem to concatenate the home path with the separate element paths, and so I simply end up with
modules/
scripts/
data/
In the unpacked tar ball directory. I.e. it seems to be treating modules, scripts etc as simply relative paths to local directory and not relative to the --home specified.
I've tried setting the root with --prefix, and using a setup.cfg and nothing seems to work. --prefix and and --home on their own with no other overrides work, but when used together with --install-xxx overrides it doesn't.
I'm either probably doing something stupid, or the documentation is wrong, or their is a bug. Any help much obliged.
I would strongly suggest that you look at Virtualenv and Pip for creating basically silos of python packages.
The Pinax project uses this exclusively now for bundling requirements together for other people to use, and it's becoming more and more of a defacto standard in the reusable apps space.
Ok, so I've been looking at the distutils source code to see what is going on - distutils.command.install does all of the pathname manipulation.
It turns out that the documentation is actually incorrect. Whenever an --install-xxxx style option is provided it completely overrides any value that might be derived from --home or --prefix - the code not does do any straightforward concatenation of paths.
However, what it does do is variable substitution of a set of special variables. The one of interest to me specifically is $base. Using it on the command line you can define the overrides, and distutils will replace all occurrences with what was specified for --home etc. But note you must quote your filenames so BASH does not try expand it as a environment variable.
So the command line that I had initially, becomes:
python setup.py install --home=/home/andre/code/packages/install --install-purelib='$base/modules' \
--install-platlib='$base/modules' --install-scripts='$base/scripts' --install-data='$base/data'
Hope someone other than me finds that useful!
As a quick check, I'd suggest replacing
~/code/packages/install
with
/full_path_to_your_user/code/packages/install
If you just want it in your home directory, there's no need to install it at all. Just make sure that the container directory is on your pythonpath somewhere, and move the scripts in django/bin into somewhere on your main PATH (or add that dir to your path).
I'm in the middle of reworking our build scripts to be based upon the wonderful Waf tool (I did use SCons for ages but its just way too slow).
Anyway, I've hit the following situation and I cannot find a resolution to it:
I have a product that depends on a number of previously built egg files.
I'm trying to package the product using PyInstaller as part of the build process.
I build the dependencies first.
Next I want to run PyInstaller to package the product that depends on the eggs I built. I need PyInstaller to be able to load those egg files as part of it's packaging process.
This sounds easy: you work out what PYTHONPATH should be, construct a copy of sys.environ setting the variable up correctly, and then invoke the PyInstaller script using subprocess.Popen passing the previously configured environment as the env argument.
The problem is that setting PYTHONPATH alone does not seem to be enough if the eggs you are adding are extension modules that are packaged as zipsafe. In this case, it turns out that the embedded libraries are not able to be imported.
If I unzip the eggs (renaming the directories to .egg), I can import them with no further settings but this is not what I want in this case.
I can also get the eggs to import from a subshell by doing the following:
Setting PYTHONPATH to the directory that contains the egg you want to import (not the path of the egg itself)
Loading a python shell and using pkg_resources.require to locate the egg.
Once this has been done, the egg loads as normal. Again, this is not practical because I need to be able to run my python shell in a manner where it is ready to import these eggs from the off.
The dirty option would be to output a wrapper script that took the above actions before calling the real target script but this seems like the wrong thing to do: there must be a better way to do this.
Heh, I think this was my bad. The issue appear to have been that the zipsafe flag in setup.py for the extension package was set to False, which appears to affect your ability to treat it as such at all.
Now that I've set that to True I can import the egg files, simply by adding each one to the PYTHONPATH.
I hope someone else finds this answer useful one day!
Although you have a solution, you could always try "virtualenv" that creates a virtual environment of python where you can install and test Python Packages without messing with the core system python:
http://pypi.python.org/pypi/virtualenv