I have an open source project (GridCal) and I tell users to install the package with pip install GridCal or pip3 install GridCal for unix systems.
The setup file is this:
from distutils.core import setup
import sys
import os
import platform
from GridCal.grid.CalculationEngine import __GridCal_VERSION__
name = "GridCal"
version = str(__GridCal_VERSION__)
description = "Research Oriented electrical simulation software."
# Python 3.5 or later needed
if sys.version_info < (3, 5, 0, 'final', 0):
raise (SystemExit, 'Python 3.5 or later is required!')
# Build a list of all project modules
packages = []
for dir_name, dir_names, file_names in os.walk(name):
if '__init__.py' in file_names:
packages.append(dir_name.replace('/', '.'))
package_dir = {name: name}
# Data_files (e.g. doc) needs (directory, files-in-this-directory) tuples
data_files = []
for dir_name, dir_names, file_names in os.walk('doc'):
files_list = []
for filename in file_names:
fullname = os.path.join(dir_name, filename)
files_list.append(fullname)
data_files.append(('share/' + name + '/' + dir_name, files_list))
if platform.system() == 'Windows':
# list the packages (On windows anaconda is assumed)
required_packages = ["numpy",
"scipy",
"networkx",
"pandas",
"xlwt",
"xlrd",
# "PyQt5",
"matplotlib",
"qtconsole",
"pysot",
"openpyxl",
"pulp"
]
else:
# make the desktop entry
make_linux_desktop_file(version_=version, comment=description)
# list the packages
required_packages = ["numpy",
"scipy",
"networkx",
"pandas",
"xlwt",
"xlrd",
"PyQt5",
"matplotlib",
"qtconsole",
"pysot",
"openpyxl",
"pulp"
]
# Read the license
with open('LICENSE.txt', 'r') as f:
license_text = f.read()
setup(
# Application name:
name=name,
# Version number (initial):
version=version,
# Application author details:
author="Santiago Peñate Vera",
author_email="santiago.penate.vera#gmail.com",
# Packages
packages=packages,
data_files=data_files,
# Include additional files into the package
include_package_data=True,
# Details
url="http://pypi.python.org/pypi/GridCal/",
# License file
license=license_text,
# description
description=description,
# long_description=open("README.txt").read(),
# Dependent packages (distributions)
install_requires=required_packages,
setup_requires=required_packages
)
From time to time I get users reports saying that the program is missing modules: https://github.com/SanPen/GridCal/issues/12
I have specified the list of packages both in install_requires and setup_requires.
Is this a pip bug, or shall I do something else?
Your setup.py imports GridCal.grid.CalculationEngine which imports almost all of your dependencies. I.e. your setup.py imports dependencies before installing them.
Try to install it in a new empty virtual env detached from your global site-packages — that surely doesn't work:
$ virtualenv --no-site-packages --python python3.4 test-gcal
Running virtualenv with interpreter /usr/bin/python3.4
Using base prefix '/usr'
New python executable in /home/phd/tmp/test-gcal/bin/python3.4
Also creating executable in /home/phd/tmp/test-gcal/bin/python
Installing setuptools, pip, wheel...done.
$ source test-gcal/bin/activate
$ pip install GridCal
Collecting GridCal
Using cached GridCal-1.85.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-build-c7q9pbep/GridCal/setup.py", line 5, in <module>
from GridCal.grid.CalculationEngine import __GridCal_VERSION__
File "/tmp/pip-build-c7q9pbep/GridCal/GridCal/grid/CalculationEngine.py", line 18, in <module>
from GridCal.grid.JacobianBased import IwamotoNR, Jacobian, LevenbergMarquardtPF
File "/tmp/pip-build-c7q9pbep/GridCal/GridCal/grid/JacobianBased.py", line 19, in <module>
from numpy import array, angle, exp, linalg, r_, Inf, conj, diag, asmatrix, asarray, zeros_like, zeros, complex128, \
ImportError: No module named 'numpy'
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-c7q9pbep/GridCal/
The fix is relatively straightforward: you have to move __GridCal_VERSION__ from GridCal/Engine/CalculationEngine.py to a separate GridCal/version.py (or __version__.py or something like this) and do from GridCal.version import __GridCal_VERSION__ in setup.py.
Please remember that import would work only if your GridCal/__init__.py is empty or only imports builtin/standard modules. If said __init__.py directly or indirectly imports a (not yet installed) dependency version.py could not be imported. There is a way to overcome this in setup.py but I skip it for now. If your ever will need the solution — ask again.
Related
I have been working on a Python package which wraps some C++ libraries that need to be built from source. I build these with CMake, and I want the whole thing to be 'pip install'able in the end. I am almost there, however I am having problems getting the libraries built by CMake to end up in the final Python installation directory.
I managed to get them into the final 'wheel', oddly enough, but they aren't in my site_packages directory.
My setup.py file looks like this:
import os
import re
import sys
import sysconfig
import site
import platform
import subprocess
import pathlib
from distutils.version import LooseVersion
from setuptools import setup, Extension
from setuptools.command.build_ext import build_ext as build_ext_orig
class CMakeExtension(Extension):
def __init__(self, name, sourcedir=''):
Extension.__init__(self, name, sources=[])
self.sourcedir = os.path.abspath(sourcedir)
class CMakeBuild(build_ext_orig):
def run(self):
try:
out = subprocess.check_output(['cmake', '--version'])
except OSError:
raise RuntimeError("CMake must be installed to build the following extensions: " +
", ".join(e.name for e in self.extensions))
if platform.system() == "Windows":
raise RuntimeError("Sorry, pyScannerBit doesn't work on Windows platforms. Please use Linux or OSX.")
for ext in self.extensions:
self.build_extension(ext)
def build_extension(self, ext):
extdir = os.path.abspath(os.path.dirname(self.get_ext_fullpath(ext.name)))
cmake_args = ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=' + extdir,
'-DPYTHON_EXECUTABLE=' + sys.executable,
'-DCMAKE_VERBOSE_MAKEFILE:BOOL=OFF',
'-Wno-dev',
'-DCMAKE_RUNTIME_OUTPUT_DIRECTORY=' + extdir,
'-DSCANNERBIT_STANDALONE=True',
'-DCMAKE_INSTALL_RPATH=$ORIGIN',
'-DCMAKE_BUILD_WITH_INSTALL_RPATH:BOOL=ON',
'-DCMAKE_INSTALL_RPATH_USE_LINK_PATH:BOOL=ON',
'-DCMAKE_INSTALL_PREFIX:PATH=' + extdir,
]
cfg = 'Debug' if self.debug else 'Release'
build_args = ['--config', cfg]
if platform.system() == "Windows":
cmake_args += ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_{}={}'.format(cfg.upper(), extdir)]
if sys.maxsize > 2**32:
cmake_args += ['-A', 'x64']
build_args += ['--', '/m']
else:
cmake_args += ['-DCMAKE_BUILD_TYPE=' + cfg]
build_args += ['--', '-j2']
env = os.environ.copy()
env['CXXFLAGS'] = '{} -DVERSION_INFO=\\"{}\\"'.format(env.get('CXXFLAGS', ''),
self.distribution.get_version())
if not os.path.exists(self.build_temp):
os.makedirs(self.build_temp)
# untar ScannerBit tarball
subprocess.check_call(['tar','-C','pyscannerbit/scannerbit/untar/ScannerBit','-xf','pyscannerbit/scannerbit/ScannerBit_stripped.tar','--strip-components=1'], cwd=ext.sourcedir, env=env)
# First cmake
subprocess.check_call(['cmake', ext.sourcedir] + cmake_args, cwd=self.build_temp, env=env)
# Build all the scanners
subprocess.check_call(['cmake', '--build', '.', '--target', 'multinest'] + build_args, cwd=self.build_temp)
# Re-run cmake to detect built scanner plugins
subprocess.check_call(['cmake', ext.sourcedir], cwd=self.build_temp)
# Main build
subprocess.check_call(['cmake', '--build', '.'] + build_args, cwd=self.build_temp)
# Install
#subprocess.check_call(['cmake', '--build', '.', '--target', 'install'], cwd=self.build_temp)
setup(
name='pyscannerbit',
version='0.0.8',
author='Ben Farmer',
# Add yourself if you contribute to this package
author_email='ben.farmer#gmail.com',
description='A python interface to the GAMBIT scanning module, ScannerBit',
long_description='',
ext_modules=[CMakeExtension('_interface')],
cmdclass=dict(build_ext=CMakeBuild),
zip_safe=False,
packages=['pyscannerbit'],
)
As you can see, I am telling CMake to build the libraries in 'extdir', which it turns out is
/tmp/pip-req-build-d7mfvn1a/build/lib.linux-x86_64-3.6
I had assumed that the files would just be copied from here (or some other temporary directory?) into the final install path in bulk, but perhaps it doesn't work like that (though as I said earlier, these built files do end up in the wheel that is generated). Do these built files need to be added to MANIFEST.in or some 'package_data' entry or something like that? Currently they are not listed anywhere like that, since it was my understanding that those were for moving files around pre-build, not post-build. Currently I only use MANIFEST.in to make sure my sdist tarball gets filled correctly.
For completeness, I am building the package with pip as follows:
python setup.py sdist
pip install -v dist/pyscannerbit-0.0.8.tar.gz
This is just so I know that the build from the tarball works, for later use with PyPI.
The source is on github if you want to try it out: https://github.com/bjfar/pyscannerbit
Ok so it seems that I just had the paths a bit wrong. I previously was setting the CMAKE_LIBRARY_OUTPUT_DIRECTORY to
extdir = os.path.abspath(os.path.dirname(self.get_ext_fullpath(ext.name)))
However I needed to point it to
extdir+'/pyscannerbit'
where pyscannerbit is the name of the package. Otherwise the files end up in the parent directory where the build occurs, but not inside the project directory. So then they don't subsequently get copied to the install path.
I am making a Python package that has a C++-extension module and someone else's shared library that it requires. I want everything installable via pip. My current setup.py file works when I use pip install -e . but when I don't use develop mode (e.i. omit the -e) I get "cannot open shared object file" when importing the module in Python. I believe the reason is that setuptools doesn't consider the shared library to be part of my package, so the relative link to the library is broken during installation when files are copied to the install directory.
Here is what my setup.py file looks like:
from setuptools import setup, Extension, Command
import setuptools.command.develop
import setuptools.command.build_ext
import setuptools.command.install
import distutils.command.build
import subprocess
import sys
import os
# This function downloads and builds the shared-library
def run_clib_install_script():
build_clib_cmd = ['bash', 'clib_install.sh']
if subprocess.call(build_clib_cmd) != 0:
sys.exit("Failed to build C++ dependencies")
# I make a new command that will build the shared-library
class build_clib(Command):
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
run_clib_install_script()
# I subclass install so that it will call my new command
class install(setuptools.command.install.install):
def run(self):
self.run_command('build_clib')
setuptools.command.install.install.run(self)
# I do the same for build...
class build(distutils.command.build.build):
sub_commands = [
('build_clib', lambda self: True),
] + distutils.command.build.build.sub_commands
# ...and the same for develop
class develop(setuptools.command.develop.develop):
def run(self):
self.run_command('build_clib')
setuptools.command.develop.develop.run(self)
# These are my includes...
# note that /clib/include only exists after calling clib_install.sh
cwd = os.path.dirname(os.path.abspath(__file__))
include_dirs = [
cwd,
cwd + '/clib/include',
cwd + '/common',
]
# These are my arguments for the compiler to my shared-library
lib_path = os.path.join(cwd, "clib", "lib")
library_dirs = [lib_path]
link_args = [os.path.join(lib_path, "libclib.so")]
# My extension module gets these arguments so it can link to clib
mygen_module = Extension('mygen',
language="c++14",
sources=["common/mygen.cpp"],
libraries=['clib'],
extra_compile_args=['-std=c++14'],
include_dirs=include_dirs,
library_dirs=library_dirs,
extra_link_args=link_args
+ ['-Wl,-rpath,$ORIGIN/../clib/lib'])
# I use cmdclass to override the default setuptool commands
setup(name='mypack',
cmdclass = {'install': install,
'build_clib': build_clib, 'build': build,
'develop': develop},
packages=['mypack'],
ext_package='mypack',
ext_modules=[mygen_module],
# package_dir={'mypack': '.'},
# package_data={'mypack': ['docs/*md']},
include_package_data=True)
I subclass some of the setuptools commands in order to build the shared-library before it compiles the extension. clib_install.sh is a bash script that locally downloads and builds the shared library in /clib, creating the headers (in /clib/include) and .so file (in /clib/lib). To solve problems with linking to shared-library dependencies I used $ORIGIN/../clib/lib as a link argument so that the absolute path to clib isn't needed.
Unfortunately, the /clib directory doesn't get copied to the install location. I tried tinkering with package_data but it didn't copy my directory over. In fact, I don't even know what pip/setuptools does with /clib after the script is called, I guess it is made in some temporary build directory and gets deleted after. I am not sure how to get /clib to where it needs to be after it is made.
package_data={
'mypack': [
'clib/include/*.h',
'clib/lib/*.so',
'docs/*md',
]
},
When generating an executable on python 2.7 using py2exe (0.6.9) or cx_freeze (5.0.1) with APScheduler (3.3.1) it gives me the following error:
File "apscheduler\__init__.pyc", line 2, in <module>
File "pkg_resources\__init__.pyc", line 552, in get_distribution
File "pkg_resources\__init__.pyc", line 426, in get_provider
File "pkg_resources\__init__.pyc", line 968, in require
File "pkg_resources\__init__.pyc", line 854, in resolve
pkg_resources.DistributionNotFound: The 'APScheduler' distribution was not found and is required by the application
This is my cx_freeze setup.py file:
import sys
from cx_Freeze import setup, Executable
# Dependencies are automatically detected, but it might need fine tuning.
build_exe_options = {"includes": ["requests", "apscheduler"], "include_files": ["XXX"]}
# GUI applications require a different base on Windows (the default is for a
# console application).
base = None
if sys.platform == "win32":
base = "Win32GUI"
setup( name = "XXX",
version = "XXX",
description = "XXX",
options = {"build_exe": build_exe_options},
executables = [Executable("XXX.py", base=base), Executable("XXX2.py", base=base)])
And this is my py2exe setup.py file:
from distutils.core import setup
import py2exe
data_files = ['XXX']
setup(
data_files=data_files,
windows=[
{'script': 'XXX.py'},
{'script': 'XXX2.py'},
],
options={'py2exe':{
'includes': ['requests', 'apscheduler'],
'bundle_files': 1,
}
},
)
I already tried to use the 'packages' option but with no success.
If I remove the code from APScheduler __init__.py (apscheduler/__init__.py), it works.
Below is the __init__.py from APScheduler package:
# These will be removed in APScheduler 4.0.
release = __import__('pkg_resources').get_distribution('apscheduler').version.split('-')[0]
version_info = tuple(int(x) if x.isdigit() else x for x in release.split('.'))
version = __version__ = '.'.join(str(x) for x in version_info[:3])
Do I need to include somehow the dependencies into py2exe/cx_freeze library pack?
I already did some research on the internet but with no success.
Found the solution. The problem was py2exe doesn't include the dist-info directory to library.zip. Each module has its own dist-info directory in the site-packages python libraries.
Those directories are used by pkg_resources libraries to import modules
as we can see on the apscheduler __init__.py:2
All you have to do is to add those dist-info directories to the library.zip file generated by py2exe.
Below is a example from google importing the (non-related, but could be used as an example) zoneinfo directory to library.zip.
https://github.com/google/transitfeed/blob/master/setup.py#L96
To get the dist-info path of a module, use follow:
import os
import pkg_resources
dist_info_dir = pkg_resources.get_distribution('desired_module')._provider.egg_info
# get base name of directory
base_name = os.path.basename(dist_info_dir)
Note: If there is no dist-info directory and an egg-info instead. You should probably search for the wheel package for the library or build yourself with:
python setup.py bdist_wheel
Cheers!
I am trying to build and distribute rpm package of python module for centos. I have followed following steps
created virtualenv and installed requires
in module added setup.py with install_requires.
then using python2.7 from virtualenv build package
../env/bin/python2.7 setup.py bdist_rpm
Now I got src, no-arch and tar-gz files in 'dist' folder.
foo-0.1-1.noarch.rpm, foo-0.1-1.src.rpm, foo-0.1.tar.gz
I tried to install package src-rpm using 'sudo yum install foo-0.1-1.src.rpm',
got error something like wrong architecture
Then I tried to install package no-arch, 'sudo yum install foo-0.1-1.noarch.rpm' it works smoothly.
But after running script, it gave some import error. here I expect to download that module automatically.
The last thing is I am using some third party library which is not on pip.
So I want to whole setup using virtualenv with required modules. So after installing rpm, user can run script directly instead of installing third party libs separately and explicitly.
Some above steps may sounds wrong, as I am new to this stuff.
Following is code in setup.py
from setuptools import setup, find_packages
setup(
name = "foo",
version = "0.1",
packages = find_packages(),
scripts = ['foo/bar.py', ],
# Project uses reStructuredText, so ensure that the docutils get
# installed or upgraded on the target machine
install_requires = ['PyYAML', 'pyOpenSSL', 'pycrypto', 'privatelib1,'privatelib2', 'zope.interface'],
package_data = {
# If any package contains *.txt or *.rst files, include them:
'': ['*.txt', '*.rst'],
# And include any *.msg files found in the 'billing' package, too:
'foo': ['*.msg'],
},
# metadata for upload to PyPI
author = "foo bar",
description = "foo bar",
license = "",
keywords = "foo bar",
# could also include long_description, download_url, classifiers, etc.
)
Also I am using shebang in script as,
#!/usr/bin/env python2.7
Note:
I have multiple python setups. 2.6 and 2.7
By default 'python' commands gives 2.6
while command 'python2.7' gives python2.7
output of `'rpm -qp foo-0.1-1.noarch.rpm --requires' =>
`/usr/bin/python
python(abi) = 2.6
rpmlib(CompressedFileNames) <= 3.0.4-1
rpmlib(PayloadFilesHavePrefix) <= 4.0-1
When i install pakcage. script's shebang line (which is now '/usr/bin/bar.py') is getting changed to /usr/bin/python' But I exclusively want to run script on python2.7.
Thanks in advance
I need to install a python module in the site packages that also will be used as a command line application. Suppose I have a module like:
app.py
def main():
print 'Dummy message'
if __name__ == '__main__':
main()
setup.py
import distutils
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
if __name__ == '__main__':
setup(name = 'dummy',
version = '1.0',
packages = ['dummy'],
)
Creating the dist by:
setup.py sdist
Install:
setup.py install
And now I would like to use it as a command line application by opening the command window and typing just: dummy
Is it possible to create such application under windows without to carry out registering system pat variables and so on ...
You can use the options in setup.py to declare command line scripts. Please refer to this article. On Windows, the script will be created in "C:\Python26\Scripts" (if you didn't change the path) - lots of tools store their scripts there (e.g. "easy_install", "hg", ...).
Put the following in dummy.cmd:
python.exe -m dummy
Or is it dummy.app...
Oh well, it's one of those.