How setup.py install npm module? - python

I implemented a python web client that I would like to test.
The server is hosted in npm registry. The server gets ran locally with node before running my functional tests.
How can I install properly the npm module from my setup.py script?
Here is my current solution inspired from this post:
class CustomInstallCommand(install):
def run(self):
arguments = [
'npm',
'install',
'--prefix',
'test/functional',
'promisify'
]
subprocess.call(arguments, shell=True)
install.run(self)
setup(
cmdclass={'install': CustomInstallCommand},

from setuptools.command.build_py import build_py
class NPMInstall(build_py):
def run(self):
self.run_command('npm install --prefix test/functional promisify')
build_py.run(self)
OR
from distutils.command.build import build
class NPMInstall(build):
def run(self):
self.run_command("npm install --prefix test/functional promisify")
build.run(self)
finally:
setuptools.setup(
cmdclass={
'npm_install': NPMInstall
},
# Usual setup() args.
# ...
)
Also look here

You are very close, Here is a simple function that does just that, you can remove "--global" option is you want to install the package for the current project only, keep in mind the the command shell=True could present security risks
import subprocess
def npm_install(args=["npm","--global", "install", "search-index"])
subprocess.Popen(args, shell=True)

Related

How do I generate python grpc code from within a setuptools installer (setup.py)?

We have some proto files for gRPC in a repo and I read that it is not good to commit generated code. So I figured I need to have the generation as part of the package installation (e.g. setuptools, setup.py)
However, to generate gRPC code, you need to first install the package by running pip install grpcio-tools according to the docs. But the purpose of setup.py is to automatically pull down dependencies like grpcio-tools.
So is there a best-practice for doing this? As in, how to generate code that depends on another python package from within setuptools? Am I better off just create a separate build.sh script that manually pip-installs and generates the code? Or should I expect users of the package to already have grpcio-tools installed?
As far as I know, the "current" best practice is:
pip manages dependencies
setup.py performs build
Executing "pip install ." is almost equivalent to perform "pip install -r requirements.txt" + "python setup.py build" + "python setup.py install".
This is a custom command that generates python sources from proto files:
class GrpcTool (Command):
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
import grpc_tools.protoc
proto_include = pkg_resources.resource_filename('grpc_tools', '_proto')
grpc_tools.protoc.main([
'grpc_tools.protoc',
'-I{}'.format(proto_include),
'--python_out=SOME_PATH/',
'--grpc_python_out=SOME_PATH/',
'SOME_PROTO.proto'
])
that is invoked customizing build_py command, like this:
class BuildPyCommand (build_py):
def run(self):
self.run_command('grpc')
super(BuildPyCommand, self).run()
Note the import inside the run method. It seems that pip run setup.py several times, both before and after having installed requirements. So if you have the import on top of file, the build fails.
Along with #makeroo approach, alternative way is to execute grpc_tools module as a subprocess.
The benefit of this approach is to receive a generation result for sure; 0 is a success and 1 for error.
proto_files = ["proto/file1.proto", "proto/file2.proto"]
import subprocess
for file in proto_files:
args = "--proto_path=. --python_out=. --grpc_python_out=. {0}".format(file)
result = subprocess.call("python -m grpc_tools.protoc " + args, shell=True)
print("grpc generation result for '{0}': code {1}".format(file, result))
Above code will create generated python files to proto directory where .proto files reside.

PyPa setup.py test scripts

I'm attempting to follow the advice and structure written in the python packaging docs. In the setup function you can specify dependencies for tests with tests_require. And you can run scripts on install just by specifying scripts. But can I have a script that is only run in the case of a setup for testing?
edit: Important parts of my setup.py
from setuptools import setup
# To use a consistent encoding
from codecs import open
from os import path
import subprocess
from setuptools import setup
from setuptools.command.test import test
class setupTestRequirements(test, object):
def run_script(self):
cmd = ['bash', 'bin/test_dnf_requirements.sh']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
ret = p.communicate()
print(ret[0])
def run(self):
self.run_script()
super(setupTestRequirements, self).run()
...
setup(
...
scripts=['bin/functional_dnf_requirements.sh'],
install_requires=['jira', 'PyYAML'],
tests_require=['flake8', 'autopep8', 'mock'],
cmdclass={'test': setupTestRequirements}
)
It does not mean the files in scripts will be executed on package installation. The keyword scripts is used to mark python files in your package that are intended to run as standalone programs after package installation (maybe the name is a bit misleading). Example: you have a file spam with the content:
#!/usr/bin/env python
if __name__ == '__main__':
print('eggs!')
If you mark this file as script by adding it to the scripts in your setup.py:
from setuptools import setup
setup(
...
scripts=['spam'],
)
after the package installation you can run spam as a standalone program in your terminal:
$ spam
eggs!
Read this tutorial for more info on command line scripts.
Now, if you want to execute custom code on testing, you have to override the default test command. In your setup.py:
from setuptools.command.test import test
class MyCustomTest(test):
def run(self):
print('>>>> this is my custom test command <<<<')
super().run()
setup(
...
cmdclass={'test': MyCustomTest}
)
Now you will notice an additional print when running tests:
$ python setup.py test
running test
>>>> this is my custom test command <<<<
running egg_info
...
running build_ext
----------------------------------------------------------------------
Ran 0 tests in 0.000s
OK
Edit: if you want to run a custom bash script before executing tests, adapt the MyCustomTest.run() method. Example script script.sh:
#!/usr/bin/env bash
echo -n ">>> this is my custom bash script <<<"
Adapting MyCustomTest class in setup.py:
import subprocess
from setuptools import setup
from setuptools.command.test import test
class MyCustomTest(test):
def run_some_script(self):
cmd = ['bash', 'script.sh']
# python3.5 and above
# ret = subprocess.run(cmd, stdout=subprocess.PIPE, universal_newlines=True)
# print(ret.stdout)
# old python2 versions
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
ret = p.communicate()
print(ret[0])
def run(self):
self.run_some_script()
super().run()
Output:
$ python setup.py test
running test
>>> this is my custom bash script <<<
running egg_info
writing spam.egg-info/PKG-INFO
writing dependency_links to spam.egg-info/dependency_links.txt
writing top-level names to spam.egg-info/top_level.txt
reading manifest file 'spam.egg-info/SOURCES.txt'
writing manifest file 'spam.egg-info/SOURCES.txt'
running build_ext
----------------------------------------------------------------------
Ran 0 tests in 0.000s
OK

How to include (script-built) libraries with package installation?

I am making a Python package that has a C++-extension module and someone else's shared library that it requires. I want everything installable via pip. My current setup.py file works when I use pip install -e . but when I don't use develop mode (e.i. omit the -e) I get "cannot open shared object file" when importing the module in Python. I believe the reason is that setuptools doesn't consider the shared library to be part of my package, so the relative link to the library is broken during installation when files are copied to the install directory.
Here is what my setup.py file looks like:
from setuptools import setup, Extension, Command
import setuptools.command.develop
import setuptools.command.build_ext
import setuptools.command.install
import distutils.command.build
import subprocess
import sys
import os
# This function downloads and builds the shared-library
def run_clib_install_script():
build_clib_cmd = ['bash', 'clib_install.sh']
if subprocess.call(build_clib_cmd) != 0:
sys.exit("Failed to build C++ dependencies")
# I make a new command that will build the shared-library
class build_clib(Command):
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
run_clib_install_script()
# I subclass install so that it will call my new command
class install(setuptools.command.install.install):
def run(self):
self.run_command('build_clib')
setuptools.command.install.install.run(self)
# I do the same for build...
class build(distutils.command.build.build):
sub_commands = [
('build_clib', lambda self: True),
] + distutils.command.build.build.sub_commands
# ...and the same for develop
class develop(setuptools.command.develop.develop):
def run(self):
self.run_command('build_clib')
setuptools.command.develop.develop.run(self)
# These are my includes...
# note that /clib/include only exists after calling clib_install.sh
cwd = os.path.dirname(os.path.abspath(__file__))
include_dirs = [
cwd,
cwd + '/clib/include',
cwd + '/common',
]
# These are my arguments for the compiler to my shared-library
lib_path = os.path.join(cwd, "clib", "lib")
library_dirs = [lib_path]
link_args = [os.path.join(lib_path, "libclib.so")]
# My extension module gets these arguments so it can link to clib
mygen_module = Extension('mygen',
language="c++14",
sources=["common/mygen.cpp"],
libraries=['clib'],
extra_compile_args=['-std=c++14'],
include_dirs=include_dirs,
library_dirs=library_dirs,
extra_link_args=link_args
+ ['-Wl,-rpath,$ORIGIN/../clib/lib'])
# I use cmdclass to override the default setuptool commands
setup(name='mypack',
cmdclass = {'install': install,
'build_clib': build_clib, 'build': build,
'develop': develop},
packages=['mypack'],
ext_package='mypack',
ext_modules=[mygen_module],
# package_dir={'mypack': '.'},
# package_data={'mypack': ['docs/*md']},
include_package_data=True)
I subclass some of the setuptools commands in order to build the shared-library before it compiles the extension. clib_install.sh is a bash script that locally downloads and builds the shared library in /clib, creating the headers (in /clib/include) and .so file (in /clib/lib). To solve problems with linking to shared-library dependencies I used $ORIGIN/../clib/lib as a link argument so that the absolute path to clib isn't needed.
Unfortunately, the /clib directory doesn't get copied to the install location. I tried tinkering with package_data but it didn't copy my directory over. In fact, I don't even know what pip/setuptools does with /clib after the script is called, I guess it is made in some temporary build directory and gets deleted after. I am not sure how to get /clib to where it needs to be after it is made.
package_data={
'mypack': [
'clib/include/*.h',
'clib/lib/*.so',
'docs/*md',
]
},

Running custom setuptools build during install

I've tried to implement Compass compiling during setuptools' build, but the following code runs compilation during explicit build command and doesn't runs during install.
#!/usr/bin/env python
import os
import setuptools
from distutils.command.build import build
SETUP_DIR = os.path.dirname(os.path.abspath(__file__))
class BuildCSS(setuptools.Command):
description = 'build CSS from SCSS'
user_options = []
def initialize_options(self):
pass
def run(self):
os.chdir(os.path.join(SETUP_DIR, 'django_project_dir', 'compass_project_dir'))
import platform
if 'Windows' == platform.system():
command = 'compass.bat compile'
else:
command = 'compass compile'
import subprocess
try:
subprocess.check_call(command.split())
except (subprocess.CalledProcessError, OSError):
print 'ERROR: problems with compiling Sass. Is Compass installed?'
raise SystemExit
os.chdir(SETUP_DIR)
def finalize_options(self):
pass
class Build(build):
sub_commands = build.sub_commands + [('build_css', None)]
setuptools.setup(
# Custom attrs here.
cmdclass={
'build': Build,
'build_css': BuildCSS,
},
)
Any custom instructions at Build.run (e.g. some printing) doesn't apply during install too, but dist instance contains in commands attribute only my build command implementation instances. Incredible! But I think the trouble is in complex relations between setuptools and distutils. Does anybody knows how to make custom building run during install on Python 2.7?
Update: Found that install definitely doesn't calls build command, but it calls bdist_egg which runs build_ext. Seems like I should implement "Compass" build extension.
Unfortunatelly, I haven't found the answer. Seems like the ability to run post-install scripts correctly there's only at Distutils 2. Now you can use this work-around:
Update: Because of setuptools' stack checks, we should override install.do_egg_install, not run method:
from setuptools.command.install import install
class Install(install):
def do_egg_install(self):
self.run_command('build_css')
install.do_egg_install(self)
Update2: easy_install runs exactly bdist_egg command which is used by install too, so the most correct way (espetially if you want to make easy_install work) is to override bdist_egg command. Whole code:
#!/usr/bin/env python
import setuptools
from distutils.command.build import build as _build
from setuptools.command.bdist_egg import bdist_egg as _bdist_egg
class bdist_egg(_bdist_egg):
def run(self):
self.run_command('build_css')
_bdist_egg.run(self)
class build_css(setuptools.Command):
description = 'build CSS from SCSS'
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
pass # Here goes CSS compilation.
class build(_build):
sub_commands = _build.sub_commands + [('build_css', None)]
setuptools.setup(
# Here your setup args.
cmdclass={
'bdist_egg': bdist_egg,
'build': build,
'build_css': build_css,
},
)
You may see how I've used this here.

In setup.py or pip requirements file, how to control order of installing package dependencies?

I've got a Python package with its setup.py having dependencies declared via the usual way, in install_requires=[...]. One of the packages there, scikits.timeseries, has a setup.py expecting numpy to already be installed, thus, I'd like some way to have numpy installed first. For this case and in general, can the order of dependency installation be controlled? How?
Currently the order in which setup.py pulls down dependencies (as listed in the arg install_requires) seems practically random. Also, in the setup.py setup(...) I tried using the arg:
extras_require={'scikits.timeseries': ['numpy']}
...without success, the order of installing dependencies was unaffected.
I also tried setting up a pip requirements file, but there too, pip's order of installing dependencies didn't match the line-order of the requirements file, so no luck.
Another possibility would be to have a system call near the top of setup.py, to install numpy before the setup(...) call, but I hope there's a better way. Thanks in advance for any help.
If scikits.timeseries needs numpy, then it should declare it as a dependency. If it did, then pip would handle things for you (I'm pretty sure setuptools would, too, but I haven't used it in a long while). If you control scikits.timeseries, then you should fix it's dependency declarations.
Use setup_requires parameter, for instance to install numpy prior scipy put it into setup_requires and add __builtins__.__NUMPY_SETUP__ = False hook to get numpy installed correctly:
setup(
name='test',
version='0.1',
setup_requires=['numpy'],
install_requires=['scipy']
)
def run(self):
__builtins__.__NUMPY_SETUP__ = False
import numpy
Here's a solution which actually works. It's not an overly "pleasant" method to have to resort to, but "desperate times...".
Basically, you have to:
Override the setuptools "install command" class (plus the closely related analogs)
Execute pip from the script via command line statements for which you can enforce the order
The drawbacks to this are:
Pip must be installed. You can't just execute setup.py in an environment without that.
The console output of the initial "prerequisite" installs don't appear for some weird reason. (Perhaps I'll post an update here down the line fixing that...)
The code:
from setuptools import setup
# Override standard setuptools commands.
# Enforce the order of dependency installation.
#-------------------------------------------------
PREREQS = [ "ORDERED-INSTALL-PACKAGE" ]
from setuptools.command.install import install
from setuptools.command.develop import develop
from setuptools.command.egg_info import egg_info
def requires( packages ):
from os import system
from sys import executable as PYTHON_PATH
from pkg_resources import require
require( "pip" )
CMD_TMPLT = '"' + PYTHON_PATH + '" -m pip install %s'
for pkg in packages: system( CMD_TMPLT % (pkg,) )
class OrderedInstall( install ):
def run( self ):
requires( PREREQS )
install.run( self )
class OrderedDevelop( develop ):
def run( self ):
requires( PREREQS )
develop.run( self )
class OrderedEggInfo( egg_info ):
def run( self ):
requires( PREREQS )
egg_info.run( self )
CMD_CLASSES = {
"install" : OrderedInstall
, "develop" : OrderedDevelop
, "egg_info": OrderedEggInfo
}
#-------------------------------------------------
setup (
...
install_requires = [ "UNORDERED-INSTALL-PACKAGE" ],
cmdclass = CMD_CLASSES
)
You can add numpy to setup_requires section:
setup_requires=['numpy'],

Categories