I want to run a bash shell script before running setup is called in a setup.py:
from numpy.distutils.core import setup, Extension
from subprocess import call
err = call('sh dependencies.sh',shell=True)
if err:
raise Exception('The dependencies failed to compile.')
extensions = [...]
setup(name = 'Package',
packages=['Package'],
ext_modules=extensions)
When I run python -m pip install . -v, everything works. HOWEVER, the script dependencies.sh is run two times, compiling the dependencies two times. How do I do this properly? Thanks!
Related
I am deploying a custom pytorch model on AWS sagemaker, Following this tutorial.
In my case I have few dependencies to install some modules.
I need pycocotools in my inference.py script. I can easily install pycocotool inside a separate notebook using this bash command,
%%bash
pip -g install pycocotools
But when I create my endpoint for deployment, I get this error that pycocotools in not defined.
I need pycocotools inside my inference.py script. How I can install this inside a .py file
At the beginning of inference.py add these lines:
from subprocess import check_call, run, CalledProcessError
import sys
import os
# Since it is likely that you're going to run inference.py multiple times, this avoids reinstalling the same package:
if not os.environ.get("INSTALL_SUCCESS"):
try:
check_call(
[ sys.executable, "pip", "install", "pycocotools",]
)
except CalledProcessError:
run(
["pip", "install", "pycocotools",]
)
os.environ["INSTALL_SUCCESS"] = "True"
I want to make a distributable package.
And my package depends on some OS package
Here what I want to install:
def install_libmagic():
if sys.platform == 'darwin':
subprocess.run(['brew', 'install', 'libmagic'])
elif sys.platform == 'linux':
subprocess.run(['apt-get', 'update'])
subprocess.run(['apt-get', 'install', '-y', 'libmagic1'])
else:
raise Exception(f'Unknown system: {sys.platform}, can not install libmagic')
I want this code to be executed only when smb call:
pip install mypacakge
I don't want it to be executed when I run: python setup.py bdist_wheel
How can I achieve this?
I tried this:
setup(
...
install_requires=install_libmagic(),
)
Also tried to override install command:
from setuptools.command.install import install
class MyInstall(install):
def run(self):
install_libmagic()
install.run(self)
setup(
...
cmdclass={'install': MyInstall}
)
But the function was executed on python setup.py bdist_wheel, which is not what I am trying to achieve.
I think you're mixing up the behaviors of built distributions (wheels) and source distributions.
If your goal is run some subprocesses at install time, then you can't do this with a built distribution. A built distribution executes no Python code at install time. It only executes setup.py at build time, which is why you're seeing your functions executed when you call python setup.py bdist_wheel.
On the other hand, a source distribution (python setup.py sdist) does execute the setup.py file at both build time and install time (roughly the same as python setup.py install) and would give you the behavior you're looking for.
However, as the comments have already mentioned, this is going to be very fragile and not very user-friendly or portable. What you're describing is really a distro/OS package that contains some Python module, and you'd probably be better off with that instead.
My project structure seems to be correct.
setup.py
mypackage/
__init__.py
__main__.py
main.py
script1.py #import script2
script2.py
tests/
test_script2.py
File script1.py imports script2.py using 'import script2'.
I can run code without errors with following commands:
python mypackage
python mypackage/main.py
Unfortunately, when I try to execute tests using pytest or python -m pytest I get error that there's no module named script2(full message below). I installed my package in editable mode pip install -e .
I'm able to fix this by using imports with package name like import mypackage.script2 as script2 but then, everyone who will clone my repository will have to install package with pip before running it. Otherwise there will error that mypackage is not found.
I'd like to be able to run this code without pip install and have the option to run each script file separately.Could you suggest me alternative solution?
Repository: pytest-imports-demo
Error message from pytest:
(venv) lecho:~/pytest-imports-demo$ pytest
================================================= test session starts ==================================================
platform linux -- Python 3.6.7, pytest-4.4.1, py-1.8.0, pluggy-0.9.0
rootdir: /home/lecho/pytest-imports-demo
collected 0 items / 1 errors
======================================================== ERRORS ========================================================
________________________________________ ERROR collecting tests/test_script2.py ________________________________________
ImportError while importing test module '/home/lecho/pytest-imports-demo/tests/test_script2.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
tests/test_script2.py:2: in <module>
import mypackage.script1 as script1
mypackage/script1.py:1: in <module>
import script2
E ModuleNotFoundError: No module named 'script2'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
=============================================== 1 error in 0.05 seconds ================================================
In the file pytest-imports-demo/mypackage/script1.py importing script2 package should be done either:
from mypackage import script2
or
from . import script2
Also need to add empty __init__.py file to pytest-imports-demo/tests/ directory.
As far as "I'd like to be able to run this code without pip install and have the option to run each script file separately." goes this can be done by making scripts executable and providing full path to the scripts or putting path to the directory with these scripts into your $PATH environment variable. Alternatively it can be done via pip install (but additional settings are required in setup.py file).
But tests can be run without having to pip install your package.
I opened PR: https://github.com/lecho/pytest-imports-demo/pull/1
I am installing a package using python2 setup.py install.
This package is a tkinter application which contains traditional
if __name__ == '__main__':
main()
I tried running python2 -m my_app or python2 -m my_app.__main__ and python2 -c "import my_app" but I either get an error or nothing happens.
I can run it ./my_app.py from console.
How can I run my python application after installation with setup.py?
Import module with main function and call it
python -c "from some_module import main; main()"
but mostly modules/apps, simply expose bin/scripts, look in bin dir of your virtualenv or setup.py.
More info about how to expose (scripts, entry_points):
http://python-packaging.readthedocs.org/en/latest/command-line-scripts.html
https://pythonhosted.org/setuptools/setuptools.html#automatic-script-creation
I'm having an issue with running setup.py/pip in a chroot environment.
Here's the scoop:
Normal directory location:
/local/my_dir/project/src/qa/libs
Chroot-ed location
/src/qa/libs
Here's my setup.py file:
#!/usr/bin/env
from __future__ import (unicode_literals, print_function, division,
absolute_import)
from setuptools import find_packages, setup
test = [
'mock',
'pytest',
'pytest-cov',
]
setup(
name='libs',
version=0.1,
description='Some desc',
long_description=open('README').read(),
author='insert_author_here',
author_email='insert_email_here',
packages=find_packages(),
package_dir={},
include_package_data=True,
tests_require=test,
install_requires=[],
keywords=['qa', 'framework'],
extras_require={
'test': test,
}
)
When running python setup.py develop in the libs directory everything goes swimmingly during the install until the very end.
Installed /src/qa/libs
Processing dependencies for libs==0.1
Finished processing dependencies for libs==0.1 # <-- It hangs here
This doesn't happen when I'm not currently in chroot (required for the environment) and it seems like setuptools/distribute is getting stuck in a recursive filesystem looking for things to clean up. Any idea how to fix this?
Installing a requirements.txt file with pip doesn't have any problems like this, so I think it might be the structure of the setup.py file.
It turns out the hang occurred during during the bash script that created the virtualenv and installed this package. I figured this out by executing the script with the bash -x my_script command, which showed the actual executing command when the hang occurred.
The setup.py file correctly installs the package and exits successfully.