I am trying to install a python package on my ubuntu.I am trying to install it through a setup script which i had written.The setup.py script looks like this:
from setuptools import setup
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
setup(
name = 'pyduino',
description = 'PyDuino project aims to make python interactive with hardware particularly arduino.',
url = '###',
keywords = 'python arduino',
author = '###',
author_email = '###',
version = '0.0.0',
license = 'GNU',
packages = ['pyduino'],
install_requires = ['pyserial'],
classifiers = [
# How mature is this project? Common values are
# 3 - Alpha
# 4 - Beta
# 5 - Production/Stable
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'Topic :: Software Development :: Build Tools',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
],
scripts=['pyduino/pyduino.py'],
)
Package installs in /usr/local/bin directory.But when I am importing the modules outside the /usr/local/bin,import error occurs.I tried changing path to /usr/local/bin and it works perfectly and import error doesn't occur.How can I install the package so that I can import the modules in any directory? Thanks in advance...
Try install your packages with pip using this
pip install --install-option="--prefix=$PREFIX_PATH" package_name
as described here Install a Python package into a different directory using pip?
and i'll suggest to read what are 1. pip 2. virtualenv
Good luck :)
EDIT: i found the package is installed with pip like:
pip install --install-option="--prefix=/usr/local/bin" pyduino_mk
Currently, you're using a scripts tag to install your python code. This will put your code in /usr/local/bin, which is not in PYTHONPATH.
According to the documentation, you use scripts when you want to install executable scripts (stuff you want to call from command line). Otherwise, you need to use packages.
My approach would be like this:
install the pyduino/pyduino.py in the library with something like packages=['pyduino']
create a wrapper (shell or python) capable of calling your installed script and install that via scripts=[...]
Using the packages tag for your module will install it in /usr/local/lib/python..., which is in PYTHONPATH. This will allow you to import your script with something like import pyduino.pyduino.*.
For the wrapper script part:
A best practice is to isolate the code to be executed if the script is triggered from command line in something like:
def main():
# insert your code here
pass
if __name__ == '__main__':
main()
Assuming there is a def main() as above
create a directory scripts in your tree (at the same level with setup.py)
create a file scripts/pyduino
in scripts/pyduino:
#!/usr/bin/env python
from pydiuno.pyduino import main
if __name__ == '__main__':
main()
add a `scripts = ['scripts/pyduino'] to your setup.py code
Related
I have a package which I have written and installed onto a virtual environment in editable mode. I can only import from this package only when the modules and items within those modules are imported using the 'from' syntax.
In a python file outside the package, I can import specific modules from the package using from package import module and import specific functions/objects from these modules via from package.module import x in external scripts/python interpreter.
However, when I try to import the whole module, I find that the package has no accessible modules; i.e. if I were to write:
import package
x = package.module.x
Then I would receive the error:AttributeError: module 'package' has no attribute 'module'.
Intriguingly , if I use a from import and then attempt the same command again, the error does not occur and the attribute, object or function 'x' imports properly.
I believe that the problem should have something to do with how the package is found in the setup.py, but I don't know enough about python packaging to understand what is going on
I have this module installed in editable mode through pip on my anaconda virtual environment, and it uses the standard cookiecutter python format. There are no sub-packages, and the init.py contains only basic bibliographic information ('_name_ ' and '_email_' and so on).
here is my setup:
"""The setup script."""
from setuptools import setup, find_packages
with open('README.rst') as readme_file:
readme = readme_file.read()
with open('HISTORY.rst') as history_file:
history = history_file.read()
requirements = ['Click>=7.0', ]
test_requirements = ['pytest>=3', ]
setup(
author="Alexander Pasha",
author_email={email},
python_requires='>=3.9',
classifiers=[
'Development Status :: 2 - Pre-Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Programming Language :: Python :: 3.9',
],
description={description},
install_requires=requirements,
license="MIT license",
long_description=readme + '\n\n' + history,
include_package_data=True,
keywords={name},
name={name},
packages=find_packages(include=['package', 'package.*']),
test_suite='tests',
tests_require=test_requirements,
url={GitHub},
version='0.1.0',
zip_safe=False,
)
for reference, I'm using python 3.9.12
For many python libraries, the argument used with import is the same as the one used to install the library with pip.
E.g.
pip install numpy
pip install scipy
pip install pandas
correspond to
import numpy
import scipy
import pandas
but this pattern doesn't seem to work for all libraries. E.g. (found here):
pip install Pillow
is required to get this to succeed
import PIL
Based on the pattern in the first examples, I would have expected pip install PIL to install PIL, but instead we use pip install Pillow. Why is this and how does this work?
Basically, what you import is usually the module name. For example, your package might be developed in the following hierarchy:
MyLib
- __init__.py
- my_script1.py
- my_script2.py
However, when you make your library as a "package" available in pip, usually you will need to prepare your setup.py file, which will be automatically run when people use pip install to install your package.
The setup.py can be something like this:
from distutils.core import setup
setup(
name = 'YOURPACKAGENAME', # How you named your package folder (MyLib)
packages = ['YOURPACKAGENAME'], # Chose the same as "name"
version = '0.1', # Start with a small number and increase it with every change you make
license='MIT', # Chose a license from here: https://help.github.com/articles/licensing-a-repository
description = 'TYPE YOUR DESCRIPTION HERE', # Give a short description about your library
author = 'YOUR NAME', # Type in your name
author_email = 'your.email#domain.com', # Type in your E-Mail
url = 'https://github.com/user/reponame', # Provide either the link to your github or to your website
download_url = 'https://github.com/user/reponame/archive/v_01.tar.gz', # I explain this later on
keywords = ['SOME', 'MEANINGFULL', 'KEYWORDS'], # Keywords that define your package best
install_requires=[ # I get to this in a second
'validators',
'beautifulsoup4',
],
classifiers=[
'Development Status :: 3 - Alpha', # Chose either "3 - Alpha", "4 - Beta" or "5 - Production/Stable" as the current state of your package
'Intended Audience :: Developers', # Define that your audience are developers
'Topic :: Software Development :: Build Tools',
'License :: OSI Approved :: MIT License', # Again, pick a license
'Programming Language :: Python :: 3', #Specify which pyhton versions that you want to support
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
],
)
Therefore, in the above example, people who install your package via pip should run pip install YOURPACKAGENAME. After that, they need to run import MyLib in the code.
TD; DL:
What you import is a module name, but what you installed via pip is the name of the package, they can be different. But usually, I would say that I like people to use the same name for both to avoid any confusing.
Ref:
https://medium.com/#joel.barmettler/how-to-upload-your-python-package-to-pypi-65edc5fe9c56
I'm deploying my first python project, but having issues with installation. I've followed the practices outlined in https://packaging.python.org/guides/distributing-packages-using-setuptools/#uploading-your-project-to-pypi. My project is organized with a top-level executable script bin/gsat that calls imports other modules like so:
import gsat.input_validation as input_validation
The modules are in src/gsat/ , following the arrangement in the example project at https://github.com/pypa/sampleproject
If I install locally from the project source , using develop mode:
pip install -e .
... then I have no issues installing and the software works.
But if I install it from PyPI:
pip install "gsat"
... then it won't run because the import statements fail to find the modules. Error:
File "/Library/Frameworks/Python.framework/Versions/3.7/bin/gsat", line 10, in <module>
import gsat.input_validation as input_validation
ModuleNotFoundError: No module named 'gsat'
The full project is at https://github.com/MikeAxtell/gsat , commit c680172. The project is also on PyPI as "gsat". The distribution files are being made like:
python3 setup.py sdist bdist_wheel
... and the fun setup.py file is below. I'm sure this is some noob issue; I am new to python packaging and python programming in general, so thanks in advance for help!
setup.py:
with open("README.md", "r") as fh:
long_description = fh.read()
setuptools.setup(
name="gsat",
version="0.1a",
author="Michael J. Axtell",
author_email="mja18#psu.edu",
description="General Small RNA-seq Analysis Tool",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/MikeAxtell/gsat",
scripts=['bin/gsat'],
package_dir={'': 'src'},
packages=setuptools.find_packages(where='gsat'),
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
python_requires='>3.5, <4',
install_requires=['biopython','python-Levenshtein'],
)
Looks like there is an issue in this line:
packages=setuptools.find_packages(where='gsat'),
I believe it should read the following instead:
packages=setuptools.find_packages(where='src'),
Recently I created a python script for PyPI. That you can download with pip install. The problem is you can only execute the script, that you downloaded with pip install, when you are in the Scripts folder which is where you python is localized (your_python_location/Scripts/myscript.py).
But this would be a hassle for the users. So I wanted to ask, how can I make it that you can execute the script from everywhere? (like you can do with pip without specifying the pip location). I also don't want that every user needs to set the path to the script.
My Setup.py (maybe its helpful):
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
with open('requirements.txt') as f:
requirements = f.read().splitlines()
setuptools.setup(
name="boston-housing-prediction",
version="0.2.0a0",
author="xx",
author_email="xxx#gmail.com",
py_modules=["misc_libary", "polynomial_regression_libary", "linear_regression_libary"],
scripts=["boston-housing-main.py"],
description="Predict housing prices in boston.",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/XXX",
packages=setuptools.find_packages(),
classifiers=[
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7'
],
keywords="regression meachine-learning housing_prices_boston learning_purpose",
license="MIT",
install_requires=requirements,
python_requires='>=3.5',
)
You can specify entry_points in setup.py, like
setuptools.setup(
# ...
entry_points = {
'console_scripts': [
'boston_housing = boston-housing-main:main'
]
},
)
This will cause pip install to install a wrapper somewhere like /usr/local/bin/boston_housing which basically pulls in the module boston-housing-main and runs its main() function.
You probably want to replace the scripts entry with this, though there is no reason per se you could not have both.
One approach to making a globally accessible Python script is to have your users call the module itself. If your package is called 'boston-housing-prediction', your users will be able to call your script from anywhere using the following command:
python -m boston-housing-prediction
What this does is calls a __main__.py file inside your package. This is just like any other Python script, so it can take arguments normally. All you have to do is rename your script to __main__.py and drop it into the package directory (not the folder including setup.py, rather into the folder including the package scripts), or create a new __main__.py that calls your script (you can just import the script if they are in the same folder).
The benefit of this approach is that it is platform independent, relying only on the proper installation of the packages. It doesn't rely on the OS at all.
I'm new to python and writing an application which I want to package for debian. Therefore, I'd like to (and I have to) use pybuilder. My goal is to create a .deb package (currently using stdeb), which makes sure, that all required libraries are installed, too.
My application uses a third party library, that is only available via pip(3) install (no debian package).
My build.py looks like:
[...]
use_plugin("python.core")
use_plugin("python.unittest")
use_plugin("python.install_dependencies")
use_plugin("python.distutils")
use_plugin("copy_resources")
use_plugin("source_distribution")
use_plugin("python.flake8")
use_plugin("python.coverage")
use_plugin("python.stdeb")
#init
def initialize(project):
project.build_depends_on('coverage')
project.build_depends_on('flake8')
project.build_depends_on('jsonmerge')
project.build_depends_on('mock')
project.build_depends_on('setuptools')
project.build_depends_on('stdeb')
project.build_depends_on('unittest-xml-reporting')
project.build_depends_on('xmlrunner')
project.depends_on('<pip-only-library>')
project.set_property('coverage_threshold_warn', 50)
project.set_property('flake8_break_build', False)
project.set_property('flake8_ignore', 'E501,E402,E731')
project.set_property('flake8_include_test_sources', True)
project.set_property('flake8_verbose_output', True)
project.set_property('verbose', True)
project.set_property("copy_resources_target", "$dir_dist")
project.set_property("coverage_break_build", False)
project.set_property("coverage_reset_modules", True)
project.set_property("dir_dist_scripts", 'scripts')
project.set_property("distutils_classifiers", [
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Development Status :: 4',
'Environment :: Console',
'Intended Audience :: Systems Administration',
'License :: OSI Approved :: BSD License'])
project.set_property('distutils_commands', ['bdist'])
project.set_property('distutils_commands', ['sdist'])
Generation of debian package works in general. Calling pyb successfully creates a python3-mypackage_1.0-1_all.deb package.
The generated setup.py looks like:
# `target/dist/mypackage-1.0/setup.py`
[...]
if __name__ == '__main__':
setup(
[...]
packages = [],
[...]
entry_points = {},
data_files = [],
package_data = {},
install_requires = ['pip-only-library'],
dependency_links = [],
zip_safe=True,
cmdclass={'install': install},
)
Testing package installation using sudo dpkg -i python3-mypackage_1.0-1_all.deb fails as dpkg refers to a dependent package python3-<pip-only-library>, which is not available.
During build time, the library is present on the local machine.
So, now comes the newbie question: How to change build.py to make sure, that the created debian package provides my application and makes sure, that library requirements are met.
Maybe, is it possible to bundle the pip-only library (i.e. taken from /usr/local/lib/python3.4/dist-packages during build-time) to my application and ship them within an 'application with all dependencies'-package? Or is there an alternative approach?
I've already seen this answer but it doesn't help.