How to import my modules streaming_capture, streaming_information, report in in other project ?
I wrote a Python project called long_term_streaming_info_capture,
Then installed it in a new project under virtualenv environment.
If I want to import StreamingCapture I should called from long_term_streaming_info_capture.scripts.streaming_capture import StreamingCapture
Can I just use the long_term_streaming_info_capture.streaming_capture without the folder script in the import path ?
(develop+-)$ tree -L 3 -P "*.py"
.
├── helpers
│ ├── __init__.py
│ └── animation
│ ├── __init__.py
│ ├── animation_helper.py
│ ├── dqa_file_io.py
│ ├── dqa_telnet.py
│ ├── file_io_helper.py
│ ├── shm_controller.py
│ ├── telnet_helper.py
│ ├── test_dqa_file_io.py
│ └── test_dqa_telnet.py
├── log
├── main.py
├── report.py
├── sandbox
├── streaming_capture.py
└── streaming_information.py
Project skeleton
.
├── HACKING.txt
├── MANIFEST.in
├── NEWS.txt
├── README.rst
├── bootstrap.py
├── buildout.cfg
├── setup.py
└── src
├── long_term_streaming_info_capture
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── docs
│ ├── scripts
│ └── tests
└── long_term_streaming_info_capture.egg-info
├── PKG-INFO
├── SOURCES.txt
├── dependency_links.txt
├── entry_points.txt
├── not-zip-safe
└── top_level.txt
Here my setup.py
from setuptools import setup, find_packages
import sys, os
here = os.path.abspath(os.path.dirname(__file__))
README = open(os.path.join(here, 'README.rst')).read()
NEWS = open(os.path.join(here, 'NEWS.txt')).read()
version = '0.1'
install_requires = [
# List your project dependencies here.
# For more details, see:
# http://packages.python.org/distribute/setuptools.html#declaring-dependencies
]
setup(name='long_term_streaming_info_capture',
version=version,
description="for capturing fps framerate",
long_description=README + '\n\n' + NEWS,
classifiers=[
# Get strings from http://pypi.python.org/pypi?%3Aaction=list_classifiers
],
keywords='',
license='',
packages=find_packages('src'),
package_dir = {'': 'src'},include_package_data=True,
zip_safe=False,
install_requires=install_requires,
entry_points={
'console_scripts':
['long_term_streaming_info_capture=long_term_streaming_info_capture:main']
}
)
Related
I'm trying to use the basic example from the docs. I am able to successfully generate an sdist, but I want to be able to install from wheel. However, when installing from wheel instead of from the sdist, I don't have a proj directory that I can import, only pkg1 and pkg2.
Starting from this directory:
├── proj
│ ├── __init__.py
│ ├── additional
│ │ └── __init__.py
│ ├── pkg1
│ │ └── __init__.py
│ └── pkg2
│ └── __init__.py
├── pyproject.toml
└── setup.cfg
I tried using this setup.cfg:
[metadata]
name = p1
version = 0.0.1
[options]
packages =
find:
package_dir =
=proj
[options.packages.find]
where = proj
include = pkg*
exclude = additional
To generate this file structure, which is successfully generated in the source distribution:
├── PKG-INFO
├── README.md
├── proj
│ ├── p1.egg-info
│ │ ├── PKG-INFO
│ │ ├── SOURCES.txt
│ │ ├── dependency_links.txt
│ │ └── top_level.txt
│ ├── pkg1
│ │ └── __init__.py
│ └── pkg2
│ └── __init__.py
├── pyproject.toml
└── setup.cfg
But the wheel has this structure:
├── p1-0.0.1.dist-info
│ ├── METADATA
│ ├── RECORD
│ ├── WHEEL
│ └── top_level.txt
├── pkg1
│ └── __init__.py
└── pkg2
└── __init__.py
This makes it impossible to use:
import proj
As there is no such module, only modules pkg1 and pkg2.
I made a repo for this here: https://github.com/spensmith/setuptools-find.
Steps to recreate:
Clone the base repository here: https://github.com/spensmith/setuptools-find
cd setuptools-find
python -m build
mkdir dist/mywheel
unzip dist/p1-0.0.1-py3-none-any.whl -d dist/mywheel
tree dist/mywheel
I must be missing something basic, any help would be so appreciated!
Thank you, Spencer.
My question was answered directly on the setuptools discussion forum. https://github.com/pypa/setuptools/discussions/3185#discussioncomment-2411330.
Effectively, I shouldn't be using
package_dir =
=proj
Because this means that all of my code is within this folder. Instead, I should remove it entirely, and then change my exclude to look like:
[options.packages.find]
exclude = proj.additional
Finally, yielding this file:
[metadata]
name = p1
version = 0.0.1
[options]
packages =
find:
[options.packages.find]
exclude = proj.additional
I'm packaging a little python package. I'm a complete newbie to python packaging, my directory structure is as follows (up to second level nesting):
.
├── data
│ ├── images
│ ├── patches
│ └── train.csv
├── docker
│ ├── check_gpu.py
│ ├── Dockerfile.gcloud_base
│ ├── Dockerfile.gcloud_myproject
├── env.sh
├── gcloud_config_p100.yml
├── legacy
│ ├── __init__.py
│ ├── notebooks
│ └── mypackage
├── notebooks
│ ├── EDA.ipynb
│ ├── Inspect_patches.ipynb
├── README.md
├── requirements.txt
├── scripts
│ ├── create_patches_folds.py
│ └── create_patches.py
├── setup.py
├── mypackage
├── data
├── img
├── __init__.py
├── jupyter
├── keras_utils
├── models
├── train.py
└── util.py
My setup.py:
import os
from setuptools import setup, find_packages
REQUIRED_PACKAGES = [
"h5py==2.9.0",
"numpy==1.16.4",
"opencv-python==4.1.0.25",
"pandas==0.24.2",
"keras==2.2.4",
"albumentations==0.3.1"
]
setup(
name='mypackage',
version='0.1',
install_requires=REQUIRED_PACKAGES,
packages=find_packages(include=["mypackage.*"]),
include_package_data=False
)
The code i want to package corresponds only to the mypackage directory. That's why i passed "mypackage.*" to find_packages and used include_package_data=False.
If i run:
python setup.py sdist
All project structure gets packaged in the resulting tar.gz file.
Anyone knowing how to just package modules inside mypackage/ and the top level README file? I'm not finding this in setuptools docs.
First thing to fix is
packages=find_packages(include=["mypackage"]),
But you also need to understand that sdist is mostly controlled by the files MANIFEST or its template MANIFEST.in, not setup.py. You can compare what is created with sdist and bdist_wheel or bdist_egg; content of bdist_* is controlled by setup.py.
So my advice is to create the following MANIFEST.in:
prune *
include README.txt
recursive-include mypackage *.py
My python project installs via setup.py. The project structure looks like:
├── Makefile
├── README.rst
├── circle.yml
├── docs
│ ├── Makefile
│ ├── conf.py
│ ├── deps.txt
│ ├── guide_installation.rst
│ ├── guide_model.rst
│ ├── guide_transliteration.rst
│ ├── index.rst
│ ├── make.bat
│ └── module_trans.rst
├── indictrans
│ ├── __init__.py
│ ├── _decode
│ ├── _utils
│ ├── base.py
│ ├── iso_code_transformer.py
│ ├── libindic_
│ ├── mappings
│ ├── models
│ ├── polyglot_tokenizer
│ ├── script_transliterate.py
│ ├── test.py
│ ├── tests
│ ├── transliterator.py
│ ├── trunk
│ └── unicode_marks.py
├── requirements.txt
├── setup.cfg
├── setup.py
├── test-requirements.txt
└── tox.ini
where the subfolder indictrans/models looks like
├── ben-eng
│ ├── classes.npy
│ ├── coef.npy
│ ├── intercept_final.npy
│ ├── intercept_init.npy
│ ├── intercept_trans.npy
│ └── sparse.vec
├── ben-guj
│ ├── classes.npy
│ ├── coef.npy
│ ├── intercept_final.npy
│ ├── intercept_init.npy
│ ├── intercept_trans.npy
│ └── sparse.vec
so I have .npy and .vec files to be included in the project.
In my setup.py I'm trying to explicitly include this folder models via the include_package_data directive like:
setup(
setup_requires=['pbr'],
pbr=True,
packages=find_packages(),
include_package_data=True,
package_data={'models': ['*.npy','*.vec']},
ext_modules=cythonize(extensions)
)
and in the setup.cfg I have
[files]
packages =
indictrans
but running python setup.py install does not copy the models folder to the installation folder /usr/local/lib/python2.7/dist-packages/indictrans/.
If I print the it is the output of the find_packages I get
['indictrans', 'indictrans.tests', 'indictrans.libindic_', 'indictrans._utils', 'indictrans._decode', 'indictrans.polyglot_tokenizer', 'indictrans.models', 'indictrans.trunk', 'indictrans.libindic_.utils', 'indictrans.libindic_.soundex', 'indictrans.libindic_.utils.tests', 'indictrans.libindic_.soundex.utils', 'indictrans.libindic_.soundex.tests', 'indictrans.libindic_.soundex.utils.tests', 'indictrans.polyglot_tokenizer.tests', 'indictrans.trunk.tests']
so I will assume that indictrans/models would be included, but it is not.
Add include_package_data=True to your setup-function (you already did that).
Create a file MANIFEST.in in the same directory as setup.py
MANIFEST.in can look as follows:
include indictrans/models/ben-eng/*
include indictrans/models/ben-guj/*
You don't need setup.cfg for doing this.
Source: This great writeup of python packaging
EDIT about recursive-include:
According to the documentation this should also work:
recursive-include indictrans/models *.npy *.vec
include_package_data=True requires MANIFEST.in.
To include data for the module indictrans.models you have to provide the full name:
package_data={'indictrans.models': ['*.npy','*.vec']},
I have created a package named aTask on my local machine and successfully installed it (as there wasn't an error happened) using pip install -e ./tasks. To double check, pip list | grep aTask and I got aTask (0.2.0, /Users/WorkPlace/projects/tasks).
But when I have been trying to import to my interpreter, I have an error
In [1]: import aTask
ImportError Traceback (most recent call last)
<ipython-input-1-d76cc8326300> in <module>()
----> 1 import aTask
ImportError: No module named aTask
Would you please give me a hint to resolve my case? Below I put more details
setup.py consists of
from setuptools import setup, find_packages
package_name = 'aTask'
version = '0.2.0'
install_requires = ['pandas==0.21.1', 'numpy==1.13.3', 'scikit-learn==0.19.1', 'scipy==1.0.0']
CLASSIFIERS = [
'Operating System :: OS Independent',
'Programming Language :: Python :: 2.7'
]
description = 'an example to install a package'
setup(name=package_name,
description=description,
author='XXXX',
version=version,
packages=find_packages(exclude=['test', 'test.*']),
platforms=['Any'],
install_requires=install_requires,
classifiers=CLASSIFIERS)
my folder structure as
> tree .
.
├── DataPopulation
│ └── main.py
├── README.md
├── Strategies
│ ├── Exceptions
│ │ ├── __init__.py
│ │ └── __init__.pyc
│ ├── IStrategy.py
│ ├── IStrategy.pyc
│ ├── Algorithms
│ │ ├── SVM.py
│ │ ├── SVM.pyc
│ │ ├── __init__.py
│ │ ├── __init__.pyc
│ │ ├── miscellaneous.py
│ │ └── miscellaneous.pyc
│ ├── __init__.py
│ └── __init__.pyc
├── StrategyBasicContext.py
├── StrategyBasicContext.pyc
├── Ultils
│ ├── DataIO
│ │ └── __init__.py
│ ├── __init__.py
│ └── __init__.pyc
├── __init__.py
├── __init__.pyc
├── requirements.txt
└── setup.py
I want to make python package with C extensions. I want this to be done with cython. My structure is:
.
├── build
│ ├── lib.linux-i686-2.7
│ │ └── pyA13SOM
│ │ ├── cython
│ │ │ └── spi.so
│ │ └── __init__.py
│ └── temp.linux-i686-2.7
│ └── pyA13SOM
│ └── cython
│ ├── my_test.o
│ └── spi.o
├── CHANGES.txt
├── Makefile
├── MANIFEST
├── pyA13SOM
│ ├── cython
│ │ ├── clibraries
│ │ │ └── spi_test.c
│ │ ├── __init__.py
│ │ ├── __init__.pyc
│ │ ├── spi.c
│ │ ├── spi.pyx
│ │ └── spi.so
│ ├── gpio
│ │ ├── gpio.c
│ │ ├── gpio_lib.c
│ │ ├── gpio_lib.h
│ │ ├── __init__.py
│ │ └── __init__.pyc
│ ├── i2c
│ │ ├── i2c.c
│ │ ├── i2c_lib.c
│ │ ├── i2c_lib.h
│ │ ├── __init__.py
│ │ └── __init__.pyc
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── spi
│ │ ├── __init__.py
│ │ ├── __init__.pyc
│ │ ├── spi.c
│ │ ├── spi_lib.c
│ │ └── spi_lib.h
│ └── utilities
│ └── color.h
├── README.txt
└── setup.py
My setup file is:
from distutils.core import setup
from distutils.core import Extension
from Cython.Build import cythonize
from Cython.Distutils import build_ext
module_gpio = Extension('pyA13SOM.gpio',
sources=['pyA13SOM/gpio/gpio_lib.c', 'pyA13SOM/gpio/gpio.c'])
module_i2c = Extension('pyA13SOM.i2c',
sources=['pyA13SOM/i2c/i2c_lib.c', 'pyA13SOM/i2c/i2c.c'])
module_spi = Extension('pyA13SOM.spi',
define_macros=[('CYTHON_IN_USE', '1')],
sources=['pyA13SOM/spi/spi_lib.c', 'pyA13SOM/spi/spi.c'])
setup(
name='pyA13SOM',
version='0.2.0',
packages=['pyA13SOM'],
# ext_modules=[module_gpio, module_i2c, module_spi],
cmdclass={'build_ext': build_ext},
ext_modules=cythonize("pyA13SOM/cython/*.pyx"),
)
The tree is in ~/mydir/. I go to ~/mydir/ and do: python setup.py install.
Everything in the build process is OK. Next I try to test import. When I import pyA13SOM.cython.spi, it should give me "Hello world" message. And it does.
~/mydir/$ **python -c "import pyA13SOM.cython.spi"**
Test:
Hellowwwwwwwwwww!
But when I do this from another directory:
~/someotherdir/$ **python -c "import pyA13SOM.cython.spi"**
ImportError: No module named cython.spi
Any idea why does this happen?
You might need to include the directory in which your newly built .spi file is located into your $PYTHONPATH. Otherwise python cannot find the file to import it. While you are in ~/mydir/, python searches the local path if I am not mistaken...
Depending on the shell you are using, you can include the ~/mydir/ directory into the pythonpath with the following:
for the bash and sh shells:
PYTHONPATH=$PYTHONPATH:~/mydir/
export $PYTHONPATH
for the csh/tcsh environment:
set PYTHONPATH = ($PYTHONPATH ~/mydir/)
These two commands add the ~/mydir/ temporarily to your $PYTHONPATH. If you want to add the path permanently, you will have to add the above commands to your ~/.bashrc or ~/.tcshrc, respectively.
Hope this helps...