In Python, What is the differences between module, sub-module, package and a sub-package?
package
|-- __init__.py
|-- module.py
|-- sub_package
|-- __init__.py
|-- sub_module.py
Consider packages and sub-packages as folders and sub-folders containing init.py file with other python files.
modules are the python files inside the package.
sub-modules are the python files inside the sub-package.
Related
I have created a package with the following structure in the dev branch (not merging to main until I verify the package installs correctly):
mypackage
|
|-- __init__.py
|-- setup.py
|-- requirements.txt
|-- module.py
|-- subpackage_one
|
|-- __init__.py
|-- module_ab.py
|-- class_aba
|-- class_abb
|-- module_ac.py
|-- function_aca
|-- subpackage_two
|
|-- __init__.py
|-- module_ba.py
|-- function_baa
Additional information:
The __init__.py files at root and in subpackage__two are both empty
The __init__.py file in subpackage_one contains some additional initialization in the form of from mypackage.subpackage_one.module_xx import class_xxx (or function_xxx)
I am installing the package via pip install git+https://github.com/organization/repo.git#dev
If I am in the root directory of the package, I can import the submodules as expected
The setup.py file is:
import setuptools
with open("README.md", "r", encoding="utf-8") as fh:
long_description = fh.read()
setuptools.setup(
name='mypackage',
version='0.0.2',
author='author1, author2',
author_email='author1_email, author2_email',
description='My Package',
long_description=long_description,
long_description_content_type="text/markdown",
url='https://github.com/organization/repo',
packages=['mypackage'],
install_requires=['requests'],
)
When I run the following snippet:
import pkgutil
for i in pkgutil.iter_modules(mypackage.__path__):
print(i)
I see:
ModuleInfo(module_finder=FileFinder('/path/to/package/mypackage'), name='module', ispkg=False)
And indeed, the subpackages are not in the mypackage folder.
How can I get the subpackages to install along with the package?
Your issue might be the packages parameter. It needs to be supplied with every module or 'package'.
setuptools has a nice function to find them, use it like this: packages=setuptools.find_namespace_packages(),
I use pants to manage a Python project that uses protocol buffers. Pants places the generated _pb2.py and _pb2.pyi files under a separate dist/codegen tree. Is it possible to get VS Code autocomplete to work when using the _pb2 modules?
The file tree looks like this:
.
|-- dist/
| `-- codegen/
| `-- src/
| `-- project/
| |-- data_pb2.py
| `-- data_pb2.pyi
`-- src/
`-- project/
|-- __init__.py
|-- code.py
`-- data.proto
And in code.py I have import statements like this:
from project import data_pb2
I've tried setting python.analysis.extraPaths to ["dist/codegen/src"] in settings.json. This makes pylance stop complaining that data_pb2 is missing. But autocomplete still does not work, and pylance has no type information for members of data_pb2.
Replace your python.analysis.extraPaths with the following extent:
"python.analysis.extraPaths": [
"./dist/codegen/src"
],
And adding the following code to your code.py:
import sys
sys.path.append(".\dist\codegen\src")
You can use Python implicit namespace packages (PEP 420) to make this work. Namespace packages are allowed to have modules within the same package reside in different directories. Which allows pylance and other tools to work correctly when code is split between src and dist/codegen/src.
To use implicit namespace packages, you just need to remove src/package/__init__.py, and leave "python.analysis.extraPaths" set to ["dist/codegen/src"].
See also the GitHub issue microsoft/pylance-release#2855, which describes using implicit namespace packages to make pylance work correctly in a similar situation.
there is my project hierachy:
#__init__.py are empty files
folder
|-- globalFunctions.py
|-- __init__.py
|-- monitor
|-- file.py
|-- __init__.py
I'm tring to import functions from globalFunctions.py, when I'm in the file.py file.
I have tried to import it with
from .. import globalFunctions
but I'm getting
ImportError: attempted relative import with no known parent package
Do you know how to make it work?
I'm trying to create Python package with the following structure:
project/
|-- project/
| |-- __init__.py
| |-- templates/
| | |-- somefile.py
|-- setup.py
somefile.py is just a template file that is not syntactically correct.
My setup.py looks like this:
#!/usr/bin/env python
import os
from setuptools import setup, find_packages
setup(name="...",
version="1.1",
...
packages=find_packages(),
package_data={
'project': 'templates/*',
})
This works great for non-Python template files. But with the .py files, setuptools tries to compile somefile.py, which results in a Syntax error since the file is on purpose not syntactically correct. So, how can I add the template Python files in my package without compiling them?
I have a big project with the following structure. utilities is a collections of small modules that are reused in various places by the different components of the big_project, project1, 2, etc.
big_project/
|-- __init__.py
|-- utilities/
|-- mod1.py
|-- mod2.py
|-- project1/
|-- setup.py
|-- __init__.py
|-- src/
|-- __init__.py
|-- mod1.py
|-- mod2.py
|-- examples/
|-- __init__.py
|-- mod.py
|-- project2/
|-- ...
|-- project3/
|-- ...
I want to distribute project1, including utilities (because I don't want to distribute utilities separately). The distributed package would have the following structures:
project1/
|-- utilities/
|-- src/
|-- examples/
and project1/setup.py looks like this:
setup(
name = 'project1',
packages = ['project1.utilities', 'project1.src', 'project1.examples'],
package_dir = {'project1.utilities': '../utilities/',
'project1.src': 'src',
'project1.examples': 'examples'}
)
The problem: python setup.py bdist produces a distribution with the right structure, but python setup.py sdist doesn't:
bdist: content of project1-0.1.linux-x86_64.tar.gz:
/./usr/local/lib/python2.7/site-packages/
|-- project1/
|-- utilities
|-- src
|-- examples
sdist: content of project1-0.1.tar.gz:
project1/
|-- src/
|-- examples/
So sdist left out the utilities module, whereas bdist included it at the correct location. Why?
If anyone wants to look at the real project: https://testpypi.python.org/pypi/microscopy where both the bsdist and sdist archives are available.
Both setuptools and distutils produce the same result. Because the project is pure Python, I'd rather use sdist...
One way that seems to work is to use bdist_wheel, which despite its name produces a platform-agnostic source distribution when the content is pure Python. And wheels are suppose to be the new standard.
setup.py also needs to be told about the root package project1, otherwise project1.__init__.py is missing:
setup(
name = 'project1',
packages = ['project1'
'project1.utilities',
'project1.src',
'project1.examples'],
package_dir = {'project1': '.',
'project1.utilities': '../utilities/',
'project1.src': 'src',
'project1.examples': 'examples'}
)
and then
python2.7 setup.py bdist_wheel
I suggest to update your MANIFEST.in file to include utilities folder
e.g. recursive-include ../utilities *