I have a python project that I want to distribute. I read multiple tutorials on how to write my setup.py file and how to install the produced wheel: sample project example, setup.py tutorial, wheel doc, wheel install or wheel install.
The structure of my project is:
project_name
|_ lib
|_ project_folder
|_ py modules
|_ test
|_ setup.py
|_README.rst
I build my wheel like this python setup.py bdist_wheel and then I take the produced wheel into another folder outside my project and do pip install my_wheel. I tried also pip install --no-index --find-links=my_wheel project_name
The problem is that when I look into my python site-packages folder, instead of having:
python folders
project_name
project_name-2.0.0.dist-info
the project_name folder is broken into lib and test:
python folders
lib
project_name-2.0.0.dist-info
test
I don't understand why my project_name isn't like the other python folders, grouped. Can someone help me understand better?
setup.py:
from setuptools import setup, find_packages
from codecs import open
from os import path
root_folder = path.abspath(path.dirname(__file__))
with open(path.join(root_folder, "README.rst"), encoding="utf-8") as f:
long_description = f.read()
setup(
name = "project",
version = "2.0.0",
description = "My project is cool",
long_description = long_description,
packages = find_packages(),
include_package_data = True
)
find_packages() determines packages by the __init__.py files. It looks like your lib and tests directories have __init__.py files in them.
Neither your lib or tests directories are packages, remove the __init__.py files from those. That way find_packages() will only include project_folder() in the resulting distribution (source, binary or wheel).
Related
I've restructured a project to the src directory structure. It looks like this:
root_dir/
src/
module1/
__init__.py
script1.py
script2.py
module2/
__init__.py
other_script1.py
other_script2.py
conftest.py
setup.py
tests/
conftest.py
some_tests/
conftest.py
test_some_parts.py
some_other_tests/
conftest.py
test_these_other_parts.py
My setup.py looks like this:
setup(
name='Project',
version=0.0,
author='Me',
install_requires=['pyodbc'],
tests_require=['pytest'],
setup_requires=['pytest-runner'],
test_suite='root_dir.Tests',
entry_points={
'console_scripts': ['load_data = module1.script1:main']
},
package_data={'Config': ['*.json']},
packages=find_packages('src'),
package_dir={'': 'src'})
I am running Anaconda3 on Windows 10. When I run python setup.py install, I am able to run the load_data script without any issue. However, from what I've been reading, it is preferable to use pip install . vice python setup.py install. When I pip install the package and attempt to run load_data, I get ModuleNotFoundError: No module named 'module1.script1'. I've attempted adding 'src' to the front of this, but this doesn't work either. I don't understand what the differences are or how to troubleshoot this.
When building the source distribution for the package not all files are included. Try creating a MANIFEST.in file with
recursive-include src/module1 *
Following is the folder structure:
Utility/utils/__init__.py, wrapper, auditory.py
/setup.py
I have been trying to install utils as site-package in python by running "python setup.py install"
When I go and check the site-package, there is a egg file which has my utils folder and egg-info.
But it should create my utils folder inside site-packages right?
Am I missing something here?
from setuptools import setup
setup(
name='utils',
version='0.1',
packages=['utils'],
license='Internal use only',
zip_safe = False
)
Ideally it should place the utils folder inside site-packages and egg-info inside egg file.
so that utils package would be available similar to and pandas
In my project, I have a single setup.py file that builds multiple modules using the following namespace pattern:
from setuptools import setup
setup(name="testmoduleserver",
packages=["testmodule.server","testmodule.shared"],
namespace_packages=["testmodule"])
setup(name="testmoduleclient",
packages=["testmodule.client","testmodule.shared"],
namespace_packages=["testmodule"])
I am trying to build wheel files for both packages. However, when I do:
python -m pip wheel .
It only ever builds the package for one of the definitions.
Why does only one package get built?
You cannot call setuptools.setup() more than once in your setup.py, even if you want to create several packages out of one codebase.
Instead you need to separate everything out into separate namespace packages, and have one setup.py for each (they all can reside in one Git repository!):
testmodule/
testmodule-client/
setup.py
testmodule/
client/
__init__.py
testmodule-server/
setup.py
testmodule/
server/
__init__.py
testmodule-shared/
setup.py
testmodule/
shared/
__init__.py
And each setup.py contains something along the lines
from setuptools import setup
setup(
name='testmodule-client',
packages=['testmodule.client'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-server',
packages=['testmodule.server'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-shared',
packages=['testmodule.shared'],
...
)
To build all three wheels you then run
pip wheel testmodule-client
pip wheel testmodule-server
pip wheel testmodule-shared
I'm trying get the file VERSION in the root of my python package installed into so that I can read from this file once it's installed, however using MANIFEST.in, the file is placed in the top-level /usr/lib/python3.6/site-packages rather than inside of /usr/lib/python3.6/site-packages/mypackage.
I'm trying to do this so that I can display the packages version at runtime and also easily make use of it inside of the repo.
directory structure:
setup.py
MANIFEST.in
VERSION
mypackage/
- __init__.py
- __main__.py
- foo.py
MANIFEST.in:
include VERSION
setup.py:
#!/usr/bin/env python3
from setuptools import setup, find_packages
with open("VERSION", "r") as versionFile:
version = versionFile.read().strip()
setup(
name="mypackage",
version=version,
packages=find_packages(),
include_package_data=True)
mypackage/__main__.py:
...
verFile = pkg_resources.resource_string(__name__, "VERSION")
with open(verFile, "r") as fin:
version = str(fin.read().strip())
print(version)
...
How can I get the VERSION file to install inside of the package directory?
I am designing a python project like this:
packages/
__init__.py
setup.py
requierment.txt # Require package1
commons/
__init__.py
setup.py
requirement.txt
Common_module.py
package1/
__init__.py
setup.py
requirement.txt # Require commons
Package1_module.py
When I do pip install -r requierment.txt -t ./installation, I would like it to create a folder installation in which I have package1 and commons but it seems that it doesn't resolve the dependencies of package1 leaving the installation with package1 only.
How can I resolve dependencies recurcively?
After some research I found that requierment.txt should list all dependencies but I really don't want that.
So I tried the following:
from distutils.core import setup
required = []
with open('requirements.txt') as f:
for line in f.readline():
if not line.startswith('#'):
required.append(line.rstrip())
setup(...
install_requires=required)
But now, it looks for my dependencies on the Internet and not in my folder even if required is a list of local paths.
It's a simplified view of my issues, I can change my project achritecture a bit but let assume that the first requirement.txt cannot know the dependencies of the sub packages (like the commons package).
Is there a nice way to resolve the dependencies recurcively?
Thanks!