I have the following directory structure:
lib/
lib/
pkg1/
__init__.py
pkg2/
__init__.py
data/ # has many subdirectories and files
tests/
.gitignore
setup.py
The data folder (not a package) has files and other subfolders, and some of them are excluded from the git repo due to .gitignore.
I want to python setup.py bdist_wheel this project and include the files in data, but I don't want any .gitignore'd file in the final wheel. Does setuptools support that?
Does setuptools support that?
No, it does not. Maybe with an additional library such as pbr or setuptools-scm on top of setuptools.
You can use setuptools_scm instead.
Related
I've restructured a project to the src directory structure. It looks like this:
root_dir/
src/
module1/
__init__.py
script1.py
script2.py
module2/
__init__.py
other_script1.py
other_script2.py
conftest.py
setup.py
tests/
conftest.py
some_tests/
conftest.py
test_some_parts.py
some_other_tests/
conftest.py
test_these_other_parts.py
My setup.py looks like this:
setup(
name='Project',
version=0.0,
author='Me',
install_requires=['pyodbc'],
tests_require=['pytest'],
setup_requires=['pytest-runner'],
test_suite='root_dir.Tests',
entry_points={
'console_scripts': ['load_data = module1.script1:main']
},
package_data={'Config': ['*.json']},
packages=find_packages('src'),
package_dir={'': 'src'})
I am running Anaconda3 on Windows 10. When I run python setup.py install, I am able to run the load_data script without any issue. However, from what I've been reading, it is preferable to use pip install . vice python setup.py install. When I pip install the package and attempt to run load_data, I get ModuleNotFoundError: No module named 'module1.script1'. I've attempted adding 'src' to the front of this, but this doesn't work either. I don't understand what the differences are or how to troubleshoot this.
When building the source distribution for the package not all files are included. Try creating a MANIFEST.in file with
recursive-include src/module1 *
I want to reuse some code for my internal team at work. My plan is to create a package and then have people install the package using pip straight out of our git repo. i.e. as shown here: https://pip.pypa.io/en/latest/reference/pip_install/#git
My question is, do I commit the dist folder to git? What is pip looking for?
Or is there a better way to share / reuse code internally for a team (across many different projects)?
I used a .gitignore file from here (is that github's default Python .gitignore file?) and it ignores all the dist files:
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
but it seems wrong to exclude these from the repo when I'm trying to install from the repo.
You do not need to commit the dist folder. pip really just needs the repository to have a setup.py file along with the packages and/or modules you're installing.
dist is a default name for a directory that contains the final build result: your project ready to be distributed, that is, packaged into a file which pip or other package managers know how to install:
$ python setup.py sdist --help
...
--dist-dir (-d) directory to put the source distribution archive(s) in
[default: dist]
So it is safe to ignore the directory and all of its contents in .gitignore. If you do not plan to upload you project's installation files to PyPI and intend to install it via passing Git url, you don't even need the dist directory and can safely delete it. It will be recreated anyway once you issue any dist command (sdist, bdist, bdist_wheel, bdist_rpm etc).
I am designing a python project like this:
packages/
__init__.py
setup.py
requierment.txt # Require package1
commons/
__init__.py
setup.py
requirement.txt
Common_module.py
package1/
__init__.py
setup.py
requirement.txt # Require commons
Package1_module.py
When I do pip install -r requierment.txt -t ./installation, I would like it to create a folder installation in which I have package1 and commons but it seems that it doesn't resolve the dependencies of package1 leaving the installation with package1 only.
How can I resolve dependencies recurcively?
After some research I found that requierment.txt should list all dependencies but I really don't want that.
So I tried the following:
from distutils.core import setup
required = []
with open('requirements.txt') as f:
for line in f.readline():
if not line.startswith('#'):
required.append(line.rstrip())
setup(...
install_requires=required)
But now, it looks for my dependencies on the Internet and not in my folder even if required is a list of local paths.
It's a simplified view of my issues, I can change my project achritecture a bit but let assume that the first requirement.txt cannot know the dependencies of the sub packages (like the commons package).
Is there a nice way to resolve the dependencies recurcively?
Thanks!
I have a python project that I want to distribute. I read multiple tutorials on how to write my setup.py file and how to install the produced wheel: sample project example, setup.py tutorial, wheel doc, wheel install or wheel install.
The structure of my project is:
project_name
|_ lib
|_ project_folder
|_ py modules
|_ test
|_ setup.py
|_README.rst
I build my wheel like this python setup.py bdist_wheel and then I take the produced wheel into another folder outside my project and do pip install my_wheel. I tried also pip install --no-index --find-links=my_wheel project_name
The problem is that when I look into my python site-packages folder, instead of having:
python folders
project_name
project_name-2.0.0.dist-info
the project_name folder is broken into lib and test:
python folders
lib
project_name-2.0.0.dist-info
test
I don't understand why my project_name isn't like the other python folders, grouped. Can someone help me understand better?
setup.py:
from setuptools import setup, find_packages
from codecs import open
from os import path
root_folder = path.abspath(path.dirname(__file__))
with open(path.join(root_folder, "README.rst"), encoding="utf-8") as f:
long_description = f.read()
setup(
name = "project",
version = "2.0.0",
description = "My project is cool",
long_description = long_description,
packages = find_packages(),
include_package_data = True
)
find_packages() determines packages by the __init__.py files. It looks like your lib and tests directories have __init__.py files in them.
Neither your lib or tests directories are packages, remove the __init__.py files from those. That way find_packages() will only include project_folder() in the resulting distribution (source, binary or wheel).
When i install software(XYZ) using setup.py file using command "python setup.py install" it copy only files present in parent directory to the folder present in site_packages/XYZ .in setup file i define all packages and data_files which i want to use. Software package structure
XYZ
__init__.py
main.py
test1.py
vector
__init__.py
vector1.py
vector2.py
exlib
__init__.py
lib1.py
lib2.py
when install using setup.py install command it copy only main.py,test1.py files in XYZ folder present in site_packages . i want to copy all files present in xyz folder when i run install command . how i modify setup file or any other way to do this.
It sounds like your setup.py needs to have:
packages=['vector', 'exlib'],