File structure:
.
|-- rdir
| |-- __init__.py
| |-- core
| | |-- __init__.py
| | |-- rdir_core.py
| | |-- rdir_node.py
| |-- generateHTML
| | |-- __init__.py
| |-- rdir.py
|-- setup.py
Setup.py:
from setuptools import setup, find_packages
setup(
name="rdir",
version="0.40",
description="....",
author="lhfcws",
author_email="lhfcws#gmail.com",
url="...",
license="MIT",
packages=["rdir"],
scripts=["rdir/rdir.py"],
install_requires=['colorama', 'pyquery'],
)
Commands:
sudo python setup.py install #local install
sudo python setup.py sdist upload #pypi upload
Try from rdir import rdir in other path like home directory, only meets with:
ImportError: No module named core.rdir_core
Of course it works well if I import rdir in the project directory.
And I looked into site-packages/rdir.egg-info/, I found that all the .py files were moved into a flat structure:
EGG-INFO
├── PKG-INFO
├── SOURCES.txt
├── dependency_links.txt
├── not-zip-safe
├── requires.txt
├── scripts
│ ├── __init__.py
│ ├── generate_page.py
│ ├── rdir.py
│ ├── rdir_core.py
│ └── rdir_node.py
└── top_level.txt
I also tried if I just import rdir_core in rdir.py, it compiles correctly. So I guessed there's sth wrong with my setup.py, and I read some demos, some setup.py of famous python projects on github and some official manuals. I changed my setup.py according to those reference, but all failed. I have no idea so I have to ask for help.
Is it something wrong with my setup.py? Or is there anything I've missed?
Or please show me a good example of a setup.py of multi file structure projects. Thank you!
BTW, if these above cannot offer you enough information, please look at rdir on Github
It's the problem of the packages keyword in your setup.py. You should list sub packages as well as the top level package.
packages=['rdir', 'rdir.core', 'rdir.generateHTML'],
Or, using find_packages, which you already imported
packages=find_packages(),
I didn't try the sdist stuff, maybe it's just collecting all py files as scripts.
P.S. You can use python setup.py build to test the result folder structure.
Inspired by Ray, I've solved by myself.
It is the problem of scripts in the setup.py.
The project was a flat structure, so it works well before 0.30. However in my computer, the scripts options will install the script to /usr/local/bin and generate a flat structure in EGG.
If I dont remove the old scripts in /usr/local/bin, python interpreter will check those scripts under /usr/local/bin first and cause error.
So the solution is:
remove scripts in setup.py
set packages=find_package()
sudo rm /usr/local/bin/rdir*
Thanks to you all.
Related
I have a Python C extension module which relies on static libraries. Below is my file tree, I haven't included all the files because I am trying to simplify the problem.
folder/
├── src/
| ├── main.c
| └── other.c
├── include/
| ├── glfw3native.h
| └── glfw3.h
├── lib/
| └── libglfw3.a
└── setup.py
Below is my setup.py file, I have removed some unnecessary lines.
import setuptools
setuptools.setup(
ext_modules = [
setuptools.Extension(
"Module.__init__", ["src/main.c", "src/other.c"],
include_dirs = ["include"],
library_dirs = ["lib"],
libraries = ["glfw"])
])
I can successfully compile my project with the following command.
python setup.py bdist_wheel
Now I want to use cibuildtools to compile my project for multiple platforms.
cibuildwheel --platform linux
For some reason, the program crashes when it tries to link the libraries. Even though the library path is stated, it shows the following error.
cannot find -lglfw
Why does this happen when compiling with cibuildwheel?
Because static binaries are different on every system, I need to compile my libraries on the corresponding platform. In the end, I used the CIBW_BEFORE_ALL variable to execute the build commands for my libraries.
This question already has answers here:
How to read a (static) file from inside a Python package?
(6 answers)
Closed 2 years ago.
I am writing a python module and need to install a directory along with the module. This is what my file tree looks like:
├── module
│ └── temp
│ | └── __init.py__
| | file2.yaml
| | file.yaml
│ └── module.py
| __init__.py
├── setup.py
As you can see, module.py is the main module with all my functions, but it needs to access what is in the temp directory. When I use setup tools to install my module locally, using pip, it installs the module.py perfectly, but it won't install the temp directory.
Here is my setup.py:
setup(name='module',
packages=find_packages(),
version=VERSION,
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
license=LICENSE,
author=AUTHORS,
install_requires=[
],
include_package_data=False
)
My theory is that I need to pass something through find packages().
Side note:
I am running setup.py as python setup.py bdist_wheel. To clarify, everything is working fine, my directory is just not being installed with the package.
When I go to where the package is stored, __init__.py and module.py are the only things that are installed in my module package directory.
How can I make sure that the temp directory is installed as well?
Any help is appreciated.
EDIT:
My first file tree was wrong. The temp directory is inside the directory with the module.
When python install packages, the module directory will be copied to the directory like /usr/local/lib/python3.6/dist-packages/module. If you add the extra tmp directory, it may conflict with other module.
How about considering moving the tmp directory to the module directory. like this:
├── module
│ └── temp
│ | └── __init.py__
| | file2.yaml
| | file.yaml
│ └── module.py
| __init__.py
├── setup.py
I have the following folder structure:
PROJECT_DIR
| --helpers
| |--utils.py
| --stuff
| |--script.py
I need to run script.py as a script, and from it, I need to use a function from helpers/utils.py.
I tried relative importing from ..helpers.utils import func, but it says
ImportError: attempted relative import with no known parent package
so I added an empty init.py file to each folder, including PROJECT_DIR.
Then I read that while running as a script, the python compiler runs the script as if it was the main module, so it doesn't see any other modules outside so relative import cannot be used.
But what should I do if I need to use that function? It's a fairly simple use case, I can't get my head around why it's so hard to import a function from a file outside the current directory. Tho I'm not really interested in the whys, I'd just like to know a solution how people do this.
root_project
└── proj
├── __init__.py
├── helpers
│ ├── __init__.py
│ └── utils.py
└── stuff
├── __init__.py
└── script.py
With this structure just cd to root_project and use this command:
python -m proj.stuff.script
In the last few days, I was working on a python module. Until now, I used poetry as a packages management tool in many other projects, but it is my first time wanting to publish a package to PyPI.
I was able to run the poetry build and poetry publish commands. I was also able to also install the published package:
$ pip3 install git-profiles
Collecting git-profiles
Using cached https://files.pythonhosted.org/packages/0e/e7/bac9027effd1e34a5b5718f2b35c0b28b3d67f3809e2f2981b6c7b58963e/git_profiles-1.1.0-py3-none-any.whl
Installing collected packages: git-profiles
Successfully installed git-profiles-1.1.0
However, right after the install, I am not able to run my package:
$ git-profiles --help
git-profiles: command not found
My project has the following structure:
git-profiles/
├── src/
│ ├── commands/
│ ├── executor/
│ ├── git_manager/
│ ├── profile/
│ ├── utils/
│ ├── __init__.py
│ └── git_profiles.py
└── tests
I tried to work with different scripts configurations in the pyproject.toml file but I've never been able to make it work after install.
[tool.poetry.scripts]
poetry = "src:git_profiles.py"
or
[tool.poetry.scripts]
git-profile = "src:git_profiles.py"
I don't know if this is a python/pip path/version problem or I need to change something in the configuration file.
If it is helpful, this is the GitHub repository I'm talking about. The package is also published on PyPI.
Poetry's scripts sections wraps around the console script definition of setuptools. As such, the entrypoint name and the call path you give it need to follow the exact same rules.
In short, a console script does more or less this from the shell:
import my_lib # the module isn't called src, that's just a folder name
# the right name to import is whatever you put at [tool.poetry].name
my_lib.my_module.function()
Which, if given the name my-lib-call (the name can be the same as your module, but it doesn't need to be) would be written like this:
[tool.poetry.scripts]
my-lib-call = "my_lib.my_module:function"
Adapted to your project structure, the following should do the job:
[tool.poetry.scripts]
git-profile = "git-profiles:main"
For a directory structure like the following, I haven't been able to make xy an importable package.
xy
├── __init__.py
├── z
│ ├── __init__.py
│ └── stuff.py
└── setup.py
If the setup.py were a directory up, I could use
from setuptools import setup
setup(name='xy',
packages=['xy'])
but short of that, no combination of package_dir and packages has let me import xy, only import z. Unfortunately, moving the setup.py a directory up isn't really an option due to an excessive number of hard-coded paths.
See the following answer for ideas how to use package_dir and packages to help with such projects: https://stackoverflow.com/a/58429242/11138259
In short for this case here:
#!/usr/bin/env python3
import setuptools
setuptools.setup(
# ...
packages=['xy', 'xy.z'],
#packages=setuptools.find_packages('..') # only if parent directory is otherwise empty
package_dir={
'xy': '.',
},
)