I have some Cython wrapped C++ code that I want to package up. The package directory is structured like so:
.
├── PackageA
│ ├── Mypackage.pyx
│ ├── MyPackageC.cpp
│ ├── HeaderFile.h
│ ├── __init__.py
│ └── setup.py
├── requirements.txt
├── setup.py
I have previously been making a shared object file by running python setup.py build_ext --inplace using the setup.py file inside of the PackageA directory and importing the shared object file but I am unsure how to deal with this inside a package structure. How can I do this?
python setup.py install should do the right thing. You can check it by doing import PackageA from a separate python session outside of the project folder.
Related
I have a local python project (A) that depends on another local python project (B).
I have added the project B as a git submodule to a dependencies folder.
When I pull the repo with all the submodules, I would like to have a setup.py in A such that wenn I run pip install -e . in A, it also installs the setup.py in B.
The repo looks like this:
A
├── A
│ └── foo.py
│
├── dependencies
│ └── B
│ ├── B
│ │ └── foo.py
│ │
│ └── setup.py
│
└── setup.py
I tried from setuptools import setup, find_packages
setup(
name='A',
packages=find_packages(),
)
But that only finds package A, which makes sense.
I also tried adding a preinstall command
class PreInstallCommand(install):
"""Pre-installation for installation mode."""
def run(self):
check_call("pip install -e dependencies/B".split())
install.run(self)
setup(
...
cmdclass={
'install': PreInstallCommand,
},
...
)
But that one is not triggered when I install it in editable mode (pip install -e).
Thanks in advance for your advice! :)
Cheers!
Hi guys i have a problem with creating python package out of my project. I read some tutorials but every one leads to uploading files to pypi which i dont want to do. I just want to pip install it to my machine locally using tar.gz file.
Here's a structure of my project folder:
root
├── src
│ ├── __init__.py
│ ├── config.py
│ └── sth_a
│ │ ├── __init__.py
│ │ └── a.py
│ └── sth_b
│ ├── __init__.py
│ └── b.py
└── setup.py
Here is how my setup.py file looks like:
from setuptools import setup, find_packages
setup(name="mypkg",
version='0.0.1',
author="whatever",
packages=find_packages()
)
First i run command:
python setup.py sdist bdist_wheel
then it creatates dist dictionary with tar.gz file and wheel file, so i just run
pip3 install dist/mypkg-0.0.1.tar.gz
After this first problem emerges. To import these files somewhere else i need to write
from src.sth_a.a import *
but i want to do this like this
from mypgk.src.sth_a.a import *
or even if i just want to 'publish' for example functions from file a.py
from mypck.a import *
Also i was having another issues bit this answer helped me a bit but it is not still what i want pip install . creates only the dist-info not the package
We have a private git monorepo which hosts a number of Python packages. Poetry was the dependency management tool initially chosen for the project. Anyway, due to this Poetry issue, it would not be accepted solution that involves creating new setup.py files.
A simplified version of the structure:
git-monorepo
├── pkg-1
│ ├── pkg
│ │ └── mod1.py
│ └── pyproject.toml
├── pkg-2
│ ├── pkg
│ │ └── mod2.py
│ └── pyproject.toml
└── lib
├── pkg
│ └── lib.py
└── pyproject.toml
The library distribution package lib is indepentent from any other package. However, pkg-1 depends on lib and pkg-2 depends on both pkg-1 and lib.
So, the question is:
How would be the proper way to use pip to install a package from this monorepo?
Let us consider as an example that we try to install pkg-1, where pkg-1/pyproject.toml includes the following lines:
...
[tool.poetry.dependencies]
lib = {path = "../lib/"}
...
The result from running pip, as explained in the VCS support documentation:
$ pip install -e git+https://gitlab.com/my-account/git-monorepo#"egg=pkg-1&subdirectory=pkg-1"
Traceback (most recent call last):
File "/home/hblanco/.local/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3101, in __init__
super(Requirement, self).__init__(requirement_string)
File "/home/hblanco/.local/lib/python3.8/site-packages/pip/_vendor/packaging/requirements.py", line 115, in __init__
raise InvalidRequirement("Invalid URL: {0}".format(req.url))
pip._vendor.packaging.requirements.InvalidRequirement: Invalid URL: ../lib
The problem in the above setup is that the dependency is specified as a path dependency. When installing it, it uses that path dependency.
I ran into the same issue with a python monorepo, where I wanted to share the packages also to other projects.
I found 2 approaches to work for me:
in the CI/CD build pipeline, edit the pyproject.toml just before creating the wheel (which is published to a pypi repo)
First create the wheel (or .tar.gz) artifact, and then modify it afterwards (by extracting it, replacing the path dependencies, and zipping it again).
The full approach I've explained here.
However, it won't work with the git+https://...
You'll need a (private) pypi repo somewhere. Gitlab provides one for each project, which I utilize in the demo project here
I have this project architecture and I want to make my_package pip installable. The project doesn't only contain stuff to package but also simple scripts (the quick and dirty kind) and over things that are important in my project but not for the package (external data for example).
my_project
├── code
│ ├── data #<-- I don't want to package this
│ │ └── make_dataset.py
│ ├── script #<-- I don't want to package this
│ │ └── make_experiment.py
│ └── my_package #<-- This is the module I want to package
│ ├── core.py
│ ├── utils.py
│ └── __init__.py
├── data
│ └── some_data.txt
├── references
│ └── reference_paper.pdf
├── reports
│ └── report.tex
├── LICENSE
├── README.md
├── requirements.txt
└── setup.py
I would like the setup.py file to be in the top-level directory so that people can do the usual
git clone gitinstance.com/my_project
cd my_project
pip install .
and get my_package module installed in their environment so they can already do python -c import my_package; print(my_package.__version__) and it works.
The question is: How can I make my_package pip-installable without putting setup.py inside the code directory?
Usually, the setup.py would look like this:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(),
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)
But it wouldn't work here because setup.py can't find my_package.
I found an example in the documentation of setuptools that more or less fit my use-case.
The solution is in the packages and package_dir arguments of the setup function that allows to specify where to find the packages to install. This is usually hidden because it defaults to the current working directory.
In my simple case, the setup.py transforms to:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(where="code"),
package_dir={'': "code"},
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)
My folder structure looks like
.
├── my_package
│ ├── A.py
│ └── __init__.py
├── my_package2
│ ├── B.py
│ └── __init__.py
└── setup.py
And my setup.py looks like
from setuptools import setup
if __name__ == '__main__':
setup(name='my_packages',
packages=['my_package'])
When I run
python3 setup.py develop
The egg file is created locally and an egg-link file is placed in my site-packages directory. Also, the folder is added to easy_install.pth which means that both my_package and my_package2 are importable. This is different than running python3 setup.py install (only my_package would be available per the keyword argument passed to the setup function).
This same behavior occurs when installing with pip using the -e flag.
Is this intended behavior? Is there any way to replicate the functionality of install using develop?