Python package with local dependencies - python

I have a local python project (A) that depends on another local python project (B).
I have added the project B as a git submodule to a dependencies folder.
When I pull the repo with all the submodules, I would like to have a setup.py in A such that wenn I run pip install -e . in A, it also installs the setup.py in B.
The repo looks like this:
A
├── A
│ └── foo.py
│
├── dependencies
│ └── B
│ ├── B
│ │ └── foo.py
│ │
│ └── setup.py
│
└── setup.py
I tried from setuptools import setup, find_packages
setup(
name='A',
packages=find_packages(),
)
But that only finds package A, which makes sense.
I also tried adding a preinstall command
class PreInstallCommand(install):
"""Pre-installation for installation mode."""
def run(self):
check_call("pip install -e dependencies/B".split())
install.run(self)
setup(
...
cmdclass={
'install': PreInstallCommand,
},
...
)
But that one is not triggered when I install it in editable mode (pip install -e).
Thanks in advance for your advice! :)
Cheers!

Related

import module after pip install wheel

I have a customized built module, lets call it abc, and pip install /local_path/abc-0.1-py3-none-any.whl. Installation is correct,
>>pip install dist/abc-0.1-py3-none-any.whl
Processing ./dist/abc-0.1-py3-none-any.whl
Successfully installed abc-0.1
but I could not import the module.
After I ran ppip freeze list and found out the name of module in list is abc # file:///local_path/abc-0.1-py3-none-any.whl.
my question is how could import the module? Thank you
.
├── requirements.txt
├── setup.py
├── src
│   ├── bin
│   │   ├── __init__.py
│   │   ├── xyz1.py
│   │   ├── xyz2.py
│   │   └── xyz3.py
here is my setup.py
with open("requirements.txt") as f:
install_requires = f.read()
setup(
name="abc",
version="0.1",
author="galaxyan",
author_email="galaxyan#123.com",
description="test whell framework",
packages=find_packages(include=["src"]),
zip_safe=False,
install_requires=install_requires,
)
############ update ############
it does not work even change setup.py
with open("requirements.txt") as f:
install_requires = f.read()
setup(
name="abc",
version="0.1",
author="galaxyan",
author_email="galaxyan#123.com",
description="test whell framework",
packages=find_packages(where="src"),
package_dir={"": "src"},
zip_safe=False,
install_requires=install_requires,
)
The setup.py is wrong, which means you're building a wheel with no packages actually inside.
Instead of
setup(
...
packages=find_packages(include=["src"]),
...
)
Try this:
setup(
...
packages=find_packages(where="src"),
package_dir={"": "src"},
...
)
See Testing & Packaging for more info.

Creating python package without releasing it to pypi

Hi guys i have a problem with creating python package out of my project. I read some tutorials but every one leads to uploading files to pypi which i dont want to do. I just want to pip install it to my machine locally using tar.gz file.
Here's a structure of my project folder:
root
├── src
│ ├── __init__.py
│ ├── config.py
│ └── sth_a
│ │ ├── __init__.py
│ │ └── a.py
│ └── sth_b
│ ├── __init__.py
│ └── b.py
└── setup.py
Here is how my setup.py file looks like:
from setuptools import setup, find_packages
setup(name="mypkg",
version='0.0.1',
author="whatever",
packages=find_packages()
)
First i run command:
python setup.py sdist bdist_wheel
then it creatates dist dictionary with tar.gz file and wheel file, so i just run
pip3 install dist/mypkg-0.0.1.tar.gz
After this first problem emerges. To import these files somewhere else i need to write
from src.sth_a.a import *
but i want to do this like this
from mypgk.src.sth_a.a import *
or even if i just want to 'publish' for example functions from file a.py
from mypck.a import *
Also i was having another issues bit this answer helped me a bit but it is not still what i want pip install . creates only the dist-info not the package

packaging a second-level sub-directory

I have this project architecture and I want to make my_package pip installable. The project doesn't only contain stuff to package but also simple scripts (the quick and dirty kind) and over things that are important in my project but not for the package (external data for example).
my_project
├── code
│   ├── data #<-- I don't want to package this
│   │   └── make_dataset.py
│ ├── script #<-- I don't want to package this
│ │ └── make_experiment.py
│   └── my_package #<-- This is the module I want to package
│      ├── core.py
│ ├── utils.py
│      └── __init__.py
├── data
│   └── some_data.txt
├── references
│ └── reference_paper.pdf
├── reports
│ └── report.tex
├── LICENSE
├── README.md
├── requirements.txt
└── setup.py
I would like the setup.py file to be in the top-level directory so that people can do the usual
git clone gitinstance.com/my_project
cd my_project
pip install .
and get my_package module installed in their environment so they can already do python -c import my_package; print(my_package.__version__) and it works.
The question is: How can I make my_package pip-installable without putting setup.py inside the code directory?
Usually, the setup.py would look like this:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(),
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)
But it wouldn't work here because setup.py can't find my_package.
I found an example in the documentation of setuptools that more or less fit my use-case.
The solution is in the packages and package_dir arguments of the setup function that allows to specify where to find the packages to install. This is usually hidden because it defaults to the current working directory.
In my simple case, the setup.py transforms to:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(where="code"),
package_dir={'': "code"},
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)

Python can't find self-written module

I'm writing a python module and want to install it. The structure is as follows:
look-up
├── look-up
│ ├── utilities
│ │ ├── __init__.py
│ │ ├── validator.py
│ └── __init__.py
├── README.md
├── requirements.txt
├── setup.py
└── MANIFEST.in
setup.py looks like this:
from setuptools import setup, find_packages
from codecs import open
from os import path
here = path.abspath(path.dirname(__file__))
with open(path.join(here, 'README.md'), encoding='utf-8') as f:
long_description = f.read()
setup(
name = 'look_up',
version = '0.0.1',
packages = find_packages(),
)
Afte i run:
python setup.py develop
and try to import it it I get the error:
No module named 'look_up'
Can someone help me what I'm doing wrong?

How to package a cython module?

I have some Cython wrapped C++ code that I want to package up. The package directory is structured like so:
.
├── PackageA
│ ├── Mypackage.pyx
│ ├── MyPackageC.cpp
│ ├── HeaderFile.h
│ ├── __init__.py
│ └── setup.py
├── requirements.txt
├── setup.py
I have previously been making a shared object file by running python setup.py build_ext --inplace using the setup.py file inside of the PackageA directory and importing the shared object file but I am unsure how to deal with this inside a package structure. How can I do this?
python setup.py install should do the right thing. You can check it by doing import PackageA from a separate python session outside of the project folder.

Categories