I have this project architecture and I want to make my_package pip installable. The project doesn't only contain stuff to package but also simple scripts (the quick and dirty kind) and over things that are important in my project but not for the package (external data for example).
my_project
├── code
│ ├── data #<-- I don't want to package this
│ │ └── make_dataset.py
│ ├── script #<-- I don't want to package this
│ │ └── make_experiment.py
│ └── my_package #<-- This is the module I want to package
│ ├── core.py
│ ├── utils.py
│ └── __init__.py
├── data
│ └── some_data.txt
├── references
│ └── reference_paper.pdf
├── reports
│ └── report.tex
├── LICENSE
├── README.md
├── requirements.txt
└── setup.py
I would like the setup.py file to be in the top-level directory so that people can do the usual
git clone gitinstance.com/my_project
cd my_project
pip install .
and get my_package module installed in their environment so they can already do python -c import my_package; print(my_package.__version__) and it works.
The question is: How can I make my_package pip-installable without putting setup.py inside the code directory?
Usually, the setup.py would look like this:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(),
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)
But it wouldn't work here because setup.py can't find my_package.
I found an example in the documentation of setuptools that more or less fit my use-case.
The solution is in the packages and package_dir arguments of the setup function that allows to specify where to find the packages to install. This is usually hidden because it defaults to the current working directory.
In my simple case, the setup.py transforms to:
from setuptools import find_packages, setup
setup(
name='my_package',
packages=find_packages(where="code"),
package_dir={'': "code"},
version='0.1.0',
description='Research project',
author='Name',
license='MIT',
)
Related
I have a customized built module, lets call it abc, and pip install /local_path/abc-0.1-py3-none-any.whl. Installation is correct,
>>pip install dist/abc-0.1-py3-none-any.whl
Processing ./dist/abc-0.1-py3-none-any.whl
Successfully installed abc-0.1
but I could not import the module.
After I ran ppip freeze list and found out the name of module in list is abc # file:///local_path/abc-0.1-py3-none-any.whl.
my question is how could import the module? Thank you
.
├── requirements.txt
├── setup.py
├── src
│ ├── bin
│ │ ├── __init__.py
│ │ ├── xyz1.py
│ │ ├── xyz2.py
│ │ └── xyz3.py
here is my setup.py
with open("requirements.txt") as f:
install_requires = f.read()
setup(
name="abc",
version="0.1",
author="galaxyan",
author_email="galaxyan#123.com",
description="test whell framework",
packages=find_packages(include=["src"]),
zip_safe=False,
install_requires=install_requires,
)
############ update ############
it does not work even change setup.py
with open("requirements.txt") as f:
install_requires = f.read()
setup(
name="abc",
version="0.1",
author="galaxyan",
author_email="galaxyan#123.com",
description="test whell framework",
packages=find_packages(where="src"),
package_dir={"": "src"},
zip_safe=False,
install_requires=install_requires,
)
The setup.py is wrong, which means you're building a wheel with no packages actually inside.
Instead of
setup(
...
packages=find_packages(include=["src"]),
...
)
Try this:
setup(
...
packages=find_packages(where="src"),
package_dir={"": "src"},
...
)
See Testing & Packaging for more info.
Hi guys i have a problem with creating python package out of my project. I read some tutorials but every one leads to uploading files to pypi which i dont want to do. I just want to pip install it to my machine locally using tar.gz file.
Here's a structure of my project folder:
root
├── src
│ ├── __init__.py
│ ├── config.py
│ └── sth_a
│ │ ├── __init__.py
│ │ └── a.py
│ └── sth_b
│ ├── __init__.py
│ └── b.py
└── setup.py
Here is how my setup.py file looks like:
from setuptools import setup, find_packages
setup(name="mypkg",
version='0.0.1',
author="whatever",
packages=find_packages()
)
First i run command:
python setup.py sdist bdist_wheel
then it creatates dist dictionary with tar.gz file and wheel file, so i just run
pip3 install dist/mypkg-0.0.1.tar.gz
After this first problem emerges. To import these files somewhere else i need to write
from src.sth_a.a import *
but i want to do this like this
from mypgk.src.sth_a.a import *
or even if i just want to 'publish' for example functions from file a.py
from mypck.a import *
Also i was having another issues bit this answer helped me a bit but it is not still what i want pip install . creates only the dist-info not the package
I have some Cython wrapped C++ code that I want to package up. The package directory is structured like so:
.
├── PackageA
│ ├── Mypackage.pyx
│ ├── MyPackageC.cpp
│ ├── HeaderFile.h
│ ├── __init__.py
│ └── setup.py
├── requirements.txt
├── setup.py
I have previously been making a shared object file by running python setup.py build_ext --inplace using the setup.py file inside of the PackageA directory and importing the shared object file but I am unsure how to deal with this inside a package structure. How can I do this?
python setup.py install should do the right thing. You can check it by doing import PackageA from a separate python session outside of the project folder.
EDIT: So, after installing Ruamel.yaml from a local path, uninstalling it and reinstalling it worked perfectly. I have no idea why re-installing it changed anything, but hey, it works.
Please close this question.
Original:
I wanted to install the Ruamel.Yaml module for Python3.4 in
PythonAnywhere. However,if I tried to use PIP3.4 I used to get the following result:
Could not find a version that satisfies the requirement ruamel.yaml (from versions: )
No matching distribution found for ruamel.yaml
Trying to work around that, I downloaded ruamel.yaml-0.11.6.tar.gz
(the file tagged as Source), and install it with PIP3.4 using
the -e flag. Apparently, PIP3.4 told me it was a success, and
trying to re-install the package gives me the following message:
pip3.4 install --user ruamel.yaml
Requirement already satisfied (use--upgrade to upgrade): ruamel.yaml
in /home/<username>/dumpfolder_version3/ruamel.yaml-0.11.6
However, when I try to run the library I get the following error...
Traceback (most recent call last):
File "/home/<username>/mailgun/configurar_menu.py", line 3, in <module>
import ruamel.yaml as yaml
ImportError: No module named 'ruamel'
Do you have any idea what could be the problem?
If I try to reinstall another package python already has, I get this
message
pip3.4 install --user pyyaml
Requirement already satisfied (use --upgrade to upgrade): pyyaml
in /usr/local/lib/python3.4/dist-packages
Could that difference be the problem?
There might be a number of problems, but unfortunately I don't have access to pythonanywhere so I cannot test them out. I do however have some experience with ruamel.yaml and its installation ¹.
The main problem is that you try to install in editable mode, but ruamel is a namespace and pip install -e cannot properly handle that. Unfortunately ruamel.yaml's setup.py currently doesn't catch that (it does if you try to use python setup.py to install), and because of that doesn't warn or correct its behaviour.
Your site-packages directory is probably already messed up so that pip is incapable of restoring, but you can try pip uninstall -y ruamel.yaml. After that check if everything starting with ruamel is removed from your lib/python3.4/site-packages directory and reinstall with pip install ruamel.yaml*tar.gz. The latter is also what you need to do if you start from scratch.
After correct installation on 3.4 you should have the following if you do tree ruamel* in your site-packages directory:
ruamel
└── yaml
├── comments.py
├── compat.py
├── composer.py
├── configobjwalker.py
├── constructor.py
├── cyaml.py
├── dumper.py
├── emitter.py
├── error.py
├── events.py
├── __init__.py
├── loader.py
├── main.py
├── nodes.py
├── parser_.py
├── __pycache__
│ ├── comments.cpython-34.pyc
│ ├── compat.cpython-34.pyc
│ ├── composer.cpython-34.pyc
│ ├── configobjwalker.cpython-34.pyc
│ ├── constructor.cpython-34.pyc
│ ├── cyaml.cpython-34.pyc
│ ├── dumper.cpython-34.pyc
│ ├── emitter.cpython-34.pyc
│ ├── error.cpython-34.pyc
│ ├── events.cpython-34.pyc
│ ├── __init__.cpython-34.pyc
│ ├── loader.cpython-34.pyc
│ ├── main.cpython-34.pyc
│ ├── nodes.cpython-34.pyc
│ ├── parser_.cpython-34.pyc
│ ├── reader.cpython-34.pyc
│ ├── representer.cpython-34.pyc
│ ├── resolver.cpython-34.pyc
│ ├── scalarstring.cpython-34.pyc
│ ├── scanner.cpython-34.pyc
│ ├── serializer.cpython-34.pyc
│ ├── tokens.cpython-34.pyc
│ └── util.cpython-34.pyc
├── reader.py
├── representer.py
├── resolver.py
├── scalarstring.py
├── scanner.py
├── serializer.py
├── tokens.py
└── util.py
ruamel.yaml-0.11.6.dist-info
├── DESCRIPTION.rst
├── INSTALLER
├── METADATA
├── metadata.json
├── namespace_packages.txt
├── RECORD
├── top_level.txt
└── WHEEL
¹ I am the author
As edited above, this was already solved. According to Yamuel's own author, it may have been because the -e flag was messing everything up.
My folder structure looks like
.
├── my_package
│ ├── A.py
│ └── __init__.py
├── my_package2
│ ├── B.py
│ └── __init__.py
└── setup.py
And my setup.py looks like
from setuptools import setup
if __name__ == '__main__':
setup(name='my_packages',
packages=['my_package'])
When I run
python3 setup.py develop
The egg file is created locally and an egg-link file is placed in my site-packages directory. Also, the folder is added to easy_install.pth which means that both my_package and my_package2 are importable. This is different than running python3 setup.py install (only my_package would be available per the keyword argument passed to the setup function).
This same behavior occurs when installing with pip using the -e flag.
Is this intended behavior? Is there any way to replicate the functionality of install using develop?