Ok, so I have been looking around and I have seen a couple different options although i am new to python so i am a bit confused. Here is what I am looking for:
I have a project of multiple .py files. I have my files and a lib directory for the libraries i have created. My question is, how do i setup the project to have a version? I see a lot of articles saying that I put the version in setup.py. Where does setup.py go? what else needs to be in that setup.py file? Do I just import setup.py into my python files? How can i check to see if it worked? How can i check the version of each .py file to make sure it imported correctly?
Read hitchhiker's guide to packaging to learn good practice for developing a Python project
The setup.py file is at the heart of a Python project. It describes all of the metadata about your project. There a quite a few fields you can add to a project to give it a rich set of metadata describing the project. However, there are only three required fields: name, version, and packages. The name field must be unique if you wish to publish your package on the Python Package Index (PyPI). The version field keeps track of different releases of the project. The packages field describes where you’ve put the Python source code within your project.
Our initial setup.py will also include information about the license and will re-use the README.txt file for the long_description field. This will look like:
from distutils.core import setup
setup(
name='TowelStuff',
version='0.1dev',
packages=['towelstuff',],
license='Creative Commons Attribution-Noncommercial-Share Alike license',
long_description=open('README.txt').read(),
)
Typically a project will include it's version as an __version__ attribute within its top level namespace.
For example:
>>> import myproject
>>> print myproject.__version__
'3.2.0'
See http://www.python.org/dev/peps/pep-0396/ for more info and ways to access __version__ from within your setup.py file.
Related
I'm writing a Python project which is published as a package to a pypi-like repository (using setuptools and twine). I use type hints in my code.
The issue is, when importing the package from a different project and running mypy, I get the following error:
error: Skipping analyzing 'XXX': found module but no type hints or library stubs
As I understand, I got this error because my package was not compliant with https://www.python.org/dev/peps/pep-0561/ .
After some searching online, I didn't find a way that was not manual to add the required files to the package.
I resorted to writing my own code to:
Run stubgen to create stub files.
Create py.typed files in every directory.
Collect all the created files in a dict in package_data field in the setup.py file.
This code solved the issue and mypy runs without errors. But this feels very wrong to me. Is there a standard tool for making a package PEP-561 compliant? Am I missing something else?
As mentioned before, You need to add the py.typed in the package folder of the module.
You also need to add that file to the setup.py package_data - otherwise the file would not be part of the package when You deploy it.
I personally put the type annotations in the code and dont create extra stub files - but that is only possible from python 3.4 upwards. If You want to make python2.7 compatible code, You can not use inline type annotation - in that case You can use stub files.
If You want to type annotate a third party library, You can write a *.pyi file for the functions You use for that library. That can be a bit tricky, because MYPY must only find that *.pyi file ONCE in the MYPY Path.
So I handle it that way :
for local testing, the MYPY path is set to a directory were I collect all the 3rd party stubs,
for testing on travis, I have a subdirectory in the package with the stubs needed for that module to test it on travis, and set the mypy path accordingly.
The solution was to add one py.typed file to the root of the main package. This forces mypy to analyze the types.
I have two python projects, one includes useful packages for file manipulation and such. They are usefull because they can be reused in any other kind of project. The second is one of these projects, requiring the use of my useful packages. Here is my projects' file structure:
Python-projects/
Useful/
package_parsing/
package_file_manipulation/
package_blabla/
Super-Application/
pkg1/
__init__.py
module1.py
module2.py
pkg2/
setup.py
MANIFEST.in
README.rst
First of, I would like to use the Useful/package_parsing package in my module Super-Application/pkg1/module1.py. Is there a more convenient way to do it other than copying the package_parsing in Super-Application project?
Depending in the first answer, that is if there is a way to link a module from a different project, how could I include such external module in a release package of my Super-Application project? I am not sure that making use of install_requires in the setup.py will do.
My main idea here is not to duplicate the Useful/package_parsing package in all of my other development projects, especially when I would like to do modifications to this useful package. I wouldn't like to update all the outdated copies in each project.
=============
EDIT 1
It appears the first part of my question cna be dealt with appending the path:
import sys
sys.path.insert(0, path/to/Useful/package_parsing)
Moreover I can simply check the available paths using:
for p in sys.path:
print p
Now for the second part, how could I include such external module in a release package, possibly using the setup.py installation file?
A project written in python with some C extensions (not using SWIG etc). I am trying to figure out how to structure my project so that:
imports work. Importing the shared objects
I don't need to change PYTHONPATH (tried figuring it out and failed).
In the future, distributing the project package will be easiest.
Current structure is, as suggested here:
Project\
docs\ # mainly documentation, right?
bin\ # empty
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
tests\
bla_test.py
C\ # the package of C files
file.c
file.so
other_c_stuff.c
header.h
setup.py # setup to compile the C files and create .so files
build\ # contaisn a bunch of (hopefully) irrelevant stuf
It worked from PyDev but not from shell. Ideal answer would adress the following:
A suggested structure for the project.
How an import would be perfomed (from, say, the by modules in tests).
Should (can) I keep all the C files in a separate library?
The build of the C files is done in which of the setup.py files (should I post them here?)
Is it possible to automatically build when necessary? How?
I tried relative imports - they don't work for me for some reason.
I saw the accepted answet to this question. He says - do whatever. But I can't get the imports to work. I read this answer, but have no clue what's all the stuff he has (and I don't have). The accepted answer doesn't help me because, again, the imports fail. This blog posts gives good advice but, again, the imports!
I don't want to go into detail for a general answer, since you linked to good ones already.
Some structure, which should work for you, could look like:
Project\
build\ # directory used by setup.py
docs\ # mainly documentation, right?
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
c_package\
__init__.py
file.c
file.so
other_c_stuff.c
header.h
tests\
__init__.py
test_bla.py
So within the project package and its subpackages you can use relative imports, if you build the C Extensions inplace
python setup.py build_ext --inplace
or create a setup.cfg containing
[build_ext]
inplace=True
but only use this for development and don't release it, since installation will fail.
A build automation is possible, but I don't know of any other than calling setup.py directly, whenever the C sources have changed.
I'm currently experimenting with Python packages. I have a tiny project which I would like to share with some people. This project consists of exactly one Python file, so I thought it should not be too difficult to create a Python package for it.
I've managed to register the project with the following setup.py at PyPI:
from setuptools import setup
setup(
name='LumixMaptool',
version='1.0.4',
author='Martin Thoma',
author_email='info#martin-thoma.de',
packages=['lumix-maptool'],
scripts=['lumix-maptool/lumix-maptool.py'],
url='http://pypi.python.org/pypi/LumixMaptool/',
license='LICENSE',
description='Manage GPS information for Panasonic Lumix cameras.',
long_description="""Panasonic offers GPS metadata to add to a SD card. This metadata can contain
tourist information that might be useful for sightseeing. This maptool helps
to copy the data from Lumix DVD to the SD card that is inserted into your
computer (the camera has not to be connected).""",
install_requires=[
"argparse >= 1.2.1",
"pyparsing >= 2.0.1",
"pyparsing >= 2.0.1",
],
entry_points={
'console_scripts':
['lumixmaptool = lumixmaptool:main']
}
)
with the command
python setup.py register
and later updated with
python setup.py sdist upload
Now it's here: https://pypi.python.org/pypi/LumixMaptool
But I currently have problems with the following entries:
packages
scripts
entry_points
What do I have to fill in there? Do I have to have a certain project structure / some files?
I currently have:
Readme.txt
LICENSE.txt
setup.py
lumix-maptool.py
The projects GitHub site is here: https://github.com/MartinThoma/lumix_map_tool
Every package on PyPI needs to have a file called setup.py at the root of the directory. If your’e using a markdown-formatted read me file you’ll also need a setup.cfg file. Also, you’ll want a LICENSE.txt file describing what can be done with your code. So if I’ve been working on a library called mypackage, my directory structure would look like this:
root-dir/ # arbitrary working directory name
setup.py
setup.cfg
LICENSE.txt
README.md
mypackage/
__init__.py
foo.py
bar.py
baz.py
Refer this link to know more about packaging.
Entry Point
EntryPoints provide a persistent, filesystem-based object name registration and name-based direct object import mechanism (implemented by the setuptools package).
They associate names of Python objects with free-form identifiers. So any other code using the same Python installation and knowing the identifier can access an object with the associated name, no matter where the object is defined. The associated names can be any names existing in a Python module; for example name of a class, function or variable. The entry point mechanism does not care what the name refers to, as long as it is importable.
An "entry point" is typically a function (or other callable function-like object) that a developer or user of your Python package might want to use, though a non-callable object can be supplied as an entry point as well (as correctly pointed out in the comments!).
The most popular kind of entry point is the "console_script" entry point, which points to a function that you want made available as a command-line tool to whoever installs your package.
Packages
Packages are used to include all the python packages present in your root-dir. you can use find_packages().
For simple projects, it's usually easy enough to manually add packages to the packages argument of setup(). However, for very large projects (Twisted, PEAK, Zope, Chandler, etc.), it can be a big burden to keep the package list updated. That's what setuptools.find_packages() is for. Refer docs.
Scripts
Refer this docs for scripts.
Following an (hopefully) common practice, I have a Python package that includes several modules and an executable script in a separate scripts directory, as can be seen here.
The documentation for the script, apart from the auto-generated help given by optparse, is together with the package documentation in a Sphinx subdirectory. I am trying to:
generate the man page for the script from the existing documentation
include the man page in the distribution
I can easily do #1 with Sphinx, the man_pages setting and sphinx-build -b man. So I can call python setup.py build_sphinx -b man and have the man page generated in the build/sphinx/man directory.
Now I would like to be able to have the generated man page included in the distribution tarball, so GNU/Linux packagers can find it and install it to the proper location. Various options like package_data do not seem to work here because the man page is not there until it is generated by Sphinx. This could also apply to i18n files (.mo vs .po files).
Including files that are not part of the source in MANIFEST.in doesn't seem right. The possibility of commiting the generated files to the source repository looks like an awful thing to do and I would like to avoid it.
There should be one-- and preferably only one --obvious way to do it.
To add static man pages in you distribution, you can add them in the MANIFEST file.
recursive-include docs *.txt
recursive-include po *.po
recursive-include sample_data *
recursive-include data *.desktop *.svg *.png
include COPYING.txt
include README.txt
recursive-include man_pages
Where man_pages is the directory containing the copies of generated man pages.
See also: http://linuxmanpages.com/man1/man.1.php
I would cause setup.py to generate the man pages probably before calling distutils.core.setup. Remember that setup.py at one level is python code. You want to test and make sure that it works even if sphinx is not installed (unless you require sphinx). So, if the man pages already exist and sphinx is not available do not fail. That way someone who unpacks your source distribution without sphinx can still run setup.py build and other targets.
The other option is to check in the man pages, but like you, I find that ugly.
The thing that I have seen done before is to provide a build target for your docs and make it clear in the README file that the documentation includes man pages and can be built by running that build target. Package maintainers then build your docs and package them during the package creation process.
The fedora 18 rpm for hawkey, for example, builds this way. I have also seen other rpms follow the model of building documentation at the same time as the source is built, then packaging it.
This question deserve a better answer, and not only because this issue has been bothering me for a while. So here is my implementation.
Download build_manpage.py from my github project (here is a link to build_manpage)
Save it somewhere you can import it to your setup.py
# inside setup.py
from setuptools import setup
from build_manpage import BuildManPage
...
...
setup(
...
...
cmdclass={
'build_manpage': BuildManPage,
)
Now you can invoke setup.py like this:
$ python setup.py build_manpage --output=prog.1 --parser=yourmodule:argparser