I had some issues with importing my custom package into project.
I've created very simple setup.py file.
It looks like:
from setuptools import setup
import custom_package
setup(
name='custom_package',
version=custom_package.__version__,
packages=['custom_package'],
)
Then I'm installing it into my system:
python setup.py install
custom_package becomes available, but there are a lot of additional folders in my project after this command:
build/
dist/
custom_package.egg-info/
Is it expected or I should avoid them somehow?
Yes that is to be expected, because you don't limit the files that should go into the package by using the setup argument package_dir. Without that setup will take everything from the directory that it is in, which includes the directories it makes for storing the structure that has to go into the installable archive (build) , as well as the destination directory for the final .tar.gz: dist).
This is one of the reasons why many package layouts have the habit of having a src subdirectory to store all the sources. You can add a list all the .py files that have to go in as a list, or use some helper function to generate that list (which is easier if you already have everything under one directory that is "clean".
Related
I have a python package built from source code in /Document/pythonpackage directory
/Document/pythonpackage/> python setup.py install
This creates a folder in site-packages directory of python
import pythonpackage
print(pythonpackage.__file__)
>/anaconda3/lib/python3.7/site-packages/pythonpackage-x86_64.egg/pythonpackage/__init__.py
I am running a script on multiple environments so the only path I know I will have is pythonpackage.__file__. However Document/pythonpackage has some data that is not in site-packages is there a way to automatically find the path to /Document/pythonpackage given that you only have access to the module in python?
working like that is discouraged. it's generally assumed that after installing a package the user can remove the installation directory (as most automated package managers would do). instead you'd make sure your setup.py copied any data files over into the relevant places, and then your code would pick them up from there.
assuming you're using the standard setuptools, you can see the docs on Including Data Files, which says at the bottom:
In summary, the three options allow you to:
include_package_data
Accept all data files and directories matched by MANIFEST.in.
package_data
Specify additional patterns to match files that may or may not be matched by MANIFEST.in or found in source control.
exclude_package_data
Specify patterns for data files and directories that should not be included when a package is installed, even if they would otherwise have been included due to the use of the preceding options.
and then says:
Typically, existing programs manipulate a package’s __file__ attribute in order to find the location of data files. However, this manipulation isn’t compatible with PEP 302-based import hooks, including importing from zip files and Python Eggs. It is strongly recommended that, if you are using data files, you should use the ResourceManager API of pkg_resources to access them
Not sure, but you could create a repository for your module and use pip to install it. The egg folder would then have a file called PKG-INFO which would contain the url to the repository you imported your module from.
In my office we have a quite complex directory structure when it comes to our code.
One of the things we have is a libs module to drop "common" things used by other parts of our big application (or set of applications... that are all living under a common directory).
The code in that libs/ directory requires certain packages installed in order for it to work. In said libs/ directory we have a requirements.txt file that supposedly lists the dependencies required for the things (things being Python code) in it to work. We have been filling that requirements.txt file pretty manually, tracking that "if this .py file uses this module, we should add it to the requirements file" so it's almost certain that by now we have forgotten adding some required modules.
Because of the complex structure we have (some parts use pipenv, some other have their own requirements.txt...) is very hard knowing whether a required module is going to end up installed or not.
So I would like to make sure that this libs/ directory (cough, cough... module ) has all its dependencies listed in its libs/requirements.txt.
Is that possible? Ideally it'd be "run this command passing /libs/ as an argument, it'll scan the directory and tell you what packages are needed by the py(s) found in it"
Thank you in advance.
Unfortunately, python does not know whether its dependencies are satisfied until runtime. requirements.txt is just a helper file for pip and similar tools, and you have to update it manually.
That said, you could
use the os module to recursively get a list of all *.py files in the folder
parse each one of them for lines having the format import aaa.bbb or from aaa import bbb
keep a set of the imports
However, even in that case, the name of the imported module is not the same as the name you need to pass to pip (eg, import yaml requires pyyaml in requirements.txt), but at least it could be a hint of what's missing.
I have two python projects, one includes useful packages for file manipulation and such. They are usefull because they can be reused in any other kind of project. The second is one of these projects, requiring the use of my useful packages. Here is my projects' file structure:
Python-projects/
Useful/
package_parsing/
package_file_manipulation/
package_blabla/
Super-Application/
pkg1/
__init__.py
module1.py
module2.py
pkg2/
setup.py
MANIFEST.in
README.rst
First of, I would like to use the Useful/package_parsing package in my module Super-Application/pkg1/module1.py. Is there a more convenient way to do it other than copying the package_parsing in Super-Application project?
Depending in the first answer, that is if there is a way to link a module from a different project, how could I include such external module in a release package of my Super-Application project? I am not sure that making use of install_requires in the setup.py will do.
My main idea here is not to duplicate the Useful/package_parsing package in all of my other development projects, especially when I would like to do modifications to this useful package. I wouldn't like to update all the outdated copies in each project.
=============
EDIT 1
It appears the first part of my question cna be dealt with appending the path:
import sys
sys.path.insert(0, path/to/Useful/package_parsing)
Moreover I can simply check the available paths using:
for p in sys.path:
print p
Now for the second part, how could I include such external module in a release package, possibly using the setup.py installation file?
Currently I'm using the auto-tools to build/install and package a project of mine, but I would really like to move to something that feels more "pythonic".
My project consists of two scripts, one module, two glade GUI descriptions, and two .desktop files. It's currently a pure python project, though that's likely to change soon-ish.
Looking at setuptools I can easily see how to deal with everything except the .desktop files; they have to end up in a specific directory so that Gnome can find them.
Is using distuils/setuptools a good idea to begin with?
I managed to get this to work, but it kinda feels to me more like a workaround.
Don't know what's the preferred way to handle this...
I used the following setup.py file (full version is here):
from setuptools import setup
setup(
# ...
data_files=[
('share/icons/hicolor/scalable/apps', ['data/mypackage.svg']),
('share/applications', ['data/mypackage.desktop'])
],
entry_points={
'console_scripts': ['startit=mypackage.cli:run']
}
)
The starter script trough entry_points works. But the data_files where put in an egg file and not in the folders specified, so they can't be accessed by the desktop shell.
To work around this, I used the following setup.cfg file:
[install]
single-version-externally-managed=1
record=install.txt
This works. Both data files are created in the right place and the .desktop file is recognized by Gnome.
In general, yes - everything is better than autotools when building Python projects.
I have good experiences with setuptools so far. However, installing files into fixed locations is not a strength of setuptools - after all, it's not something to build installaters for Python apps, but distribute Python libraries.
For the installation of files which are not application data files (like images, UI files etc) but provide integration into the operating system, you are better off with using a real packaging format (like RPM or deb).
That said, nothing stops you from having the build process based on setuptools and a small make file for installing everything into its rightful place.
You can try to use python-distutils-extra. The DistUtilsExtra.auto module automatically supports .desktop files, as well as Glade/GtkBuilder .ui files, Python modules and scripts, misc data files, etc.
It should work both with Distutils and Setuptools.
I've created https://pypi.python.org/pypi/install-freedesktop. It creates .desktop files automatically for the gui_scripts entry points, which can be customized through a setup argument, and supports --user as well as system-wide installation. Compared to DistUtilsExtra, it's more narrow in scope and IMHO more pythonic (explicit is better than implicit).
I have a Python project that has the following structure:
package1
class.py
class2.py
...
package2
otherClass.py
otherClass2.py
...
config
dev_settings.ini
prod_settings.ini
I wrote a setup.py file that converts this into an egg with the same file structure. (When I examine it using a zip program the structure seems identical.) The funny thing is, when I run the Python code from my IDE it works fine and can access the config files; but when I try to run it from a different Python script using the egg, it can't seem to find the config files in the egg. If I put the config files into a directory relative to the calling Python script (external to the egg), it works - but that sort of defeats the purpose of having a self-contained egg that has all the functionality of the program and can be called from anywhere. I can use any classes/modules and run any functions from the egg as long as they don't use the config files... but if they do, the egg can't find them and so the functions don't work.
Any help would be really appreciated! We're kind of new to the egg thing here and don't really know where to start.
The problem is, the config files are not files anymore - they're packaged within the egg. It's not easy to find the answer in the docs, but it is there. From the setuptools developer's guide:
Typically, existing programs manipulate a package's __file__ attribute in order to find the location of data files. However, this manipulation isn't compatible with PEP 302-based import hooks, including importing from zip files and Python Eggs.
To access them, you need to follow the instructions for the Resource Management API.
In my own code, I had this problem with a logging configuration file. I used the API successfully like this:
from pkg_resources import resource_stream
_log_config_file = 'logging.conf'
_log_config_location = resource_stream(__name__, _log_config_file)
logging.config.fileConfig(_log_config_location)
_log = logging.getLogger('package.module')
See Setuptools' discussion of accessing pacakged data files at runtime. You have to get at your configuration file a different way if you want the script to work inside an egg. Also, for that to work, you may need to make your config directory a Python package by tossing in an empty __init__.py file.