Github Travis CI with pytest and package data - FileNotFoundError - python

I've got a repo on GitHub, for which I wanted to implement Travis CI with pytest for basic testing. Currently the Travis CI build fails when loading data tables within a module, raising a FileNotFoundError.
To make it short, here is the imho most important information on the build:
directory of the data tables is included in MANIFEST.in with include mypkg/data_tables/* (see below for a detailed structure)
setuptools.setup method has the include_package_data=True parameter
additionally packages=setuptools.find_packages() is provided
Travis CI installs the package with install: pip install -e .
Travis CI pytest is invoked with script: pytest --import-mode=importlib
during testing the first tests succeed. But when it comes to loading the data tables, pytest raises the error FileNotFoundError: [Errno 2] No such file or directory: '/home/travis/build/myname/mypkg/mypkg/data_tables\\my_data.csv'
Interestingly the slashes before the file name are back-slashes, while the other are not, even though the final path is constructed with os.path.abspath().
Detailed description
Unluckily the repo is private and I'm not allowed to share it. Thus I'll try to describe the GitHub package layout as detailed as possible. So let's say my repo is built with a structure like this (general layout taken from this example):
setup.py
MANIFEST.in
mypkg/
some_data_tables/
my_data.csv
my_other_data.pkl
__init__.py
view.py
tests/
test_view.py
My minimum MANIFEST.in looks like this:
include mypkg/data_tables/*
With the setup.py fully reduced to a minimum working example like this:
from setuptools import find_packages, setup
setup(
name='Mypkg',
version='123.456',
description='some_text',
python_requires='>=3.7.7',
packages=find_packages( # <---- this should be sufficient, right?
exclude=["tests", "*.tests", "*.tests.*", "tests.*"]),
include_package_data=True, # <---- also this should work
)
And the .travis.yml file (omitting - pip install -r requirements.txt etc.):
language: python
python:
- "3.7.7"
dist: xenial
install:
- pip install -e .
script:
- pytest --import-mode=importlib
Checking the content of the .egg or tar.gz files, the data tables are included. So I have no idea, where the files are "getting lost".
Any idea how to solve this error?
If providing more information could help, f.i. on the class initialized in test_view, please tell me.

Related

Python package-data not found on CI server

For one python project, I want to ship a data file with the package.
Following various (and partially contradictory) advice on the mess that is Python package data, I ended up trying different things and got it to work locally on my machine with the following setup.
My setup.cfg contains, among other things that shouldn't matter here,
[options]
include_package_data = True
and no package_data or other data related keys. My MANIFEST.in states
recursive-include lexedata clics3-network.gml.zip
My setup.py is pretty bare, essentially
from setuptools import setup
readline = "readline"
setup(extras_require={"formatguesser": [readline]})
To load the file, I use
pkg_resources.resource_stream("lexedata", "data/clics3-network.gml.zip")
I test this using tox, configured with
[tox]
isolated_build = True
envlist = general
[testenv]
passenv = CI
deps =
codecov
pytest
pytest-cov
commands =
pytest --doctest-modules --cov=lexedata {envsitepackagesdir}/lexedata
pytest --cov=lexedata --cov-append test/
codecov
On my local machine, when I run pip install ., the data file lexedata/data/clics2-network.gml.zip is properly deposited inside the site-packages/lexeadata/data directory of the corresponding virtual environment, and tox packages it inside .tox/dist/lexedata-1.0.0b3.tar.gz as well as in its venv site packages directory .tox/general/lib/python3.8/site-packages/lexedata/data/.
However, continuous integration using Github actions fails on all Python 3 versions I'm testing with
UNEXPECTED EXCEPTION: FileNotFoundError(2, 'No such file or directory')
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/lexedata/lexedata/.tox/general/lib/python3.10/site-packages/lexedata/data/clics3-network.gml.zip'
at the equivalent of that same tox venv path.
What could be going wrong here?
You almost did it right, try slightly update your MANIFEST.in to any of the following examples:
include src/lexedata/data/*.zip
recursive-include src/* *.zip
recursive-include **/data clics3-network.gml.zip
As you can find in docs include command defines files as paths relative to the root of the project (that's why first example starts from src folder)
recursive-include expect first argument being as dir-pattern (glob-style), so it is better include asterisks

PyPI - folder inside the package folder not getting uploaded [duplicate]

When using setuptools, I can not get the installer to pull in any package_data files. Everything I've read says that the following is the correct way to do it. Can someone please advise?
setup(
name='myapp',
packages=find_packages(),
package_data={
'myapp': ['data/*.txt'],
},
include_package_data=True,
zip_safe=False,
install_requires=['distribute'],
)
where myapp/data/ is the location of the data files.
I realize that this is an old question, but for people finding their way here via Google: package_data is a low-down, dirty lie. It is only used when building binary packages (python setup.py bdist ...) but not when building source packages (python setup.py sdist ...). This is, of course, ridiculous -- one would expect that building a source distribution would result in a collection of files that could be sent to someone else to built the binary distribution.
In any case, using MANIFEST.in will work both for binary and for source distributions.
I just had this same issue. The solution, was simply to remove include_package_data=True.
After reading here, I realized that include_package_data aims to include files from version control, as opposed to merely "include package data" as the name implies. From the docs:
The data files [of include_package_data] must be under CVS or Subversion control
...
If you want finer-grained control over what files are included (for example, if
you have documentation files in your package directories and want to exclude
them from installation), then you can also use the package_data keyword.
Taking that argument out fixed it, which is coincidentally why it also worked when you switched to distutils, since it doesn't take that argument.
Following #Joe 's recommendation to remove the include_package_data=True line also worked for me.
To elaborate a bit more, I have no MANIFEST.in file. I use Git and not CVS.
Repository takes this kind of shape:
/myrepo
- .git/
- setup.py
- myproject
- __init__.py
- some_mod
- __init__.py
- animals.py
- rocks.py
- config
- __init__.py
- settings.py
- other_settings.special
- cool.huh
- other_settings.xml
- words
- __init__.py
word_set.txt
setup.py:
from setuptools import setup, find_packages
import os.path
setup (
name='myproject',
version = "4.19",
packages = find_packages(),
# package_dir={'mypkg': 'src/mypkg'}, # didnt use this.
package_data = {
# If any package contains *.txt or *.rst files, include them:
'': ['*.txt', '*.xml', '*.special', '*.huh'],
},
#
# Oddly enough, include_package_data=True prevented package_data from working.
# include_package_data=True, # Commented out.
data_files=[
# ('bitmaps', ['bm/b1.gif', 'bm/b2.gif']),
('/opt/local/myproject/etc', ['myproject/config/settings.py', 'myproject/config/other_settings.special']),
('/opt/local/myproject/etc', [os.path.join('myproject/config', 'cool.huh')]),
#
('/opt/local/myproject/etc', [os.path.join('myproject/config', 'other_settings.xml')]),
('/opt/local/myproject/data', [os.path.join('myproject/words', 'word_set.txt')]),
],
install_requires=[ 'jsonschema',
'logging', ],
entry_points = {
'console_scripts': [
# Blah...
], },
)
I run python setup.py sdist for a source distrib (haven't tried binary).
And when inside of a brand new virtual environment, I have a myproject-4.19.tar.gz, file,
and I use
(venv) pip install ~/myproject-4.19.tar.gz
...
And other than everything getting installed to my virtual environment's site-packages, those special data files get installed to /opt/local/myproject/data and /opt/local/myproject/etc.
include_package_data=True worked for me.
If you use git, remember to include setuptools-git in install_requires. Far less boring than having a Manifest or including all path in package_data ( in my case it's a django app with all kind of statics )
( pasted the comment I made, as k3-rnc mentioned it's actually helpful as is )
Using setup.cfg (setuptools ≥ 30.3.0)
Starting with setuptools 30.3.0 (released 2016-12-08), you can keep your setup.py very small and move the configuration to a setup.cfg file. With this approach, you could put your package data in an [options.package_data] section:
[options.package_data]
* = *.txt, *.rst
hello = *.msg
In this case, your setup.py can be as short as:
from setuptools import setup
setup()
For more information, see configuring setup using setup.cfg files.
There is some talk of deprecating setup.cfg in favour of pyproject.toml as proposed in PEP 518, but this is still provisional as of 2020-02-21.
Update: This answer is old and the information is no longer valid. All setup.py configs should use import setuptools. I've added a more complete answer at https://stackoverflow.com/a/49501350/64313
I solved this by switching to distutils. Looks like distribute is deprecated and/or broken.
from distutils.core import setup
setup(
name='myapp',
packages=['myapp'],
package_data={
'myapp': ['data/*.txt'],
},
)
I had the same problem for a couple of days but even this thread wasn't able to help me as everything was confusing. So I did my research and found the following solution:
Basically in this case, you should do:
from setuptools import setup
setup(
name='myapp',
packages=['myapp'],
package_dir={'myapp':'myapp'}, # the one line where all the magic happens
package_data={
'myapp': ['data/*.txt'],
},
)
The full other stackoverflow answer here
I found this post while stuck on the same problem.
My experience contradicts the experiences in the other answers.
include_package_data=True does include the data in the
bdist! The explanation in the setuptools
documentation
lacks context and troubleshooting tips, but
include_package_data works as advertised.
My setup:
Windows / Cygwin
git version 2.21.0
Python 3.8.1 Windows distribution
setuptools v47.3.1
check-manifest v0.42
Here is my how-to guide.
How-to include package data
Here is the file structure for a project I published on PyPI.
(It installs the application in __main__.py).
├── LICENSE.md
├── MANIFEST.in
├── my_package
│ ├── __init__.py
│ ├── __main__.py
│ └── _my_data <---- folder with data
│ ├── consola.ttf <---- data file
│ └── icon.png <---- data file
├── README.md
└── setup.py
Starting point
Here is a generic starting point for the setuptools.setup() in
setup.py.
setuptools.setup(
...
packages=setuptools.find_packages(),
...
)
setuptools.find_packages() includes all of my packages in the
distribution. My only package is my_package.
The sub-folder with my data, _my_data, is not considered a
package by Python because it does not contain an __init__.py,
and so find_packages() does not find it.
A solution often-cited, but incorrect, is to put an empty
__init__.py file in the _my_data folder.
This does make it a package, so it does include the folder
_my_data in the distribution. But the data files inside
_my_data are not included.
So making _my_data into a package does not help.
The solution is:
the sdist already contains the data files
add include_package_data=True to include the data files in the bdist as well
Experiment (how to test the solution)
There are three steps to make this a repeatable experiment:
$ rm -fr build/ dist/ my_package.egg-info/
$ check-manifest
$ python setup.py sdist bdist_wheel
I will break these down step-by-step:
Clean out the old build:
$ rm -fr build/ dist/ my_package.egg-info/
Run check-manifest to be sure MANIFEST.in matches the
Git index of files under version control:
$ check-manifest
If MANIFEST.in does not exist yet, create it from the Git
index of files under version control:
$ check-manifest --create
Here is the MANIFEST.in that is created:
include *.md
recursive-include my_package *.png
recursive-include my_package *.ttf
There is no reason to manually edit this file.
As long as everything that should be under version control is
under version control (i.e., is part of the Git index),
check-manifest --create does the right thing.
Note: files are not part of the Git index if they are either:
ignored in a .gitignore
excluded in a .git/info/exclude
or simply new files that have not been added to the index yet
And if any files are under version control that should not be
under version control, check-manifest issues a warning and
specifies which files it recommends removing from the Git index.
Build:
$ python setup.py sdist bdist_wheel
Now inspect the sdist (source distribution) and bdist_wheel
(build distribution) to see if they include the data files.
Look at the contents of the sdist (only the relevant lines are
shown below):
$ tar --list -f dist/my_package-0.0.1a6.tar.gz
my_package-0.0.1a6/
...
my_package-0.0.1a6/my_package/__init__.py
my_package-0.0.1a6/my_package/__main__.py
my_package-0.0.1a6/my_package/_my_data/
my_package-0.0.1a6/my_package/_my_data/consola.ttf <-- yay!
my_package-0.0.1a6/my_package/_my_data/icon.png <-- yay!
...
So the sdist already includes the data files because they are
listed in MANIFEST.in. There is nothing extra to do to include
the data files in the sdist.
Look at the contents of the bdist (it is a .zip file, parsed
with zipfile.ZipFile):
$ python check-whl.py
my_package/__init__.py
my_package/__main__.py
my_package-0.0.1a6.dist-info/LICENSE.md
my_package-0.0.1a6.dist-info/METADATA
my_package-0.0.1a6.dist-info/WHEEL
my_package-0.0.1a6.dist-info/entry_points.txt
my_package-0.0.1a6.dist-info/top_level.txt
my_package-0.0.1a6.dist-info/RECORD
Note: you need to create your own check-whl.py script to produce the
above output. It is just three lines:
from zipfile import ZipFile
path = "dist/my_package-0.0.1a6-py3-none-any.whl" # <-- CHANGE
print('\n'.join(ZipFile(path).namelist()))
As expected, the bdist is missing the data files.
The _my_data folder is completely missing.
What if I create a _my_data/__init__.py? I repeat the
experiment and I find the data files are still not there! The
_my_data/ folder is included but it does not contain the data
files!
Solution
Contrary to the experience of others, this does work:
setuptools.setup(
...
packages=setuptools.find_packages(),
include_package_data=True, # <-- adds data files to bdist
...
)
With the fix in place, redo the experiment:
$ rm -fr build/ dist/ my_package.egg-info/
$ check-manifest
$ python.exe setup.py sdist bdist_wheel
Make sure the sdist still has the data files:
$ tar --list -f dist/my_package-0.0.1a6.tar.gz
my_package-0.0.1a6/
...
my_package-0.0.1a6/my_package/__init__.py
my_package-0.0.1a6/my_package/__main__.py
my_package-0.0.1a6/my_package/_my_data/
my_package-0.0.1a6/my_package/_my_data/consola.ttf <-- yay!
my_package-0.0.1a6/my_package/_my_data/icon.png <-- yay!
...
Look at the contents of the bdist:
$ python check-whl.py
my_package/__init__.py
my_package/__main__.py
my_package/_my_data/consola.ttf <--- yay!
my_package/_my_data/icon.png <--- yay!
my_package-0.0.1a6.dist-info/LICENSE.md
my_package-0.0.1a6.dist-info/METADATA
my_package-0.0.1a6.dist-info/WHEEL
my_package-0.0.1a6.dist-info/entry_points.txt
my_package-0.0.1a6.dist-info/top_level.txt
my_package-0.0.1a6.dist-info/RECORD
How not to test if data files are included
I recommend troubleshooting/testing using the approach outlined
above to inspect the sdist and bdist.
pip install in editable mode is not a valid test
Note: pip install -e . does not show if data files are
included in the bdist.
The symbolic link causes the installation to behave as if the
data files are included (because they already exist locally on
the developer's computer).
After pip install my_package, the data files are in the
virtual environment's lib/site-packages/my_package/ folder,
using the exact same file structure shown above in the list of
the whl contents.
Publishing to TestPyPI is a slow way to test
Publishing to TestPyPI and then installing and looking in
lib/site-packages/my_packages is a valid test, but it is too
time-consuming.
Ancient question and yet... package management of python really leaves a lot to be desired. So I had the use case of installing using pip locally to a specified directory and was surprised both package_data and data_files paths did not work out. I was not keen on adding yet another file to the repo so I ended up leveraging data_files and setup.py option --install-data; something like this
pip install . --install-option="--install-data=$PWD/package" -t package
Moving the folder containing the package data into to module folder solved the problem for me.
See this question: MANIFEST.in ignored on "python setup.py install" - no data files installed?
Just remove the line:
include_package_data=True,
from your setup script, and it will work fine. (Tested just now with latest setuptools.)
Like others in this thread, I'm more than a little surprised at the combination of longevity and still a lack of clarity, BUT the best answer for me was using check-manifest as recommended in the answer from #mike-gazes
So, using just a setup.cfg and no setup.py and additional text and python files required in the package, what worked for me was keeping this in setup.cfg:
[options]
packages = find:
include_package_data = true
and updating the MANIFEST.in based on the check-manifest output:
include *.in
include *.txt
include *.yml
include LICENSE
include tox.ini
recursive-include mypkg *.py
recursive-include mypkg *.txt
For a directory structure like:
foo/
├── foo
│   ├── __init__.py
│   ├── a.py
│   └── data.txt
└── setup.py
and setup.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from setuptools import setup
NAME = 'foo'
DESCRIPTION = 'Test library to check how setuptools works'
URL = 'https://none.com'
EMAIL = 'gzorp#bzorp.com'
AUTHOR = 'KT'
REQUIRES_PYTHON = '>=3.6.0'
setup(
name=NAME,
version='0.0.0',
description=DESCRIPTION,
author=AUTHOR,
author_email=EMAIL,
python_requires=REQUIRES_PYTHON,
url=URL,
license='MIT',
classifiers=[
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
],
packages=['foo'],
package_data={'foo': ['data.txt']},
include_package_data=True,
install_requires=[],
extras_require={},
cmdclass={},
)
python setup.py bdist_wheel works.
Starting with Setuptools 62.3.0, you can now use recursive wildcards ("**") to include a (sub)directory recursively. This way you can include whole folders with all their folders and files in it.
For example, when using a pyproject.toml file, this is how you include two folders recursively:
[tool.setuptools.package-data]
"ema_workbench.examples.data" = ["**"]
"ema_workbench.examples.models" = ["**"]
But you can also only include certain file-types, in a folder and all subfolders. If you want to include all markdown (.md) files for example:
[tool.setuptools.package-data]
"ema_workbench.examples.data" = ["**/*.md"]
It should also work when using setup.py or setup.cfg.
See https://github.com/pypa/setuptools/pull/3309 for the details.

How to use Travis CI with some files in gitignore?

I have a Flask app that has its configurations in a file called settings.py. I've put this file in .gitignore because the project is in a public repo. Travis-CI was working before I added tests into my project even though settings.py was in .gitignore. After adding tests to the project, the build started failing with the following output:
Debugged import:
- 'settings' not found.
Original exception:
ImportError: No module named 'settings'
My .travis.yml file looks like this:
language: python
python:
- "3.4"
- "3.5"
# command to install dependencies
install:
- pip install -r requirements.txt
# command to run tests
script: python tests.py
Does this mean that in order to use travis-ci, we have to include all necessary files in the repo? Or is there a workaround? The repo on GitHub can be found here.
#dirn's comment of using a default settings.py file and then overriding some settings with encrypted environment variables on Travis is a good idea, certainly worth it if there are only a couple of differences.
However, if you can't be bothered or it's too complicated breaking up your settings, you could install the Ruby Travis command line client gem, which is useful for quite a few things.
With the client on your machine you can use Travis' file encryption feature to encrypt your whole settings.py file, and then commit the encrypted version (which will have an .enc file extension) to GitHub. Travis will then be able to decrypt the file during the CI run, as long as you add the right commands to the .travis.yml file, say in a before_install step. Detailed instructions are on the file encryption page.
I did a trick in .travis.yml
After commit and before Travis build, create the ignored file like this:
before_install:
- cp .ignored.file.copy ignored.file
This way, the build will succeed without the actual gitignore-ed file.

Python import fails on travisCI but not locally

I'm trying to integrate TravisCI into my workflow, and realized I had some dependencies because of my old directory structure (not having self-contained, virtualenv-able git repos).
When I try to run nosetests locally, it runs the tests just fine; when TravisCI tries to run them, it fails, with an import error. Specifically, I have, as one of the lines in my test script:
from myproject import something
My directory structure is inside my git repo myproject is something like:
.travis.yml
requirements.txt
something.py
tests/
test_something.py
I have tried getting this to fail locally (because then I'd understand the TravisCI issue, maybe), but cannot accomplish it.
I've tried running with regular python, and using a virtualenv which added nose to its requirements.txt, and the tests always pass locally.
I feel like I still haven't understood absolute-vs-relative imports, and I can't tell if that's coming in to play here, or if I'm just doing something obvious and dumb in my project.
Desired outcome: figure out why TravisCI is failing, and fix my repo accordingly, so that I can commit and have things build correctly, both locally and on TravisCI. If that requires more drastic changes like "you should have a setup.py that does blah-blah to the environment" or similar - please let me know. I'm new to this aspect of Python, and find the current documentation overwhelmingly unclear.
As an FYI, I found this question and adding --exe doesn't help, or seem to be the same issue.
I see there are no answer and I encountered the same issue, so I am posting here in hope to help somebody:
Solution 1
The quick fix for me was to add this line export PYTHONPATH=$PYTHONPATH:$(pwd) in the .travis.yml:
before_install:
- "pip install -U pip"
- "export PYTHONPATH=$PYTHONPATH:$(pwd)"
Solution 2
Having a setup.py which should be the default option as it is the most elegant, configured like:
from setuptools import setup, find_packages
setup(name='MyPythonProject',
version='0.0.1',
description='What it does',
author='',
author_email='',
url='',
packages=find_packages(),
)
And then add this line in .travis.yml
before_install:
- "pip install -U pip"
- "python setup.py install"
Solution 3:
Changing the layout of the project to have the test folder under the application one (the one with your core python code) such as:
.travis.yml
requirements.txt
app
|_ tests
| |_ test_application.py
|_ application.py
And running the test in travis with coverage and nosetest like:
script:
- "nosetests --with-coverage --cover-package app"

How to run tests without installing package?

I have some Python package and some tests. The files are layed out following http://pytest.org/latest/goodpractices.html#choosing-a-test-layout-import-rules
Putting tests into an extra directory outside your actual application
code, useful if you have many functional tests or for other reasons
want to keep tests separate from actual application code (often a good
idea):
setup.py # your distutils/setuptools Python package metadata
mypkg/
__init__.py
appmodule.py
tests/
test_app.py
My problem is, when I run the tests py.test, I get an error
ImportError: No module named 'mypkg'
I can solve this by installing the package python setup.py install but this means the tests run against the installed package, not the local one, which makes development very tedious. Whenever I make a change and want to run the tests, I need to reinstall, else I am testing the old code.
What can I do?
I know this question has been already closed, but a simple way I often use is to call pytest via python -m, from the root (the parent of the package).
$ python -m pytest tests
This works because -m option adds the current directory to the python path, and hence mypkg is detected as a local package (not as the installed).
See:
https://docs.pytest.org/en/latest/usage.html#calling-pytest-through-python-m-pytest
The normal approach for development is to use a virtualenv and use pip install -e . in the virtualenv (this is almost equivalent to python setup.py develop). Now your source directory is used as installed package on sys.path.
There are of course a bunch of other ways to get your package on sys.path for testing, see Ensuring py.test includes the application directory in sys.path for a question with a more complete answer for this exact same problem.
On my side, while developing, I prefer to run tests from the IDE (using a runner extension) rather than using the command line. However, before pushing my code or prior to a release, I like to use the command line.
Here is a way to deal with this issue, allowing you to run tests from both the test runner used by your IDE and the command line.
My setup:
IDE: Visual Studio Code
Testing: pytest
Extension (test runner): https://marketplace.visualstudio.com/items?itemName=LittleFoxTeam.vscode-python-test-adapter
Work directory structure (my solution should be easily adaptable to your context):
project_folder/
src/
mypkg/
__init__.py
appmodule.py
tests/
mypkg/
appmodule_test.py
pytest.ini <- Use so pytest can locate pkgs from ./src
.env <- Use so VsCode and its extention can locate pkgs from ./src
.env:
PYTHONPATH="${PYTHONPATH};./src;"
pytest.ini (tried with pytest 7.1.2):
[pytest]
pythonpath = . src
./src/mypkg/appmodule.py:
def i_hate_configuring_python():
return "Finally..."
./tests/mypkg/appmodule_test.py:
from mypkg import app_module
def test_demo():
print(app_module.i_hate_configuring_python())
This should do the trick
Import the package using from .. import mypkg. For this to work you will need to add (empty) __init__.py files to the tests directory and the containing directory. py.test should take care of the rest.

Categories