How to use Travis CI with some files in gitignore? - python

I have a Flask app that has its configurations in a file called settings.py. I've put this file in .gitignore because the project is in a public repo. Travis-CI was working before I added tests into my project even though settings.py was in .gitignore. After adding tests to the project, the build started failing with the following output:
Debugged import:
- 'settings' not found.
Original exception:
ImportError: No module named 'settings'
My .travis.yml file looks like this:
language: python
python:
- "3.4"
- "3.5"
# command to install dependencies
install:
- pip install -r requirements.txt
# command to run tests
script: python tests.py
Does this mean that in order to use travis-ci, we have to include all necessary files in the repo? Or is there a workaround? The repo on GitHub can be found here.

#dirn's comment of using a default settings.py file and then overriding some settings with encrypted environment variables on Travis is a good idea, certainly worth it if there are only a couple of differences.
However, if you can't be bothered or it's too complicated breaking up your settings, you could install the Ruby Travis command line client gem, which is useful for quite a few things.
With the client on your machine you can use Travis' file encryption feature to encrypt your whole settings.py file, and then commit the encrypted version (which will have an .enc file extension) to GitHub. Travis will then be able to decrypt the file during the CI run, as long as you add the right commands to the .travis.yml file, say in a before_install step. Detailed instructions are on the file encryption page.

I did a trick in .travis.yml
After commit and before Travis build, create the ignored file like this:
before_install:
- cp .ignored.file.copy ignored.file
This way, the build will succeed without the actual gitignore-ed file.

Related

Google Cloud Buildpack custom source directory for Python app

I am experimenting with Google Cloud Platform buildpacks, specifically for Python. I started with the Sample Functions Framework Python example app, and got that running locally, with commands:
pack build --builder=gcr.io/buildpacks/builder sample-functions-framework-python
docker run -it -ePORT=8080 -p8080:8080 sample-functions-framework-python
Great, let's see if I can apply this concept on a legacy project (Python 3.7 if that matters).
The legacy project has a structure similar to:
.gitignore
source/
main.py
lib
helper.py
requirements.txt
tests/
<test files here>
The Dockerfile that came with this project packaged the source directory contents without the "source" directory, like this:
COPY lib/ /app/lib
COPY main.py /app
WORKDIR /app
... rest of Dockerfile here ...
Is there a way to package just the contents of the source directory using the buildpack?
I tried to add this config to the project.toml file:
[[build.env]]
name = "GOOGLE_FUNCTION_SOURCE"
value = "./source/main.py"
But the Python modules/imports aren't set up correctly for that, as I get this error:
File "/workspace/source/main.py", line 2, in <module>
from source.lib.helper import mymethod
ModuleNotFoundError: No module named 'source'
Putting both main.py and /lib into the project root dir would make this work, but I'm wondering if there is a better way.
Related question, is there a way to see what project files are being copied into the image by the buildpack? I tried using verbose logging but didn't see anything useful.
Update:
The python module error:
File "/workspace/source/main.py", line 2, in <module>
from source.lib.helper import mymethod
ModuleNotFoundError: No module named 'source'
was happening because I moved the lib dir into source in my test project, and when I did this, Intellij updated the import statement in main.py without me catching it. I fixed the import, then applied the solution listed below and it worked.
I had been searching the buildpack and Google cloud function documentation, but I discovered the option I need on the pack build documentation page: option --path.
This command only captures the source directory contents:
pack build --builder=gcr.io/buildpacks/builder --path source sample-functions-framework-python
If changing the path, the project.toml descriptor needs to be in that directory too (or specify with --descriptor on command line).

Python package-data not found on CI server

For one python project, I want to ship a data file with the package.
Following various (and partially contradictory) advice on the mess that is Python package data, I ended up trying different things and got it to work locally on my machine with the following setup.
My setup.cfg contains, among other things that shouldn't matter here,
[options]
include_package_data = True
and no package_data or other data related keys. My MANIFEST.in states
recursive-include lexedata clics3-network.gml.zip
My setup.py is pretty bare, essentially
from setuptools import setup
readline = "readline"
setup(extras_require={"formatguesser": [readline]})
To load the file, I use
pkg_resources.resource_stream("lexedata", "data/clics3-network.gml.zip")
I test this using tox, configured with
[tox]
isolated_build = True
envlist = general
[testenv]
passenv = CI
deps =
codecov
pytest
pytest-cov
commands =
pytest --doctest-modules --cov=lexedata {envsitepackagesdir}/lexedata
pytest --cov=lexedata --cov-append test/
codecov
On my local machine, when I run pip install ., the data file lexedata/data/clics2-network.gml.zip is properly deposited inside the site-packages/lexeadata/data directory of the corresponding virtual environment, and tox packages it inside .tox/dist/lexedata-1.0.0b3.tar.gz as well as in its venv site packages directory .tox/general/lib/python3.8/site-packages/lexedata/data/.
However, continuous integration using Github actions fails on all Python 3 versions I'm testing with
UNEXPECTED EXCEPTION: FileNotFoundError(2, 'No such file or directory')
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/lexedata/lexedata/.tox/general/lib/python3.10/site-packages/lexedata/data/clics3-network.gml.zip'
at the equivalent of that same tox venv path.
What could be going wrong here?
You almost did it right, try slightly update your MANIFEST.in to any of the following examples:
include src/lexedata/data/*.zip
recursive-include src/* *.zip
recursive-include **/data clics3-network.gml.zip
As you can find in docs include command defines files as paths relative to the root of the project (that's why first example starts from src folder)
recursive-include expect first argument being as dir-pattern (glob-style), so it is better include asterisks

Github Travis CI with pytest and package data - FileNotFoundError

I've got a repo on GitHub, for which I wanted to implement Travis CI with pytest for basic testing. Currently the Travis CI build fails when loading data tables within a module, raising a FileNotFoundError.
To make it short, here is the imho most important information on the build:
directory of the data tables is included in MANIFEST.in with include mypkg/data_tables/* (see below for a detailed structure)
setuptools.setup method has the include_package_data=True parameter
additionally packages=setuptools.find_packages() is provided
Travis CI installs the package with install: pip install -e .
Travis CI pytest is invoked with script: pytest --import-mode=importlib
during testing the first tests succeed. But when it comes to loading the data tables, pytest raises the error FileNotFoundError: [Errno 2] No such file or directory: '/home/travis/build/myname/mypkg/mypkg/data_tables\\my_data.csv'
Interestingly the slashes before the file name are back-slashes, while the other are not, even though the final path is constructed with os.path.abspath().
Detailed description
Unluckily the repo is private and I'm not allowed to share it. Thus I'll try to describe the GitHub package layout as detailed as possible. So let's say my repo is built with a structure like this (general layout taken from this example):
setup.py
MANIFEST.in
mypkg/
some_data_tables/
my_data.csv
my_other_data.pkl
__init__.py
view.py
tests/
test_view.py
My minimum MANIFEST.in looks like this:
include mypkg/data_tables/*
With the setup.py fully reduced to a minimum working example like this:
from setuptools import find_packages, setup
setup(
name='Mypkg',
version='123.456',
description='some_text',
python_requires='>=3.7.7',
packages=find_packages( # <---- this should be sufficient, right?
exclude=["tests", "*.tests", "*.tests.*", "tests.*"]),
include_package_data=True, # <---- also this should work
)
And the .travis.yml file (omitting - pip install -r requirements.txt etc.):
language: python
python:
- "3.7.7"
dist: xenial
install:
- pip install -e .
script:
- pytest --import-mode=importlib
Checking the content of the .egg or tar.gz files, the data tables are included. So I have no idea, where the files are "getting lost".
Any idea how to solve this error?
If providing more information could help, f.i. on the class initialized in test_view, please tell me.

Travis.yml file not being found in Django project

I have a Django project that needs to be built using TravisCI. Originally, I put the travis.yml file in the root of the project (where the virtual environment would be) and then built it but for some reason, Travis is using default settings since it can't find the yml file.
I then moved the file into the src directory and rerun, but the build still wasn't finding the file. Where does the file need to be placed in order for Travis to find it?
Travis.yml:
language: python
python:
- "2.7"
# setup environment
env:
- DJANGO_VERSION=1.11.2
- DJANGO_SETTINGS_MODULE='di_photouploader.settings.production'
# install dependencies
install:
- pip install -r requirements.txt
# run test scripts
script:
- python manage.py test
It should be named
.travis.yml
And not
Travis.yml
Place the file at the project root.

Python import fails on travisCI but not locally

I'm trying to integrate TravisCI into my workflow, and realized I had some dependencies because of my old directory structure (not having self-contained, virtualenv-able git repos).
When I try to run nosetests locally, it runs the tests just fine; when TravisCI tries to run them, it fails, with an import error. Specifically, I have, as one of the lines in my test script:
from myproject import something
My directory structure is inside my git repo myproject is something like:
.travis.yml
requirements.txt
something.py
tests/
test_something.py
I have tried getting this to fail locally (because then I'd understand the TravisCI issue, maybe), but cannot accomplish it.
I've tried running with regular python, and using a virtualenv which added nose to its requirements.txt, and the tests always pass locally.
I feel like I still haven't understood absolute-vs-relative imports, and I can't tell if that's coming in to play here, or if I'm just doing something obvious and dumb in my project.
Desired outcome: figure out why TravisCI is failing, and fix my repo accordingly, so that I can commit and have things build correctly, both locally and on TravisCI. If that requires more drastic changes like "you should have a setup.py that does blah-blah to the environment" or similar - please let me know. I'm new to this aspect of Python, and find the current documentation overwhelmingly unclear.
As an FYI, I found this question and adding --exe doesn't help, or seem to be the same issue.
I see there are no answer and I encountered the same issue, so I am posting here in hope to help somebody:
Solution 1
The quick fix for me was to add this line export PYTHONPATH=$PYTHONPATH:$(pwd) in the .travis.yml:
before_install:
- "pip install -U pip"
- "export PYTHONPATH=$PYTHONPATH:$(pwd)"
Solution 2
Having a setup.py which should be the default option as it is the most elegant, configured like:
from setuptools import setup, find_packages
setup(name='MyPythonProject',
version='0.0.1',
description='What it does',
author='',
author_email='',
url='',
packages=find_packages(),
)
And then add this line in .travis.yml
before_install:
- "pip install -U pip"
- "python setup.py install"
Solution 3:
Changing the layout of the project to have the test folder under the application one (the one with your core python code) such as:
.travis.yml
requirements.txt
app
|_ tests
| |_ test_application.py
|_ application.py
And running the test in travis with coverage and nosetest like:
script:
- "nosetests --with-coverage --cover-package app"

Categories