Code coverage for a python google-app-engine site? - python

I used to be able to get code coverage for unit testing a Google App Engine test via a commandline like:
coverage run --omit=/Applications --source=../mycode --branch /usr/local/bin/dev_appserver.py ...
[This uses Ned Batchelder's coverage.py] But, after recently updating to the latest SDK (after a long spell of not working on the code), I find that this does not work any more. The server process must run the application code in a subprocess or somesuch.
I tried following this http://nedbatchelder.com/code/coverage/subprocess.html#subprocess
I see another semi-recent question about this with a comment that suggests that coverage.py just won't work. Getting coverage with dev_appserver.py excludes my project files
I've spent a few frustrating hours googling around and trying some things with no luck. So...is this still impossible? Has anyone gotten code coverage to work in any manner? Is there some other tool that can figure out code coverage?

A short term fix might be to run the old dev_appserver.py
https://developers.google.com/appengine/docs/python/tools/old_devserver#Running_the_Old_Development_Web_Server

dragonx's suggestion to use old_dev_appserver.py worked well for me. More specifically, here's what I did using App Engine 1.9.6, coverage 3.7.1, and Python 2.7 on MacOS X 10.9.3:
MyAppDir is the directory containing app.yaml.
--omit is optional. You may well not need it. I had already moved my test code out of MyAppDir because I did not want appcfg.py to upload it.
--branch is optional but useful.
old_dev_appserver.py ships (for now) with App Engine. There is no need to download or install a copy.
# One time:
sudo pip install coverage
# Start the server:
APP=MyAppDir
coverage run \
--source=$APP \
--omit='$APP/exclude/*' \
--branch \
/usr/local/bin/old_dev_appserver.py \
$APP
# Run your tests in a separate tab. In my case I use this command:
webdriver/system_tests.py
# Kill the server with Control-C once the tests are finished.
# Display a quick text summary:
coverage report -m
# Generate and open an HTML report linking to line by line coverage:
coverage html
open htmlcov/index.html
My relatively straightforward app (email, full text search, ndb, urlfetch, webapp2) did not need any changes to work with old_dev_appserver. I did remove the flags I passed to dev_appserver. I was able to live without them. --port is supported if you need it, as are a few others.
If you'd like to see code coverage support in future versions of dev_appserver.py please vote up Add support for code coverage tests and some documentation, formerly https://code.google.com/p/googleappengine/issues/detail?id=4936.

Related

Coverage "No source for code" with pytest

I am trying to measure code coverage by my pytest tests. I tried following the quick start guide of coverage (https://coverage.readthedocs.io/en/6.4.1/)
When I run my test with the following command, everything seems fine
coverage run -m pytest tests/
===================================== test session starts ======================================
platform linux -- Python 3.10.4, pytest-7.1.2, pluggy-1.0.0
rootdir: /home/arnaud/Documents/Github/gotcha
collected 4 items
tests/preprocessing/test_preprocessing.py .... [100%]
====================================== 4 passed in 0.30s =======================================
However, when I try to access the report with either of those commands,
coverage report
coverage html
I get the following message:
No source for code: '<project_directory>/config-3.py'.
I did not find an appropriate solution to this problem so far
It is possible to ignore errors using the command
coverage html -i
which solved my issue
This issue is usually caused by older coverage result files, so you can either:
remove the old coverage results files or...
run coverage command with -i flag in order to ignore the errors - you can read more about that in coverage official docs: https://coverage.readthedocs.io/en/6.4.1/cmd.html#reporting
Another possible solution is to specify the source attribute. In my case, rather than the whole project (source = .), I specified the actual source folder (e.g. src). This can either be done on the commandline:
coverage run --source=src
or include it in your .coveragerc file:
[run]
source = src
...
I was getting this same issue because of a specific library I was importing*, but I never figured out why that library affected coverage, and others didn't.
Though this might just be a workaround, it makes sense to just check your source folder, and ignoring all errors (with -i) isn't much better.
* The library uses opencv-python-headless, which I think has the root cause of this issue.

How to ensure that README.rst is valid?

There are two version of my little tool:
https://pypi.python.org/pypi/tbzuploader/2017.11.0
https://pypi.python.org/pypi/tbzuploader/2017.12.0 Bug: The pypi page looks ugly.
In the last update a change in README.rst cases a warning:
user#host> rst2html.py README.rst > /tmp/foo.html
README.rst:18: (WARNING/2) Inline emphasis start-string without end-string.
README.rst:18: (WARNING/2) Inline emphasis start-string without end-string.
Now the pypi page looks ugly :-(
I use this recipe to do CI, bumpversion, upload to pypi: https://github.com/guettli/github-travis-bumpversion-pypi
How could I ensure that no broken README.rst gets released any more? With other words I want to avoid that the pypi page looks ugly.
Dear detail lovers: Please don't look into the current particular error in the README.rst. That's is not the question :-)
Update
As of Sep 21, 2018, the Python Packaging Authority recommends an alternative command twine check. To install twine:
pip install twine
twine check dist/*
Note that twine requires readme_renderer. You could still use readme_renderer, and you only need to install twine if you want its other features, which is a good idea anyway if you are releasing to PyPI.
From the official Python packaging docs, Uploading your Project to PyPI:
Tip: The reStructuredText parser used on PyPI is not Sphinx! Furthermore, to ensure safety of all users, certain kinds of URLs and directives are forbidden or stripped out (e.g., the .. raw:: directive). Before trying to upload your distribution, you should check to see if your brief / long descriptions provided in setup.py are valid. You can do this by following the instructions for the pypa/readme_renderer tool.
And from that tool's README.rst:
To check your long description's locally simply install the readme_renderer library using:
$ pip install readme_renderer
$ python setup.py check -r -s
Preamble
I had a readme which would not render on PyPi, other than the first element on the page (an image). I ran the file against multiple validators, and tested it against other renders. It worked perfectly fine everywhere else! So, after a long, nasty fight with it, and numerous version bumps so I could test a PyPi revision, I tried reducing the file to a bare minimum, from which I'd build it back up. It turned out that the first line was always processed, and then nothing else was...
Solution
Discovering this clue regarding the first line, I then had an epiphany... All I had to do was change the line endings in the file! I was editing the file in Windows, with Windows line endings being tacked on implicitly. I changed that to Unix style and (poof!) PyPi fully rendered the doc!
Rant...
I've encountered such things in the past, but I took it for granted that PyPi would handle cross platform issues like this. I mean one of the key features of Python is being cross platform! Am I the first person working in Windows to encounter this?! I don't appreciate the hours of time this wasted.
You could try if rstcheck catches the type of error in your readme. If it does, run it after pytest in your script section. (and add it in your requirements ofc).

How to bundle Python dependancies in IronWorker?

I'm writing a simple IronWorker in Python to do some work with the AWS API.
To do so I want to use the boto library which is distributed via PyPi repository. The boto library is not installed by default in the IronWorker runtime environment.
How can I bundle the boto library dependancy with my IronWorker code?
Ideally I'm hoping I can use something like the gem dependancy bundling available for Ruby IronWorkers - i.e in myRuby.worker specify
gemfile '../Gemfile', 'common', 'worker' # merges gems from common and worker groups
In the Python Loggly sample, I see that the hoover library is used:
#here we have to include hoover library with worker.
hoover_dir = os.path.dirname(hoover.__file__)
shutil.copytree(hoover_dir, worker_dir + '/loggly') #copy it to worker directory
However, I can't see where/how you specify which hoover library version you want, or where to download it from.
What is the official/correct way to use 3rd party libraries in Python IronWorkers?
Newer iron_worker version has native support of pip command.
So, you need:
runtime "python"
exec "something.py"
pip "boto"
pip "someotherpip"
full_remote_build true
[edit]We've worked on our toolset a bit since this answer was written and accepted. The answer from my colleague below is the recommended course moving forward.[/edit]
I wrote the Python client library for IronWorker. I'm also employed by Iron.io.
If you're using the Python client library, the easiest (and recommended) way to do this is to just copy over the library's installed folder, and include it when uploading the package. That's what the Python Loggly sample is doing above. As you said, that doesn't specify a version or where to download the library from, because it doesn't care. It just takes the one installed on your system and uses it. Whatever you get when you enter "import boto" on your local machine is what would be uploaded.
The other option is using our CLI to upload your worker, with a .worker file.
To do this, here's what you'd need to do:
Create a botoworker.worker file:
runtime "binary"
build 'pip install --install-option="--prefix=`pwd`/pips" boto'
file 'botoworker.py'
exec "botoworker.sh"
That second line is the pip command that will be run to install the dependency. You can modify it like you would any pip command run from the command line. It's going to execute that command on the worker during the "build" phase, so it's only executed once instead of every time you run a task.
The third line should be changed to the Python file you want to run--it's your Python worker file. Here's the one we used to test this:
import boto
If you save that as botoworker.py, the above should work without any modification. :)
The fourth line is a shell script that's going to actually run your worker. I've included the one we used below. Just save it as botoworker.sh, and you won't have to worry about modifying the .worker file above.
PYTHONPATH="$HOME/pips/lib/python2.7/site-packages:$PYTHONPATH" python botoworker.py "$#"
You'll notice it refers to your Python file--if you don't name your Python file botoworker.py, remember to change it here, too. All this does is set your PYTHONPATH to include the installed library, and then runs your Python file.
To upload this, just make sure you have the CLI installed (gem install iron_worker_ng, making sure your Ruby version is 1.9.3 or higher) and then run "iron_worker upload botoworker" in your shell, from the same directory your botoworker.worker file is in.
Hope this helps!

Web Hosting of Django Application

I found a nice project management app written in Django (busylissy.com). Unfortunately, the guys write there at the top that they plan to shut it down and published it as open source for further development. I was thinking of hosting it on ixwebhosting.com with the basic linux program, but I'm not sure whether this is even possible, so that's basically the question.
I only have the access to the basic configuration, so I can't really install anything on that server. In the requirements.txt, the app lists following :
# **Django**
Django==1.1
# **Imaging**
http://effbot.org/downloads/Imaging-1.1.6.tar.gz
# **STDImage**
-e git+git://github.com/gearheart/django-stdimage.git#egg=stdimage
# **Django AuthOpenID**
-e hg+https://wunki#bitbucket.org/benoitc/django-authopenid#egg=django_authopenid
# **Django registration**
-e hg+https://wunki#bitbucket.org/ubernostrum/django-registration#egg=registration
# **Tagging**
-e svn+http://django-tagging.googlecode.com/svn/trunk#egg=tagging
# **Authority**
-e hg+https://wunki#bitbucket.org/jezdez/django-authority#egg=authority
# **Filebrowser**
-e svn+http://django-filebrowser.googlecode.com/svn/trunk#egg=filebrowser
# **Markdown**
-e git+git://gitorious.org/python-markdown/mainline.git#egg=markdown
# **Treebeard**
-e svn+http://django-treebeard.googlecode.com/svn/trunk/#egg=treebeard
# **Locale url**
-e svn+http://django-localeurl.googlecode.com/svn/trunk/#egg=localeurl
# **Thumbnail**
-e hg+https://sorl-thumbnail.googlecode.com/hg/#egg=sorl-thumbnail
# **DateUtil**
http://labix.org/download/python-dateutil/python-dateutil-1.4.1.tar.gz
Is there any chance to build a self-contained version with all these prerequisites included that doesn't require much more than mod_python or should I rather start looking for some other tool ?
You could use virtualenv (http://pypi.python.org/pypi/virtualenv)
It has dependencies that require compiled code (such as PIL). I'm not really sure what the 'basic linux program' is all about, or what you mean by 'self contained', but it would be trivial to install these dependencies on any normal linux machine. You would have trouble on some shared hosting platforms that do not have the compiled libs available and do not allow you to add your own, etc.
Also don't use mod_python, use mod_wsgi

How can I exclude South migrations from coverage reports using coverage.py

I use coverage.py to check the test coverage of my django application. However since I use South for my database migrations, all those files show up with 0% and mess up the overall percentage.
I already tried using --omit=*migrations* in both run and report (and both) but that didn't work.
I tried versions 3.4 and latest revision from Bitbucket as of Dec 20th 2010 with the same result.
Any ideas how I can get coverage.py to actually ignore the migrations folders?
The solution was:
[run]
omit = ../*migrations*
You should be able to match against the migrations directory to omit those files. Have you tried quoting the argument? Depending on your OS and shell, it may be expanding those asterisks prematurely. Try it like this:
--omit='*migrations*'
Alternately, you could put the switch into a .coveragerc file:
[run]
omit = *migrations*
Latest version of django-jenkins has new option COVERAGE_WITH_MIGRATIONS that would exclude migrations. It's not in PyPI yet so you need to install it with pip/easy_install specyfing url git url as source.
Have you tried django_coverage. I think it handles this kind of problem.
This worked for me:
coverage run --source='.' --omit='*/migrations/*.py' manage.py test
try:
coverage run --source=. manage.py test app_name
this ignores third party code and fixes your % problem

Categories