Testing python Azure function app before deployment using Github Workflows - python

I have a python function app that I want I am deploying to Azure using a github standard workflow. I want to add a job before deploying that also runs all the units tests for the function. Locally, I am using pytest to run the unit tests. The structure of the folders is as shown below.
When running locally I do 'python -m pytest tests' and it runs all the tests.
When I add the same thing in the github workflow it gives me error
ERROR: file or directory not found: tests
The job is described as:
Is there a way to run the tests using the workflow?

I was able to solve this issue by using the command
python -m pytest ./function/tests

Related

selenium-python code management(git) + jenkins integration. Now how to run the test project present on git on jenkins?

Using Selenium Webdriver and Python Scripting I have created a project and was able to make a series of test cases in PyCharm that run on Unit Test. Now I'm able to run my test/test-suite using the below command in the terminal
python -m pytest /test/smoke/test_suite1.py --browser=chrome --baseURL=dev"
The Project had a directory which contains the WebDriver and Conftest.py file, POM's, Test Scripts etc..
Now my test/test-suite currently run when I am in PyCharm and pass the terminal command. I have moved my project to the Git repository now but also would like to run these test cases in Jenkins via GIT repository.
There are two ways I would prefer doing the CI.
Schedule a time (once per week) to run through the tests and send email of result.
Upload my test cases to a repository on GitHub and when there is a change done to the repository, run the tests and/or on a schedule (once per week).
Ive honestly tried to look up tutorials (videos/documents) but they all seem very unclear.
Is it better to do this with a Jenkins Plugin? If so what are the things I need to add/change to allow this to happen, and configure in Jenkins.

Google Cloud Dataflow Dependencies

I want to use dataflow to process in parallel a bunch of video clips I have stored in google storage. My processing algorithm has non-python dependencies and is expected to change over development iterations.
My preference would be to use a dockerized container with the logic to process the clips, but it appears that custom containers are not supported (in 2017):
use docker for google cloud data flow dependencies
Although they may be supported now - since it was being worked on:
Posthoc connect FFMPEG to opencv-python binary for Google Cloud Dataflow job
According to this issue a custom docker image may be pulled, but I couldn't find any documentation on how to do it with dataflow.
https://issues.apache.org/jira/browse/BEAM-6706?focusedCommentId=16773376&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16773376
Another option might be to use setup.py to install any dependencies as described in this dated example:
https://cloud.google.com/blog/products/gcp/how-to-do-distributed-processing-of-landsat-data-in-python
However, when running the example I get an error that there is no module named osgeo.gdal.
For pure python dependencies I have also tried to pass the --requirements_file argument, however I still get an error: Pip install failed for package: -r
I could find documentation for adding dependencies to apache_beam, but not to dataflow, and it appears the apache_beam instructions do not work, based on my tests of --requirements_file and --setup_file
This was answered in the comments, rewriting here for clarity:
In Apache Beam you can modify the setup.py file while will be run once per container on start-up. This file allows you to perform arbitrary commands before the the SDK Harness start to receive commands from the Runner Harness.
A complete example can be found in the Apache Beam repo.
As of 2020, you can use Dataflow Flex Templates, which allow you to specify a custom Docker container in which to execute your pipeline.

How to deploy AWS python Lambda project locally?

I got an AWS python Lambda function which contains few python files and also several dependencies.
The app is build using Chalice so by that the function will be mapped like any REST function.
Before the deployment in prod env, I want to test it locally, so I need to pack all this project (python files and dependencies), I tried to look over the web for the desired solution but I couldn't find it.
I managed to figrue how to deploy one python file, but a whole project did not succeed.
Take a look to the Atlassian's Localstack: https://github.com/atlassian/localstack
It's a full copy of the AWS cloud stack, locally.
I use Travis : I hooked it to my master branch in git, so that when I push on this branch, Travis tests my lambda, with a script that uses pytest, after having installed all its dependencies with pip install. If all the tests passed, it then deploy the lambda in AWS in my prod-env.

Django: reusable app testing

By following official Django doc, I've extracted an app from my project and made it reusable and installable using pip (currently I still have to learn how to release it on pypi but that's another story)... so far so good... but now I have no idea how to run the tests I wrote for my app, since after installing it in my project using pip Django stopped to execute those tests (by default in Django 1.7 only project-apps tests are picked up)... so my question is: how can I run tests for my apps now that it has been extracted from the main project sources?
ps: of course I don't want to force the potential users of my app to run the tests I wrote, but I have to run them while working on the app on my machine
Ok, I'm an idiot... the only thing I have to do is passing the app name in the test command:
python manage.py test -v 2 -p "Test*.py" --noinput --settings=settings.test my_app

Run custom Python script before appcfg.py update runs

Is that possible to run some Python script every time I run deployment process with appcfg.py? I need that to copy some files from external source to my app folder before uploading it to GAE. Thanks!
I checked briefly the sources of appcfg.py, the script that deploys the application to Google App Engine, but I didn't find a place where a pre-deploy hook can be defined.
I believe that modifying appcfg.py itself would me not mantainable and a bit overkill.
You should create a simple deployment script and call your command from the script.
For example, you can create a simple Makefile with only one target that does what you want:
deploy:
your-copy-command
/path/to/gae-devkit/appcfg.py update .
Running the make command will execute the command to copy external files and call the Google App Engine deployment tool.

Categories