I am trying to implement devops(CI/CD pipelines) on few of the python projects. All of the python projects are available in git repository.
I have implemented devops on each python projects, shown below the build pipeline for one of the project and it was working successfully.
Now i have brought all of the python projects in a single repository as multi module project as shown below.
Now I want to implement devops by crating CI/CD pipelines on each of the modules separately.
For that, I have created build(CI) pipelines which are similar as above, by modifying requirement.txt and set up.py files path as modulename/requirement.txt and modulename/setup.py.
setup.py file is failing by throwing an error, unable to find the package and the task for test is failing because of not able to find the packages which are having test resource files.
Below is the folder structure of python module.
Any idea how to resolve these errors?
Is there any reference for implementing devops(CI/CD) on python multi module projects?
Any leads much appreciated!
Related
Looking at the official Python Beam documentation page it seems like the only way to do deployments is to have a setup.py that defines dependencies that exist within your repo or externally.
But this doesn't quite work with the Bazel way of managing Python dependencies (i.e. I have no setup.py file) or separate requirements.txt file for each pipeline in my repo.
How does one package and deploy jobs to runners using Bazel?
You can have a script that runs pip freeze > ... to generate the requirements files for your pipelines, then use them to deploy your pipelines.
If multiple pipelines share the same set of Python dependencies and the super set of all their dependencies is not that big, you can build a custom container to submit with all the jobs.
According to the documentation here
Dependency specification using the Pipfile/Pipfile.lock standard is currently not supported. Your project should not include these files.
I use Pipfile for managing my dependencies and create a requirements.txt file through
pipenv lock --requirements
Till now everything works and my gcloud function is up and running. So why should a python google cloud function not contain a Pipfile?
If it shouldn't contain, what is the preferred way suggested to manage an isolated environment ?
When you deploy your function, you deploy it on its own environment. You won't manage several environment because the cloud function deployment is dedicated to one and only one piece of code.
That's why, it's useless to have a virtual environment in a single usage environment. You could use Cloud Run to do that because you can customize your build and runtime environment. But, here again, it's useless: You won't have concurrent environment in the same container, it does not make sense.
I'm working on a serverless project using sklearn.neural_network.MLPClassifier using AWS Lambda.
AWS requires that all dependencies get uploaded with the project during deploy, is there a way to install only the files needed to use a specific classifier so I can save some bandwidth?
I am working on a web project written in python. Here are some facts:
Project hosted on GitHub
fabric (a python library for build and deploy automation) script (fabfile.py) for automatic build and deploy
Jenkins for build automation
My question is, where to put the fabfile.py conventionally.
I prefer to put the fabfile.py in the root of the project repo so that I can config jenkins job to grab the source code from git and simply run fab build to get the compiled package.
Someone insists that the fabfile.py should NOT be part of the project repo, it should be kept in an external repo instead. In this case you need to config jenkins to clone the fabfile repo, invoke git to clone the product repo then run packaging.
I know this is probably a matter of personal flavor, but are there any benefits to put fabfile.py in a separate repo than to put it with the product code?
Any suggestions are appreciated.
In my opinion, I can't see any benefits besides maybe preventing some junior dev accidentally deploying some unwanted code.
On the other hand, it's nice to have everything in one repo so you don't have to maintain multiple repositories. In past experiences, we always included deployment scripts in the root of the project.
I am trying to use SymPy module in python, implemented on the Microsoft Azure platform, to no avail. This is the first of a few modules I want to use but I am finding it very difficult to find a step by step guide on how to implement any external python modules in Microsoft Azure. Has anybody had any experience with this that could offer advice? I have tried downloading the source from github and putting it in the same folder that contains my app folder. Incidentally I have set up a Flask app within azure, and cloned the app to a local folder where I seen the flask folder so thought that I could put the Sympy folder in there and call it but that hasn't worked. Help!!