how to run control m job through python script? - python

i have a control m job instead of triggering it through control m application i want to trigger through my python script but i don't know how to perform this action is there any module available in python which can trigger control m job and also how can I pass user-id , DNS details and password into it..

It can be done using Control-M Automation API.
You can use the following link as a reference for the python code:
https://github.com/controlm/automation-api-quickstart/tree/master/control-m/201-call-rest-api-using-python
You can use the following link to browse the Control-M Swagger documentation:
http://aapi-swagger-doc.s3-website-us-west-2.amazonaws.com/#/run/runNow
It seems like /run/runNow endpoint will do the trick

Related

How we can automate python client application which is used an an interface for user to call APIs

We have made an python client which is used as an interface for user. some function is defined in the client which internally calls the APIs and give output to users.
My requirement is to automate the python client - functions and validate the output.
Please suggest tools to use.
There are several ways to do that:
You can write multiple tests for your application as the test cases which are responsible to call your functions and get the result and validate them. It calls the "feature test". To do that, you can use the python "unittest" library and call the tests periodically.
If you have a web application you can use "selenium" to make automatic test flows. (Also you can run it in a docker container virtually)
The other solution is to write another python application to call your functions or send requests everywhere you want to get the specific data and validate them. (It's the same with the two other solutions with a different implementation)
The most straightforward way is using Python for this, the simplest solution would be a library like pytest. More comprehensive option would be something like Robot framework
Given you have jmeter in your tags I assume that at some point you will want to make a performance test, however it might be easier to use Locust for this as it's pure Python load testing framework.
If you still want to use JMeter it's possible to call Python programs using OS Process Sampler

How to get the error message in robot framework?

I have created a script to get the error output
***Settings***
Resource importsLib.robot
Suite Setup Run Keywords
... Initialize Test AND
... Register Keyword To Run On Failure Failure Callback
Variables OMG.yaml
and the keywords
***Keywords***
Failure Callback
Capture Page Screenshot
Log Source loglevel=WARN
but the point is I need to get HTML error message when Back-end didn't send the value to Front-end element sometimes and I need to track the root causes of faults or problems.
Can you offer the best solution for this case?
You can use HttpLibrary but you should also code BE tests for it, there is no way to check it under your FE tests if you are not check parallelly BE http methods.
You can create your own custom libraries or keywords for it.
And here is a example of how you can use custom libraries on the Robot Framework :
How to create a custom Python code library for the Robot Framework

How to add cron jobs in CloudFoundry

I have a couple of Python apps in CloudFoundry. Now I would like to schedule their execution. For example a specific app has to be executed on the second day of each month.
I coudldn't find anything on the internet. Is that even possible?
Cloud Foundry will deploy your application inside a container. You could use libraries to execute your code on a specific schedule but either way you're paying to have that instance run the whole time.
What you're trying to do is a perfect candidate for "serverless computing" (also known as "event-driven" or "function as a service" computing.
These deployment technologies execute functions on response to a trigger e.g. a REST api call, a certain timestamp, a new database insert etc...
You could execute your python cloud foundry apps using the Openwhisk serverless compute platform.
IBM offer a hosted version of this running on their cloud platform, Bluemix.
I don't know what your code looks like so I'll use this sample hello world function:
import sys
def main(dict):
if 'message' in dict:
name = dict['message']
else:
name = 'stranger'
greeting = 'Hello ' + name + '!'
print(greeting)
return {'greeting':greeting}
You can upload your actions (functions) to OpenWhisk using either the online editor or the CLI.
Once you've uploaded your actions you can automate them on a specific schedule by using the Alarm Package. To do this in the online editor click "automate this process" and pick the alarm package.
To do this via the CLI we need to first create a trigger:
$ wsk trigger create regular_hello_world --feed /whisk.system/alarms/alarm -p cron '0 0 9 * * *'
ok: created trigger feed regular_hello_world
This will trigger every day at 9am. We then need to link this trigger to our action by creating a rule:
$ wsk rule create regular_hello_rule regular_hello_world hello_world
ok: created rule regular_hello_rule
For more info see the docs on creating python actions.
The CloudFoundry platform itself does not have a scheduler (at least not at this time) and the containers where you application runs do not have cron installed (unlikely to ever happen).
If you want to schedule code to periodically run, you have a few options.
You can deploy an application that includes a scheduler. The scheduler can run your code directly in that container or it can trigger the code to run elsewhere (ex: it sends an HTTP request to another application and that request triggers the code to run). If you trigger the code to run elsewhere, you can make the scheduler app run pretty lean (maybe with 64m of memory or less) to reduce costs.
You can look for a third party scheduler service. The availability of and cost of services like this will vary depending on your CF provider, but there are service offerings to handle scheduling. These typically function like the previous example where an HTTP request is sent to your app at a specific time and that triggers your scheduled code. Many service providers offer free tiers, which give you a small number of triggers per month at no cost.
If you have a server outside of CF with cron installed, you can use cron there to schedule the tasks and trigger the code to run on CF. You can do this like the previous examples by sending HTTP requests to your app, however, this option also gives you the possibility to make use of CloudFoundry's task feature.
CloudFoundry has the concept of a task, which is a one-time execution of some code. With it, you can execute the cf run-task command to trigger the task to run. Ex: cf run-task <app-name> "python my-task.py". More on that in the docs, here. The nice part about using tasks is that your provider will only bill you while the task is running.
To see if your provider has tasks available, run cf feature-flags and look to see if task_creation is set to enabled.
Hope that helps!

Using Microsoft Azure to run "a bunch of Python scripts on demand"

I'm trying to define an architecture where multiple Python scripts need to be run in parallel and on demand. Imagine the following setup:
script requestors (web API) -> Service Bus queue -> script execution -> result posted back to script requestor
To this end, the script requestor places a script request message on the queue, together with an API endpoint where the result should be posted back to. The script request message also contains the input for the script to be run.
The Service Bus queue decouples producers and consumers. A generic set of "workers" simply look for new messages on the queue, take the input message and call a Python script with said input. Then they post back the result to the API endpoint. But what strategies could I use to "run the Python scripts"?
One possible strategy could be to use Webjobs. Webjobs can execute Python scripts and run on a schedule. Let's say that you run a Webjob every 5 minutes, the Python script can pool the queue, do some processing and post the results back to you API.
Per my exprience, I think there are two strategies below you could use.
Developing a Python script for Azure HDInsight. Azure HDInsight as a platform based on Hadoop that has the power of parallel compute, you can try to refer to the doc https://azure.microsoft.com/en-us/documentation/articles/hdinsight-hadoop-streaming-python/ to know it.
Develoging a Python script based on the parallel compute framework like dispy or jug running Azure VMs.
Hope it helps. Best Regards.

using python and node.js

I use to program on python. I have started few months before, so I am not the "guru" type of developer. I also know the basics of HTML and CSS.
I see few tutorials about node.js and I really like it. I cannot create those forms, bars, buttons etc with my knowledge from html and css.
Can I use node.js to create what user see on browser and write with python what will happen if someone push the "submit" button? For example redirect, sql write and read etc.
Thank you
You can call python scripts in the back end at the node server, in response to button click by user. For that you can use child_process package. It allows you to call programs installed on your machine.
For example here is how to run your script when user POST's something on /reg page:
app.post('/reg', function(request, response){
spawn = require('child_process').spawn;
path = "location of your script";
// create child process of your script and pass two arguments from the request
backend = spawn('python',[path, request.body.name, request.body.email]);
backend.on('exit', function(code) {
console.log(path + ' exited with code ' + code);
if(code==0)
response.render('success'); //show success page if script runs successfully
else
response.redirect('bad');
});
});
Python has to be installed in your system, along with other python libraries you will need. It cannot respond / redirect to requests to node, else why would you use node then. When in Rome, do as the Romans do. Use JavaScript in node, calling external programs is not as fast using JS libraries.
Node.js is a serverside JavaScript environment (like Python). It runs on the server and interacts with the database, generates the HTML that the clients see and isn't actually directly accessed by the browser.
Browsers, on the other hand, run clientside JavaScript directly.
If you want to use Python on the server, there are a bunch of frameworks that you can work with:
Django
Flask
Bottle
Web.py
CherryPy
many, many more...
I think you're thinking about this problem backwards. Node.js lets you run browser Javascript without a browser. You won't find it useful in your Python programming. You're better off, if you want to stick with Python, using a framework such as Pyjamas to write Javascript with Python or another framework such as Flask or Twisted to integrate the Javascript with Python.

Categories