I have created an Ansible script to download PuTTY and install it on Windows.
- hosts: windows
tasks:
- name: Download Zip File
win_get_url:
url: "{{zipurl}}"
dest: "{{myvarfile}}"
- name: Extract zipfile
win_unzip:
src: "{{myvarfile}}"
dest: "C:\{{packagename}}"
recurse: yes
rm: true
Then:
ansible-playbook deploywar.yml \
--extra-vars="myvarfile=c:\putty.zip zipurl=https://the.earth.li/~sgtatham/putty/latest/w64/putty.zip packagename=putty"
now i need to pass the package name as parameter and need to concatenate in:
dest: "C:\\{{packagename}}"
How can I achieve this?
Either use single quotes:
dest: 'C:\{{packagename}}'
or escape special characters:
dest: "C:\\{{packagename}}"
Related
I want to override the args of a model in the "comp1" folder by passing the parameters to the main file in the "component" folder and hence need some mechanism to pass the override args.
I've run it before in wsl2 and it worked.I want it to work in windows cmd and hence need some workaround or an alternate of echo to be able to pass the override parameters to the main file.
Adding project folder structure for reference:
Folder Component1
Folder comp1
Adding the MLproject file(Used for wsl2) for reference:
name: KNN
conda_env: conda.yml
entry_points:
main:
parameters:
hydra_options:
description: Hydra parameters to override
type: str
default: ''
command: >-
python main.py $(echo {hydra_options})
I've tried the set command in windows to assign a variable to the override params(passed through cmd) and then use it to concatenate with the python main.py file to incorporate the hydra override parameters but it doesn't seem to work as well.
Adding for reference:
name: KNN_main
conda_env: conda.yml
entry_points:
main:
parameters:
hydra_options:
description: Hydra values to override
type: str
default: " "
command: >-
#echo off
set command = "python main.py" and %{hydra_options}%
echo %command%
Tech stack: MLflow==1.29.0 Hydra==1.2.0
OS: Windows 10
According to this answer, you shouldn't place space before and after = in the set command.
It would work if you rewrote the MLproject into this:
name: KNN_main
conda_env: conda.yml
entry_points:
main:
parameters:
hydra_options:
description: Hydra values to override
type: str
default: " "
command: >-
#echo off
set command="python main.py %{hydra_options}%"
echo %command%
Also, I'm not sure but I think you don't need echo and this command will work.
command: >-
python main.py %{hydra_options}%
Summary
What specific syntax must be changed in the code below in order for the multi-line contents of the $MY_SECRETS environment variable to be 1.) successfully written into the C:\\Users\\runneradmin\\somedir\\mykeys.yaml file on a Windows runner in the GitHub workflow whose code is given below, and 2.) read by the simple Python 3 main.py program given below?
PROBLEM DEFINITION:
The echo "$MY_SECRETS" > C:\\Users\\runneradmin\\somedir\\mykeys.yaml command is only printing the string literal MY_SECRETS into the C:\\Users\\runneradmin\\somedir\\mykeys.yaml file instead of printing the multi-line contents of the MY_SECRETS variable.
We confirmed that this same echo command does successfully print the same multi-line secret in an ubuntu-latest runner, and we manually validated the correct contents of the secrets.LIST_OF_SECRETS environment variable. ... This problem seems entirely isolated to either the windows command syntax, or perhaps to the windows configuration of the GitHub windows-latest runner, either of which should be fixable by changing the workflow code below.
EXPECTED RESULT:
The multi-line secret should be printed into the C:\\Users\\runneradmin\\somedir\\mykeys.yaml file and read by main.py.
The resulting printout of the contents of the C:\\Users\\runneradmin\\somedir\\mykeys.yaml file should look like:
***
***
***
***
LOGS THAT DEMONSTRATE THE FAILURE:
The result of running main.py in the GitHub Actions log is:
ccc item is: $MY_SECRETS
As you can see, the string literal $MY_SECRETS is being wrongly printed out instead of the 4 *** secret lines.
REPO FILE STRUCTURE:
Reproducing this error requires only 2 files in a repo file structure as follows:
.github/
workflows/
test.yml
main.py
WORKFLOW CODE:
The minimal code for the workflow to reproduce this problem is as follows:
name: write-secrets-to-file
on:
push:
branches:
- dev
jobs:
write-the-secrets-windows:
runs-on: windows-latest
steps:
- uses: actions/checkout#v3
- shell: python
name: Configure agent
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
import subprocess
import pathlib
pathlib.Path("C:\\Users\\runneradmin\\somedir\\").mkdir(parents=True, exist_ok=True)
print('About to: echo "$MY_SECRETS" > C:\\Users\\runneradmin\\somedir\\mykeys.yaml')
output = subprocess.getoutput('echo "$MY_SECRETS" > C:\\Users\\runneradmin\\somedir\\mykeys.yaml')
print(output)
os.chdir('D:\\a\\myRepoName\\')
mycmd = "python myRepoName\\main.py"
p = subprocess.Popen(mycmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
# returns None while subprocess is running
retcode = p.poll()
line = p.stdout.readline()
print(line)
if retcode is not None:
break
MINIMAL APP CODE:
Then the minimal main.py program that demonstrates what was actually written into the C:\\Users\\runneradmin\\somedir\\mykeys.yaml file is:
with open('C:\\Users\\runneradmin\\somedir\\mykeys.yaml') as file:
for item in file:
print('ccc item is: ', str(item))
if "var1" in item:
print("Found var1")
STRUCTURE OF MULTI-LINE SECRET:
The structure of the multi-line secret contained in the secrets.LIST_OF_SECRETS environment variable is:
var1:value1
var2:value2
var3:value3
var4:value4
These 4 lines should be what gets printed out when main.py is run by the workflow, though the print for each line should look like *** because each line is a secret.
The problem is - as it is so often - the quirks of Python with byte arrays and strings and en- and de-coding them in the right places...
Here is what I used:
test.yml:
name: write-secrets-to-file
on:
push:
branches:
- dev
jobs:
write-the-secrets-windows:
runs-on: windows-latest
steps:
- uses: actions/checkout#v3
- shell: python
name: Configure agent
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
import subprocess
import pathlib
import os
# using os.path.expanduser() instead of hard-coding the user's home directory
pathlib.Path(os.path.expanduser("~/somedir")).mkdir(parents=True, exist_ok=True)
secrets = os.getenv("MY_SECRETS")
with open(os.path.expanduser("~/somedir/mykeys.yaml"),"w",encoding="UTF-8") as file:
file.write(secrets)
mycmd = ["python","./main.py"]
p = subprocess.Popen(mycmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
# returns None while subprocess is running
retcode = p.poll()
line = p.stdout.readline()
# If len(line)==0 we are at EOF and do not need to print this line.
# An empty line from main.py would be '\n' with len('\n')==1!
if len(line)>0:
# We decode the byte array to a string and strip the
# new-line characters \r and \n from the end of the line,
# which were read from stdout of main.py
print(line.decode('UTF-8').rstrip('\r\n'))
if retcode is not None:
break
main.py:
import os
# using os.path.expanduser instead of hard-coding user home directory
with open(os.path.expanduser('~/somedir/mykeys.yaml'),encoding='UTF-8') as file:
for item in file:
# strip the new-line characters \r and \n from the end of the line
item=item.rstrip('\r\n')
print('ccc item is: ', str(item))
if "var1" in item:
print("Found var1")
secrets.LIST_OF_SECRETS:
var1: secret1
var2: secret2
var3: secret3
var4: secret4
And my output in the log was
ccc item is: ***
Found var1
ccc item is: ***
ccc item is: ***
ccc item is: ***
Edit: updated with fixed main.py and how to run it.
You can write the key file directly with Python:
- shell: python
name: Configure agent
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
import os
import pathlib
pathlib.Path('C:\\Users\\runneradmin\\somedir\\').mkdir(parents=True, exist_ok=True)
with open('C:\\Users\\runneradmin\\somedir\\mykeys.yaml', 'w') as key_file:
key_file.write(os.environ['MY_SECRETS'])
- uses: actions/checkout#v3
- name: Run main
run: python main.py
To avoid newline characters in your output, you need a main.py that removes the newlines (here with .strip().splitlines()):
main.py
with open('C:\\Users\\runneradmin\\somedir\\mykeys.yaml') as file:
for item in file.read().strip().splitlines():
print('ccc item is: ', str(item))
if "var1" in item:
print("Found var1")
Here's the input:
LIST_OF_SECRETS = '
key:value
key2:value
key3:value
'
And the output:
ccc item is: ***
Found var1
ccc item is: ***
ccc item is: ***
ccc item is: ***
Here is my complete workflow:
name: write-secrets-to-file
on:
push:
branches:
- master
jobs:
write-the-secrets-windows:
runs-on: windows-latest
steps:
- shell: python
name: Configure agent
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
import os
import pathlib
pathlib.Path('C:\\Users\\runneradmin\\somedir\\').mkdir(parents=True, exist_ok=True)
with open('C:\\Users\\runneradmin\\somedir\\mykeys.yaml', 'w') as key_file:
key_file.write(os.environ['MY_SECRETS'])
- uses: actions/checkout#v3
- name: Run main
run: python main.py
Also, a simpler version using only Windows shell (Powershell):
- name: Create key file
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
mkdir C:\\Users\\runneradmin\\somedir
echo "$env:MY_SECRETS" > C:\\Users\\runneradmin\\somedir\\mykeys.yaml
- uses: actions/checkout#v3
- name: Run main
run: python main.py
I tried the following code and it worked fine :
LIST_OF_SECRETS
key1:val1
key2:val2
Github action (test.yml)
name: write-secrets-to-file
on:
push:
branches:
- main
jobs:
write-the-secrets-windows:
runs-on: windows-latest
steps:
- uses: actions/checkout#v3
- shell: python
name: Configure agentt
env:
MY_SECRETS: ${{ secrets.LIST_OF_SECRETS }}
run: |
import base64, subprocess, sys
import os
secrets = os.environ["MY_SECRETS"]
def powershell(cmd, input=None):
cmd64 = base64.encodebytes(cmd.encode('utf-16-le')).decode('ascii').strip()
stdin = None if input is None else subprocess.PIPE
process = subprocess.Popen(["powershell.exe", "-NonInteractive", "-EncodedCommand", cmd64], stdin=stdin, stdout=subprocess.PIPE)
if input is not None:
input = input.encode(sys.stdout.encoding)
output, stderr = process.communicate(input)
output = output.decode(sys.stdout.encoding).replace('\r\n', '\n')
return output
command = r"""$secrets = #'
{}
'#
$secrets | Out-File -FilePath .\mykeys.yaml""".format(secrets)
command1 = r"""Get-Content -Path .\mykeys.yaml"""
powershell(command)
print(powershell(command1))
Output
***
***
As you also mention in the question, Github will obfuscate any printed value containing the secrets with ***
EDIT : Updated the code to work with multiple line secrets. This answer was highly influenced by this one
You need to use yaml library:
import yaml
data = {'MY_SECRETS':'''
var1:value1
var2:value2
var3:value3
var4:value4
'''}#add your secret
with open('file.yaml', 'w') as outfile: # Your file
yaml.dump(data, outfile, default_flow_style=False)
This is result:
I used this.
I'm looking to make an ansible role and module abled to list all packages on a Linux System actually installed and register them to a var.
Then upgrade all of them and put the second list in an other var.
My module is here to make a diff of the two dictionaries (yum_packages1 and yum_packages2) and return it at the end of my role
When i'm trying to pass thoses two dictonaries into my modules and start my treatment i have a very strange error.
fatal: [centos7_]: FAILED! => {"changed": false, "msg": "argument yum_packages2 is of type and we were unable to convert to dict: cannot be converted to a dict"}
Ansible role task
---
# tasks file for ansible_yum_update_linux
- name: Listing Linux packages already installed
yum:
list: installed
register: yum_packages1
- name: Upgrade paquets Linux
yum:
name: '*'
state: latest
exclude: "{{ packages_exclude }}"
- name: Listing Linux packages installed and updated
yum:
list: installed
register: yum_packages2
- debug:
var: yum_packages1
- debug:
var: yum_packages2
- name: file compare
filecompare:
yum_packages1: "{{ yum_packages1.results }}"
yum_packages2: "{{ yum_packages2.results }}"
register: result
- debug:
var: result
Custome ansible module
#!/usr/bin/python
import json
from ansible.module_utils.basic import AnsibleModule
def diff_update(f1,f2):
#f3 = set(f1.keys()) == set(f2.keys())
upd1 = set(f1.values())
upd2 = set(f2.values())
f3 = (upd1.difference(upd2))
return f3
def main():
module = AnsibleModule(
argument_spec = dict(
yum_packages1 = dict(required=True, type='dict'),
yum_packages2 = dict(required=True, type='dict')
)
)
f3 = diff_update(module.params['yum_packages1'],module.params['yum_packages2'])
module.exit_json(changed=False, diff=f3)
if __name__ == '__main__':
main()
Do you have any idea why i get this error ?
Do you have any idea why i get this error ?
Because set is not json serializable:
import json
print(json.dumps(set("hello", "kaboom")))
cheerfully produces:
TypeError: set(['kaboom', 'hello']) is not JSON serializable
That's actually bitten me a couple of times in actual ansible modules, which is why I knew it was a thing; if you insist on using your custom module, you'll want to convert the set back to a list before returning it:
module.exit_json(changed=False, diff=list(f3))
i'm trying to extract variables from a python script response, basically i have a task that executes a python script and i need to get the variables of that response, this python script parse a json and put the response in diferent variables
((python script)
import json
with open('dd.json') as f:
data = json.load(f)
for item in data['service-nat-pool-information'][0]['sfw-per-service-set-nat-pool']:
ifname = [b['data'] for b in item['interface-name']]
for q in item['service-nat-pool']:
name = [a['data'] for a in q['pool-name']]
rang = [n['data'] for n in q['pool-address-range-list'][0]['pool-address-range']] #linea agregada de stack
# ports = item['pool-port-range'][0]['data']
# use= item['pool-ports-in-use'][0]['data']
block= [j['data'] for j in q['effective-port-blocks']]
mblock= [m['data'] for m in q['effective-ports']]
maxp =[d['data'] for d in q['port-block-efficiency']]
print("|if-name",ifname,"|name:",name,"|ip-range:",rang,"|Effective-port-blocks:",block[0],"|Effective-port:",mblock[0],"|Port-Block-Efficiency:",maxp[0])
ansible playbook
---
- name: Parse
hosts: localhost
connection: local
vars:
pool: "{{ lookup('file','/etc/ansible/playbook/dd.json') | from_json }}"
tasks:
- name: Execute Script
command: python3.7 parsetry.py
i expected a task in ansible that gets the variables in the python script and store them in ansible variables
You have to use register. If you modify your script to output json that might ease your work a little bit.
- name: Execute Script
command: python3.7 parsetry.py
register: script_run
- name: Degug output
debug:
msg: "{{ script_run.stdout | from_json }}"
If you want to keep full python power under your fingers, you might as well consider turning your script in a custom module or a custom filter if it ever makes sense.
I have a file and in that file I want to add the pattern:
dogstreams: /root/ddmonitor/pattern.txt:/opt/datadog-agent/agent/checks/libs/parsers.py:parse_web
to the end of the file.
I've tried the following so far:
---
- name: Creates directory
file: path=/root/ddmonitor state=directory owner=root group=root mode=0775
- name: copy the pattern_search.txt file which has patterns to be grepped
copy: src=pattern_search.txt dest=/root/ddmonitor/pattern_search.txt owner=root group=root mode=755
- name: copy the logsearchtest.sh script which greps patterns and prints pattern.txt file
copy: src=logsearchtest.sh dest=/root/ddmonitor/logsearchtest.sh owner=root group=root mode=755
- name: schedule cron to run every 5 minutes
cron: name="logsearch script for grepping pega alert logs" minute="*/5" job="/root/ddmonitor/logsearchtest.sh > /dev/null"
- name: copy parsers.py fucntion to datadog lib path
copy: src=parsers.py dest=/opt/datadog-agent/agent/checks/libs/parsers.py owner=root group=root mode=755
- name: copy datadog agent configuration file
lineinfile: dest=/etc/dd-agent/datadog.conf regexp="^dogstreams: " line="dogstreams: /root/ddmonitor/pattern.txt:/opt/datadog-agent/agent/checks/libs/parsers.py:parse_web"
- name: wait a bit
service: name=datadog-agent state=restarted
when: not datadog_api_key == 'NONE'
but when I run my main playbook it shows the following error:
ERROR! Syntax Error while loading YAML.
The error appears to have been in '/etc/ansible/roles/datadog-pegalogs-apptier/tasks/main.yml': line 19, column 65, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: copy datadog agent configuration file
lineinfile: dest=/etc/dd-agent/datadog.conf regexp="dogstreams: " line="dogstreams: /root/ddmonitor/pattern.txt:/opt/datadog-agent/agent/checks/libs/parsers.py:parse_web"
^ here
We could be wrong, but this one looks like it might be an issue with
unbalanced quotes. If starting a value with a quote, make sure the
line ends with the same set of quotes. For instance this arbitrary
example:
foo: "bad" "wolf"
Could be written as:
foo: '"bad" "wolf"'
Check out the Ansible Docs
I believe that you will need to do something similar to this:
Fully quoted because of the ': ' on the line. See the Gotchas in the YAML docs.
- lineinfile: "dest=/etc/sudoers state=present regexp='^%wheel' line='%wheel ALL=(ALL) NOPASSWD: ALL'"
There is a YAML limitation to the ":" which is listed here
You will want to quote any hash values using colons, like so:
foo: "somebody said I should put a colon here: so I did"
So your line should look like:
- name: copy datadog agent configuration file
lineinfile: "dest=/etc/dd-agent/datadog.conf regexp='^dogstreams: ' line='dogstreams: /root/ddmonitor/pattern.txt:/opt/datadog-agent/agent/checks/libs/parsers.py:parse_web'"
I guess you could escape the colon as {{ ":" }} like so:
lineinfile: dest=/etc/dd-agent/datadog.conf regexp="^dogstreams{{ ":" }} " line="dogstreams: /root/ddmonitor/pattern.txt{{ ":" }}/opt/datadog-agent/agent/checks/libs/parsers.py:parse_web"