This question already has an answer here:
ERROR! 'file' is not a valid attribute for a Play [duplicate]
(1 answer)
Closed 5 years ago.
I have a module called "mysync" which is a small modification of synchronize. Bottom of this ticket is a diff between syncronize and mysync. It is placed it in the library directory:
library = /sites/utils/local/ansible/modules
when running the module in a playbook, the module is not found. Recommendations?
#> cat play.yml
- hosts: all
name: put shop onto server
mysync:
mode: pull
module: shop
src: rsync://#DEPOTHOST#::shop.ear
dest: /sites/MODULES/
archive: no
compress: yes
copy_links: yes
delete: yes
links: yes
times: yes
use_ssh_args: yes
verify_host: no
delegate_to: "{{ inventory_hostname }}"
#>ansible-playbook ./sync_shop.yml --limit tst37 -vvv
ERROR! 'mysync' is not a valid attribute for a Play
The error appears to have been in '/sites/utils/local/ansible/modules/sync_shop.yml': line 1, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- hosts: all
^ here
#>diff diff /Library/Python/2.7/site-packages/ansible/modules/files/synchronize.py ./mysync.py
26,28c26,28
< module: synchronize
< version_added: "1.4"
< short_description: A wrapper around rsync to make common tasks in your playbooks quick and easy.
---
> module: mySync
> version_added: "2.1"
> short_description: A custom wrapper around rsync to get src-host, src and dest from ldap
302a303,304
> import socket
> from ansible.module_utils.ldapData import ldapData
370a373,382
> #myInv = ldapData(self.args.debug,self.args.file,self.args.refresh)
> myInv = ldapData()
> host = socket.getfqdn()
> mydata = hosts[host][instances]
> for inst in mydata:
> if 'depothost' in inst:
> src_host=inst['depothost']
> if src_host is None:
> module.fail_json(msg='Could not determine depothost')
>
378a391
> source.replace('#DEPOTHOST#',src_host)
You should not "call" modules (custom or standard) directly from a play.
You should add them to a tasks dictionary which is missing in your play.
- hosts: all
tasks:
- name: put shop onto server
mysync:
mode: pull
module: shop
# etc.
Related
As the title states I am currently struggling to output the result of my .py script. I have looked around and tried to implement many of the examples I've seen on SO/Reddit, e.g.:
using register & debug parameters
using above parameters with "stdout"
using above parameters with "stdout_lines"
Currently, when I run docker-compose up I have no issues; desired output is seen in the CLI (shows disk usage). However, when I run docker-compose up using an ansible-playbook I see an output that details various server related info and is NOT what I want to see when the playbook is run. I only want to see the result of the disk clean-up script.
See python scripts + ansible playbook code below:
my_modules.py
import time, os, shutil
path = os.getcwd()
total, used, free = shutil.disk_usage(path)
total_2_dp = round(total / 2^^30, 2)
used_2_dp = round(used / 2^^30, 2)
free_2_dp = round(free / 2^^30, 2)
def dfcleanup():
timestamp = os.path.getmtime(path)
days = 40
s_in_days = time.time() - (days * 24 * 60 * 60)
for file in os.listdir(path):
filename = os.fsdecode(file)
if timestamp >= s_in_days:
os.remove(filename)
break
df_cleanup.py ## (this script prints disk usage)
from clean_up_msg import clean_up_message
from my_modules import *
threshold = 80
if (used / total)*100 >= threshold:
dfcleanup(), clean_up_message()
print("Clean up complete (:")
print(f"Disk Usage:\nTotal: {total_2_dp} GiB, Used: {used_2_dp} GiB, Free: {free_2_dp} GiB")
else:
print(f"Disk Usage:\nTotal: {total_2_dp} GiB, Used: {used_2_dp} GiB, Free: {free_2_dp} GiB")
docker-compose up to run .py script
running docker-compose up (output I want)
ansible playbook to run docker-compose up
---
- hosts: all
become: true
tasks:
- name: copy file with owner & permissions
ansible.builtin.copy:
src: /home/user/project-directory/docker-compose.yml
dest: /home/user/docker-compose.yml
- name: run docker-compose up
community.docker.docker_compose:
project_src: /home/user/
files:
- "docker-compose.yml"
state: present
register: df_output
- debug:
msg: "{{ df_output }}"
ansible playbook output
running ansible-playbook (output I get)
I am using ansible tower and aiming to do something like this. Lets say I have these inventories defined in ansible tower
kanto-pkmn unova-pkmn johto-pkmn
a e c
b f d
Now I want the user to input variables (say he enters kanto and unova) then the script is only supposed to run on those hosts. However the catch is the hosts are supposed to mantain some form of variable that connects them to their respective inventory.
(ex. some sort of mapping should be there between a and kanto)
Ideas I have explored:
Multiple inventories seems like the best way but ansible tower only allows one inventory to be set during a job
Smart inventory is another option but it seems that it removes all the groups of the previous inventory so all I seem to obtain is
a
b
c
f
e
f
Is there any way I can get something like smart inventory with the groups intact or basically get something like
[kanto-pkmn] (or anything that can be mapped to the file)
a
b
[unova-pkmn]
e
f
Ansible Inventory
[kanto-pkmn]
a
b
[unova-pkmn]
e
f
[johto-pkmn]
c
d
[kantounova:children]
kanto-pkmn
unova-pkmn
Run aginst ansible inventory
ansible all -m setup **--> will run on all (builtin feature)**
ansible unova-pkmn -m setup **--> will run on c and d**
ansible kantounova -m setup **--> will run on a,b,e, and f**
Ansible Playbook
In your playbook or role you want to run you can add survey questions
- name: Install Lab Environment
become: true
hosts: johto-pkmn
vars_prompt:
- name: rhn_username *-->this is your variable name**
prompt: Enter Red Hat CDN username
private: no
- name: rhn_password
prompt: Enter Red Hat user CDN password
private: yes
- name: ORG
prompt: What is the name of your organization?
private: no
- name: LOC
prompt: Please enter the location of your env
private: no
tasks:
- name: Register with red hat cdn and attach rhel subscription
redhat_subscription:
username: "{{ rhn_username }}"
password: "{{ rhn_password }}"
state: present
pool: '^Red Hat Ansible Automation'
when:
- rhn_username != ""
- unova in group_names **<-- this would run on abc and d if they have a rhn_username**
I would like to add Ansible module locally. The module should use external Python library. I have added just the following line:
from ansible.module_utils.foo import Bar
to the Ansible new module template making it look like below:
my_test.py
#!/usr/bin/python
# Copyright: (c) 2018, Terry Jones <terry.jones#example.org>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
ANSIBLE_METADATA = {
'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'
}
DOCUMENTATION = '''
---
module: my_test
short_description: This is my test module
version_added: "2.4"
description:
- "This is my longer description explaining my test module"
options:
name:
description:
- This is the message to send to the test module
required: true
new:
description:
- Control to demo if the result of this module is changed or not
required: false
extends_documentation_fragment:
- azure
author:
- Your Name (#yourhandle)
'''
EXAMPLES = '''
# Pass in a message
- name: Test with a message
my_test:
name: hello world
# pass in a message and have changed true
- name: Test with a message and changed output
my_test:
name: hello world
new: true
# fail the module
- name: Test failure of the module
my_test:
name: fail me
'''
RETURN = '''
original_message:
description: The original name param that was passed in
type: str
returned: always
message:
description: The output message that the test module generates
type: str
returned: always
'''
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.foo import Bar
def run_module():
# define available arguments/parameters a user can pass to the module
module_args = dict(
name=dict(type='str', required=True),
new=dict(type='bool', required=False, default=False)
)
# seed the result dict in the object
# we primarily care about changed and state
# change is if this module effectively modified the target
# state will include any data that you want your module to pass back
# for consumption, for example, in a subsequent task
result = dict(
changed=False,
original_message='',
message=''
)
# the AnsibleModule object will be our abstraction working with Ansible
# this includes instantiation, a couple of common attr would be the
# args/params passed to the execution, as well as if the module
# supports check mode
module = AnsibleModule(
argument_spec=module_args,
supports_check_mode=True
)
# if the user is working with this module in only check mode we do not
# want to make any changes to the environment, just return the current
# state with no modifications
if module.check_mode:
module.exit_json(**result)
# manipulate or modify the state as needed (this is going to be the
# part where your module will do what it needs to do)
result['original_message'] = module.params['name']
result['message'] = 'goodbye'
# use whatever logic you need to determine whether or not this module
# made any modifications to your target
if module.params['new']:
result['changed'] = True
# during the execution of the module, if there is an exception or a
# conditional state that effectively causes a failure, run
# AnsibleModule.fail_json() to pass in the message and the result
if module.params['name'] == 'fail me':
module.fail_json(msg='You requested this to fail', **result)
# in the event of a successful module execution, you will want to
# simple AnsibleModule.exit_json(), passing the key/value results
module.exit_json(**result)
def main():
run_module()
if __name__ == '__main__':
main()
I haven't introduced any changes in Ansible playbook template. I paste playbook here for neatness:
testmod.yml
- name: test my new module
hosts: localhost
tasks:
- name: run the new module
my_test:
name: 'hello'
new: true
register: testout
- name: dump test output
debug:
msg: '{{ testout }}'
I have put module my_test.py in the following localization:
~/.ansible/plugins/modules
I have extended ANSIBLE_MODULE_UTILS environment variable making foo library visible. Generally, leaving aside Ansible, parts of foo library may be imported to the Python script in the following way:
from foo import Bar
I have tested, that when e.g. foo is a Python script and Bar is a class inside that script, the testmod.yml playbook may be run correctly. The problem is that foo is a directory, there is not foo.py file, nor Bar.py. In my case, when I run testmod.yml, I receive traceback:
ImportError: No module named foo.config
Could you tell me what should I do to be able to use foo external library in my local Ansible module?
I'm looking to make an ansible role and module abled to list all packages on a Linux System actually installed and register them to a var.
Then upgrade all of them and put the second list in an other var.
My module is here to make a diff of the two dictionaries (yum_packages1 and yum_packages2) and return it at the end of my role
When i'm trying to pass thoses two dictonaries into my modules and start my treatment i have a very strange error.
fatal: [centos7_]: FAILED! => {"changed": false, "msg": "argument yum_packages2 is of type and we were unable to convert to dict: cannot be converted to a dict"}
Ansible role task
---
# tasks file for ansible_yum_update_linux
- name: Listing Linux packages already installed
yum:
list: installed
register: yum_packages1
- name: Upgrade paquets Linux
yum:
name: '*'
state: latest
exclude: "{{ packages_exclude }}"
- name: Listing Linux packages installed and updated
yum:
list: installed
register: yum_packages2
- debug:
var: yum_packages1
- debug:
var: yum_packages2
- name: file compare
filecompare:
yum_packages1: "{{ yum_packages1.results }}"
yum_packages2: "{{ yum_packages2.results }}"
register: result
- debug:
var: result
Custome ansible module
#!/usr/bin/python
import json
from ansible.module_utils.basic import AnsibleModule
def diff_update(f1,f2):
#f3 = set(f1.keys()) == set(f2.keys())
upd1 = set(f1.values())
upd2 = set(f2.values())
f3 = (upd1.difference(upd2))
return f3
def main():
module = AnsibleModule(
argument_spec = dict(
yum_packages1 = dict(required=True, type='dict'),
yum_packages2 = dict(required=True, type='dict')
)
)
f3 = diff_update(module.params['yum_packages1'],module.params['yum_packages2'])
module.exit_json(changed=False, diff=f3)
if __name__ == '__main__':
main()
Do you have any idea why i get this error ?
Do you have any idea why i get this error ?
Because set is not json serializable:
import json
print(json.dumps(set("hello", "kaboom")))
cheerfully produces:
TypeError: set(['kaboom', 'hello']) is not JSON serializable
That's actually bitten me a couple of times in actual ansible modules, which is why I knew it was a thing; if you insist on using your custom module, you'll want to convert the set back to a list before returning it:
module.exit_json(changed=False, diff=list(f3))
i'm trying to extract variables from a python script response, basically i have a task that executes a python script and i need to get the variables of that response, this python script parse a json and put the response in diferent variables
((python script)
import json
with open('dd.json') as f:
data = json.load(f)
for item in data['service-nat-pool-information'][0]['sfw-per-service-set-nat-pool']:
ifname = [b['data'] for b in item['interface-name']]
for q in item['service-nat-pool']:
name = [a['data'] for a in q['pool-name']]
rang = [n['data'] for n in q['pool-address-range-list'][0]['pool-address-range']] #linea agregada de stack
# ports = item['pool-port-range'][0]['data']
# use= item['pool-ports-in-use'][0]['data']
block= [j['data'] for j in q['effective-port-blocks']]
mblock= [m['data'] for m in q['effective-ports']]
maxp =[d['data'] for d in q['port-block-efficiency']]
print("|if-name",ifname,"|name:",name,"|ip-range:",rang,"|Effective-port-blocks:",block[0],"|Effective-port:",mblock[0],"|Port-Block-Efficiency:",maxp[0])
ansible playbook
---
- name: Parse
hosts: localhost
connection: local
vars:
pool: "{{ lookup('file','/etc/ansible/playbook/dd.json') | from_json }}"
tasks:
- name: Execute Script
command: python3.7 parsetry.py
i expected a task in ansible that gets the variables in the python script and store them in ansible variables
You have to use register. If you modify your script to output json that might ease your work a little bit.
- name: Execute Script
command: python3.7 parsetry.py
register: script_run
- name: Degug output
debug:
msg: "{{ script_run.stdout | from_json }}"
If you want to keep full python power under your fingers, you might as well consider turning your script in a custom module or a custom filter if it ever makes sense.