Ansible - Issue with iterating through items in path - loops

I'm quite new to Ansible and I can't seem to be able to iterate through the items in a path using "with_items".
Below is the sample code which is meant to go through some files in a path and apply the configuration to a Juniper router.
---
- name: Get Juniper Device Facts
hosts: "junos_devices"
gather_facts: false
connection: local
tasks:
- name: Update prefix-lists
junos_config:
src: prefix-lists/{{item}}
with_items: "/home/python/prefix-lists/*"
The error I'm getting is this:
failed: [192.168.216.66] (item=/home/python/prefix-lists/*) => {"changed": false, "failed": true, "item": "/home/python/prefix-lists/*", "msg": "path specified in src not found"}
Does anyone have any idea why I'm unable to do so?

Why with_items? Use with_fileglob.
From examples:
# copy each file over that matches the given pattern
- name: Copy each file over that matches the given pattern
copy:
src: "{{ item }}"
dest: "/etc/fooapp/"
owner: "root"
mode: 0600
with_fileglob:
- "/playbooks/files/fooapp/*"

Related

Can I create dynamic lists within Ansible vars_files?

I have a variables file that includes important info about our databases; the server they are on, the db version, the DB_HOME directory, etc. In the variables file, I would like to dynamically create lists that capture the unique values of those properties, so they can be easily iterated through in a task.
I have equivalent functionality by creating the list on the fly in a task's loop option, but that means repeating that loop syntax (violates DRY principle) and I would like less sophisticated Ansible colleagues to be able to use a pre-defined list.
example of the variables file databases.yml:
databases:
- name: test_db1
server: ora_901
listener: LISTENER_XYZ
version: '11.2.0.4'
oracle_home: '/app/oracle/product/11.2.0.4/db_home'
- name: test_db2
server: ora_902
listener: LISTENER_ABC
version: '11.2.0.4'
oracle_home: '/app/oracle/product/11.2.0.4/db_home'
## This didn't work... was hoping I could build this list dynamically
listeners:
- name: "{{ item }}"
loop: "{{ databases | map(attribute = 'listener') | list | unique }}"
servers:
- name: "{{ item }}"
loop: "{{ databases | map(attribute = 'server') | list | unique }}"
I would then use this loop through either the 'listeners' or 'servers' lists directly with some tasks.
When I tried a task that referenced the listeners variable, it failed. Referencing databases works and all items are returned, so I know it's getting some data from the vars_file...
- vars_files:
- vars/databases.yml
tasks:
- debug:
msg: "{{ databases }}"
- debug:
msg: "{{ listeners }}"
TASK [debug] **************************************************************************
ok: [FQDN] => {
"msg": [
{
"listener": "LISTENER_XYZ",
"name": "test_db1",
"oracle_home": "/app/oracle/product/11.2.0.4/db_home",
"server": "ora_901",
"version": "11.2.0.4"
},
{
"listener": "LISTENER_ABC",
"name": "test_db2",
"oracle_home": "/app/oracle/product/11.2.0.4/dbhome_1",
"server": "ora_902",
"version": "11.2.0.4"
},
fatal: [FQDN]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'item' is undefined\n\nThe error appears to have been in '/home/xxx/test_vars.yml': line 21, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - debug:\n ^ here\n"}
I would really like to keep these dynamic definitions in the same place as the server definitions, and don't see why it wouldn't be possible, I'm just sure I'm using the wrong mechanism.
Check this out from Ansible doc// Just to give you an idea, about "loop_control".
You can nest two looping tasks using include_tasks. However, by default Ansible sets the loop variable item for each loop. This means the inner, nested loop will overwrite the value of item from the outer loop. You can specify the name of the variable for each loop using loop_var with loop_control:
# main.yml
- include_tasks: inner.yml
loop:
- 1
- 2
- 3
loop_control:
loop_var: outer_item
# inner.yml
- debug:
msg: "outer item={{ outer_item }} inner item={{ item }}"
loop:
- a
- b
- c

Ansible - Versionfile check

I want to be able to read a versionfile if it exists, and check its contents. Then return True if the version changed or the file does not exists, False if versionfile exists and the version matches the content.
Basically this:
# setup test data
- set_fact:
version_expected: "0001"
version_path: "/path/to/version"
version_owner: "root"
version_group: "root"
# this block is used to check for version changes
- name: check version change
block:
- name: check version file
stat:
path: "{{version_path}}"
register: version_file
- set_fact:
version_remote: "{{ lookup('file', version_path) | default('') }}"
when: version_file.stat.exists
- set_fact:
version_changed: not version_file.stat.exists or version_remote != version_expected
# test writing new version
- name: write file
copy:
dest: "{{version_path}}"
content: "{{version_expected}}"
owner: "{{version_owner}}"
group: "{{version_group}}"
when: version_changed
My problem is: This is somewhat ugly and becoming quite redundant in my roles.
Is there a more elegant way to do this?
Is there maybe a module for this? (though I found none)
Or should I just write a module for this?
Best regards,
2d4r
EDIT:
im only meaning the "check version change" block, the surrounding code is for debugging only.
To be more specific, I want to download a server binary, but only if my expectet version differs from the content of the versionfile.
I want to write the new version to file, if (and only if) the download was successfull, but that is not part of my question.
EDIT2:
I got this by now:
# roles/_helper/tasks/version_check.yml
- name: check if file exists
stat:
path: "{{version_path}}"
register: version_file
- name: get remote version
slurp:
src: "{{version_path}}"
register: version_changed
when: version_file.stat.exists
# (False if versionfile exists and version is expected; True else)
- name: set return value
set_fact:
version_changed: "{{ not version_file.stat.exists or ((version_changed.content | b64decode) is version_compare(version_expected, 'ne')) }}"
used like this:
# /roles/example/tasks/main.yml
- include_role:
name: _helper
tasks_from: version_check
vars:
version_path: "{{file_version_path}}"
version_expected: "{{file_version_expected}}"
- name: doing awesome things
when: version_changed
block:
- name: download server
[...]
- name: write version
copy:
dest: "{{file_version_path}}"
content: "{{file_version_expected}}"
It kills the redundancy, but is still not what I want.
Sadly I can not register a return value from a role.
Delete everything except for write file task and remove the condition.
Ansible does this automatically for you.
- name: write file
copy:
dest: "{{version_path}}"
content: "{{version_expected}}"
owner: "{{version_owner}}"
group: "{{version_group}}"
After you changed the question, given the information provided, the only thing I can point to is to use slurp module instead of lookup, as an lookup plugins work locally in the control machine.
Compare versions using your logic or built-in version_compare filter/test.

ansible to pick & iterate the location name from the file sequencly

I have written a playbook to install our custom collectd and that's working , where i'm changing the configuration with lineinfile module now i have another situation where i have multiple site names like one i have in my playbook ie richmod as follows:
Prefix "collectd.dns.NA.richmond.physical.
i have multiple site name viz richmond, carry, london, boston hence looking forward to put them into a varible and call that variable to assign the value.
lets suppose when it run the playbook on first Server it should pick up "richmon" for second it should pick up carry, for third it should london and so on.. looking for ideas..
---
- name: Playbook to Install CollectD
#hosts: all[1]
hosts: all
remote_user: root
become: true
tasks:
- name: Downloading collectd
get_url:
url="http://my-dc/collectd-5.7.2.tar.gz"
dest="/opt/"
- name: Extracting collectd archive
unarchive:
src="/opt/collectd-5.7.2.tar.gz"
dest="/opt/"
remote_src=True
- name: Creating soft link to collectd Dir
file:
src: "/opt/collectd-5.7.2"
dest: "/opt/collectd"
state: link
owner: root
group: root
#############################################################################################
# Dont Disable the gather_facts at all as {{ ansible_hostname }} absolutlty depends on facts #
# ###########################################################################################
- name: Committing changes to collectd configuration....
lineinfile:
dest: "{{ item.dest }}"
regexp: "{{ item.regexp }}"
line: "{{ item.line }}"
backrefs: yes
with_items:
- { dest: '/opt/collectd/etc/collectd.conf', regexp: '#Hostname "collectServer"', line: 'Hostname "{{ ansible_hostname }}"' }
- { dest: '/opt/collectd/etc/collectd.conf', regexp: '# Prefix "collectd.unix."', line: ' Prefix "collectd.dns.NA.richmond.physical."' }
- name: Copy the collectd Daemon to init ..
copy:
src: "/opt/collectd/startup/collectd"
dest: "/etc/init.d/"
remote_src: True
mode: 0755
owner: root
group: root
- name: starting collectd Service
service:
name: collectd
state: started
enabled: yes

In Ansible, how can I fetch file from multiple nodes and store this in one file centralized?

All what I need is in title, for example I want to know how I can do somethings like that :
---
- hosts: ansible-clients
tasks:
- name: Fetch source list from clients
fetch: src=/etc/apt/sources.list
dest=/tmp/allnodes.sourcelist
OR in simply way
echo remote#/etc/apt/sources.list >> local#/tmp/allnodes.sourcelist
I can create and run script in local but the only condition I have is to do all actions in one playbook.
You can use this play:
---
- hosts: ansible-clients
tasks:
- name: Fetch source list from clients
fetch:
src: /etc/apt/sources.list
flat: yes
dest: "/tmp/{{ inventory_hostname }}.sourcelist"
- name: Merge files
run_once: yes
delegate_to: localhost
shell: "cat /tmp/{{ item }}.sourcelist >> /tmp/allnodes.sourcelist"
with_items: "{{ groups['ansible-clients'] }}"
First task is used to fetch all files from remotes and store them in /tmp. inventory_hostname is used in filename to be sure it is unique.
Second task is run once on any host, and append all files (get list of hosts linked to group ansible-clients) in final file

Ansible nested loop cross list reference?

I'm trying to solve this and frankly I'm beggining to think it doesn't work that way.
- name: Create directories
file:
path: "{{ item[0] }}"
state: directory
owner: some_user
group: some_group
mode: some_mode
with_nested:
- [ '/var/lib/{{ item[1] }}', '/var/lib/{{ item[1] }}/conf' ]
- [ 'app1', 'app2' ]
Apparently there's a scope issue here, I'm just not getting it.
If it's not clear enough, I want to create the app dirs first and then conf dirs inside each of them.
Thanks in advance
To create directories you need just one loop:
- name: Create directories
file:
path: "/var/lib/{{ item }}/conf"
state: directory
with_items:
- app1
- app2
From docs:
If state=directory, all immediate subdirectories will be created if they do not exist, since 1.7 they will be created with the supplied permissions
Update: in case of multiple subfolders (conf, logs, etc):
- name: Create directories
file:
path: "/var/lib/{{ item[1] }}/{{ item[0] }}"
state: directory
with_nested:
- [ 'conf', 'logs' ]
- [ 'app1', 'app2' ]

Resources