DevOps

Ansible JSON File: Loop, Filter, Replace in Playbooks

Learn Ansible playbook steps to create JSON files, check existence, copy to directories, loop with custom loop_var, filter by tech_stack, remove ansible entries, and replace dark theme with light theme. Full code examples.

1 answer 1 view

Task 2: File_IO

How do I implement the following steps in an Ansible playbook?

  • Create a JSON file (sample provided).
  • Check whether the file exists.
  • Create a directory.
  • If the file exists, copy it into the directory.
  • Loop over the JSON content to print the key/value pairs using a custom loop variable (loop_control.loop_var).
  • Remove objects where tech_stack contains “ansible” and print the remaining dictionaries.
  • Replace all occurrences of “dark theme” with “light theme” in the JSON.

I’m stuck on the “looping over it” part. How can I loop over a JSON file in Ansible to print key/value pairs using a custom loop var, filter/delete entries by tech_stack, and update the file (replace “dark theme” with “light theme”)? Example playbook snippets for reading the JSON, iterating key/value pairs, filtering/removing entries, and writing the modified JSON would be helpful.

Use the Ansible JSON file workflow: write the sample JSON with the copy/file module, check existence with stat, create a directory with file, and copy conditionally with copy (remote_src). Read the file with slurp | b64decode | from_json, loop JSON using loop + loop_control.loop_var (or a block with nested loops and dict2items) to print key/value pairs, filter-out items with json_query or a Jinja2 select/lambda, then write the modified JSON back with copy (using to_nice_json) or use replace for a simple string swap.


Contents


Ansible JSON file: create, check, copy, and directory handling

Below is a compact, working playbook that demonstrates the full File_IO flow: create a JSON file on the managed host, check if it exists, create a directory, and copy the file into that directory only when it exists.

Example playbook snippet (run on your inventory hosts):

yaml
- name: Task 2: File_IO - create/check/copy/loop/filter/update JSON
 hosts: all
 gather_facts: false
 vars:
 sample_path: /tmp/sample.json
 backup_dir: /tmp/sample_backup
 tasks:

 - name: Create sample JSON file on the remote host
 ansible.builtin.copy:
 dest: "{{ sample_path }}"
 mode: '0644'
 content: |
 [
 {
 "id": 1,
 "name": "project-alpha",
 "tech_stack": ["python", "ansible"],
 "theme": "dark theme"
 },
 {
 "id": 2,
 "name": "project-bravo",
 "tech_stack": ["go", "docker"],
 "theme": "dark theme"
 },
 {
 "id": 3,
 "name": "project-charlie",
 "tech_stack": ["ansible", "bash"],
 "theme": "light theme"
 }
 ]

 - name: Check whether the JSON file exists
 ansible.builtin.stat:
 path: "{{ sample_path }}"
 register: sample_stat

 - name: Create backup directory (idempotent)
 ansible.builtin.file:
 path: "{{ backup_dir }}"
 state: directory
 mode: '0755'

 - name: Copy JSON into backup directory if it exists on the host
 ansible.builtin.copy:
 src: "{{ sample_path }}"
 dest: "{{ backup_dir }}/sample.json"
 remote_src: yes
 when: sample_stat.stat.exists

Notes and keywords:

  • Use the stat module to check file existence and read sample_stat.stat.exists for conditionals (see the ansible file exists pattern). A short how-to on stat is available at https://phoenixnap.com/kb/ansible-check-if-file-exists.
  • Creating directories is done with the file module (state: directory), and copying on the same remote host uses copy with remote_src: yes (or shell: cp if you prefer).

Loop JSON with custom loop_var and print key/value pairs

Reading the JSON and iterating over it is easiest when you parse the file into an in-play variable. Use slurp to read remote files, decode with b64decode, and parse with from_json.

Read & parse JSON into myjson:

yaml
 - name: Read JSON from remote file (slurp -> from_json)
 ansible.builtin.slurp:
 src: "{{ sample_path }}"
 register: sample_slurp
 when: sample_stat.stat.exists

 - name: Parse JSON into a variable
 set_fact:
 myjson: "{{ (sample_slurp.content | b64decode) | from_json }}"
 when: sample_stat.stat.exists

Now — how do you print each key/value pair and use a custom loop variable? Two clean approaches:

  1. Single-task print (custom outer loop_var). This prints each object’s keys/values in one debug message per object:
yaml
 - name: Print key/value pairs for each object (custom loop_var 'obj')
 debug:
 msg: |
 Object {{ obj.name | default('<no-name>') }}:
 {% for p in obj | dict2items %}
 - {{ p.key }}: {{ p.value }}
 {% endfor %}
 loop: "{{ myjson | default([]) }}"
 loop_control:
 loop_var: obj

This uses dict2items to convert a dict to a list of {key, value} pairs, and loop_control.loop_var renames the loop variable from the default item to obj. That avoids item collisions when you have nested loops.

  1. Nested loops within a block to show distinct loop_vars for outer and inner loops. This prints each key/value on its own debug line and demonstrates loop_control.loop_var in both loops:
yaml
 - block:
 - name: Convert current object to key/value list
 set_fact:
 kv_list: "{{ item | dict2items }}"

 - name: Print each key/value using custom inner loop_var 'pair'
 debug:
 msg: "{{ pair.key }}: {{ pair.value }}"
 loop: "{{ kv_list }}"
 loop_control:
 loop_var: pair

 loop: "{{ myjson | default([]) }}"
 loop_control:
 loop_var: item

Why two approaches? The first is concise and often sufficient. The second demonstrates explicit inner/outer loop variable naming (useful when you run more than one task per object). For more looping patterns and examples see the practical guide at https://www.freekb.net/Article?id=2417 and Adam Gardner’s post on working with JSON in Ansible: https://agardner.net/working-with-json-in-ansible/.


Filter JSON entries by tech_stack (remove “ansible”)

You can remove list items where tech_stack contains the string "ansible" either with a JMESPath json_query or with a Jinja2 select/lambda. Both work; choose json_query if you prefer a declarative filter (requires the jmespath library), or use the Jinja2 approach if you want pure-Jinja logic.

  1. json_query approach (JMESPath):
yaml
 - name: Filter out entries where tech_stack contains "ansible" using json_query
 set_fact:
 filtered_json: "{{ myjson | json_query('[?contains(tech_stack, `ansible`)==`false`]') }}"
 when: myjson is defined

Note: json_query leverages JMESPath and patterns like contains(...). If you don’t have the library installed, install jmespath on the control machine. A practical walkthrough of json_query is here: https://networktocode.com/blog/ansible-filtering-json-query/ and the general filters guide is at https://docs.ansible.com/projects/ansible/latest/playbook_guide/playbooks_filters.html.

  1. Jinja2 select(lambda…) approach (pure Jinja):
yaml
 - name: Filter out entries where tech_stack contains "ansible" using Jinja2 lambda
 set_fact:
 filtered_json: >-
 {{ myjson
 | select(lambda o: 'ansible' not in (o.tech_stack | default([])))
 | list }}
 when: myjson is defined

Either filtered_json output will contain only objects whose tech_stack list does not include "ansible". Then you can debug the remaining dictionaries:

yaml
 - name: Show remaining objects after filter
 debug:
 var: filtered_json

If tech_stack might be a string rather than a list, adjust the lambda (e.g., coerce to list or test membership more defensively).


Replace “dark theme” → “light theme” and write modified JSON

There are two common ways to update the JSON:

  • Modify the parsed structure and write the structured JSON back (recommended when you altered the data model).
  • Do a text-level find/replace on the file (quick and dirty).

A. Structured update (recommended)

Build a new list where you replace dark theme with light theme in the theme field, then write the JSON back using to_nice_json (keeps the file valid JSON and is idempotent):

yaml
 - name: Replace "dark theme" -> "light theme" in structured data (build updated_json)
 set_fact:
 updated_json: >-
 {{ updated_json | default([])
 + [ item | combine({'theme': (item.theme | default('') | replace('dark theme','light theme'))}) ] }}
 loop: "{{ filtered_json | default([]) }}"
 loop_control:
 loop_var: item

 - name: Write the modified JSON back to the same file
 ansible.builtin.copy:
 dest: "{{ sample_path }}"
 content: "{{ updated_json | to_nice_json }}"
 mode: '0644'

A few points:

  • Using set_fact with list concatenation is a straightforward way to accumulate transformed items.
  • to_nice_json makes the output human-readable; to_json may be used for compact output.

B. Quick text replace (alternative)

If you only need a global string replacement (and aren’t restructuring objects), the replace module is simpler:

yaml
 - name: Replace 'dark theme' with 'light theme' directly in the file
 ansible.builtin.replace:
 path: "{{ sample_path }}"
 regexp: 'dark theme'
 replace: 'light theme'
 when: sample_stat.stat.exists

This just swaps the text and doesn’t validate JSON structure; use carefully if you altered object membership earlier.

Finally, show the final file (optional read-back to confirm):

yaml
 - name: Read updated JSON back (optional verification)
 ansible.builtin.slurp:
 src: "{{ sample_path }}"
 register: updated_slurp

 - name: Show final JSON on the controller (decoded)
 debug:
 msg: "{{ (updated_slurp.content | b64decode) | from_json }}"
 when: updated_slurp is defined

Sources


Conclusion

You can implement Task 2 using standard Ansible modules: create the Ansible JSON file with copy, check it with stat, create a directory with file, then conditionally copy the file. Read the file with slurp | b64decode | from_json, loop JSON objects with loop and loop_control.loop_var (or nested loops and dict2items) to print key/value pairs, filter entries with json_query or a Jinja2 select(lambda…), and write the modified JSON back using copy with to_nice_json (or replace for a simple text swap). This workflow gives you safe, idempotent Ansible JSON automation you can adapt to more complex schemas.

Authors
Verified by moderation
Moderation
Ansible JSON File: Loop, Filter, Replace in Playbooks