Multiple Ansible archive(s) are not created - module

im trying to create 2 archives out of 2 folders by using archive module.
Unfortunately it´s not working without any error.
My tasks looks like this:
tasks:
- name: create a tarball of logfiles
archive:
path: "{{ item.path }}"
dest: /tmp/{{ ansible_hostname }}_{{ item.name }}_{{ ansible_date_time.date }}.tar.gz
register: ausgabe
with_items:
- { name: 'xxxxxx', path: '/opt/jira/xxx/xxxxxx' }
- { name: 'xxxxxxx', path: '/opt/jira/xxxx/xxxxxxx' }
Output:
TASK [create a tarball of logfiles] ************************************************************************************************************************************************
ok: [xxxxxxx] => (item={u'path': u'/opt/jira/xxx/xxxx', u'name': u'xxxxx'})
ok: [xxxxxxx] => (item={u'path': u'/opt/jira/xxx/xxxx', u'name': u'xxxxxx'})
The tar.gz files are not created.
Can somebody help me on this?
Thx
Harry

Whenever you are using variables or 'templating' your playbook make sure that you use " (inverted-commas) properly.
I have modified your archive module's statements and I got the required result.
archive:
dest: "/tmp/{{ ansible_hostname }}_{{ item.name }}_{{ ansible_date_time.date }}.tar.gz"
path: "{{ item.path }}"
Output:
myHost_xxxxxx_2018-06-12.tar.gz
myHost_xxxxxxx_2018-06-12.tar.gz

Related

Fetch a file from task in same Ansible playbook

How do I transfer a file I have created from a previous task in my Ansible playbook? Here is what I got so far:
- name: Create Yum Report
shell: |
cd /tmp
yum history info > $(hostname -s)_$(date "+%d-%m-%Y").txt
register: after_pir
- name: Transfer PIR
fetch:
src: /tmp/{{ after_pir }}
dest: /tmp/
However, I receive this error message when I run my playbook.
TASK [Transfer PIR] ************************************************************************************************************
failed: [x.x.x.x] (item=after_pir) => {"ansible_loop_var": "item", "changed": false, "item": "after_pir", "msg": "the remote file does not exist, not transferring, ignored"}
I have tried to run different fetch, synchronzie and pull methods but I'm not sure what the issue is.
One way to do that:
- name: Create Yum Report
command: yum history info
register: yum_report
- name: Dump report on local disk for each host
copy:
content: "{{ yum_report.stdout }}"
dest: "/tmp/{{ inventory_hostname_short }}-{{ '%d-%m-%Y' | strftime }}"
delegate_to: localhost

Ansible write on file from jinja file

I want to lanch a application :
command=/usr/bin/toto --config /var/toto/conf1.json /var/toto/conf2.json /var/toto/conf3.json
the config file are on /var/toto directory
task.yml
- name: Ansible find file
find:
paths: "/var/toto"
register: found_files
- name: print files
debug:
msg: "{{ found_files['files'] | map(attribute='path') | map('regex_replace','^.*/(.*)$','\\1') | list }}"
register: file_name
- name: Create the Jinja2 based template
template:
src: "etc/control/config.conf.j2"
dest: "/etc/control/config.conf"
with_dict: "{{ file_name }}"
config.conf.j2
command=/usr/bin/toto --config {% for item in file_name %} /var/toto/{{ item }} {% endfor %}
but, I have get this on my file
/etc/control/config.conf
command=/usr/bin/toto --config /var/opt/msg /var/opt/failed /var/opt/changed
varfile_name :
"msg": [ "conf1.json", "conf2.json", "conf3.json"
]
You're iterating over the dictionary in file_name, which is a task result. If you were to print that out, you would find that it contains something like:
TASK [debug] *********************************************************************************************************************************************************************************
ok: [localhost] => {
"file_name": {
"changed": false,
"failed": false,
"msg": [
"file1",
"file2",
"file3"
]
}
}
So when you iterate over it using for item in file_name, you're iterating over the top level keys (changed, failed, msg).
But this is all the wrong way to do it. You never use the debug module to create variables; that's what set_fact is for. You want something like:
- name: build list of files
set_fact:
file_name: "{{ found_files['files'] | map(attribute='path') | map('regex_replace','^.*/(.*)$','\\1') | list }}"
- name: Create the Jinja2 based template
template:
src: "config.conf.j2"
dest: "config.conf"
After the set_fact task, the variable file_name will contain a
list of file names.
It looks like you're using that regular expression to get the basename
of the files found in your find task. There's a basename filter
that can do that with less complexity:
- name: print files
set_fact:
file_name: "{{ found_files['files'] | map(attribute='path') | map('basename') | list }}"
- name: Create the Jinja2 based template
template:
src: "config.conf.j2"
dest: "config.conf"
Here's the playbook I'm using to test this locally:
- hosts: localhost
gather_facts: true
tasks:
- name: find files
find:
paths: /tmp/example
register: found_files
- name: print files
set_fact:
file_name: "{{ found_files['files'] | map(attribute='path') | map('basename') | list }}"
- name: Create the Jinja2 based template
template:
src: "config.conf.j2"
dest: "config.conf"
Before running this, I run:
mkdir /tmp/example
touch /tmp/exampe/file{1,2,3}
This produces a config.conf file that looks like:
command=/usr/bin/toto --config /var/toto/file3 /var/toto/file2 /var/toto/file1

Using python variables in ansible

I tried this one
---
- name: py
hosts: master
tasks:
- name:
command: /home/vagrant/test.py
register: csvfile
changed_when: false
- debug:
var: csvfile
- name: Create csvfile directories
file:
path: "/tmp/{{ item.host }}_{{ item.app }}"
state: directory
with_dict: "{{ csvfile }}"
Test.py results are:
{'key': 'stdout_lines', 'value': ["{'host': '123', 'app': 'abc'}", "{'host': '2345', 'app': 'def'}", "{'host': '8484', 'app': 'ghju'}", "{'host': '89393', 'app': 'yruru'}"]}
and i'm getting error at "/{{ item.host }}_{{ item.app }}"
Can someone help me?
The registered variable csvfile must have the attribute stdout_lines
csvfile:
stdout_lines:
- {'host': '123', 'app': 'abc'}
- {'host': '2345', 'app': 'def'}
- {'host': '8484', 'app': 'ghju'}
- {'host': '89393', 'app': 'yruru'}
The simple loop should do the job
- name: Create csvfile directories
file:
path: "/tmp/{{ item.host }}_{{ item.app }}"
state: directory
loop: "{{ csvfile.stdout_lines }}"
The key/value decomposition was added very probably by with_dict. Please confirm, update and fix the question. Post the output of the task
- debug:
var: csvfile
There are a bunch of things wrong, so while this is an "answer" because it is too big to fit into a comment here on S.O., it is not the fix to your problem because there is so much wrong with your code.
As your debug output demonstrates, the register: captures the output of the whole task, and not just the output from your program. Thus, at the very least you would need with_dict: "{{ csvfile.stdout }}" but that, too, will not work because the output is not an interoperability format that ansible can use. Just because it is written in python and your script is written in python does not mean they can communicate
You will need to have the test.py call json.dump or json.dumps on the results, and not just print() or repr or whatever it is calling now, so that the output can be parsed by ansible back into an actual data structure in your playbook
Then, what happens next depends on whether you continue to write out each dictionary from test.py on a per-line basis, or you choose to package them all into a list of dictionaries and dump that as JSON
Start by fixing the output from test.py to be parseable by ansible, and we'll go from there
Sumanth, you can try the same approach but using with_items:
- name: Create csvfile directories
file:
path: "/tmp/{{ item.host }}_{{ item.app }}"
state: directory
with_items: "{{ csvfile.stdout_lines }}"

Using variables from one yml file in another playbook

I am new to ansible and am trying to use variables from a vars.yml file in a playbook.yml file.
vars.yml
---
- firstvar:
id: 1
name: One
- secondvar:
id: 2
name: two
playbook.yml
---
- hosts: localhost
tasks:
- name: Import vars
include_vars:
file: ./vars.yml
name: vardata
- name: Use FirstVar
iso_vlan:
vlan_id: "{{ vardata.firstvar.id }}"
name: "{{ vardata.firstvar.name }}"
state: present
- name: Use Secondvar
iso_vlan:
vlan_id: "{{ vardata.secondvar.id }}"
name: "{{ vardata.secondvar.name }}"
state: present
So you can see here I am treating the imported variable data, which is stored in vardata, as object and trying to call each of them in other tasks. I am pretty sure these imported vars at the first task are only available in that very task. How can I use that in other tasks? It would output as variables undefined for each tasks. Any input is appreciated.
Your vars.yml file isn't formatted correctly.
Try this:
---
firstvar:
id: 1
name: One
secondvar:
id: 2
name: two
I used this to test it:
---
- hosts: localhost
tasks:
- name: Import vars
include_vars:
file: ./vars.yml
name: vardata
- name: debug
debug:
msg: "{{ vardata.firstvar.name }}"
- name: more debug
debug:
msg: "{{ vardata.secondvar.id }}"
On top of the error you made when declaring the variables (syntax is very important), you can also define include_vars: ./vars.yml such that you can just call {{ firstvar.name }}, {{ firstvar.id }} immediately. Much more leaner/shorter.

How to pull a variable file from S3 in Ansible

My requirement is that I want to dynamically include a variable file in my Ansible script. I can do that by putting following in my ansible task-
- name: Include vars file
include_vars: vars/dev.yml
- name: Some other task
cp: copy something
Above works if I keep the dev.yml in my vars directory. Now I actually do not want to put the dev.yml in the directory, I want to pull it from S3 and then use the variable in it. Something like below-
- name: Get dev file
s3:
bucket: bucket_name
object: object_name
dest: "dest_directory" ## Here I want the destination to be vars/dev.yml
mode: get
aws_access_key: "{{ s3.aws_access_key }}"
aws_access_key: "{{ s3.aws_secret_key }}"
- name: Include vars file
include_vars: vars/dev.yml
- name: Some other task that uses vars in dev.yml
template: render some template using vars in dev.yml and copy to server
The above will actually not work. How do I do this?
Thanks Konstantin Suvorov for help. I just needed to add delegate_to in my task.
- name: Get dev file
s3:
bucket: bucket_name
object: object_name
dest: vars/dev.yml
mode: get
aws_access_key: "{{ s3.aws_access_key }}"
aws_access_key: "{{ s3.aws_secret_key }}"
delegate_to: localhost
- name: Include vars file
include_vars: vars/dev.yml