I'm trying to setup some processors in a filebeat.yml to process some logs before sending to ELK.
An important part of the processing is determining the "level" of the event, which is not always included in the line in the log file.
This is the idea I have for it right now:
# /var/log/messages
- type: log
processors:
- dissect:
tokenizer: "%{month} %{day} %{time} %{hostname} %{service}: {%message}"
field: "message"
target_prefix: "dissect"
- if:
when:
regexp:
message: ((E|e)rror|(f|F)ault)
then:
- add_fields:
target: 'dissect'
fields:
level: error
else:
- if:
when:
regexp:
message: (W|W)arning
then:
- add_fields:
target: 'dissect'
fields:
level: warning
else:
- add_fields:
target: 'dissect'
fields:
level: information
- drop_fields:
#duplicate
fields: ["dissect.month","dissect.day","dissect.time","dissect.hostname","message"]
# Change to true to enable this input configuration.
enabled: true
paths:
- /var/log/messages
I'm still not sure about those patterns I'm trying... but right now I don't think they're what's causing me to fail.
When trying to run filebeat with console output for a test with
filebeat -e -c filebeat.yml
I get the following error:
2022-01-26T17:45:27.174+0200 ERROR instance/beat.go:877 Exiting: Error while initializing input: failed to make if/then/else processor: missing or invalid condition
Exiting: Error while initializing input: failed to make if/then/else processor: missing or invalid condition
I'm very new to yaml in general, and the only other beat I've done before is an AuditBeat (which works, and has conditions, but not "if"s).
Does anyone know what the problem might be?
To clarify: I commented out all other "input" entries, leaving just this one, and still got this error.
Edit: Version: 7.2.0
The if part of the if-then-else processor doesn't use the when label to introduce the condition. The correct usage is:
- if:
regexp:
message: [...]
You have to correct the two if processors in your configuration.
Additionally, there's a mistake in your dissect expression. {%message} should be %{message}. Also, the regexp for warning should be (W|w)arning not (W|W)arning (both W's are uppercase in your config).
This is the corrected processors configuration:
processors:
- dissect:
tokenizer: "%{month} %{day} %{time} %{hostname} %{service}: %{message}"
field: "message"
target_prefix: "dissect"
- if:
regexp:
message: ((E|e)rror|(f|F)ault)
then:
- add_fields:
target: 'dissect'
fields:
level: error
else:
- if:
regexp:
message: (W|w)arning
then:
- add_fields:
target: 'dissect'
fields:
level: warning
else:
- add_fields:
target: 'dissect'
fields:
level: information
Related
I'm trying to achieve the following using Ansible:
Define a YAML file with some variables in the dotted format inside it (variables.yml)
database.hosts[0]: "db0"
database.hosts[1]: "db1"
database.hosts[2]: "db2"
foo.bar: 1
foo.baz: 2
Load the variables in variables.yml by using the include_vars module in my playbook (playbook.yml) and print them in a tree structure
- hosts: all
gather_facts: not
tasks:
- name: "Loading vars"
run_once: true
include_vars:
file: 'variables.yml'
- name: "Testing"
debug:
msg: "{{ foo }}"
- name: "Testing"
debug:
msg: "{{ database }}"
Running this results in the following error:
fatal: [host0]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'foo' is undefined\n\nThe error appears to be in '.../playbook.yml': line 9, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: \"Testing\"\n ^ here\n"}
Which makes it clear that each property in the YAML file has been loaded as a separate property and not as properties within two trees rooted in database and foo.
Of course, the playbook works as expected if I specify the properties as follows:
database:
hosts:
- "db0"
- "db1"
- "db2"
foo:
bar: 1
baz: 2
However, I need the YAML variables file to be in the dotted format instead of in the classic indented format. Is there any way to achieve this? E.g.: a module different from include_vars or some configuration that I can add to the ansible.cfg file? I have already tried to use hash_behaviour=merge, but that didn't help.
Q: "I need the YAML variables file to be in the dotted format instead of in the classic indented format. Is there any way to achieve this?"
A: No. It's' not possible. See Creating valid variable names.
How do we check for a registered variable if only one of the two conditions turns out to be true having the same registered variable?
Below is my playbook that executes only one of the two shell modules.
- name: Check file
shell: cat /tmp/front.txt
register: myresult
when: Layer == 'front'
- fail:
msg: data was read from front.txt and print whatever
when: myresult.rc != 0
- name: Check file
shell: cat /tmp/back.txt
register: myresult
when: Layer == 'back'
- fail:
msg: data was read from back.txt and print whatever
when: myresult.rc != 0
Run the above playbook as
ansible-playbook test.yml -e Layer="front"
I do get error that says myresult does not have an attribute rc. What is the best way to print debug one statements based on the condition met?
Note: I wish the fail to terminate the execution of the play as soon as the condition is met hence I beleive ignore_errors with fail will not help.
Note: The shell modules can be any Unix command.
I tried myresult is changed but that too does not help. Can you please suggest.
You may want to look at this logical grouping of tasks: blocks
- name: Check file
block:
- name: check file
shell: cat /tmp/front.txt
register: myresult
ignore_errors: true
- fail:
msg: data was read from front.txt and print whatever
when: myresult.rc != 0
when: Layer == 'front'
- name: Check file
block:
- name: check file
shell: cat /tmp/back.txt
register: myresult
ignore_erros: true
- fail:
msg: data was read from back.txt and print whatever
when: myresult.rc != 0
when: Layer == 'back'
when the variable Layer is set to the front it will execute the shell command for front. but in case when the file doesn't exists it will give the error no such file exists and stop the play. so i have put the ignore_errors in the shell task.it will ignore it and jump to the fail module.
I have a Python process writing the following example JSON log line:
{"levelname": "DEBUG", "asctime": "2020-02-04 08:37:42,128", "module": "scale_out", "thread": 139793342834496, "filename": "scale_out.py", "lineno": 130, "funcName": "_check_if_can_remove_inactive_components", "message": "inactive_components: set([]), num_of_components_active: 0, max num_of_components_to_keep: 1"}
In the filebeat.yml, I'm trying to exclude all DEBUG logs from being sent into Elasticsearch.
I've tried using the exclude_lines keyword, but Filebeat still publish these events.
I've also tried using a processor with drop event
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/my_service/*.log
json.keys_under_root: true
json.add_error_key: true
json.message_key: "module"
exclude_lines: ['DEBUG'] # also tried ['.*DEBUG.*']
keep_null: true
processors:
- drop_event:
when:
levelname: 'DEBUG'
Any ideas what am I may be doing wrong?
Well..
It was much more easier (and stupid) that I expected it to be.
While the exclude_lines doesn't work (still),
I was able to get the drop_event to work.
The problem was that the 'DEBUG' should had been written without quotes.
processors:
- drop_event:
when:
levelname: DEBUG
I am using yaml2json for the first time. My OS is Windows 7 and I am using git bash.
May be I am missing something very basic, can you guys please help/guide me here.
I tried sending the output of the bash text processing command to test.yml and I can see the test.yml file is created properly. But once I feed it as a input to yaml2json, it parses just the first line "version" :1 and exits without any error.
However, If I try to convert test.yml file contents online via site:--http://yamltojson.com/-- the resulting .json is proper.
Following are the contents of test.yml file generated:--
version: 1
layout: post
lang: en
slug: "checklist"
type: "modal"
title: "Checklist"
published: "true"
categories: "mobile"
tags: "mobile"
action:
title: "Disguise Now" link: "close"
title: "Cancel" link: "home-ready" status: disabled checklist:
title: "Review security plan and update contacts regularly"
I encountered the same problem and solved it by starting the document with
---
So for example ...
---
version: 1
layout: post
lang: en
slug: "checklist"
type: "modal"
title: "Checklist"
published: "true"
categories: "mobile"
tags: "mobile"
... works well, but may not solve your problem because you are using a generated yaml file.
There are more problems with yaml2json (e.g. interpreting the sign of a negative number as list item indicator). So in many cases I use a simple python script (python 2.7 or higher provided) instead of using yaml2json. Only disadvantage I can see is that as opposed to yaml2json the order of dictionary entries is not preserved, but that's just a cosmetical issue, not a logical one:
python -c 'import sys, json, yaml; print json.dumps(yaml.load(sys.stdin), indent=4)' < myyamlfile.yaml
#yaccob's solution worked for me. Just had to add the Loader yaml.load(sys.stdin, Loader=yaml.FullLoader parameter to avoid the deprecation warning:
python2 -c 'import sys, json, yaml; print json.dumps(yaml.load(sys.stdin, Loader=yaml.FullLoader), indent=4)' < sample.yaml
Ansible provides a failed_when module, allowing users to specify certain fail conditions on their tasks, e.g. a certain string being found in stdout or stderr.
I am trying to do the opposite: I'd like my tasks not to fail if any of a set of strings is found in stdout or stderr. In other words, I'd like something approaching the functionality of a passed_when module.
I still want it to pass normally when the return code is 0.
But if it would fail (rc != 0) then it should first check for the occurrence of some string.
I.e. if some string is found it passes no matter what.
My reasoning goes like this:
There are many reasons why the task could
fail - but some of these, depending on the output, I do not consider
as a failure in the current context.
Does anybody have a good idea how this can be achieved?
Have a look here:
Is there some Ansible equivalent to "failed_when" for success
- name: ping pong redis
command: redis-cli ping
register: command_result
failed_when:
- "'PONG' not in command_result.stderr"
- "command_result.rc != 0"
will not fail if return code is 0 and there is no 'PONG' in stderr.
will not fail if there is "PONG" in stderr.
So it passes if any of the list is False
Your original question was phrased like this (using boolean logic to make it easier):
Succeed a command if a set of strings is found in stdout or stderr
Rephrasing your logic:
fail if a set of strings is NOT found in stdout or stderr. Using this logic it's easy to do with failed_when. Here's a snippet:
---
- name: Test failed_when as succeed_if
hosts: localhost
connection: local
gather_facts: no
tasks:
- name: "'succeed_if' set of strings in stdout"
command: /bin/echo succeed1
register: command_result
failed_when: "command_result.stdout not in ['succeed1',]"
- name: "'succeed_if' set of strings in stdout (multiple values)"
command: /bin/echo succeed2
register: command_result
failed_when: "command_result.stdout not in ['succeed1', 'succeed2']"
- name: "'succeed_if' set of strings in stderr (multiple values)"
shell: ">&2 /bin/echo succeed2 "
register: command_result
failed_when: "command_result.stderr not in ['succeed1', 'succeed2']"
- name: "'succeed_if' set of strings in stderr (multiple values) or rc != 0"
shell: ">&2 /bin/echo succeed2; /bin/false"
register: command_result
failed_when: "command_result.stderr not in ['succeed1', 'succeed2'] and command_result.rc != 0"
# vim: set ts=2 sts=2 fenc=utf-8 expandtab list:
Also the documenation you are probably looking for are Jinja2 Expressions