How to specify inputs on command line - filebeat

One can specify filebeat input with this config:
filebeat.inputs:
- type: log
paths:
- /path/to/dir/*
I tried doing same on command line:
$ filebeat run -E filebeat.inputs=[{type=log,paths=['/path/to/dir/*']}]
Exiting: Error reading config file: required 'object', but found 'string' in field 'filebeat.inputs.0' (source:'command line flag')

There are 2 issues:
The -E argument needs to be quoted
The = chars need to be :
Here goes:
filebeat -E "filebeat.inputs=[{type:log,paths:['/path/to/dir/*']}]"
Note that run is not needed.

Related

How to send multiple stdin in ansible

I have a bat file like this
#echo off
echo please enter Hostname(For Example 127.0.0.1 OR . OR PC-Name)
set /p ServerName="Server Name: "
echo "---------------------ServerName-----------------------------"
echo %ServerName%
echo "-------------------------------------------------"
cls
echo Please Select Authentication Mode:
echo 1- Windows Authentication
echo 2- SQL Server Authentication
set /p AuthMode="Please Enter 1 OR 2: "
I use "stdin" within Win_shell but it just gets one input and the AuthMode is always null
my ansible playbook.yml:
- name: Script
Win_shell: D:\Myscript.bat
args:
stdin: 127.0.0.1
is there any solution?
Try passing a multiline string:
- name: Script
Win_shell: D:\Myscript.bat
args:
stdin: |
127.0.0.1
2
You can check out https://yaml-multiline.info/ for an explanation of how to do multiline strings in yaml (and why the above | syntax means "keep newlines and have a single newline at the end").

extends in Gitlab-ci pipeline

I'm trying to include a file in which I declare some repetitive jobs, I'm using extends.
I always have this error did not find expected key while parsing a block
this is the template file
.deploy_dev:
stage: deploy
image: nexus
script:
- ssh -i ~/.ssh/id_rsa -o "StrictHostKeyChecking=no" sellerbot#sb-dev -p 10290 'sudo systemctl restart mail.service'
only:
- dev
this is the main file
include:
- project: 'sellerbot/gitlab-ci'
ref: master
file: 'deploy.yml'
deploy_dev:
extends: .deploy_dev
Can anyone help me please
`
It looks like just stage: deploy has to be indented. In this case it's a good idea to use gilab CI line tool to check if CI pipeline code is valid or just YAML validator. When I checked section from template file in yaml linter I've got
(<unknown>): mapping values are not allowed in this context at line 3 column 8

Using Apache's CustomLog to regex replace, get unterminated 's' character error

I'm trying to sanitise my apache logs of sensitive data as it's passed around in query string parameters. I'm aware this is not good, it cannot be changed.
Following from this question: https://stackoverflow.com/a/9473943/1046387
This is in my apache.conf
CustomLog "|/bin/sed -u -E s/'api_key=[^& \t\n]*'/'api_key=\[FILTERED\]'/g >> /var/log/apache2/access.log" combined
I've been getting this error:
AH00106: piped log program '/bin/sed -u -E s/'api_key=[^& \\t\\n]*'/'api_key=\\[FILTERED\\]'/g >> /var/log/apache2/access.log' failed unexpectedly
/bin/sed: -e expression #1, char 14: unterminated `s' command
Despite being able to run the command on the box directly:
$ echo "api_key=343" | /bin/sed -u -E s/'api_key=[^& \t\n]*'/'api_key=\[FILTERED\]'/g
api_key=[FILTERED]
Seems like apache isn't handing over the command to sed properly so it's missing some of the arguments. Some problem with escape sequences or something?

Ansible linefile module's attribute line is not adding environmental variable on server

Hi I am having a task which is as follows
- name: Replace log directory in configuration
lineinfile:
path: $HOME/amsible_test/test.txt
regexp: '^dataDir='
line: 'dataDir=$HOME/.zookeeper_log'
it's running fine , But issue is that this is writing line as dataDir=$HOME/.zookeeper_log
but as per my understanding it should parse $HOME as /home/username as per ubuntu 16.04 .It should write dataDir=/home/username/.zookeeper.log but not doing as expected.
any suggestion what i am doing wrong . I tried many alternate for string parsing purpose but no luck.
Thanks in advance
Hi this worked for me ..
- name: test connection
shell: echo $HOME
register: user_home
- name: Replace log directory in configuration
lineinfile:
path: $HOME/amsible_test/test.txt
regexp: '^dataDir='
line: 'dataDir={{user_home.stdout}}/.zookeeper_log'

getting error ERROR 1000: Error during parsing. Lexical error

I wrote pig script as :
my_script.pig
bag_1 = LOAD '$INPUT' USING PigStorage('|') AS (LN_NR:chararray,ET_NR:chararray,ET_ST_DT:chararray,ED_DT:chararray,PI_ID:chararray);
bag_2 = LIMIT bag_1 $SIZE;
DUMP bag_2;
and made one param file as :
my_param.txt:
INPUT = hdfs://0.0.0.0:8020/user/training/example
SIZE = 10
now, I am calling the script by
pig my_param.txt my_script.pig
this command but getting error as:
ERROR 1000: Error during parsing. Lexical error
any suggestions for that
I think you need to provide the parameter file using -m or -param_file option. Refer the help documentation below.
$ pig --help
Apache Pig version 0.11.0-cdh4.7.1 (rexported)
compiled Nov 18 2014, 09:08:23
USAGE: Pig [options] [-] : Run interactively in grunt shell.
Pig [options] -e[xecute] cmd [cmd ...] : Run cmd(s).
Pig [options] [-f[ile]] file : Run cmds found in file.
options include:
-4, -log4jconf - Log4j configuration file, overrides log conf
-b, -brief - Brief logging (no timestamps)
-c, -check - Syntax check
-d, -debug - Debug level, INFO is default
-e, -execute - Commands to execute (within quotes)
-f, -file - Path to the script to execute
-g, -embedded - ScriptEngine classname or keyword for the ScriptEngine
-h, -help - Display this message. You can specify topic to get help for that topic.
properties is the only topic currently supported: -h properties.
-i, -version - Display version information
-l, -logfile - Path to client side log file; default is current working directory.
-m, -param_file - Path to the parameter file
-p, -param - Key value pair of the form param=val
-r, -dryrun - Produces script with substituted parameters. Script is not executed.
-t, -optimizer_off - Turn optimizations off. The following values are supported:
SplitFilter - Split filter conditions
PushUpFilter - Filter as early as possible
MergeFilter - Merge filter conditions
PushDownForeachFlatten - Join or explode as late as possible
LimitOptimizer - Limit as early as possible
ColumnMapKeyPrune - Remove unused data
AddForEach - Add ForEach to remove unneeded columns
MergeForEach - Merge adjacent ForEach
GroupByConstParallelSetter - Force parallel 1 for "group all" statement
All - Disable all optimizations
All optimizations listed here are enabled by default. Optimization values are case insensitive.
-v, -verbose - Print all error messages to screen
-w, -warning - Turn warning logging on; also turns warning aggregation off
-x, -exectype - Set execution mode: local|mapreduce, default is mapreduce.
-F, -stop_on_failure - Aborts execution on the first failed job; default is off
-M, -no_multiquery - Turn multiquery optimization off; default is on
-P, -propertyFile - Path to property file
$
You are not using the command correctly.
To use a property file, use -param_file in the command:
pig -param_file <file> pig_script.pig
You can refer more details in the Parameter Substitution