all.
I am using logstash-1.4.2 to consume messages stored in my activeMQ(with stomp plubgin).
in my acitveMQ.xml config file, I have the line:
<transportConnector name="stomp" uri="stomp://0.0.0.0:61613?maximumConnections=1000&wireFormat.maxFrameSize=104857600"/>
when I run my logstash, I have this error:
C:\logstash\logstash-1.4.2\bin>logstash agent -f logstashconfig.conf
+---------------------------------------------------------+
| An unexpected error occurred. This is probably a bug. |
| You can find help with this problem in a few places: |
| |
| * chat: #logstash IRC channel on freenode irc. |
| IRC via the web: http://goo.gl/TI4Ro |
| * email: logstash-users#googlegroups.com |
| * bug system: https://logstash.jira.com/ |
| |
+---------------------------------------------------------+
The error reported is:
Couldn't find any input plugin named 'stomp'. Are you sure this is correct? Trying to load the stomp input plugin resulted in this error: no such file to load -- logstash/inputs/stomp
in my logstashconfig.conf, I have :
input {
stomp {
password => "admin"
user => "admin"
}
}
output {
file {
path => "C:\logstash\logstash-1.4.2\cosumedfromstomp.txt"
}
}
If I consume from rabbitMQ, with the following logstashconfig.conf, it would be correct (here is my rabbitMQ version of config):
input {
rabbitmq {
host => "amqp"
queue => "elasticsearch"
key => "elasticsearch"
exchange => "elasticsearch"
type => "all"
durable => true
auto_delete => false
exclusive => false
format => "json_event"
debug => false
}
}
output {
file {
path => "C:\logstash\logstash-1.4.2\cosumedfromstomp.txt"
}
}
I don't have trouble with my rabbitMQ version of logstash, the text file created looks correct.
My question is :
1, do I configure my stomp input wrong? since I don't have the "queue" name in my config, it might be wrong?
2, or if the problem is I didn't create the stomp plugin correctly, if that is the reason, it would not be about logstash...
Thanks
You need to install the Contributed Plugins. These have been removed from the core download for Logstash. The Stomp plugin is located in the contributed plugins:
Stomp
Milestone: 2
This is a community-contributed plugin! It does not ship with logstash
by default, but it is easy to install! To use this, you must have
installed the contrib plugins package.
Directions here:
http://logstash.net/docs/1.4.2/contrib-plugins
Hosted on GitHub here:
https://github.com/elasticsearch/logstash-contrib
Related
kafka{
topic_id => "myTopic"
bootstrap_servers => "127.0.0.1:9092"
value_serializer => "io.confluent.kafka.serializers.KafkaAvroSerializer"
}
}
[[main]-pipeline-manager] kafka - Unable to create Kafka producer from given configuration {:kafka_error_message=>org.apache.kafka.common.config.ConfigException: Invalid value io.confluent.kafka.serializers.KafkaAvroSerializer for configuration value.serializer: Class io.confluent.kafka.serializers.KafkaAvroSerializer could not be found., :cause=>nil}
Has anyone made logstash work with io.confluent.kafka.serializers.KafkaAvroSerializer ?
You'll need to use the ByteArraySerializer and install this codec
https://github.com/revpoint/logstash-codec-avro_schema_registry
ES version - 2.3.5 , Logstash - 2.4
'Attempted to send bulk request to Elasticsearch, configured at ["xxxx.com:9200"] ,
An error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided ?
Error:
"SSL peer shut down incorrectly", Manticore::ClientProtocolException
logstash"'
My logstash Output section:
output
{
stdout { codec => rubydebug }
stdout { codec => json }
elasticsearch
{
user => "xxxx"
password => "xxx"
index => "wrike_jan"
document_type => "data"
hosts => ["xxxx.com:9200"]
ssl => true
ssl_certificate_verification => false
truststore => "elasticsearch-2.3.5/config/truststore.jks"
truststore_password => "83dfcdddxxxxx"
}
}
Logstash file is executed , but it is failing to send the data to ES.
Could you please suggest, thank you.
Be particular about http or https in the url, in the above case i am sending data to https but my ES is using http.
Later, upgrade of logstash version solved to send data to ES.
I'm trying to install rabbitmq via puppet. I'm using the puppetlabs-rabbitmq module. It also has section to configure queues and exchanges, which are Native Types. I can't figure out how to use these native types.
My code for rabbitmq installation:
class rabbitmq-concrete{
$tools = ["vim-enhanced","mc"]
package { $tools: ensure => "installed" }
$interface = "enp0s8"
$address = inline_template("<%= scope.lookupvar('::ipaddress_${interface}') -%>")
class { 'rabbitmq':
config_cluster => true,
cluster_nodes => ['rml01', 'rml02'],
cluster_node_type => 'disc',
manage_repos => true,
node_ip_address => $address,
erlang_cookie => 'rmq_secret',
}
rabbitmq_exchange { "logging#${node_name}":
type => 'topic',
ensure => present,
}
rabbitmq_queue { "logging#${node_name}":
durable => true,
auto_delete => false,
arguments => {
x-message-ttl => 123,
x-dead-letter-exchange => 'other'
},
ensure => present,
}
rabbitmq_binding { "logging#logging#${node_name}":
destination_type => 'logging',
routing_key => '#',
arguments => {},
ensure => present,
}
}
include rabbitmq-concrete
I get following error:
==> rml01: Error: Puppet::Parser::AST::Resource failed with error ArgumentError: Invalid resource type rabbitmq_queue at /tmp/vagrant-puppet-2/manifests/site.pp:35 on node rml01
==> rml01: Wrapped exception:
==> rml01: Invalid resource type rabbitmq_queue
==> rml01: Error: Puppet::Parser::AST::Resource failed with error ArgumentError: Invalid resource type rabbitmq_queue at /tmp/vagrant-puppet-2/manifests/site.pp:35 on node rml01
Note: When I leave out these native types, rabbit installation works well.
How do I use Native Types to configure rabbitmq_queue, rabbitmq_exchange and rabbitmq_binding ?
Do you have the required prerequisites? You need the following packages from the Forge:
puppetlabs/stdlib
stahnma/epel
nanliu/staging
garethr/erlang
To your manifest I added:
include epel
include staging
class { 'erlang': epel_enable => true}
Your question is dated 13th Feb, yet looking on the Puppet Forge those features were only added to that module in the most recent release on 10th March in version 5.1.0.
Full changelog => https://forge.puppetlabs.com/puppetlabs/rabbitmq/changelog
Abridged:
"2015-03-10 - Version 5.1.0
Summary
This release adds several features for greater flexibility in configuration of rabbitmq, includes a number of bug fixes, and bumps the minimum required version of puppetlabs-stdlib to 3.0.0.
Features
Add rabbitmq_queue and rabbitmq_binding types"
can activeMQ work with logstash?
I was switching from rabbitMQ to activeMQ, and trying to make logstash to work with activeMQ..
In my previous rabbitMQ, I have something like:
input {
rabbitmq {
host => "hostname"
queue => "queue1"
key => "key1"
exchange => "ex1"
type => "all"
durable => true
auto_delete => false
exclusive => false
format => "json_event"
debug => false
}
}
filter {....}
on logstash webpage -> doc, it does not show activeMQ supported as input...
http://logstash.net/docs/1.4.1/
any suggestions?
You can probably use (not tried it myself) the STOMP input. ActiveMQ supports stomp.
I've got logstash running, and successfully reading in a file
rabbitmq is running, I'm watching the log, and I can see the web interface
I've configured logstash to output to a rabbitmq exchange... I think!
Here's the problem: nothing ever gets posted to the exchange, as seen in the web interface.
Any ideas?
My output config:
output {
rabbitmq {
codec => plain
host => localhost
exchange => yomtvraps
exchange_type => direct
}
file { path => "/tmp/heartbeat-from-logstash.log" }
}
UPDATE: I'm watching the rabbit log with
tail -F /usr/local/var/log/rabbitmq/rabbit\#localhost.log
As it turns out, the problem was that there was no routing key set for the exchange and queue.
A working config is:
output {
rabbitmq {
codec => plain
host => localhost
exchange => yomtvraps
exchange_type => direct
key => yomtvraps
# these are defaults but you never know...
durable => true
port => 5672
user => "guest"
password => "guest"
}
}
Here's a sample receiver code (using ruby "Bunny")
require "bunny"
conn = Bunny.new(:automatically_recover => false)
conn.start
ch = conn.create_channel
q = ch.queue("yomtvraps")
exchange = ch.direct("yomtvraps", :durable => true)
begin
puts " [*] Waiting for messages. To exit press CTRL+C"
q.bind(exchange, :routing_key => "yomtvraps").subscribe(:block => true) do |delivery_info, properties, body|
puts " [x] Received #{body}"
end
rescue Interrupt => _
conn.close
exit(0)
end
you rabbitmq's parameter seems not enough, username,password and port have not been configured.
You can configure two outputs, one is to rabbitmq, the other is to file for vertifying the log's creation and log stash is ok.
pay attention to the logstash's version(log stash, rabbitmq plugin), it gave me lots of trouble in my trial before (log stash to another redis server etc).
You could debug rabbitmq's log.
ps -ef|grep erl you could find the log file's path in the arguments.
Be sure that rabbitmq's web manager plugin is enabled, and firewall is rightly configured, then open rabbitmq's web manager, ipaddress:15672
check the exchange's type is ok (in this case 'direct' may be a correct choice), your message consumer is configured ok, and your consumer's queue has been been bound to the exchange correctly.
try to post the message to your consumer through web manager and ensure consumer work well.
Monitor your queue when log stash push log into your consumer.