本文介绍了在Logstash中合并多个事件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个Logstash配置,我从石墨输入中读取简单的行(但是如果有帮助的话,也可能只是tcp),然后通过AMQP将它们转发到RabbitMQ.

I have a Logstash configuration where I'm reading simple lines from a graphite input (but if that helps it might as well just be tcp) and i'm forwarding them to RabbitMQ via AMQP.

input {
  graphite {
    host => localhost
    type => carbon
    port => 22003
  }
}

output {
  rabbitmq {
    codec => json
    host => 'localhost'
    port => 5672
    user => 'guest'
    password => 'guest'
    vhost => '/'
    exchange_type => topic
    key => '%{type}'
    persistent => true
    durable => true
    ssl => false
    verify_ssl => false
    workers => 1
    exchange => 'metrics'
  }
}

现在,我想通过将石墨输入中的比在线更多的内容添加到一个AMQP消息中来优化有效负载/开销比率.

Now I would like to optimize the payload/overhead ratio by adding more than on line from the graphite input into one AMQP message.

我正在查看诸如归类或聚合之类的过滤器,但它们似乎并没有完全满足我的需求.我要寻找的是一种传输格式,其中一条AMQP消息包含来自此输入的20或30行.

I was looking at filters like collate or aggregate but they don't seem to be doing exactly what I need. What I'm looking for is a transport format where one AMQP message contains something like 20 or 30 lines from this input.

推荐答案

我自己弄清楚了,我现在使用multiline作为输入编解码器:

I figured it out myself, I'm using multiline as input codec now:

tcp {
  host => localhost
  codec => multiline { pattern => "\r" max_lines => 100 what => "next" }
  type => carbon
  port => 22003
}

这篇关于在Logstash中合并多个事件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-24 13:53