将JSON文件解析为logstash

将JSON文件解析为logstash

本文介绍了将JSON文件解析为logstash的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正尝试将具有多个对象的json文件发送到带有logstash的elasticsearch,以便我可以使用kibana显示数据.我对此进行了广泛的研究,根本无法理解如何正确格式化数据以在kibana中使用.

Hi I am trying to send a json file with multiple objects to elasticsearch with the logstash so I can display the data using kibana. I have researched this extensively and simply cannot understand how to make the data formatted correctly to be used in kibana.

我尝试使用不同的过滤器,例如:json,date和grok

I have tried to use different filters such as: json, date, and grok

问题可能是我将如何使用这些过滤器,因为我不明白它的设置是否很好.

The issue is probably how I'm going about using these filters as I can't understand it's setup all to well.

这是输入json文件的示例行:

Here is a sample line of the input json file:

{"time":"2015-09-20;12:13:24","bug_code":"tr","stacktrace":"543534"},

我想使用这种格式在kibana中显示数据并根据对象的时间"对许多对象进行排序

I want to use this format for displaying the data in kibana and sorting many objects according to their "time"

以下是我当前的过滤器部分:

this following is what my current filter section is:

filter {
    date {
        match => ["time", "YYYY-MM-dd;HH:mm:ss Z" ]
        timezone => "America/New_York"
        locale => "en"
        target => "@timestamp"
    }
    grok {
        match => ["time", "%{TIMESTAMP_ISO8601:timestamp}"]
    }
}

在这一点上,我知道grok是错误的,因为我得到了"_grokparsefailure"但是我如何找出使用grok的正确方法,还是有一种简单的方法使用给定的时间戳而不是通过发送数据时给定的处理后的时间戳对数据进行排序.

At this point I know the grok is wrong because I get "_grokparsefailure"but how can I figure out the correct way to use grok or is there a simple way to sort the data using the given timestamp and not the processed timestamp given when sending the data through.

这是当前输出显示的内容:

here is what the output currently shows:

"message" => "{\"time\":\"2015-09-20;12:13:24\",\"bug_code\":\"tr\",\"stacktrace\":\"543534\"},\r",
"@version" => "1",
"@timestamp" => "2015-11-23T09:54:50:274Z",
"host" => "<my_computer>",
"path" => "<path_to_.json>",
"type" => "json",
"tags" => [
[0] "_grokparsefailure"

任何建议将不胜感激

推荐答案

您快到了,我可以通过一些调整使其工作.

You're almost there, I could get it working with a few tweaks.

首先,您需要在第一个位置添加json{}过滤器.然后,您需要将日期模式更改为YYYY-MM-dd;HH:mm:ss,最后可以在最后删除grok过滤器.您的过滤器配置如下所示:

First, you need to add the json{} filter in the first position. Then you need to change the date pattern to YYYY-MM-dd;HH:mm:ss and finally you can remove the grok filter at the end. You filter configuration would look like this:

filter {
    json {
        source => "message"
    }
    date {
        match => ["time", "YYYY-MM-dd;HH:mm:ss" ]
        timezone => "America/New_York"
        locale => "en"
        target => "@timestamp"
    }
}

示例JSON行的解析事件如下所示:

The parsed event for your sample JSON line would then look like this:

{
       "message" => "{\"time\":\"2015-09-20;12:13:24\",\"bug_code\":\"tr\",\"stacktrace\":\"543534\"}",
      "@version" => "1",
    "@timestamp" => "2015-09-20T16:13:24.000Z",
          "host" => "iMac.local",
          "time" => "2015-09-20;12:13:24",
      "bug_code" => "tr",
    "stacktrace" => "543534"
}

这篇关于将JSON文件解析为logstash的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-06 16:06