为什么流利的JSON解析器无法正常工作

为什么流利的JSON解析器无法正常工作

本文介绍了为什么流利的JSON解析器无法正常工作?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用图像gcr.io/google-containers/fluentd-elasticsearch(v2.3.1),以便使fluentd收集一些日志并将其发送到Elastic search.我将以下配置用于fluentd:

I'm using the image gcr.io/google-containers/fluentd-elasticsearch (v2.3.1) in order to make fluentd collect some logs and send them to Elastic search. I'm using the below configuration for fluentd:

<source>
  type forward
  port {{.Values.fluentd.forward.port}}
  bind 0.0.0.0
</source>
<filter kube.**>
  @type parser
  @log_level debug
  key_name log
  reserve_data true
  remove_key_name_field true
  <parse>
    @type json
    time_key time
    time_type string
    time_format %iso8601
  </parse>
</filter>
<filter kube.**>
  @type record_transformer
  @log_level debug
  enable_ruby
  <record>
    kubernetes ${record["kubernetes"]["cluster_name"] = "{{.Values.clusterName}}"; record["kubernetes"] }
    logtrail  {"host": "${record['kubernetes']['pod_name']}", "program":"${record['kubernetes']['container_name']}"}
  </record>
</filter>
<filter kube.**>
  @type concat
  key log
  stream_identity_key kubernetes["docker_id"]
  multiline_end_regexp /\n$/
  separator ""
</filter>

上面列出的配置应该解析与名为log的键关联的JSON.但是我看到JSON根本没有被解析.以下是我流畅进行过滤后得到的JSON.我曾预计与键日志关联的JSON将被解析.

The above listed configuration was supposed to parse the JSON that is associated with a key called log. But I'm seeing that the JSON is not getting parsed at all. Below is the JSON that I'm getting after fluentd does the filtering. I had expected that the JSON associated with the key log would be parsed.

{"kubernetes":{"pod_name":"api-dummy-dummy-vcpqr","namespace_name":"dummy","pod_id":"dummy","labels":{"name":"api-dummy","pod-template-hash":"dummy","tier":"dummy"},"host":"dummy","container_name":"api-dummy","docker_id":"dummy","cluster_name":"dummy Dev"},"log":"{\"name\":\"dummy\",\"json\":false,\"hostname\":\"api-dummy-dummy-vcpqr\",\"pid\":24,\"component\":\"dummy\",\"level\":30,\"version\":\"1.0\",\"timestamp\":1539645856126}","stream":"stdout","logtrail":{"host":"api-dummy-dummy-vcpqr","program":"api-dummy"}}

我花了3天多的时间为此找到解决方案.我什至尝试使用 https://github.com/edsiper/fluent-plugin-docker但这没有帮助.尽管该插件有助于解析JSON,但它导致解析的日志消息被我的Elastic搜索拒绝.

I have spent more than 3 days figuring out the solution for this. I even tried to use https://github.com/edsiper/fluent-plugin-docker but that did not help. Although the plugin helped to parse the JSON, it resulted in the parsed log messages getting rejected by my Elastic search.

推荐答案

您的日志字段不是有效的JSON.

Your log field is not valid JSON.

{
  "kubernetes": {
    "pod_name": "api-dummy-dummy-vcpqr",
    "namespace_name": "dummy",
    "pod_id": "dummy",
    "labels": {
      "name": "api-dummy",
      "pod-template-hash": "dummy",
      "tier": "dummy"
    },
    "host": "dummy",
    "container_name": "api-dummy",
    "docker_id": "dummy",
    "cluster_name": "dummy Dev"
  },
  "log": "{\"name\":\"dummy\",\"json\":false,\"hostname\":\"api-dummy-dummy-vcpqr\",\"pid\":24,\"component\":\"dummy\",\"level\":30,\"version\":\"1.0\",\"timestamp\":1539645856126",
  "stream": "stdout",
  "logtrail": {
    "host": "api-dummy-dummy-vcpqr",
    "program": "api-dummy"
  }
}

在解析为JSON之前,您应该连接日志字段.

You should concatenate log field before parse as JSON.

这篇关于为什么流利的JSON解析器无法正常工作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-06 16:50