使用filebeat将apache日志从Windows系统发送到Linux EC2中的我的logstash服务器,然后发送到 flex 搜索和Kibana。

flex 搜寻和Kibana-5.3
Logstash和Filebeat-5.3

filebeat.yml:

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*
    - C:\Users\Sagar\Desktop\elastic_test4\data\log\*

output.logstash:
  # The Logstash hosts
  hosts: ["10.101.00.11:5044"]
  template.name: "filebeat-poc"
  template.path: "filebeat.template.json"
  template.overwrite: false

Ubuntu Linux EC2实例中的logstash.conf
input {
  beats {
    port => 5044
  }
}
filter {
  grok {
      match => {
        "message" => "%{COMBINEDAPACHELOG}"
      }
  }
  geoip {
      source => "clientip"
      target => "geoip"
      add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }
   mutate {
      convert => [ "[geoip][coordinates]", "float"]
  }
 }
output {
  elasticsearch {
  hosts => ["elastic-instance-1.es.amazonaws.com:80"]
  index => "apache-%{+YYYY.MM.dd}"
  document_type => "apache_logs"
 }
  stdout { codec => rubydebug }
}

我的虚拟日志文件。
64.242.88.10 - - [07/Mar/2004:16:05:49 -0800] "GET /twiki/bin/edit/Main/Double_bounce_sender?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12846
64.242.88.10 - - [07/Mar/2004:16:06:51 -0800] "GET /twiki/bin/rdiff/TWiki/NewUserTemplate?rev1=1.3&rev2=1.2 HTTP/1.1" 200 4523
64.242.88.10 - - [07/Mar/2004:16:10:02 -0800] "GET /mailman/listinfo/hsdivision HTTP/1.1" 200 6291
64.242.88.10 - - [07/Mar/2004:16:11:58 -0800] "GET /twiki/bin/view/TWiki/WikiSyntax HTTP/1.1" 200 7352
64.242.88.10 - - [07/Mar/2004:16:20:55 -0800] "GET /twiki/bin/view/Main/DCCAndPostFix HTTP/1.1" 200 5253
64.242.88.10 - - [07/Mar/2004:16:23:12 -0800] "GET /twiki/bin/oops/TWiki/AppendixFileSystem?template=oopsmore¶m1=1.12¶m2=1.12 HTTP/1.1" 200 11382
64.242.88.10 - - [07/Mar/2004:16:24:16 -0800] "GET /twiki/bin/view/Main/PeterThoeny HTTP/1.1" 200 4924
64.242.88.10 - - [07/Mar/2004:16:29:16 -0800] "GET /twiki/bin/edit/Main/Header_checks?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12851
64.242.88.10 - - [07/Mar/2004:16:30:29 -0800] "GET /twiki/bin/attach/Main/OfficeLocations HTTP/1.1" 401 12851
64.242.88.10 - - [07/Mar/2004:16:31:48 -0800] "GET /twiki/bin/view/TWiki/WebTopicEditTemplate HTTP/1.1" 200 3732
64.242.88.10 - - [07/Mar/2004:16:32:50 -0800] "GET /twiki/bin/view/Main/WebChanges HTTP/1.1" 200 40520
64.242.88.10 - - [07/Mar/2004:16:33:53 -0800] "GET /twiki/bin/edit/Main/Smtpd_etrn_restrictions?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12851

我能够将这些日志发送到Elastic和Kibana仪表板。管道已设置并且可以使用,但是geoip无法使用。

这是我在搜索中的kibana输出。
{
        "_index": "apache-2017.06.15",
        "_type": "apache_logs",
        "_id": "AVyqJhi6ItD-cRj2_AW6",
        "_score": 1,
        "_source": {
          "@timestamp": "2017-06-15T05:06:48.038Z",
          "offset": 154,
          "@version": "1",
          "input_type": "log",
          "beat": {
            "hostname": "sagar-machine",
            "name": "sagar-machine",
            "version": "5.3.2"
          },
          "host": "by-df164",
          "source": """C:\Users\Sagar\Desktop\elastic_test4\data\log\apache-log.log""",
          "message": """64.242.88.10 - - [07/Mar/2004:16:05:49 -0800] "GET /twiki/bin/edit/Main/Double_bounce_sender?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12846""",
          "type": "log",
          "tags": [
            "beats_input_codec_plain_applied",
            "_grokparsefailure",
            "_geoip_lookup_failure"
          ]
        }
      }

知道为什么我要面对这个问题。

最佳答案

您有一个_grokparsefailure,因此clientip字段不存在。这将导致_geoip_lookup_failure,因为geoip过滤器正在寻找不存在的clientip字段。

您的日志与%{COMMONAPACHELOG}模式匹配,而不是您正在使用的模式。因此,您的配置如下所示:

filter {
  grok {
      match => {
        "message" => "%{COMMONAPACHELOG}"
      }
   }
   ...
}

使用正确的模式后,您应该注意到clientip字段存在,此后,希望geoip过滤器可以工作。 :)

关于elasticsearch - geoip查找失败弹性堆栈logstash,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/44559202/

10-10 18:24