我正在使用Elasticsearch-1.5.1,Kibana-4.0.2-linux-x86,Logstash-1.4.2。
我的 logstash conf 像这样
input{
redis{
data_type=>'list'
key=>'pace'
password=>'bhushan'
type=>pace
}
}filter {
geoip {
source => "mdc.ip"
target => "geoip"
database => "/opt/logstash-1.4.2/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
}
output{
if[type]=="pace"{
elasticsearch{
template_overwrite => true
host=>localhost
index=>'pace'
template => "/opt/logstash-1.4.2/mytemplates/elasticsearch-template.json"
template_name => "bhushan"
}
}
stdout{
codec=>rubydebug
}
}
我的 elasticsearch-template.json 是
{
"template" : "bhushan",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true
"properties" : {
"location" : { "type" : "geo_point" }
}
}
}
}
}
}
当我做URL curl http://localhost:9200/pace/_mapping/pace/field/geoip.location?pretty
{
"pace" : {
"mappings" : {
"pace" : {
"geoip.location" : {
"full_name" : "geoip.location",
"mapping" : {
"location" : {
"type" : "double"
}
}
}
}
}
}
}
日志记录的例子像
{
"thread_name" => "main",
"mdc.ip" => "14.X.X.X",
"message" => "Hii, I m in info",
"@timestamp" => "2015-05-15T10:18:32.904+05:30",
"level" => "INFO",
"file" => "Test.java",
"class" => "the.bhushan.log.test.Test",
"line_number" => "15",
"logger_name" => "bhushan",
"method" => "main",
"@version" => "1",
"type" => "pace",
"geoip" => {
"ip" => "14.X.X.X",
"country_code2" => "IN",
"country_code3" => "IND",
"country_name" => "India",
"continent_code" => "AS",
"region_name" => "16",
"city_name" => "Mumbai",
"latitude" => 18.974999999999994,
"longitude" => 72.82579999999999,
"timezone" => "Asia/Calcutta",
"real_region_name" => "Maharashtra",
"location" => [
[0] 72.82579999999999,
[1] 18.974999999999994
],
"coordinates" => [
[0] "72.82579999999999",
[1] "18.974999999999994"
]
}
}
我以为我的问题和this一样,所以我做了该链接中提到的所有内容,例如删除所有旧索引以及重新启动LS和ES,但是没有运气。
任何帮助表示赞赏。
最佳答案
您的logstash过滤器将坐标存储在字段geoip.coordinates
中,但是在elasticsearch-template.json
映射中,该字段称为geoip.location
。这将显示在示例日志记录中,您可以在其中看到location
子对象中的两个字段coordinates
和geoip
。
我认为,如果您在logstash过滤器中更改此设置,则可能会很好:
由此
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
对此
add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][location]", "%{[geoip][latitude]}" ]
更新
add_field
过滤器中的两个geoip
指令,因为它们是不必要的"path": "full"
已被弃用pace
而不是bushan
,即,存储日志记录的索引的名称。 关于elasticsearch - 无法在基巴纳 map 上显示位置,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/30251850/