stash将数据从mongoDB同步到elasticsearch

stash将数据从mongoDB同步到elasticsearch

本文介绍了使用Logstash将数据从mongoDB同步到elasticsearch的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将mongoDB数据库同步到elasticsearch.我正在使用 logstash-input-mongoDb logstash-output-elasticsearch 插件.

问题是mongoDb插件无法从mongodb中插入的文档中提取所有信息,因此我看到只有很少的字段插入到elasticsearch中.而且我还获得了整个查询作为Elasticsearch索引中的日志.我试图操纵配置文件中的logstash过滤器,并将输入更改为elasticsearch,但无法使其工作.

任何帮助或建议都会很棒.

I am trying to sync the mongoDB database to the elasticsearch. I am using logstash-input-mongoDb and logstash-output-elasticsearch plugins.

The issue is mongoDb plugin is not able to extract all the information from the inserted document in mongodb, thus I am seeing only few fields being inserted to the elasticsearch. And I also get the entire query as the log in elasticsearch index. I tried to manipulate the filters in the config file for the logstash and change the input to the elasticsearch but could not make it work.

Any help or suggestion would be great.



修改:
Mongo模式:



Edit:
Mongo schema:

A:{
  B: 'sometext',
  C: {G: 'someText', H:'some text'}
},
D:[
 {E:'sometext',F:'sometext'},
 {E:'sometext',F:'sometext'},
 {E:'sometext',F:'sometext'}
]

插件:

    input {
    mongodb {
        uri => 'mongodb://localhost:27017/testDB'
        placeholder_db_dir => '/opt/logstash-mongodb/'
        placeholder_db_name => 'logstash_sqlite.db'
        collection => 'testCOllection'
        batch_size => 1000
    }
}
output {
        stdout {
                codec => rubydebug
        }
        elasticsearch {
                action => "index"
                index => "testdb_testColl"
                hosts => ["localhost:9200"]
        }
}

输出到弹性:

{
    //some metadata
    A_B: 'sometext',
    A_C_G: 'someText',
    A_C_H: 'some text',
    log_entry: 'contains complete document inserted to mongoDB'
}

我们没有在弹性中获得mongo集合的属性D.希望这能更详尽地解释问题.

We are not getting property D of mongo collection in the elastic.Hope this explains the problem more elaborately.

推荐答案

因为您的配置对我来说很好,所以我检查了 phutchins/logstash-input-mongodb 回购的问题,然后发现这个:"未存储到elasticsearch的数组",这几乎是描述了你的问题.它仍然是一个未解决的问题,但是您可能想尝试ivancruzbht建议的解决方法.这种解决方法使用 ruby​​ Logstash过滤器来解析 log_entry 字段,您还确认该字段具有所有字段-包括 D .

because your configuration looked good to me, I checked the issues of the phutchins/logstash-input-mongodb repo, and I found this one: "array not stored to elasticsearch", which pretty much described your problem. It is still an open issue, but you might want to try out the workaround suggested by ivancruzbht. Such workaround uses the ruby Logstash filter to parse the log_entry field, which you also confirmed has all the fields - including D.

这篇关于使用Logstash将数据从mongoDB同步到elasticsearch的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-06 16:46