将csv导入elasticsearch

将csv导入elasticsearch

本文介绍了将csv导入elasticsearch的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在做弹性搜索入门"教程.不幸的是,本教程没有涵盖将 csv 数据库导入 elasticsearch 的第一步.

I'm doing "elastic search getting started" tutorial. Unfortunatelly this tutorial doesn't cover first step which is importing csv database into elasticsearch.

我用谷歌搜索找到解决方案,但不幸的是它不起作用.这是我想要实现的目标和我所拥有的:

I googled to find solution but it doesn't work unfortunatelly. Here is what I want to achieve and what I have:

我有一个包含数据的文件,我想导入(简化)

I have a file with data which I want to import (simplified)

id,title
10,Homer's Night Out
12,Krusty Gets Busted

我想使用 logstash 导入它.经过互联网研究后,我最终得到以下配置:

I would like to import it using logstash. After research over the internet I end up with following config:

input {
    file {
        path => ["simpsons_episodes.csv"]
        start_position => "beginning"
    }
}

filter {
    csv {
        columns => [
            "id",
            "title"
        ]
    }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        action => "index"
        hosts => ["127.0.0.1:9200"]
        index => "simpsons"
        document_type => "episode"
        workers => 1
    }
}

我在指定文档类型时遇到问题,因此一旦导入数据,我导航到 http://localhost:9200/simpsons/episode/10 我希望看到第 10 集的结果.

I have a trouble with specifying document type so once data is imported and I navigate to http://localhost:9200/simpsons/episode/10 I expect to see result with episode 10.

推荐答案

干得好,大功告成,只是缺少文档 ID.您需要像这样修改 elasticsearch 输出:

Good job, you're almost there, you're only missing the document ID. You need to modify your elasticsearch output like this:

elasticsearch {
    action => "index"
    hosts => ["127.0.0.1:9200"]
    index => "simpsons"
    document_type => "episode"
    document_id => "%{id}"             <---- add this line
    workers => 1
}

在此之后,您将能够查询 ID 为 10 的剧集

After this you'll be able to query episode with id 10

GET http://localhost:9200/simpsons/episode/10

这篇关于将csv导入elasticsearch的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-06 16:47