我是 flex 搜索的新手,我正在尝试使用下面的映射创建索引,该映射是我在网上找到的,并使用kibana作为我的客户端,将错误抛出为。
PUT /local_test
{
"settings": {
"index.mapping.total_fields.limit": 1000,
"index.mapping.depth.limit": 20,
"index.mapping.nested_fields.limit": 50,
"number_of_shards": 5,
"number_of_replicas": 1,
"analysis": {
"analyzer": {
"edge_ngram_analyzer": {
"type": "custom",
"tokenizer": "edge_ngram_tokenizer",
"filter": [
"lowercase",
"en_stopwords"
]
},
"standard_custom": {
"type": "custom",
"char_filter": [
"punctuation_remap"
],
"tokenizer": "standard",
"filter": [
"lowercase",
"en_stopwords"
]
},
"lowercase_keyword": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"edge_ngram_tokenizer": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 50,
"token_chars": [
"letter",
"digit"
]
}
},
"filter": {
"en_stopwords": {
"type": "stop",
"stopwords": "_english_"
}
},
"char_filter": {
"punctuation_remap": {
"type": "mapping",
"mappings": [
". => -",
": => -",
"' => -"
]
}
}
}
},
"mappings": {
"local_test": {
"_all": {
"enabled": false
},
"properties": {
"id": {
"type": "keyword"
},
"user_id": {
"type": "keyword"
},
"created_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||epoch_millis"
},
"docvalue": {
"type": "object",
"dynamic": false,
"enabled": true,
"properties": {
"key": {
"type": "text",
"analyzer": "lowercase_keyword"
},
"value": {
"type": "text",
"analyzer": "lowercase_keyword"
}
}
},
"recurring": {
"type": "boolean"
},
"amount": {
"type": "long"
}
}
}
}
}
最佳答案
以下是您的请求中的两个问题,我假设您使用的是最新的主要版本,即7.X。
_all
。参见this official blog on this change。 local_test
,因为在最新版本中也删除了类型。有关更多信息,请参见the removal of types。 因此,使用下面的请求可以正常工作:
放入/ local_test
{
"settings": {
"index.mapping.total_fields.limit": 1000,
"index.mapping.depth.limit": 20,
"index.mapping.nested_fields.limit": 50,
"number_of_shards": 5,
"number_of_replicas": 1,
"analysis": {
"analyzer": {
"edge_ngram_analyzer": {
"type": "custom",
"tokenizer": "edge_ngram_tokenizer",
"filter": [
"lowercase",
"en_stopwords"
]
},
"standard_custom": {
"type": "custom",
"char_filter": [
"punctuation_remap"
],
"tokenizer": "standard",
"filter": [
"lowercase",
"en_stopwords"
]
},
"lowercase_keyword": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"edge_ngram_tokenizer": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 50,
"token_chars": [
"letter",
"digit"
]
}
},
"filter": {
"en_stopwords": {
"type": "stop",
"stopwords": "_english_"
}
},
"char_filter": {
"punctuation_remap": {
"type": "mapping",
"mappings": [
". => -",
": => -",
"' => -"
]
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"user_id": {
"type": "keyword"
},
"created_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||epoch_millis"
},
"docvalue": {
"type": "object",
"dynamic": false,
"enabled": true,
"properties": {
"key": {
"type": "text",
"analyzer": "lowercase_keyword"
},
"value": {
"type": "text",
"analyzer": "lowercase_keyword"
}
}
},
"recurring": {
"type": "boolean"
},
"amount": {
"type": "long"
}
}
}
}
输出
{
"acknowledged": true,
"shards_acknowledged": true,
"index": "local_test"
}
关于elasticsearch - 在Elastic Search中创建索引时出错,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/61545885/