问题描述
对此我有点陌生,我需要帮助,我看了网上找不到我正在寻找的任何答案.基本上,我想做的是基于从某些文本字段派生的关键字进行自动填充
Im kinda new to this i need help, i looked online couldnt find any answer im looking for. Basically, what im trying to do is for autocomplete based on keywords derived from some textfields
举一个我的索引的例子:
Given an example of my indices:
"name": "One liter of Chocolate Milk"
"name": "Milo Milk 250g"
"name": "HiLow low fat milk"
"name": "Yoghurt strawberry"
"name": "Milk Nutrisoy"
因此,当我输入"mi"时,我希望得到如下结果:
So when i type in "mi", im expecting to get the results like:
"milk"
"milo"
"milo milk"
"chocolate milk"
etc
很好的例子是此aliexpress.com自动完成
Very good example is this aliexpress.com autocomplete
预先感谢
推荐答案
对于 木片
令牌过滤器
That seems like a good use case for the shingle
token filter
curl -XPUT localhost:9200/your_index -d '{
"settings": {
"analysis": {
"analyzer": {
"my_shingles": {
"tokenizer": "standard",
"filter": [
"lowercase",
"shingles"
]
}
},
"filter": {
"shingles": {
"type": "shingle",
"min_shingle_size": 2,
"max_shingle_size": 2,
"output_unigrams": true
}
}
}
},
"mappings": {
"your_type": {
"properties": {
"field": {
"type": "string",
"analyzer": "my_shingles"
}
}
}
}
}'
如果使用此分析仪分析 Milo Milk 250g
,您将获得以下令牌:
If you analyze Milo Milk 250g
with this analyzer, you'll get the following tokens:
curl -XGET 'localhost:9200/your_index/_analyze?analyzer=my_shingles&pretty' -d 'Milo Milk 250g'
{
"tokens" : [ {
"token" : "milo",
"start_offset" : 0,
"end_offset" : 4,
"type" : "<ALPHANUM>",
"position" : 0
}, {
"token" : "milo milk",
"start_offset" : 0,
"end_offset" : 9,
"type" : "shingle",
"position" : 0
}, {
"token" : "milk",
"start_offset" : 5,
"end_offset" : 9,
"type" : "<ALPHANUM>",
"position" : 1
}, {
"token" : "milk 250g",
"start_offset" : 5,
"end_offset" : 14,
"type" : "shingle",
"position" : 1
}, {
"token" : "250g",
"start_offset" : 10,
"end_offset" : 14,
"type" : "<ALPHANUM>",
"position" : 2
} ]
}
因此,当搜索 mi
时,您将获得以下令牌:
So when searching for mi
, you'll get the following tokens:
- milo
- 牛奶
- 牛奶
- 牛奶250g
这篇关于返回从ElasticSearch中的字段派生的关键字集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!