es索引添加自定义数字分词器
作者:互联网
1、创建索引、分片、副本
PUT /waybill_test { "index": { "number_of_replicas" : 1, "number_of_shards" : 3 } }
2、关闭索引
POST waybill_test/_close
3、添加自定义分词器、在添加自定义分词器之前需要关闭索引
PUT waybill_test/_settings { "settings": { "analysis": { "analyzer": { "search_ngram_analyzer": { "tokenizer": "my_ik_tokenizer", "filter": "ik_first_letter_and_full_ngram_filter" }, "ngram_analyzer": { "tokenizer": "ngram_tokenizer" } }, "tokenizer": { "my_ik_tokenizer": { "type": "ik_smart" }, "ngram_tokenizer": { "type": "ngram", "min_gram": 1, "max_gram": 6, "token_chars": [ "letter", "digit" ] } }, "filter": { "ik_first_letter_and_full_ngram_filter": { "type": "ngram", "min_gram": 1, "max_gram": 6, "token_chars": [ "letter", "digit" ] } } } } }
4、打开索引
POST waybill_test/_open
5、添加properties字段及 analyer分词器
PUT /waybill_test/waybill_test_type/_mappings { "properties": { "search_remark": { "type": "text", "analyzer": "search_ngram_analyzer" } } }
标签:tokenizer,自定义,waybill,ik,ngram,分词器,test,type,es 来源: https://www.cnblogs.com/yk775879106/p/16696720.html