How to deal with punctuation in an ElasticSearch field -


i have field in document stored in elastic search, want analyzed full text field. in 1 case, contains value name field this:

a&b corp 

i want able search documents auto-complete widget, using query (suppose user typed a&b autocomplete field). intention match documents contain terms typed prefix.

{   "query": {     "filtered": {       "query": {         "query_string": {           "query": "a&b*",           "fields": [             "firstname",             "lastname",             "name",             "key",             "email"           ]         }       },       "filter": {         "terms": {           "environmentid": [             "foo"           ]         }       }     }   } } 

```

my mapping name field looks this:

"name": {     "type": "string" }, 

but, no results. query structure works documents don't have & in field, i'm pretty sure part of problem.

but, i'm not sure how deal this. pretty sure still want analyze field full text search.

in addition, if add space before * in query (ie, "query": "a&b *",) results including a&b, don't think discarding ampersand , treating a , b separate terms.

should change mapping? query?

the query_string query has set of reserved characters needs escaped.

query_string : read reserved characters section

so search

'a&b' (or) 'a&b corp' (or) 'a&b....'

your query must "a&b\\*" such query_string parser treats * wildcard operator.

  1. while query searching exact match of "a&b*" expects asterik part of data.

  2. and when search "a&b *" whitespace reserved character searching "a&b" (or) "*" , hence match in case.


Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -