# ERROR: ES FAIL FOR SQ 1. Query string containing "now" as Prefix Problem: For all query strings containing "now" as prefix such as "nowu", "nowhere clothing", elasticsearch is throwing "parse_exeption". Reason: ES is failing to create query. Because of prefix "now" in search terms, multi_match queries are trying to parse the query text into "date" type which leads to throw "parse_exception". Solution: In multi_match queries, set lenient as true. It will avoid all cases of parse exceptions like string to number or string to boolean parsing exceptions. e.g. { "query": { "multi_match": { "query": "nowhere", "fields": ["product.name", "variant.startDate", "variant.isPublished"], **"lenient": true** } } } 2. Max_clause_count exceeding 1024. Problem: When a query containing large number of clauses is fired to elasticsearch then it is throwing exception of type "too_many_clauses". Solution: Elasticsearch can at a time process only 1024 (default value) clauses. Each element in the terms array corresponds to 1 clause. Either we can increase the max_clause_count parameter in the elasticsearch.yml or make sure that the length of the terms array or number of clauses doesn't exceed 1024. Limitations: i). Max_clause_count does not only affect Elasticsearchs bool query, but many other queries are rewritten to Lucene’s BooleanQuery internally. This limit is there to prevent searches from becoming too large and taking up too much CPU and memory. ii). In case, if we increase this value, make sure we have checked all other options to avoid having to do this. Higher values can lead to performance degradations and memory issues, especially in clusters with a high load or few resources. 3. Elasticsearch result window exceeding 10000. Problem: When response of Elasticsearch query is a large data set, it is throwing exception of type "query_phase_execution_exception". Reason: Right now, the value of max_result_window in elesticsearch is 10000 (default value). When count of response data exceeds this limit, it is throwing exception of type "query_phase_execution_exception". Solution: i). increase the value of max_result_window Limitaion: Since each call to scroll api takes linear space and time complexity to fetch data until there are no more results left to return, ie the hits array is empty, incraesing the limit will increase memory and CPU overheads. ii). Use deep paging concept Limitation: It sorts response data for each page response that implies exponential time complexity. So Not recommended