ok, can you tell, In my mapping my types are set as 'keyword'? is that the only thing required to read the search term as it is? if yes, then why its dropping '!' character? also i can i make use of analyze() function to see what's happening exactly? Also, I tried doing this:

`http://localhost:9210/search/_analyze?analyzer=keyword&text=f04((?!z).)*`

Seems to look fine here with result:

```
{
  "tokens": [
    {
      "token": "f04((?!z).)*",
      "start_offset": 0,
      "end_offset": 12,
      "type": "word",
      "position": 0
    }
  ]
}
```

But i don't know how to check in the code what's happening exactly.

---
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB