Elasticsearch Notes

From Federal Burro of Information
Jump to navigationJump to search

elasticsearch-head and elastic search plugin ( https://github.com/mobz/elasticsearch-head )

_search?search_type=count

{
 "aggs" : {
  "all_users": {
   "terms": {
    "field": "screen_name"
   }
  }
 }
}

list indexes and summary:

curl 'localhost:9200/_cat/indices?v'

show health

curl 'localhost:9200/_cat/health?v'

list nodes:

curl 'localhost:9200/_cat/nodes?v'

delete an index

curl -XDELETE 'http://localhost:9200/twitterindex_v2/'

created an index with mappings from a file:

curl -XPUT localhost:9200/twitterindex_v2 -T 'mapping.1'

get the mappoings for an index

curl -XGET "http://localhost:9200/test-index/_mapping" | jsonlint > mapping

Explicitly mapping date fields

from: http://joelabrahamsson.com/dynamic-mappings-and-dates-in-elasticsearch/

curl -XPUT "http://localhost:9200/myindex" -d'
{
   "mappings": {
      "tweet": {
         "date_detection": false,
         "properties": {
             "postDate": {
                 "type": "date"
             }
         }
      }
   }
}'

changing-mapping-with-zero-downtime

https://www.elastic.co/blog/changing-mapping-with-zero-downtime

moving data between indexes

Use ElasticDump ( https://www.npmjs.com/package/elasticdump )

1) yum install epel-release

2) yum install nodejs

3) yum install nodejs npm

4) npm install elasticdump

5) cd node_modules/elasticdump/bin

6)

./elasticdump \

  --input=http://192.168.1.1:9200/original \

  --output=http://192.168.1.2:9200/newCopy \

  --type=data
elasticdump \
  --input=http://localhost:9700/.kibana \
  --output=http://localhost:9700/.kibana_read_only \
  --type=mapping
elasticdump \
  --input=http://localhost:9700/.kibana \
  --output=http://localhost:9700/.kibana_read_only \
  --type=data