- Centos下安装elasticdump
yum install elasticdump
- 安装完成后
[root@i-vvxxxxswtw5ne ~]# elasticdump --help
elasticdump: Import and export tools for elasticsearch
version: 2.2.0
Usage: elasticdump --input SOURCE --output DESTINATION [OPTIONS]
--input
Source location (required)
--input-index
Source index and type
(default: all, example: index/type)
--output
Destination location (required)
--output-index
Destination index and type
(default: all, example: index/type)
--limit
How many objects to move in batch per operation
limit is approximate for file streams
(default: 100)
--debug
Display the elasticsearch commands being used
(default: false)
--type
What are we exporting?
(default: data, options: [data, mapping])
--delete
Delete documents one-by-one from the input as they are
moved. Will not delete the source index
(default: false)
--searchBody
Preform a partial extract based on search results
(when ES is the input,
(default: '{"query": { "match_all": {} } }'))
--sourceOnly
Output only the json contained within the document _source
Normal: {"_index":"","_type":"","_id":"", "_source":{SOURCE}}
sourceOnly: {SOURCE}
(default: false)
--all
Load/store documents from ALL indexes
(default: false)
--ignore-errors
Will continue the read/write loop on write error
(default: false)
--scrollTime
Time the nodes will hold the requested search in order.
(default: 10m)
--maxSockets
How many simultaneous HTTP requests can we process make?
(default:
5 [node <= v0.10.x] /
Infinity [node >= v0.11.x] )
--timeout
Integer containing the number of milliseconds to wait for
a request to respond before aborting the request. Passed
directly to the request library. Mostly used when you don't
care too much if you lose some data when importing
but rather have speed.
--offset
Integer containing the number of rows you wish to skip
ahead from the input transport. When importing a large
index, things can go wrong, be it connectivity, crashes,
someone forgetting to `screen`, etc. This allows you
to start the dump again from the last known line written
(as logged by the `offset` in the output). Please be
advised that since no sorting is specified when the
dump is initially created, there's no real way to
guarantee that the skipped rows have already been
written/parsed. This is more of an option for when
you want to get most data as possible in the index
without concern for losing some rows in the process,
similar to the `timeout` option.
--inputTransport
Provide a custom js file to us as the input transport
--outputTransport
Provide a custom js file to us as the output transport
--toLog
When using a custom outputTransport, should log lines
be appended to the output stream?
(default: true, except for `$`)
--help
This page
Examples:
# Copy an index from production to staging with mappings:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=http://staging.es.com:9200/my_index \
--type=mapping
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=http://staging.es.com:9200/my_index \
--type=data
# Backup index data to a file:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=/data/my_index_mapping.json \
--type=mapping
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=/data/my_index.json \
--type=data
# Backup and index to a gzip using stdout:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=$ \
| gzip > /data/my_index.json.gz
# Backup the results of a query to a file
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=query.json \
--searchBody '{"query":{"term":{"username": "admin"}}}'
Learn more @ https://github.com/taskrabbit/elasticsearch-dump
- 数据从一个库导入另一个库input和output都是url
[root@i-vvwdddtw5ne ~]# elasticdump --input=http://192.192.16.50:9200/elasticsearch_sapdata --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data Sun, 21 Jul 2019 06:44:18 GMT | starting dump Sun, 21 Jul 2019 06:44:18 GMT | Error Emitted => {"error":{"root_cause":[{"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36}],"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36},"status":400} Sun, 21 Jul 2019 06:44:18 GMT | Total Writes: 0 Sun, 21 Jul 2019 06:44:18 GMT | dump ended with error (get phase) => Error: {"error":{"root_cause":[{"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36}],"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36},"status":400}
解决办法:加上 --searchBody '{"query":{"match_all": {}}}'
#input 和output都指向库的url
[root@i-vvwtw5ne ~]# elasticdump --input=http://192.192.16.50:9200/elasticsearch_sapdata --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data --searchBody '{"query":{"match_all": {}}}'
Sun, 21 Jul 2019 06:49:57 GMT | starting dump
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 0)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 100)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 200)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 300)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 400)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 500)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 600)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 700)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 800)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 900)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1000)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1100)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1200)
Sun, 21 Jul 2019 06:49:59 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:59 GMT | got 87 objects from source elasticsearch (offset: 1300)
Sun, 21 Jul 2019 06:49:59 GMT | sent 87 objects to destination elasticsearch, wrote 87
Sun, 21 Jul 2019 06:49:59 GMT | got 0 objects from source elasticsearch (offset: 1387)
Sun, 21 Jul 2019 06:49:59 GMT | Total Writes: 1387
Sun, 21 Jul 2019 06:49:59 GMT | dump complete
- 把一个文件导入到库中,input为文件,output为要导入的库
[root@i-vvwtw5ne ~]# elasticdump --input=gaopan.json --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data --searchBody '{"query":{"match_all": {}}}'
Sun, 21 Jul 2019 06:53:36 GMT | starting dump
Sun, 21 Jul 2019 06:53:36 GMT | got 100 objects from source file (offset: 0)
Sun, 21 Jul 2019 06:53:36 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:53:37 GMT | got 137 objects from source file (offset: 100)
Sun, 21 Jul 2019 06:53:37 GMT | sent 137 objects to destination elasticsearch, wrote 137
Sun, 21 Jul 2019 06:53:37 GMT | got 141 objects from source file (offset: 237)
Sun, 21 Jul 2019 06:53:37 GMT | sent 141 objects to destination elasticsearch, wrote 141
Sun, 21 Jul 2019 06:53:37 GMT | got 132 objects from source file (offset: 378)
Sun, 21 Jul 2019 06:53:37 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:37 GMT | got 143 objects from source file (offset: 510)
Sun, 21 Jul 2019 06:53:37 GMT | sent 143 objects to destination elasticsearch, wrote 143
Sun, 21 Jul 2019 06:53:37 GMT | got 132 objects from source file (offset: 653)
Sun, 21 Jul 2019 06:53:37 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:37 GMT | got 140 objects from source file (offset: 785)
Sun, 21 Jul 2019 06:53:38 GMT | sent 140 objects to destination elasticsearch, wrote 140
Sun, 21 Jul 2019 06:53:38 GMT | got 131 objects from source file (offset: 925)
Sun, 21 Jul 2019 06:53:38 GMT | sent 131 objects to destination elasticsearch, wrote 131
Sun, 21 Jul 2019 06:53:38 GMT | got 143 objects from source file (offset: 1056)
Sun, 21 Jul 2019 06:53:38 GMT | sent 143 objects to destination elasticsearch, wrote 143
Sun, 21 Jul 2019 06:53:38 GMT | got 132 objects from source file (offset: 1199)
Sun, 21 Jul 2019 06:53:38 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:38 GMT | got 56 objects from source file (offset: 1331)
Sun, 21 Jul 2019 06:53:38 GMT | sent 56 objects to destination elasticsearch, wrote 56
Sun, 21 Jul 2019 06:53:38 GMT | got 0 objects from source file (offset: 1387)
Sun, 21 Jul 2019 06:53:38 GMT | Total Writes: 1387
Sun, 21 Jul 2019 06:53:38 GMT | dump complete
- 把数据导出到json文件中(input为要导出的库,output为要导出的文件路径)
[root@i-vvwtw5ne ~]# elasticdump --input=http://192.192.16.30:9200/elasticsearch_sapdata --output=gaopan2.json --type=data --searchBody '{"query":{"match_all": {}}}' Sun, 21 Jul 2019 06:55:57 GMT | starting dump Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 0) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 100) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 200) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 300) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 400) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 500) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 600) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 700) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 800) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 900) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1000) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1100) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1200) Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100 Sun, 21 Jul 2019 06:55:57 GMT | got 87 objects from source elasticsearch (offset: 1300) Sun, 21 Jul 2019 06:55:57 GMT | sent 87 objects to destination file, wrote 87 Sun, 21 Jul 2019 06:55:57 GMT | got 0 objects from source elasticsearch (offset: 1387) Sun, 21 Jul 2019 06:55:57 GMT | Total Writes: 1387 Sun, 21 Jul 2019 06:55:57 GMT | dump complete