Sometimes you have to import data to Elasticsearch but you don’t want to use logstash because you have to create the configuration file just for a simple data records or because you have your data in mongoexport format. Maybe you want to insert some CSV files from a node module in Elasticsearch
Whatever the case you can do it with elastic-import, a nodejs module but also a CLI tool to import data to Elasticsearch
Import data to Elasticsearch from MongoDB
If you need to import your Mongodb data to Elasticsearch you can export them with mongoexport and then import with elastic-import as easy as this
mongoexport -h mongohost -d mydatabasename -c mycollectionname -o mongodata.json elastic-import mongodata.json elastichost:9200 myindex mytype --mongo
If you use an standard JSON format you can do the same like this
mongoexport -h mongohost -d mydatabasename -c mycollectionname -o mongodata.json --jsonArray elastic-import mongodata.json elastichost:9200 myindex mytype --json
Even if you use CSV you can import all the data with this two commands
mongoexport -h mongohost -d mydatabasename -c mycollectionname -o mongodata.json --type=csv -f field1,field2,field3,field4 elastic-import mongodata.json elastichost:9200 myindex mytype --csv -f field1,field2,field3,field4
The module elastic-import allows you to tranform your records before import them to elasticsearch. Just use a file with a function like this
module.exports = function (record) { record.field1 = record.field1.toLowerCase() }
Then you can use your function with the -t
/ --transform-file
option
elastic-import mongodata.json elastichost:9200 myindex mytype --mongo -t mytransformfunction.js
You can fork this module from the repo in github