Eli*_*res 5 csv import elasticsearch
我打算使用弹性搜索索引来存储大约有290万条记录的大型城市数据库,并将其用作我的Laravel应用程序的搜索引擎.
问题是:我在MySQL数据库和CSV文件中都有城市.该文件有~300MB.
如何将其快速导入索引?
我的导入脚本是这样的:
input {
file {
path => ["/home/user/location_cities.txt"]
type => "city"
start_position => "beginning"
}
}
filter {
csv {
columns => ["region", "subregion", "ufi", "uni", "dsg", "cc_fips", "cc_iso", "full_name", "full_name_nd", "sort_name", "adm1", "adm1_full_name", "adm2", "adm2_full_name"]
separator => " "
remove_field => [ "host", "message", "path" ]
}
}
output {
elasticsearch {
action => "index"
protocol => "http"
host => "127.0.0.1"
port => "9200"
index => "location"
workers => 4
}
}
Run Code Online (Sandbox Code Playgroud)
此脚本将没有分隔符的制表符分隔文件导入到location使用type 调用的索引中city.
要运行该脚本,需要bin/logstash -f import_script_file在安装/解压缩Logstash的文件夹中运行.