docker 使用elasticsearch+logstash
Posted j-wym
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了docker 使用elasticsearch+logstash相关的知识,希望对你有一定的参考价值。
1.1部署elasticsearch:6.5.4
docker pull elasticsearch:6.5.4 docker run -d --name elasticsearch -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" elasticsearch:6.5.4
http://localhost:9200/
1.2添加elasticsearch-head
docker pull mobz/elasticsearch-head:5 docker create --name elasticsearch-head -p 9100:9100 mobz/elasticsearch-head:5 docker start elasticsearch-head
docker exec -it elasticsearch /bin/bash vi config/elasticsearch.yml
在最下面添加2行
http.cors.enabled: true http.cors.allow-origin: "*"
2.1部署logstash:6.5.4
docker pull logstash:6.5.4
2.2映射配置文件
mkdir -p /usr/local/src/docker_logstash mkdir -p /usr/local/src/docker_logstash/logs vi logstash.yml http.host: "0.0.0.0" xpack.monitoring.elasticsearch.url: http://sandbox:9200 vi log4j2.properties logger.elasticsearchoutput.name = logstash.outputs.elasticsearch logger.elasticsearchoutput.level = debug vi pipelines.yml - pipeline.id: my-logstash path.config: "/usr/share/logstash/config/*.conf" pipeline.workers: 3 vi *.conf #控制台输入 input { stdin { } } output { #codec输出到控制台 stdout { codec=> rubydebug } #输出到elasticsearch elasticsearch { hosts => "sandbox:9200" codec => json } #输出到文件 file { path => ""/usr/share/logstash/config/logs/all.log" #指定写入文件路径 flush_interval => 0 # 指定刷新间隔,0代表实时写入 codec => json } }
docker run -d -p 5044:5044 -p 9600:9600 -it --name logstash -v /usr/local/src/docker_logstash:/usr/share/logstash/config logstash:6.5.4
docker exec -it logstash /bin/bash bin/logstash -e ‘input { stdin { } } output { stdout {} }‘
2.3 mysql --> elasticsearch
vi mysql.conf
input { stdin { } jdbc { jdbc_connection_string => "jdbc:mysql://sandbox:3306/erp_test4" jdbc_user => "root" jdbc_password => "123456" jdbc_driver_library => "/usr/share/logstash/config/mysql-connector-java-5.1.27.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" statement => "SELECT * FROM nrd2_project" type => "project" } } filter { json { source => "message" remove_field => ["message"] } } output { elasticsearch { hosts => "sandbox:9200" index => "project" document_id => "%{id}" } stdout { codec => json_lines } }
bin/logstash -f config/mysql.conf