因為業(yè)務需要要將線上elasticsearch-2.3升級到最新版本,官方稱7.2.0版本比之前版本運行速度和效率有質(zhì)的飛躍诫给!
我們將安裝elasticsearch-7.2.0配套的環(huán)境
一初狰,首先去各自官網(wǎng)下載相應的文件:
elasticsearch-7.2.0
elasticsearch-head-master
kibana-7.2.0-linux-x86_64
logstash-7.2.0
elasticsearch-analysis-ik-7.2.0
node-v8.16.0-linux-x64.tar
在此我就不貼官網(wǎng)地址了三椿,給大家提供個百度云盤我下載好的鏈接
鏈接:https://pan.baidu.com/s/1qUSDGHaIRHfjjyfyZ7A9Pg
提取碼:e54v
二亏娜,安裝前準備
1铛纬,因為elasticsearch不允許使用root賬戶啟動职辅,所以我們首先創(chuàng)建用戶
useradd elk
2棒呛,更改系統(tǒng)資源限制
vim /etc/security/limits.conf
添加如下參數(shù):
soft nofile 65536
hard nofile 65536
soft nproc 2048
hard nproc 4096
vim /etc/sysctl.conf
添加:
vm.max_map_count=655360
使用如下命令使參數(shù)生效
sysctl -p
三,安裝部署
1,安裝elasticsearch
因提供的都是二進制包域携,無需編譯簇秒,可以直接使用
解壓
tar -xvf elasticsearch-7.2.0
將文件放到合適的地方
mv elasticsearch-7.2.0 /home/elk/
將文件所有者更改為elk
chown -R elk.elk /home/elk/elasticsearch-7.2.0
編輯elasticsearch配置文件:
vim elasticsearch.yml
cluster.name: xavito
transport.tcp.compress: true
cluster.initial_master_nodes: ["vito248","vito203"]
discovery.seed_hosts: ["192.168.1.248", "192.168.1.203"]
node.name: vito203
http.port: 9200
bootstrap.memory_lock: false
bootstrap.system_call_filter: false
network.host: 0.0.0.0
http.cors.enabled: true
http.cors.allow-origin: "*"
其中我這是兩個節(jié)點的集群,如果是單節(jié)點cluster參數(shù)就不用配置了秀鞭,如果是集群要主要集群的寫法趋观,有別與之前的版本
寫錯了就會到導致服務起不來扛禽,或者后續(xù)elasticsearch-head-master在網(wǎng)頁上連接不上。
同時也可以指定數(shù)據(jù)文件和日志文件路徑皱坛,默認情況下在當前主文件下
path.data: /path/to/data
path.logs: /path/to/logs
直接用elk用戶啟動即可
su elk
/home/elk/elasticsearch-7.2.0/elasticsearch -d (-d為后臺運行)
啟動后可以用curl或者瀏覽其訪問9200端口:
表示啟動已經(jīng)成功了
2编曼,安裝elasticsearch-head-master.zip
解壓:
unzip elasticsearch-head-master.zip
mv elasticsearch-head-master /home/elk/
chown -R elk.elk /home/elk/elasticsearch-head-maste
安裝:
首先必須有node.js,才可以安裝剩辟,沒有的話安裝下掐场,node只要在環(huán)境變量里配置好就行了
安裝node如下:
將安裝包上傳到指定位置(我習慣放到:/usr/local/application/目錄),并解壓
tar -xvf node-v10.6.0-linux-x64.tar.xz
重命名文件夾
1 mv node-v10.6.0-linux-x64 nodejs
通過建立軟連接變?yōu)槿?/p>
ln -s /usr/local/application/nodejs/bin/npm /usr/local/bin/
ln -s /usr/local/application/nodejs/bin/node /usr/local/bin/
檢查是否安裝成功贩猎,命令:node-v
node -v
v10.6.0
已安裝的上面步驟略過
cd /home/elk/elasticsearch-head-maste
執(zhí)行 npm install
執(zhí)行 npm run start (npm run start &)
netstat -nltp
看到9100端口開啟熊户,表示安裝成功了
可以在瀏覽器登錄
3、安裝kibana-7.2.0-linux-x86_64.tar.gz吭服,在7.2版本中官方直接支持中文嚷堡,在配置文件中即可設置
解壓
tar -xvf kibana-7.2.0-linux-x86_64.tar.gz
mv kibana-7.2.0-linux-x86_64 /home/elk/
chown -R elk.elk /home/elk/kibana-7.2.0-linux-x86_64
vim kibana.yml
在配置文件中添加如下參數(shù):
server.host: "192.168.1.248"
i18n.locale: "zh-CN"
即可支持中文
4,安裝elasticsearch-analysis-ik-7.2.0.zip中文分詞器
unzip elasticsearch-analysis-ik-7.2.0.zip
chown -R elk.elk /home/elk/elasticsearch-analysis-ik-7.2.0
直接加載到elasticsearch的 plugins下
mv elasticsearch-analysis-ik-7.2.0 /home/elk/elasticsearch-7.2.0/plugins/ik
重新啟動elasticsearch即可
5艇棕,安裝logstash-7.2.0.tar.gz
解壓:
tar -xvf logstash-7.2.0.tar.gz
mv logstash-7.2.0 /home/elk/
chown -R elk.elk /home/elk/ logstash-7.2.0
根據(jù)實際需要編輯配置文件:
例如:
vim format.cnf
input {
file {
path => "/mnt/new/tomcat/LOGS/WebLogs/user/.log"
type => "userlog"
start_position => "beginning"
sincedb_path => "/root/logstash-7.2.0/sincedb"
}
file {
path => "/mnt/new/tomcat/LOGS//output/.log"
type => "syslog"
codec => multiline {
pattern => "^\d"
negate => true
what => "previous"
}
start_position => "beginning"
sincedb_path => "/root/logstash-7.2.0/sincedb"
}
file {
path => "/mnt/new/tomcat/LOGS//pay/.log"
type => "pay"
codec => multiline {
pattern => "^\d"
negate => true
what => "previous"
}
start_position => "beginning"
sincedb_path => "/root/logstash-7.2.0/sincedb"
}
}
filter{
if [type] == "userlog" {
grok {
patterns_dir => "/root/logstash-7.2.0/reg"
match => {
"message" => "%{logdatetime:times} %{MSG_:msg}"
}
}
json {
source => "msg"
target => "msg"
}
if [msg][uri]=="/base/msg/tips/queryMsgTipsByUser.htm"{
drop {}
}
if [msg][ip] !~ "127.|192.168.|172.1[6-9].|172.2[0-9].|172.3[01].|10." {
geoip {
source => "[msg][ip]"
database => "/root/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.1-java/vendor/GeoLite2-City.mmdb"
target => "geoip"
fields => ["region_name","city_name","location"]
}
}
useragent {
source => "[msg][ua]"
target => "ua"
}
date {
locale => "cn"
match => ["times", "yyyy-MM-dd HH:mm:ss.SSS","UNIX"]
target => "@timestamp"
}
mutate {
remove_field => "@version"
remove_field => "message"
remove_field => "[msg][ua]"
remove_field => "times"
}
}
if [type] == "syslog" {
grok {
patterns_dir => "/root/logstash-7.2.0/reg"
match=>["message","%{logdatetime:times} .%{LOGLEVEL:loglevel } {0,4}%{CLASSNAME:classname} {0,4}- {0,4}%{INFOMSG:infomsg}"]
}
if [classname] =~"^com.alibaba.dubbo.config.AbstractConfig" {
drop {}
}
if [classname] =~"^org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping" {
drop {}
}
if [classname] =~"^o.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping" {
drop {}
}
if [loglevel] == "DEBUG" {
drop {}
}
date {
locale => "cn"
match => ["times", "yyyy-MM-dd HH:mm:ss.SSS","UNIX"]
target => "@timestamp"
}
mutate {
remove_field => "@version"
remove_field => "times"
remove_field => "message"
remove_field => "tags"
}
}
if [type] == "pay" {
grok {
patterns_dir => "/root/logstash-7.2.0/reg"
match=>["message","%{logdatetime:times} .*%{LOGLEVEL:loglevel } {0,4}%{CLASSNAME:classname} {0,4}- {0,4}%{INFOMSG:infomsg}"]
}
date {
locale => "cn"
match => ["times", "yyyy-MM-dd HH:mm:ss.SSS","UNIX"]
target => "@timestamp"
}
mutate {
remove_field => "@version"
remove_field => "times"
remove_field => "message"
remove_field => "tags"
}
}
}
output {
elasticsearch {
hosts => ["192.168.1.248:9200","192.168.1.203:9200"]
codec => "json"
index => "log-%{type}-%{+YYYYMM}"
manage_template => true
template_overwrite => true
template_name => "my_logstash"
template => "/root/logstash-7.2.0/bin/logstash.json"
}
}
啟動:
/home/elk/ logstash-7.2.0/bin/logstash -f /path/format.cnf
到此安裝完成蝌戒!