ELK是一種能夠從任意數(shù)據(jù)源抽取數(shù)據(jù),并實時對數(shù)據(jù)進行搜索瓶蚂、分析和可視化展現(xiàn)的數(shù)據(jù)分析框架萤衰。此篇文章對ELK環(huán)境部署進行簡單記錄,Linux版與Mac版相差不多坏瞄,此處以macOS系統(tǒng)操作桂对。
基礎環(huán)境:java1.8.0,node8.14.0鸠匀,maven3.5.0
目標文件(文末有網(wǎng)盤鏈接蕉斜,也可直接下載):
************************
ES6.5.0 https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.5.0.tar.gz
Logstash6.5.0 https://artifacts.elastic.co/downloads/logstash/logstash-6.5.0.tar.gz
Kibana6.5 https://artifacts.elastic.co/downloads/kibana/kibana-6.5.0-darwin-x86_64.tar.gz
IK https://github.com/medcl/elasticsearch-analysis-ik/archive/master.zip
***********************
截至2019年1月18日,ES(https://www.elastic.co/downloads/elasticsearch)最新版本為6.5.4缀棍,但是分詞插件IK現(xiàn)在最高版本只支持到6.5.0宅此,所以ELK整體版本定在6.5.0
一.ES+IK
ES簡介:Elasticsearch是個開源分布式搜索引擎,提供搜集爬范、分析父腕、存儲數(shù)據(jù)三大功能。它的特點有:分布式青瀑,零配置璧亮,自動發(fā)現(xiàn),索引自動分片斥难,索引副本機制枝嘶,restful風格接口,多數(shù)據(jù)源哑诊,自動搜索負載等群扶。
下載地址已經(jīng)在目標文件中寫好,以ES舉例镀裤,獲取方式如下竞阐,Logstash和Kibana不做贅述。
1.1 獲取es下載地址
到elastic官網(wǎng)https://www.elastic.co/ 右上「downloads」按鈕暑劝,進入下載頁面骆莹,找到Elasticsearch->Download->pastVersion 到版本選擇頁面(https://www.elastic.co/downloads/past-releases),選擇6.5.0版铃岔,獲取到下載地址汪疮。
1.2 下載并安裝es
新建文件夾「ELK」峭火,下載es壓縮文件到本地,命令如下:
mkdir /opt/soft/ELK
cd /opt/soft/ELK/
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.5.0.tar.gz
下載完后在當前文件解壓,然后進入文件運行es進行測試
tar -zxvf elasticsearch-6.5.0.tar.gz
rm elasticsearch-6.5.0.tar.gz
cd elasticsearch-6.5.0
./bin/elasticsearch
運行l(wèi)og出現(xiàn)「started」代表運行成功智嚷,es端口默認為9200卖丸,到瀏覽器輸入http://localhost:9200/ 查看
1.3 安裝IK分詞插件
control+C停掉es,在plugins下創(chuàng)建ik文件夾盏道。
cd plugins
mkdir ik
回到ELK目錄下進行下載ik
cd ../..
wget https://github.com/medcl/elasticsearch-analysis-ik/archive/master.zip
解壓zip文件稍浆,如果沒有unzip命令使用brew下載一個即可,linux下使用yum/apt-get猜嘱。
unzip master.zip
cd elasticsearch-analysis-ik-master
解壓完ik后進入到文件夾內進行maven打包
mvn clean package
需要時間較長衅枫,build成功后,到target/releases/ 拿出「elasticsearch-analysis-ik-6.5.0.zip」到es的plugins下ik文件夾內并解壓
mv elasticsearch-analysis-ik-6.5.0.zip /opt/soft/ELK/elasticsearch-6.5.0/plugins/ik/
cd /opt/soft/ELK/elasticsearch-6.5.0/plugins/ik/
unzip ik.zip
解壓完后朗伶,運行es弦撩,查看ik插件狀態(tài)
log中出現(xiàn)“l(fā)oaded plugin [analysis-ik]” 并且started 代表ik插件安裝成功
1.4 測試ES+IK
新開一個終端,創(chuàng)建索引index论皆,然后對“你果然后面有戲”進行分詞測試
curl -XPUT http://localhost:9200/index
curl -H "Content-Type: application/json"? -XGET 'http://localhost:9200/index/_analyze?pretty=true' -d '
{
"analyzer" : "ik_max_word",
"text":"你果然后面有戲份"
}'
看終端輸出結果
{
? "tokens" : [
? ? {
? ? ? "token" : "你",
? ? ? "start_offset" : 0,
? ? ? "end_offset" : 1,
? ? ? "type" : "CN_CHAR",
? ? ? "position" : 0
? ? },
? ? {
? ? ? "token" : "果然",
? ? ? "start_offset" : 1,
? ? ? "end_offset" : 3,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 1
? ? },
? ? {
? ? ? "token" : "然后",
? ? ? "start_offset" : 2,
? ? ? "end_offset" : 4,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 2
? ? },
? ? {
? ? ? "token" : "后面",
? ? ? "start_offset" : 3,
? ? ? "end_offset" : 5,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 3
? ? },
? ? {
? ? ? "token" : "面有",
? ? ? "start_offset" : 4,
? ? ? "end_offset" : 6,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 4
? ? },
? ? {
? ? ? "token" : "有戲",
? ? ? "start_offset" : 5,
? ? ? "end_offset" : 7,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 5
? ? },
? ? {
? ? ? "token" : "戲份",
? ? ? "start_offset" : 6,
? ? ? "end_offset" : 8,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 6
? ? }
? ]
}
再測一個“落花有意流水無情”
curl -H "Content-Type: application/json" -XGET 'http://localhost:9200/index/_analyze?pretty=true' -d '
{
"analyzer" : "ik_max_word",
"text":"落花有意流水無情"
}'
輸出:
{
? "tokens" : [
? ? {
? ? ? "token" : "落花有意流水無情",
? ? ? "start_offset" : 0,
? ? ? "end_offset" : 8,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 0
? ? },
? ? {
? ? ? "token" : "落花有意",
? ? ? "start_offset" : 0,
? ? ? "end_offset" : 4,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 1
? ? },
? ? {
? ? ? "token" : "落花",
? ? ? "start_offset" : 0,
? ? ? "end_offset" : 2,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 2
? ? },
? ? {
? ? ? "token" : "有意",
? ? ? "start_offset" : 2,
? ? ? "end_offset" : 4,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 3
? ? },
? ? {
? ? ? "token" : "流水無情",
? ? ? "start_offset" : 4,
? ? ? "end_offset" : 8,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 4
? ? },
? ? {
? ? ? "token" : "流水",
? ? ? "start_offset" : 4,
? ? ? "end_offset" : 6,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 5
? ? },
? ? {
? ? ? "token" : "無情",
? ? ? "start_offset" : 6,
? ? ? "end_offset" : 8,
? ? ? "type" : "CN_WORD",
? ? ? "position" : 6
? ? }
? ]
}
中文分詞效果出現(xiàn)益楼,至此,ES+IK安裝并測試完成
二.Logstash+input_jdbc
Logstash簡介:Logstash 主要是用來日志的搜集点晴、分析感凤、過濾日志的工具,支持大量的數(shù)據(jù)獲取方式粒督。一般工作方式為c/s架構陪竿,client端安裝在需要收集日志的主機上,server端負責將收到的各節(jié)點日志進行過濾屠橄、修改等操作在一并發(fā)往elasticsearch上去族跛。
2.1 安裝logstash
到ELK文件夾下進行l(wèi)ogstash下載并解壓
cd /opt/soft/ELK
wget https://artifacts.elastic.co/downloads/logstash/logstash-6.5.0.tar.gz
tar -zxvf logstash-6.5.0.tar.gz
2.2 logstash測試
到解壓好的文件夾下運行l(wèi)ogstash
cd logstash-6.5.0/
./bin/logstash -e 'input { stdin { } } output { stdout {} }'
運行l(wèi)ogstash后,日志面板出現(xiàn)「Successfully started」代表成功锐墙,輸入hello world 進行測試
[2019-01-18T17:07:16,543][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-18T17:07:16,789][INFO ][logstash.agent? ? ? ? ? ] Successfully started Logstash API endpoint {:port=>9600}
hello world
{
? ? ? "message" => "hello world",
? ? "@timestamp" => 2019-01-18T09:07:23.194Z,
? ? ? ? ? "host" => "zhangyanandeMacBook-Pro.local",
? ? ? "@version" => "1"
}
2.3 安裝logstash-input-jdbc 插件
使用logstash 自帶命令進行安裝
./bin/logstash-plugin? install logstash-input-jdbc
出現(xiàn)「Installation successful」時代表安裝完成
2.4 logstash 實現(xiàn)mysql數(shù)據(jù)庫同步
創(chuàng)建sqlconfig.conf文件
vim sqlconfig.conf
input {
? jdbc {
? # 指定mysql驅動jar
? ? jdbc_driver_library => "/opt/soft/ELK/mysql-connector-java-5.1.47-bin.jar"
? ? jdbc_driver_class => "com.mysql.jdbc.Driver"
? #連接到指定數(shù)據(jù)庫
? ? jdbc_connection_string => "jdbc:mysql://localhost:3306/infodata?useUnicode=true&characterEncoding=utf8&useSSL=true"
? ? #數(shù)據(jù)庫用戶名
? ? jdbc_user => "root"
? ? #數(shù)據(jù)庫密碼
? ? jdbc_password => "Ac123456"
? ? #定時器
? ? schedule => "* * * * *"
??? #輸入需要輸出到es的sql庸蔼,也可以使用-f命令指定文件
? ? statement => "SELECT id,name,content,crtime from zztest"
? }
}
filter {
? ? json {
? ? ? ? source => "message"
? ? ? ? remove_field => ["message"]
? ? }
}
output {
? ? elasticsearch {
? ? ? ? hosts => "localhost:9200"
? ? ? ? # 輸出到指定索引
? ? ? ? index => "sqlindex"
? ? ? ? # 填寫查詢的列名
? ? ? ? document_id => "%{id}"
? ? }
? ? stdout {
? ? ? ? codec => json_lines
? ? }
}
保存文件后,再次運行l(wèi)ogstash贮匕,這次指定加載文件為sqlconfig
./bin/logstash -f sqlconfig.conf
查看log,db是否同步到es中
Sending Logstash logs to /opt/soft/ELK/logstash-6.5.0/logs which is now configured via log4j2.properties[2019-01-18T17:26:41,208][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified[2019-01-18T17:26:41,227][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.0"}[2019-01-18T17:26:44,471][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}[2019-01-18T17:26:45,001][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}[2019-01-18T17:26:45,010][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}[2019-01-18T17:26:45,289][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}[2019-01-18T17:26:45,388][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}[2019-01-18T17:26:45,392][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}[2019-01-18T17:26:45,430][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}[2019-01-18T17:26:45,456][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}[2019-01-18T17:26:45,474][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}[2019-01-18T17:26:45,598][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash[2019-01-18T17:26:45,964][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2b242f8a run>"}[2019-01-18T17:26:46,011][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}[2019-01-18T17:26:46,326][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}[2019-01-18T17:27:01,682][INFO ][logstash.inputs.jdbc ] (0.027233s) SELECT id,name,content,crtime from zz{"@timestamp":"2019-01-18T09:27:01.820Z","name":"zhyn","content":"elktest","crtime":
"2019-01-17T08:03:00.000Z","@version":"1","id":1}{"@timestamp":"2019-01-18T09:27:01.841Z","name":"張哈哈","content":"同步成功","crtime":"2019-01-17T08:31:08.000Z","@version":"1","id":3}{"@timestamp":"2019-01-18T09:27:01.840Z","name":"zhyn4j","content":"eltest333","crtime":
"2019-01-17T08:16:35.000Z","@version":"1","id":2}
使用ES查詢功能花枫,查看索引“sqlindex”下所有內容
curl? -H "Content-Type: application/json"? -XPOST 'localhost:9200/sqlindex/_search?pretty' -d '
{
? "query": { "match_all": {} }
}'
得到返回:
{
? "took" : 62,
? "timed_out" : false,
? "_shards" : {
??? "total" : 5,
??? "successful" : 5,
??? "skipped" : 0,
??? "failed" : 0
? },
? "hits" : {
??? "total" : 3,
??? "max_score" : 1.0,
??? "hits" : [
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "2",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.157Z",
????????? "name" : "zhyn4j",
????????? "content" : "eltest333",
????????? "crtime" : "2019-01-17T08:16:35.000Z",
????????? "@version" : "1",
????????? "id" : 2
??????? }
????? },
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "1",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.156Z",
????????? "name" : "zhyn",
????????? "content" : "elktest",
????????? "crtime" : "2019-01-17T08:03:00.000Z",
????????? "@version" : "1",
????????? "id" : 1
??????? }
????? },
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "3",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.157Z",
????????? "name" : "張哈哈",
????????? "content" : "同步成功",
????????? "crtime" : "2019-01-17T08:31:08.000Z",
????????? "@version" : "1",
????????? "id" : 3
??????? }
????? }
??? ]
? }
}
至此刻盐,logstash同步數(shù)據(jù)庫成功。
三 Kibana
Kibana簡介:Kibana 是一個開源和免費的工具劳翰,Kibana可以為 Logstash 和 ElasticSearch 提供的日志分析友好的 Web 界面敦锌,可以幫助匯總、分析和搜索重要數(shù)據(jù)日志佳簸。
3.1 安裝kibana
到ELK目錄下進行kibana安裝
cd /opt/soft/ELK
wget https://artifacts.elastic.co/downloads/kibana/kibana-6.5.0-darwin-x86_64.tar.gz
tar -zxvf kibana-6.5.0-darwin-x86_64.tar.gz
3.2運行kibana
cd kibana-6.5.0-darwin-x86_64
./bin/kibana
kibana的默認端口為5601
在網(wǎng)頁輸入? http://localhost:5601 查看是否可以使用
kibana側欄常用功能如下:
Discover 數(shù)據(jù)搜索查看
Visualize 圖表制作
Dashboard 儀表盤制作
Timelion 時序數(shù)據(jù)的高級可視化分析
DevTools 開發(fā)者工具
Management 配置
到DIscover欄查看索引乙墙,顯示有sqlindex 颖变,表示Kibana已經(jīng)連上ES,Kibana 安裝成功
四 小結
ELK環(huán)境部署已經(jīng)完成听想,對于大數(shù)據(jù)采集腥刹、分析的實戰(zhàn),爭取在之后的文章記述汉买,幫助新同學們少走一些彎路衔峰。文中提到的文件我已經(jīng)放到網(wǎng)盤,包括mysql的數(shù)據(jù)驅動jar蛙粘。有需要的同學可以自行獲取垫卤,網(wǎng)盤鏈接:
https://pan.baidu.com/s/1VVs667OiDvZph8oq9yBk3Q? 密碼:br7n