# 日誌分析平台 - ELK
[課程共筆](https://paper.dropbox.com/doc/20230613-es0Q2ZfeY0EbpUhLMa1Dj)
*6/13
Russel老師email: russell.chen@bimap.co
### ElasticSearch安裝
```py=
$ sudo -i #轉換為root身分
$ apt-get install wget
$ wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-8.5.3-amd64.deb
$ dpkg -i elasticsearch-8.5.3-amd64.deb
$ systemctl daemon-reload
$ systemctl start elasticsearch
```
* 此時網址打https://ip:9200會出現需要帳號密碼的驗證
* 重設Elastic與Kibana_system的密碼
```py=
$ /usr/share/elasticsearch/bin/elasticsearch-reset-password -i -u elastic
$ /usr/share/elasticsearch/bin/elasticsearch-reset-password -i -u kibana_system
```
### Kibana安裝
```py=
$ wget https://artifacts.elastic.co/downloads/kibana/kibana-8.5.3-amd64.deb
$ dpkg -i kibana-8.5.3-amd64.deb
$ vim /etc/kibana/kibana.yml
# 檔案最上方加一行 server.host: "0.0.0.0"
$ systemctl start kibana
```
* 網址打http://ip:5601,需要輸入Token,從ubuntu產生Token
```py=
$ /usr/share/elasticsearch/bin/elasticsearch-create-enrollment-token -s kibana
```

將Token輸入網址內

接著產生驗證碼。
```py=
/usr/share/kibana/bin/kibana-verification-code
```



接著輸入帳號Elastic,與剛剛設定的密碼就能登入

### ELASTICSEARCH 設定修改
* 先建立資料夾
```py=
$ mkdir -p /data/es
$ chown -R elasticsearch. /data
```
* 修改設定檔
```py=
$ vi /etc/elasticsearch/elasticsearch.yml
```
```py=
#單機的話不要改cluster與node name,Log位置也不要改,network host與http port也不要改
path.data: /data/es
bootstrap.memory_lock: true
```
* 重啟elastic research
```py=
$ systemctl restart elasticsearch
```
### Logstash安裝
```py=
$ wget https://artifacts.elastic.co/downloads/logstash/logstash-8.5.3-amd64.deb
$ dpkg -i logstash-8.5.3-amd64.deb
```
## ELK介紹



## Kibana使用


* 資料導入四種方式
a. 從 Logstash 導入
b. 使用 Bulk 直接導入
c. 從 Beats 直接導入
d. 從 Kibana 使用 Data Visualizer 直接導入(這次上課用這個導入CSV)
* 導入CSV資料




* Import前要調整資料
a. 資料名與資料注意事項:不能用大寫、不能使用大部分符號除了 ( , - _ [ ] )、不能空格
b. 建立新的Data View(打勾)
c. Index 基礎設定(不用動)
d. Mapping
e. Ingest pipeline (long=數字)







* 把Mappings的欄位改成以下Code之後按import(為了繪圖方便,需要keyword)
```py=
{"properties": {
"@timestamp": {"type": "date"},
"audience_count": {"type": "long"},
"audience_fresh_critics_count": {"type": "long"},
"audience_rating": {"type": "long"},
"audience_rotten_critics_count": {"type": "long"},
"audience_status": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"audience_top_critics_count": {"type": "long"},
"cast": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"critics_consensus": {"type": "text"},
"directors": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"genre": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"in_theaters_date": {"type": "date","format": "iso8601"},
"movie_info": {"type": "text"},
"movie_title": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"on_streaming_date": {"type": "date","format": "iso8601"},
"poster_image_url": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"rating": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"rotten_tomatoes_link": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"runtime_in_minutes": {"type": "long"},
"studio_name": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"tomatometer_count": {"type": "long"},
"tomatometer_rating": {"type": "long"},
"tomatometer_status": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"writers": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}}
}}
```


* 匯入完成

* 把時間軸修改為過去150年


* 擴大 DATA VISUALIZER 上傳限制方法(通常是測試用才會手動上傳)(上課不改)
Stack Management → Advanced Setting → 修改 Maximum file upload size
* Ingest pipeline與Lonstash (通常都用Longstash)

* 看Index Management
功能列-Stack Management-Index Management
單機就會是Yellow,若有另一台機器Replica過去的話就會是Green。有分片Replica則空間就會倍數成長

* Data View與Index

* 若要看合集
功能列-Stack Management-Kibana-Data View

* Data View格式驗證,aggregatable綠色才能畫圖

*6/14
* Discovery的操作

* Filter,要選.keyword的,value才有預覽功能可以用選的

* 若建立Filter後,要暫時不要有篩選功能,可以選擇Temporarily disable

* 選想看的欄位資料

* 欄位可以快速篩選預覽

* 快速看欄位資訊

* 可以儲存搜尋的欄位資訊

* 可以輸出成csv報告

* Data View的ID都是獨一無二的,若有綁定圖表,刪除Data View後就會找不到

* 篩選器選is between,上限不打則表示無限大

* 篩選器若名字太長可以自己重新取名字

### KQL (Kibana Query Language) 查詢


### 視覺化
* Visualiza Library


* Aggregation Base

* Metric






* 圖表備份
Stack Management - Saved Object - Export (include related object不要勾,不然會連Data view都匯出,再匯入時就會再多一個data view)

### Logstash
* 標準螢幕輸入/輸出:創一個test.conf在家目錄下,內容為
```py=
input {
stdin {}
}
output {
stdout {
codec => rubydebug
}
}
```
接著執行此test.conf
```py=
$ /usr/share/logstash/bin/logstash -f 路徑/test.conf
```
Output:

* 標準螢幕輸入/輸出到檔案:再創一個test2.conf
```py=
input {
stdin{}
}
output {
file {
path => "/home/admin1/output.log"
}
}
```
Output:


### 作業:
* 題目: 以logstash設定日期與經緯度資料型態,並output至Elasticsearch再以Kibana進行分析;Kibana請製作最少5張分析圖表的Dashboard,內容不限
```py=
#用GCP建一個主機,安裝ELK
#34.81.114.119:9200 / 5601
#建立一個conf檔,內容為
input {
file {
path => "/home/samantha.yu0611/SALES222.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns =>
["UNIT","AC-YEAR","YEAR","MONTH","YM","DEP","STORE","BRAND","SEASON","TYPE","QUALITY","SALES(K)","COST(K)","PRICE(K)","MARGIN(K)","ORIGEN","Latitude","Longitude"]
}
mutate {
remove_field => "AC-YEAR"
}
date{
match => ["YM", "yyyy/MM/dd"]
target => "YM"
}
mutate {
replace => { "@timestamp" => "%{+YYYY-MM-dd}" }
}
mutate {
convert => {"SALES(K)" => "float"}
convert => {"COST(K)" => "float"}
convert => {"PRICE(K)" => "float"}
convert => {"MARGIN(K)" => "float"}
}
mutate {
convert => { "Latitude" => "float" }
convert => { "Longitude"=> "float" }
}
mutate {
add_field => { "location" => "%{Latitude}, %{Longitude}" }
}
output {
elasticsearch {
hosts => ["https://34.81.114.119:9200"]
index => "logstash-sales6"
ssl => true
ssl_certificate_verification => false
user => elastic
password => "123456"
}
stdout { codec => rubydebug }
}
```
* 執行conf檔
```py=
$ /usr/share/logstash/bin/logstash -f /root/elk/0615.conf
```
*6/27
### 首頁功能列-DevTool-Grok Debugger
* sample data
```2017-12-04 12:34:56,789 [1] INFO MemberController - Call SampleDB.SP_CreateMember [0.001234]```
* grok
```py=
%{DATA:logTimestamp} \[%{NUMBER:num}\] %{GREEDYDATA:msg}
```
變數要用大括弧包起來,中括弧前要用反斜線
沒有空格的字串用NOTSPACE






### ES Restful API
* 功能列-DevTools-Console
* 看index資訊 (/?v是加表頭)
```GET _cat/indices/?v```

* 在Terminal也可打API
```py=
$ curl -k -u elastic:123456 -XGET "https://0.0.0.0:9200/_cat/indices/?v"
```

* 常用的API

* Template等同於在index management - template - mapping

* api看模板
```py=
GET _cat/templates/?v #看所有
GET _index_template/ecs-logstash #看單一個模板如logstash
```
* Query語法可以先從這裡找

###### tags: `日誌分析平台` `ELK`