用ELK去分析與儲存log紀錄吧! Use ELK Stack to log
=
Preface
-
Before, we installed and configured the rsyslog , MySQL and LogAnalyzer.
Now we use anothor tools to help us collect logs , and get better statistics.
Objective
-
Deploy the ELK and get statistics to analyze logs.
System environvent
-
[Transport iptables log file to log server](https://hackmd.io/@0zewB80fSB2qjONXd8dXPw/rkGmm-WNH)
Same environment in this note,and install the ELK in log server.
Let's Deploy
===
Install ELK
-
Install Java-OpenSDK
```
yum install java
```
Download **ElasticSearch** and extract it
```
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.3.0-linux-x86_64.tar.gz
tar -zxvf elasticsearch-7.3.0-linux-x86_64.tar.gz
cd elasticsearch
bin/elasticsearch #run to install
```
modify config file from config/elasticsearch.yml
```
cluster.name: log-elasticsearch
network.host: $SERVER_IP
http.port: 9200
discovery.seed_hosts:["127.0.0.1","[::1]","[$SERVER_IP]"]
```
Download **Kibana** and extract it
```
wget https://artifacts.elastic.co/downloads/kibana/kibana-7.3.0-linux-x86_64.tar.gz
tar -zxvf kibana-7.3.0-linux-x86_64.tar.gz
cd kibana/config
vim kibana.yml
```
modify config file from config/kibana.yml
```
server.port: 5601
server.host: $SERVER_NAME
elasticsearch.hosts: ["http://$elasticsearch_SERVER_IP:9200"]
```
Download **Logstash** and extract it
```
wget https://artifacts.elastic.co/downloads/logstash/logstash-7.3.0.tar.gz
tar -zxvf logstash-7.3.0.tar.gz
```
Transport method
===
Transport rsyslog log files through LogStash
-
Logstash config file
```
imput{
syslog{
port => "514"
}
}
output{
elasticsearch{hosts => ["$Elasticsearch_SERVER:9200"]}
stdout{}
}
```
start up logstash
```
bin/logstash -f config/syslog.conf
```
Transport log files through Filebeat
-
Download filebeat plugin
```
https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.3.0-linux-x86_64.tar.gz
tar -zxvf filebeat-7.3.0-linux-x86_64.tar.gz
```
Modify filebeat config from ./filebeat.yml
-
```
#=====Filebeat inputs=====
filebeat.inputs:
-type: log
enabled: true
paths:
-/var/log/*.log
#=====Kibana=====
setup.kibana:
host: "192.168.0.250:5601"
#=====Outputs=====
#-----ElasticSearch output-----
output.elasticsearch:
hosts: ["$elasticsearch_SERVER_IP:9200"]
```
Browse kibana/discover then can show log in screen.
Some Errors
===
Insufficient space for shared memory file
-
clean the disk.
```
df -h
du -h -x --max-depth=1
ps aux
kill
```
Create Kibana index pattern forbidden
-
```
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'
```
Screenshot
-
![](https://i.imgur.com/wEvXFnC.png)
<center>log from rsyslog client through logstash</center>
![](https://i.imgur.com/dGhGovD.png)
<center>log from local through filebeat</center>
reference
-
[ELK 常用架构及使用场景介绍](https://www.ibm.com/developerworks/cn/opensource/os-cn-elk-filebeat/index.html)
[集中式日志系统 ELK 协议栈详解](https://www.ibm.com/developerworks/cn/opensource/os-cn-elk/index.html)
[ELK 錯誤訊息 max file descriptors [4096] for elasticsearch process is too low](http://my-fish-it.blogspot.com/2017/06/ss-elk-max-file-descriptors-4096-for.html)
[elasticsearch 7 单机配置](https://juejin.im/post/5cb81bf4e51d4578c35e727d)
---
###### tags: `ELK` `filebeat` `log` `CentOS 7`