[monitoring system] ELK log collection and analysis platform

ELK centralized log collection and analysis platform

Official station: https://www.elastic.co/products

ELK consists of elastic search, Logstash and Kiabana

  • Elasticsearch is an open source distributed search engine. Its characteristics are: distributed, zero configuration, dynamic discovery, index dynamic transfer, index copy mechanism, restful grid connection, multiple data sources, dynamic search load, etc.
  • Logstash is a fully open source device, which can collect, filter and store your records for later use (such as search).
  • Kibana is also an open-source and free tool. Kibana can provide Logstash and ElasticSearch with a log analysis friendly Web community, which can help you summarize, analyze and search important data logs.
  • Filebeat: it is installed on the client server and sends the logs to Logstash. Filebeat acts as the log delivery agent and communicates with Logstash by using loggers' network protocol.

Deployment environment requirements
host1.com      # ELK server
4GB memory Elasticsearch,Kibana,Logstash,Java
  
host2.com      # Monitored server rsyslog Filebeat

host3.com      # Proxy server nginx Filebeat      

1.Java environment deployment

download jdk Address: http://www.oracle.com/technetwork/java/javase/downloads/index.html

//It is recommended to install the jdk version of elk
[root@host1 ~]# tar -zxvf jdk-8u102-linux-x64.tar.gz -C /usr/local/
[root@host1 ~]# cd /usr/local/jdk1.8.0_102/

//Configure jdk environment variables
[root@host1 ~]# vim /etc/profile
export JAVA_HOME=/usr/local/jdk1.8.0_102;
export PATH=$PATH:$JAVA_HOME/bin;
export CLASS_PATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/jre/lib/rt.jar

[root@host1 ~]# source /etc/profile

//Or execute the order,Select installed Java Edition
[root@host1 ~]# alternatives --config java

//Check Java version
[root@host1 ~]# java -version


2. Deploy elastic search environment
Click to view: elastic search installation deployment FAQ

elasticsearch The security mechanism of is not allowed root User started, so new user elk,with elk User startup elasticsearch: 
[root@host1 ~]# useradd elk
[root@host1 ~]# passwd elk
[root@host1 ~]# su - elk
[root@host1 ~]# wget -c https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.3.2.rpm
[root@host1 ~]# rpm -ivh elasticsearch-5.3.2.rpm
[root@host1 ~]# chown -R elk:elk /etc/elasticsearch/
[root@host1 ~]# vim /etc/elasticsearch/elasticsearch.yml
network.host: 0.0.0.0    #Modify to enable network monitoring
ttp.port: 9200

[root@host1 ~]# systemctl start elasticsearch ; systemctl status elasticsearch
[root@host1 ~]# curl ELK server IP address:9200

//Note: out of memory4G,Java The environment will not run successfully and will cause elasticsearch Unable to start


3. Deploy Kibana environment

[root@host1 ~]# wget -c https://artifacts.elastic.co/downloads/kibana/kibana-5.3.2-x86_64.rpm
[root@host1 ~]# rpm -ivh kibana-5.3.2-x86_64.rpm
[root@host1 ~]# vim /etc/kibana/kibana.yml
server.host:"x.x.x.x"        #ELK server IP address

[root@host1 ~]# systemctl start kibana ; systemctl status kibana

//Test access
kibana The server IP address:5601


4. Deploy Nginx agent environment

[root@host3 ~]# yum install nginx httpd-tools

//Login user kibanaadmin and password when creating access
[root@host3 ~]# htpasswd -c /etc/nginx/htpasswd.users kibanaadmin
[root@host3 ~]# vim /etc/nginx/conf.d/kibana.conf
server {
	listen 80;
	server_name xxx;    #Hostname of host3
	auth_basic "Restricted Access";
	auth_basic_user_file /etc/nginx/htpasswd.users;
 
 location / {
	proxy_pass http://kibana The server IP address:5601;
	proxy_http_version 1.1;
	proxy_set_header Upgrade $http_upgrade;
	proxy_set_header Connection 'upgrade';
	proxy_set_header Host $host;
	proxy_cache_bypass $http_upgrade;
 }
}

[root@host3 ~]# systemctl start nginx ; systemctl status nginx

//Test access
//Proxy nginx server IP address


5. Deploy Logstash environment

[root@host1 ~]# wget -c https://artifacts.elastic.co/downloads/logstash/logstash-5.3.2.rpm
[root@host1 ~]# rpm -ivh logstash-5.3.2.rpm

a.Use Filebeat Deliver logs from client server to elk Server,So you need to create SSL Certificate and key pair. The certificate is composed of Filebeat Used to verify identity ELK The server.. 
[root@host1 ~]# vim /etc/pki/tls/openssl.cnf
[ v3 ca ]
subjectAltName = IP: x.x.x.x             #ELK server IP address,Space

b.Use the following command to generate SSL Authentication and private key
[root@host1 ~]# cd /etc/pki/tls/
[root@host1 tls]# openssl req -config /etc/pki/tls/openssl.cnf -x509 -days 3650 -batch -nodes \
-newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt

c.Logstash To configure:Document adoption JSON-Format. Configuration consists of three parts: input, filter and output.
[root@host1 ~]# vim /etc/logstash/conf.d/02-beats-input.conf
input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}

d.Describe a beats Input, monitor tcp port5044,And will use the ssl Authentication is the secret key
[root@host1 ~]# vim /etc/logstash/conf.d/10-syslog-filter.conf
filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

e.Yes syslog Type ( Filebeat Tag) and use the grok Will input syslog Log parsing to make it structured and easy to query.
[root@host1 ~]# vim /etc/logstash/conf.d/30-elasticsearch-output.conf
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

[root@host1 ~]# systemctl start logstash ; systemctl status logstash

f.Copy SSL Certificate to client server.
[root@host1 ~]# scp /etc/pki/tls/certs/logstash-forwarder.crt Filebeat Client IP address:/etc/pki/tls/certs/


6. Deploy the client Filebeat Package environment

[root@host2 ~]# wget -c https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.3.2-x86_64.rpm
[root@host2 ~]# rpm -ivh filebeat-5.3.2-x86_64.rpm 
[root@host2 ~]# vim /etc/filebeat/filebeat.yml
filebeat.prospectors:
  paths:
    - /var/log/nginx/access.log    #Add monitored log path

output.elasticsearch:
  hosts: ["x.x.x.x:9200"]           #ELK server IP address
  index: "host2"                    #Native host name

output.logstash:
  hosts: ["x.x.x.x:5044"]           #ELK server IP address

ssl.certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]    #Certificate path

[root@host2 ~]# systemctl start filebeat ; systemctl status filebeat


7.For the first time, you will be prompted to createindex(Fill filebeat*),Direct click Create Button.
//Select Discover to set monitoring feedback time and display monitoring data



Question 1: [2016-11-06t16:27:21712] [warn] [o.e.b.jnaanatives] unable to install syscall filter:
Java.lang.UnsupportedOperationException: seccomp unavailable: requires kernel 3.5+ with CONFIG_SECCOMPandCONFIG_SECCOMP_FILTERcompiledinatorg.elasticsearch.bootstrap.Seccomp.linuxImpl(Seccomp.java:349) ~[elasticsearch-5.0.0.jar:5.0.0]
at org.elasticsearch.bootstrap.Seccomp.init(Seccomp.java:630) ~[elasticsearch-5.0.0.jar:5.0.0]

Reason: a series of errors have been reported. You don't need to panic. It's just a warning, mainly because your Linux version is too low.
Solution:
1. Reinstall the new version of Linux system
2. Warning does not affect use, can be ignored

Problem 2: error: bootstrap checks failed max file descriptors [4096] for elasticsearch process like too low, increase to at least [65536]

Reason: the local file cannot be created. The maximum number of files that users can create is too small
Solution: switch to root, edit the limits.conf configuration file, and add something similar to the following:

vim /etc/security/limits.conf
* soft nofile 65536
* hard nofile 131072
* soft nproc 2048
* hard nproc 4096
* 
Note: * represents all Linux user names (such as hadoop)
Save, exit, and reboot to take effect

Question 3: max number of threads [1024] for user [es] like too low, increase to at least [2048]

Reason: unable to create local thread, the maximum number of threads that can be created by the user is too small
Solution: switch to the root user, enter the limits.d directory, and modify the 90-nproc.conf configuration file.

vim /etc/security/limits.d/90-nproc.conf
* soft nproc 1024       #Change to * soft nproc 2048

Question 4: Max virtual memory areas vm.max map [count [65530] almost too low, increase to at least [262144]

Reason: the maximum virtual memory is too small
Solution: switch to root and modify the configuration file sysctl.conf

vim /etc/sysctl.conf
 Add the following configuration:
vm.max_map_count=655360
 And execute the command:
sysctl -p
 Then restart the elastic search service to start successfully.

Problem 5: no host or route found when ElasticSearch starts

Cause: problem with ElasticSearch unicast configuration
Solution: check the profile in ElasticSearch

vim  config/elasticsearch.yml
 Locate the following configuration:
discovery.zen.ping.unicast.hosts:["192.168.**.**:9300","192.168.**.**:9300"]
In general, there is a problem with the configuration here. Pay attention to the writing format

Q6: org.elasticsearch.transport.RemoteTransportException: Failed to deserialize exception response from stream
Reason: inconsistent jdk versions between ElasticSearch nodes
Solution: elastic search cluster unified jdk environment

Q7: Unsupported major.minor version 52.0

Reason: jdk version is too low
Solution: change the JDK version, and elastic search 5.0.0 supports jdk1.8.0

Q8: bin / elasticsearch plugin install license
ERROR: Unknown plugin license

Reason: the plug-in command has changed since elasticsearch 5.0.0
Solution: install all plug-ins bin / elasticsearch plugin install x-pack with the latest command

Published 8 original articles, won praise 0, visited 66
Private letter follow

Tags: ElasticSearch vim RPM Java

Posted on Wed, 12 Feb 2020 03:10:37 -0800 by bubazoo