configure Elasticsearch, Kibana, Filebeat to collect netflow and analyze
by Muruganandan from LinuxQuestions.org on (#5JPV2)
Hi Team ,
I have configured ELK,Kibana and filebeat to collect net flow data from the router. On Ubunto 20.04. Somehow I managed to collect the data successfully and Kibana started to show netflow data. But after I have added a few lines in the filebeat.yml file to give depth analyses of netflow. After adding those lines the filebeat started to throw errors even after I removed those lines.
Config for the netflow in the filebeat, Highlighted in black are the config codes have been added.
#
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']
- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192
# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601
# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
==================
error
2021-06-02T12:36:50.155+0530 ERROR instance/beat.go:971 Exiting: No outputs are defined. Please define one under the output section.
Exiting: No outputs are defined. Please define one under the output section.
I disabled the logstash and I'm using Elastic search for log analysis. The following is the elasticsearch.yml file configuration
http.port: 9200
node.name: MIKROTIK
path.data: /var/lib/elasticsearch
# Path to log files:
path.logs: /var/log/elasticsearch
network.host: localhost
http.port: 9200
Please suggestions how to getrid of this alert.
I have configured ELK,Kibana and filebeat to collect net flow data from the router. On Ubunto 20.04. Somehow I managed to collect the data successfully and Kibana started to show netflow data. But after I have added a few lines in the filebeat.yml file to give depth analyses of netflow. After adding those lines the filebeat started to throw errors even after I removed those lines.
Config for the netflow in the filebeat, Highlighted in black are the config codes have been added.
#
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']
- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192
# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601
# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
==================
error
2021-06-02T12:36:50.155+0530 ERROR instance/beat.go:971 Exiting: No outputs are defined. Please define one under the output section.
Exiting: No outputs are defined. Please define one under the output section.
I disabled the logstash and I'm using Elastic search for log analysis. The following is the elasticsearch.yml file configuration
http.port: 9200
node.name: MIKROTIK
path.data: /var/lib/elasticsearch
# Path to log files:
path.logs: /var/log/elasticsearch
network.host: localhost
http.port: 9200
Please suggestions how to getrid of this alert.