Logstash搜集spring-cloud日志

如果使用sleuth,通过logback按照一定的格式打印日志,即可通过logstash过滤并搜集到es里面存储,继而通过Kibana展示。在k8s集群中,可以把logstash设置为DaemonSet在运行,进而能搜集每一个宿主机上的日志。

Dockerfile

From logstash
COPY log.conf /log.conf

CMD ["-f", "/log.conf"]

Logstash conf

input {
    file {
                type => "spring-cloud"
                path => "/logs/*/*.log"
                codec => multiline {
                        pattern => "^\d{4}-\d{2}-\d{2}"
                        negate => true
                        what => "previous"
                }

        }
}
filter {
     grok {
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:severity}\s+\[%{DATA:service},%{DATA:trace},%{DATA:span},%{DATA:exportable}\]\s+%{DATA:pid}---\s+\[%{DATA:thread}\]\s+%{DATA:class}\s+:\s+%{GREEDYDATA:rest
}" }
          }
       }

output {
        elasticsearch { hosts => ["192.168.99.100:9200"] }
}

k8s DaemonSet

apiVersion: extensions/v1beta1
kind: DaemonSet
metadata:
  name: logstash
spec:
  template: # create pods using pod definition in this template
    metadata:
      labels:
        app: logstash
    spec:
      containers:
      - name: logstash
        volumeMounts:
        - mountPath: /logs
          name: logs
        image: thoreaurepo.io:5000/logstash:v1
        imagePullPolicy: Always
      volumes:
      - name: logs
        hostPath:
          path: /logs
CONTENTS