仍然是springboot,gradle和logback的新手,我需要帮助!我正在尝试创建自己的回退过滤器。

主要目标是,如果某些带有相同错误消息的日志是由应用程序发送的,则允许我的记录器仅发送一条日志消息。

为此,我只创建了一个带有2个类的基本gradle项目进行测试。

build.gradle

logback.xml

project_explorer_eclipse

I-我的主要类(class)记录了一些错误

package com.example.CDOP221logback;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.joran.JoranConfigurator;
import ch.qos.logback.core.joran.spi.JoranException;

@SpringBootApplication
public class Cdop221LogbackApplication {

    private final static Logger log = LoggerFactory.getLogger("com.example.CDOP221log4j");

    public static void main(String[] args) {
        SpringApplication.run(Cdop221LogbackApplication.class, args);

        LoggerContext context = (LoggerContext)LoggerFactory.getILoggerFactory();
        context.reset();

        JoranConfigurator config = new JoranConfigurator();
        config.setContext(context);

        try {
          config.doConfigure("/home/mehdi/eclipse-workspace/CDOP-221-logback/logback.xml");
        } catch (JoranException e) {
          e.printStackTrace();
        }

        test();
    }

    private static void test() {
        log.debug("Application Cdop221 with LOGBACK logger launch succesful");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        log.error("ERROR_1");
        int i = 0;
        while(i < 100) {
            log.error("ERROR_2");
            i++;
        }

    }
}

II-如果某些错误消息相同,则我的自己的附录数量限制为

package com.logback;


import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.classic.spi.IThrowableProxy;
import ch.qos.logback.classic.spi.LoggingEvent;
import ch.qos.logback.classic.spi.StackTraceElementProxy;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;
import org.apache.commons.lang3.ArrayUtils;
import org.slf4j.LoggerFactory;

import java.util.HashMap;
import java.util.Map;
import java.util.Timer;
import java.util.TimerTask;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.TimeUnit;

/**
 * Improved {@link ch.qos.logback.classic.turbo.DuplicateMessageFilter} with a timeout feature added and time window error stacking #buzzwords
 * Indeed if there's some error logs that are the same (same hashcode) they are stacked and sent after {@link DuplicateErrorLogFilter#cacheTimeoutInSec}
 */

public class DuplicateErrorLogFilter extends Filter<ILoggingEvent> {

    /**
     * Repetition number MDC property
     */
    private static final String REP_NB = "repNb";

    /**
     * The default cache size.
     */
    private static final int DEFAULT_CACHE_SIZE = 100;
    /**
     * The default cache timeout in seconds
     */
    private static final int DEFAULT_CACHE_TIMEOUT_IN_SEC = 300;

    private String smtpAppenderName;
    private int cacheSize = DEFAULT_CACHE_SIZE;
    private int cacheTimeoutInSec = DEFAULT_CACHE_TIMEOUT_IN_SEC;
    private Map<Integer, FoldingTask> tasks = new ConcurrentHashMap<>(cacheSize);
    /**
     * Timer that will expire folding tasks
     */
    private Timer foldingTimer = new Timer("folding-timer", false);

    private final class FoldingTask extends TimerTask {

        private Integer key;
        private ILoggingEvent lastEvent;
        private int foldingCount;

        @Override
        public void run() {
            // Remove current task
            tasks.remove(key);

            // And send the event to SMTP appender
            sendEvent(lastEvent, foldingCount);
        }
    }

    /**
     * Append an event that has been folded
     *
     * @param event        the last seen event of this kind
     * @param foldingCount how many events were folded
     */
    protected void sendEvent(ILoggingEvent event, int foldingCount) {
        if (event != null) {
            if (foldingCount > 1) {
                // Do that to prevent UnsupportedOp from EmptyMap
                if (event.getMDCPropertyMap().isEmpty() && event instanceof LoggingEvent) {
                    ((LoggingEvent) event).setMDCPropertyMap(new HashMap<>());
                }
                event.getMDCPropertyMap().put(REP_NB, "[" + foldingCount + "x]");
            }
            ((Logger) (LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME))).getAppender(smtpAppenderName).doAppend(event);
        }
    }

    public void setSmtpAppenderName(String smtpAppenderName) {
        this.smtpAppenderName = smtpAppenderName;
    }

    public void setCacheSize(int cacheSize) {
        this.cacheSize = cacheSize;
    }

    public void setCacheTimeoutInSec(int cacheTimeoutInSec) {
        this.cacheTimeoutInSec = cacheTimeoutInSec;
    }

    @Override
    public void start() {
        super.start();
    }

    @Override
    public void stop() {
        tasks.clear();
        tasks = null;
        super.stop();
    }

    @Override
    public FilterReply decide(ILoggingEvent event) {
        if (!event.getLevel().isGreaterOrEqual(Level.ERROR)) {
            return FilterReply.NEUTRAL;
        }

        Integer key = eventHashCode(event);
        FoldingTask task = tasks.get(key);
        if (task == null) {
            // First time we encounter this event
            task = new FoldingTask();
            task.key = key;
            // lastEvent will be set at the first folded event
            tasks.put(key, task);

            // Arm timer for this task
            foldingTimer.schedule(task, TimeUnit.SECONDS.toMillis(cacheTimeoutInSec));

            // And log this event
            return FilterReply.NEUTRAL;
        } else {
            // Fold this event
            task.lastEvent = event;
            task.foldingCount++;
            return FilterReply.DENY;
        }
    }

    /**
     * Compute a signature for an event
     */
    private int eventHashCode(ILoggingEvent event) {
        IThrowableProxy thrInfo = event.getThrowableProxy();
        if (thrInfo == null || ArrayUtils.isEmpty(thrInfo.getStackTraceElementProxyArray())) {
            // No stacktrace
            String message = event.getFormattedMessage();
            return message.hashCode();
        }

        StackTraceElementProxy[] stack = thrInfo.getStackTraceElementProxyArray();
        int hashCode = 0;
        for (StackTraceElementProxy str : stack) {
            hashCode = 31 * hashCode + str.hashCode();
        }

        return hashCode;
    }
}

因此,当我运行我的代码时,它实际上并不起作用...但是我真的无法确定这是由于配置错误(我是logback库的初学者)还是我的代码很烂?

预先感谢您的帮助

result code (doesn't work correctly)

最佳答案

您在其中连接过滤器(DuplicateErrorLogFilter)和logback的logback配置文件(logback.xml)中缺少一部分:

<filter class="com.logback.DuplicateErrorLogFilter"/>

有关如何使用过滤器的其他信息:https://logback.qos.ch/manual/filters.html

关于gradle - Logback筛选器-限制性错误日志消息,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/52879670/

10-12 01:50