本文介绍了配置 Spring Kafka 以使用 DeadLetterPublishingRecoverer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 Spring Boot 2.1.3 并尝试使用 DeadLetterPublishingRecoverer 配置 Spring SeekToCurrentErrorHandler 以将错误记录发送到不同的主题.创建新的 DLT 队列并插入一条记录,但消息正文为空.我希望消息正文中填充有原始 JSON 正文以供将来分析.

I'm using Spring Boot 2.1.3 and am trying to configure Spring SeekToCurrentErrorHandler with a DeadLetterPublishingRecoverer to send error records to a different topic. The new DLT queue is created and a record is inserted but the message body is empty. I was expecting the message body to be populated with the original JSON body for future analysis.

这是我目前的配置.知道我哪里出错了吗?不确定它是否与使用 kafkaTemplate 有关,因为消息生产者使用 kafkaTemplate.

Here is the configuration I have so far. Any idea where I'm going wrong? Not sure if its to do with using kafkaTemplate<Object, Object> where as the message producer uses kafkaTemplate<String, Message>.

@Configuration
@EnableKafka

public class ListenerConfig {

    @Value("${kafka.bootstrap-servers}")
    private String bootstrapServers;

    @Autowired
    private KafkaTemplate<Object, Object> kafkaTemplate;

    @Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
        props.put(ErrorHandlingDeserializer2.KEY_DESERIALIZER_CLASS, JsonDeserializer.class);
        props.put(ErrorHandlingDeserializer2.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName());
        props.put(JsonDeserializer.KEY_DEFAULT_TYPE, "java.lang.String");
        props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, "com.test.kafka.Message");
        props.put(JsonDeserializer.TRUSTED_PACKAGES, "com.test.kafka");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        return props;
    }

    @Bean
    public ConsumerFactory<String, Message> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(
        consumerConfigs());
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, Message> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, Message> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        factory.setErrorHandler(new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(kafkaTemplate), 3));
        return factory;
    }

    @KafkaListener(topics = "test")
    public void receive(@Payload Message data,
                    @Headers MessageHeaders headers) {
        LOG.info("received data='{}'", data);

        System.out.println(data.getMessage());

        headers.keySet().forEach(key -> {
           LOG.info("{}: {}", key, headers.get(key));
        });
    }

推荐答案

DeadLetterPublishingRecoverer 只是发布传入的 ConsumerRecord 内容.

The DeadLetterPublishingRecoverer simply publishes the incoming ConsumerRecord contents.

ErrorHandlingDeserializer2 检测到反序列化异常时,ConsumerRecord 中没有 value() 字段(因为它无法反序列化).

When the ErrorHandlingDeserializer2 detects a deserialization exception, there is no value() field in the ConsumerRecord (because it couldn't be deserialized).

相反,故障被放入两个标头之一:ErrorHandlingDeserializer2.VALUE_DESERIALIZER_EXCEPTION_HEADERErrorHandlingDeserializer2.KEY_DESERIALIZER_EXCEPTION_HEADER.

Instead, the failure is put into one of two headers: ErrorHandlingDeserializer2.VALUE_DESERIALIZER_EXCEPTION_HEADER or ErrorHandlingDeserializer2.KEY_DESERIALIZER_EXCEPTION_HEADER.

您可以通过

Header header = record.headers().lastHeader(headerName);
DeserializationException ex = (DeserializationException) new ObjectInputStream(
    new ByteArrayInputStream(header.value())).readObject();

使用 ex.getData() 中的原始负载.

with the original payload in ex.getData().

当检测到存在此类标头且 value()null 时,我们可能应该增强恢复器以执行此操作.

We should probably enhance the recoverer to do this when it detects such a header is present and the value() is null.

我打开了一个新功能问题.

这篇关于配置 Spring Kafka 以使用 DeadLetterPublishingRecoverer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-25 19:21