本文介绍了Spring Boot 中的事务同步与 Database+ kafka 示例的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想用 Spring Boot 编写一个新应用程序,使用数据库作为 MySQL + Mango 并用于向 Spring Kafka 发送消息.

I Want to write one new application with Spring boot using the database as MySQL + Mango and for messaging Spring Kafka.

我尝试使用 Many POC 来同步 Kafka 和 DB 之间的事务,但在某些情况下我失败了,并且我搜索了许多存储库、博客以获取至少一个示例.我现在还没有得到任何例子.

I tried with Many POC for synchronizing the transaction between Kafka and DB but I failed in certain conditions and also I searched many Repositories, blogs to get at least one example. I didn't get any example still now.

如果有人提供至少一个示例或配置,这将是未来所有人的一个很好的参考.

if anyone gives at least one example or configurations it would be a nice reference in the future for all.

推荐答案

给你...

@SpringBootApplication
public class So56170932Application {

    public static void main(String[] args) {
        SpringApplication.run(So56170932Application.class, args);
    }

    @Bean
    public ApplicationRunner runner(KafkaTemplate<String, String> template) {
        return args -> template.executeInTransaction(t -> t.send("so56170932a", "foo"));
    }

    @Bean
    public ChainedKafkaTransactionManager<Object, Object> chainedTm(KafkaTransactionManager<String, String> ktm,
            DataSourceTransactionManager dstm) {

        return new ChainedKafkaTransactionManager<>(ktm, dstm);
    }

    @Bean
    public DataSourceTransactionManager dstm(DataSource dataSource) {
        return new DataSourceTransactionManager(dataSource);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
            ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
            ConsumerFactory<Object, Object> kafkaConsumerFactory,
            ChainedKafkaTransactionManager<Object, Object> ctm) {

        ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
        configurer.configure(factory, kafkaConsumerFactory);
        factory.getContainerProperties().setTransactionManager(ctm);
        return factory;
    }

    @Component
    public static class Listener {

        private final JdbcTemplate jdbcTemplate;

        private final KafkaTemplate<String, String> kafkaTemplate;

        public Listener(JdbcTemplate jdbcTemplate, KafkaTemplate<String, String> kafkaTemplate) {
            this.jdbcTemplate = jdbcTemplate;
            this.kafkaTemplate = kafkaTemplate;
        }

        @KafkaListener(id = "so56170932a", topics = "so56170932a")
        public void listen1(String in) {
            this.kafkaTemplate.send("so56170932b", in.toUpperCase());
            this.jdbcTemplate.execute("insert into so56170932 (data) values ('" + in + "')");
        }

        @KafkaListener(id = "so56170932b", topics = "so56170932b")
        public void listen2(String in) {
            System.out.println(in);
        }

    }

    @Bean
    public NewTopic topicA() {
        return TopicBuilder.name("so56170932a").build();
    }

    @Bean
    public NewTopic topicB() {
        return TopicBuilder.name("so56170932b").build();
    }

}

spring.datasource.url=jdbc:mysql://localhost/integration?serverTimezone=UTC
spring.datasource.username=root
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver

spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.consumer.properties.isolation.level=read_committed

spring.kafka.producer.transaction-id-prefix=tx-

logging.level.org.springframework.transaction=trace
logging.level.org.springframework.kafka.transaction=debug
logging.level.org.springframework.jdbc=debug

mysql> select * from so56170932;
+------+
| data |
+------+
| foo  |
| foo  |
| foo  |
| foo  |
| foo  |
| foo  |
| foo  |
| foo  |
| foo  |
+------+
9 rows in set (0.00 sec)

这篇关于Spring Boot 中的事务同步与 Database+ kafka 示例的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-18 23:11