Spring Kafka 测试 - 使用 EmbeddedKafka 在 @KafkaListener 中未接收数据

dav*_*nko 6 java apache-kafka spring-boot spring-kafka spring-kafka-test

我们正在使用 Cucumber 对外部应用程序进行一些集成测试,我们在测试@KafkaListener. 我们设法使用 EmbeddedKafka 并将数据生成到其中。

但是消费者永远不会收到任何数据,我们也不知道发生了什么。

这是我们的代码:

生产者配置

@Configuration
@Profile("test")
public class KafkaTestProducerConfig {

    private static final String SCHEMA_REGISTRY_URL = "schema.registry.url";

    @Autowired
    protected EmbeddedKafkaBroker embeddedKafka;

    @Bean
    public Map<String, Object> producerConfig() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
                embeddedKafka.getBrokersAsString());
        props.put(SCHEMA_REGISTRY_URL, "URL");
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        return props;
    }

    @Bean
    public ProducerFactory<String, GenericRecord> producerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfig());
    }

    @Bean
    public KafkaTemplate<String, GenericRecord> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }

}
Run Code Online (Sandbox Code Playgroud)

消费者配置

@Configuration
@Profile("test")
@EnableKafka
public class KafkaTestConsumerConfig {

    @Autowired
    protected EmbeddedKafkaBroker embeddedKafka;

    private static final String SCHEMA_REGISTRY_URL = "schema.registry.url";

    @Bean
    public Map<String, Object> consumerProperties() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "groupId");
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
        props.put(SCHEMA_REGISTRY_URL, "URL");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
        props.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, "1000");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class.getName());
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        props.put(ConsumerConfig.FETCH_MAX_WAIT_MS_CONFIG, 10000);
        return props;
    }

    @Bean
    public DefaultKafkaConsumerFactory<String, Object> consumerFactory() {
        KafkaAvroDeserializer avroDeserializer = new KafkaAvroDeserializer();
        avroDeserializer.configure(consumerProperties(), false);
        return new DefaultKafkaConsumerFactory<>(consumerProperties(), new StringDeserializer(), avroDeserializer);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, Object> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        factory.setBatchListener(true);
        factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
        return factory;
    }

}

Run Code Online (Sandbox Code Playgroud)

集成测试

@SpringBootTest(
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT,
        classes = Application.class)
@ActiveProfiles("test")
@EmbeddedKafka(topics = {"TOPIC1", "TOPIC2", "TOPIC3"})
public class CommonStepDefinitions implements En {

    protected static final Logger LOGGER = LoggerFactory.getLogger(CommonStepDefinitions.class);

    @Autowired
    protected KafkaTemplate<String, GenericRecord> kafkaTemplate;

}
Run Code Online (Sandbox Code Playgroud)

步骤定义

public class KafkaStepDefinitions extends CommonStepDefinitions {

    private static final String TEMPLATE_TOPIC = "TOPIC1";

    public KafkaStepDefinitions(){
        Given("given statement", () -> {
            OperationEntity operationEntity = new OperationEntity();
            operationEntity.setFoo("foo");
            kafkaTemplate.send(TEMPLATE_TOPIC, AvroPojoTransformer.pojoToRecord(operationEntity));
        });
    }

}
Run Code Online (Sandbox Code Playgroud)

消费者 同样的代码在生产引导服务器上工作正常,但它从未与嵌入式 Kafka 接触过

@KafkaListener(topics = "${kafka.topic1}", groupId = "groupId")
    public void consume(List<GenericRecord> records, Acknowledgment ack) throws DDCException {
        LOGGER.info("Batch of {} records received", records.size());
        //do something with the data
        ack.acknowledge();
    }
Run Code Online (Sandbox Code Playgroud)

日志中的所有内容看起来都很好,但我们不知道缺少什么。

提前致谢。

mpk*_*nje 0

问题是消费者没有连接到嵌入式 Kafka。您可以通过使用配置文件运行测试test并将以下内容添加到application-test.yml.

spring:
  kafka:
    bootstrap-servers: ${spring.embedded.kafka.brokers}
Run Code Online (Sandbox Code Playgroud)

那么你也不需要自定义consumerProperties,consumerFactorykafkaListenerContainerFactorybeans。Spring Boot 将为您自动装配这些。如果您确实希望使用这些 bean(不知道为什么),您应该仔细检查KafkaAutoConfiguration以确保您覆盖了正确的名称类型。