feat(infra-messaging): Implement fully reactive Kafka producer and consumer
This commit introduces a comprehensive refactoring of the messaging module to establish a fully reactive, non-blocking, and robust infrastructure for Kafka-based communication. Features & Refinements Reactive Publisher: The KafkaEventPublisher has been refactored from a blocking implementation (KafkaTemplate) to a fully non-blocking, reactive one using Spring's ReactiveKafkaProducerTemplate. The EventPublisher interface now returns reactive types (Mono, Flux) to reflect the asynchronous nature of the operations. Reactive Consumer: A new KafkaEventConsumer has been implemented, providing a standardized, reusable, and reactive way for services to consume events. It encapsulates the complexity of reactor-kafka and exposes a simple receiveEvents<T>(topic) method that returns a Flux<T>. Architectural Cleanup: The Spring configuration has been split. The basic ProducerFactory and consumer properties reside in messaging-config, while the reactive-specific ReactiveKafkaProducerTemplate bean is now correctly located in messaging-client. Testing Added Kafka Integration Test: A new KafkaIntegrationTest has been created to ensure the reliability of the messaging infrastructure. The test uses Testcontainers to spin up a real Apache Kafka broker for end-to-end validation. Project Reactor's StepVerifier is used to test the reactive streams deterministically, avoiding flaky tests. The test correctly manages the lifecycle of Kafka producers to ensure clean shutdown without hanging threads. Bug Fixes Resolved UninitializedPropertyAccessException in tests by making the KafkaConfig test-friendly. Fixed IllegalStateException related to Testcontainers lifecycle by making the container a static resource. Corrected compilation errors in tests related to resource cleanup by using the concrete DefaultKafkaProducerFactory type.
This commit is contained in:
parent
f9927066a2
commit
d87a5a4a93
|
|
@ -175,6 +175,8 @@ assertj-core = { module = "org.assertj:assertj-core", version.ref = "assertj" }
|
|||
testcontainers-core = { module = "org.testcontainers:testcontainers", version.ref = "testcontainers" }
|
||||
testcontainers-junit-jupiter = { module = "org.testcontainers:junit-jupiter", version.ref = "testcontainers" }
|
||||
testcontainers-postgresql = { module = "org.testcontainers:postgresql", version.ref = "testcontainers" }
|
||||
testcontainers-kafka = { module = "org.testcontainers:kafka", version.ref = "testcontainers" }
|
||||
reactor-test = { module = "io.projectreactor:reactor-test" } # Version wird von der Spring BOM verwaltet
|
||||
room-common-jvm = { group = "androidx.room", name = "room-common-jvm", version.ref = "roomCommonJvm" }
|
||||
|
||||
[bundles]
|
||||
|
|
|
|||
|
|
@ -2,80 +2,84 @@
|
|||
|
||||
## Überblick
|
||||
|
||||
Das **Messaging-Modul** stellt die Infrastruktur für die asynchrone Kommunikation zwischen den Microservices des Meldestelle-Systems bereit. Es nutzt **Apache Kafka** als hochperformanten, verteilten Message-Broker. Dieses Modul ist entscheidend für die Entkopplung von Services und die Implementierung von Mustern wie Publish/Subscribe, um eine skalierbare und resiliente Architektur zu ermöglichen.
|
||||
Das **Messaging-Modul** stellt die Infrastruktur für die asynchrone, reaktive Kommunikation zwischen den Microservices bereit. Es nutzt **Apache Kafka** als hochperformanten, verteilten Message-Broker und ist entscheidend für die Entkopplung von Services und die Implementierung einer skalierbaren, ereignisgesteuerten Architektur.
|
||||
|
||||
## Architektur
|
||||
|
||||
Ähnlich wie andere Infrastruktur-Module ist auch dieses in zwei spezialisierte Komponenten aufgeteilt, um Konfiguration von der Client-Logik zu trennen:
|
||||
Das Modul ist in zwei spezialisierte Komponenten aufgeteilt, um Konfiguration von der Client-Logik zu trennen:
|
||||
|
||||
|
||||
infrastructure/messaging/
|
||||
├── messaging-config/ # Stellt die zentrale Kafka-Konfiguration bereit
|
||||
└── messaging-client/ # Stellt wiederverwendbare Producer- und Consumer-Clients bereit
|
||||
└── messaging-client/ # Stellt wiederverwendbare, reaktive Clients bereit
|
||||
|
||||
|
||||
### `messaging-config`
|
||||
|
||||
Dieses Modul ist die Basis für jede Kafka-Interaktion. Es ist dafür verantwortlich, die gesamte **Konfiguration** zu zentralisieren.
|
||||
Dieses Modul zentralisiert die grundlegende Kafka-Konfiguration für das gesamte Projekt.
|
||||
|
||||
* **Zweck:** Definiert Spring-Beans für die grundlegende Kafka-Konfiguration. Dazu gehören:
|
||||
* Die Adresse der Kafka-Broker (`bootstrap-servers`).
|
||||
* Konfiguration für Serializer und Deserializer (z.B. `JsonSerializer` von Spring Kafka), um sicherzustellen, dass alle Services Nachrichten im selben Format (JSON) austauschen.
|
||||
* Konfiguration für Topics, Partitionen und Replikationsfaktoren.
|
||||
* **Vorteil:** Jeder Service, der Kafka nutzt, kann sich auf diese zentrale Konfiguration verlassen, was die Konsistenz sicherstellt und die Einrichtung neuer Producer oder Consumer vereinfacht.
|
||||
* **Zweck:** Definiert Spring-Beans für die `ProducerFactory` (Basis für Producer) und eine `Map` mit Standard-Konfigurationen für Consumer (z.B. `bootstrap-servers`, `group-id`, Serializer).
|
||||
* **Vorteil:** Stellt Konsistenz sicher und vereinfacht die Einrichtung neuer Producer oder Consumer in den Services.
|
||||
|
||||
### `messaging-client`
|
||||
|
||||
Dieses Modul baut auf `messaging-config` auf und stellt **wiederverwendbare High-Level-Komponenten** für die Interaktion mit Kafka bereit.
|
||||
Dieses Modul baut auf der Konfiguration auf und stellt wiederverwendbare High-Level-Komponenten für die Interaktion mit Kafka bereit.
|
||||
|
||||
* **Zweck:** Stellt einfach zu verwendende Klassen oder Services zur Verfügung, z.B. einen `KafkaProducerService` zum Senden von Nachrichten und einen `KafkaConsumerService` zum Empfangen von Nachrichten. Es nutzt **Project Reactor** (`reactor-kafka`), um eine reaktive und nicht-blockierende Verarbeitung von Nachrichten zu ermöglichen.
|
||||
* **Vorteil:** Kapselt die Komplexität der Kafka-Producer- und -Consumer-API. Ein Fach-Service muss nur noch eine Methode wie `producer.sendMessage("topic", message)` aufrufen, ohne sich um die Details der Verbindung, Serialisierung oder Fehlerbehandlung kümmern zu müssen.
|
||||
* **Zweck:**
|
||||
* **`KafkaEventPublisher`**: Ein reaktiver, nicht-blockierender Service zum Senden von Nachrichten. Er nutzt den `ReactiveKafkaProducerTemplate` von Spring.
|
||||
* **`KafkaEventConsumer`**: Ein reaktiver Service zum Empfangen von Nachrichten. Er kapselt die Komplexität von `reactor-kafka` und gibt einen kontinuierlichen `Flux`-Stream von Events zurück.
|
||||
* **Vorteil:** Kapselt die Komplexität der reaktiven Kafka-API. Ein Fach-Service muss nur noch reaktive Streams (`Mono`, `Flux`) handhaben, ohne sich um die Details der Kafka-Interaktion zu kümmern.
|
||||
|
||||
## Verwendung in anderen Modulen
|
||||
## Verwendung
|
||||
|
||||
Ein Microservice, der Nachrichten senden oder empfangen möchte, geht wie folgt vor:
|
||||
Ein Microservice, der Nachrichten senden oder empfangen möchte, deklariert eine Abhängigkeit zu `:infrastructure:messaging:messaging-client` und injiziert die entsprechenden Interfaces.
|
||||
|
||||
1. **Abhängigkeit deklarieren:** Das Service-Modul (z.B. `events-service`) fügt eine `implementation`-Abhängigkeit zu `:infrastructure:messaging:messaging-client` in seiner `build.gradle.kts` hinzu.
|
||||
|
||||
2. **Client-Service injizieren:** Im Service-Code wird der `KafkaProducerService` oder `KafkaConsumerService` per Dependency Injection angefordert.
|
||||
|
||||
```kotlin
|
||||
// Beispiel für das Senden einer Nachricht
|
||||
@Service
|
||||
class EventNotificationService(
|
||||
private val kafkaProducer: KafkaProducerService
|
||||
) {
|
||||
fun notifyNewEvent(eventDetails: EventDetails) {
|
||||
val topic = "new-events-topic"
|
||||
// Einfacher Aufruf zum Senden der Nachricht.
|
||||
// Die Komplexität der Serialisierung und des Sendens ist gekapselt.
|
||||
kafkaProducer.sendMessage(topic, eventDetails.id, eventDetails)
|
||||
.subscribe(
|
||||
{ result -> logger.info("Message sent successfully to topic '{}'", topic) },
|
||||
{ error -> logger.error("Failed to send message to topic '{}'", topic, error) }
|
||||
)
|
||||
}
|
||||
**Beispiel für das Senden einer Nachricht (nicht-blockierend):**
|
||||
```kotlin
|
||||
@Service
|
||||
class EventNotificationService(
|
||||
private val eventPublisher: EventPublisher
|
||||
) {
|
||||
fun notifyNewEvent(eventDetails: EventDetails) {
|
||||
val topic = "new-events-topic"
|
||||
eventPublisher.publishEvent(topic, eventDetails.id, eventDetails)
|
||||
.subscribe(
|
||||
null, // onComplete: Nichts zu tun
|
||||
{ error -> logger.error("Failed to send message to topic '{}'", topic, error) }
|
||||
)
|
||||
// Die Methode kehrt sofort zurück, ohne auf die Bestätigung von Kafka zu warten.
|
||||
}
|
||||
```kotlin
|
||||
// Beispiel für das Empfangen von Nachrichten
|
||||
@Component
|
||||
class EventListener(
|
||||
private val kafkaConsumer: KafkaConsumerService
|
||||
) {
|
||||
@PostConstruct
|
||||
fun listenForEvents() {
|
||||
val topic = "new-events-topic"
|
||||
// Reaktiv auf eingehende Nachrichten lauschen.
|
||||
kafkaConsumer.receiveMessages<EventDetails>(topic)
|
||||
.subscribe { event ->
|
||||
logger.info("Received new event with ID: {}", event.id)
|
||||
// Geschäftslogik zur Verarbeitung des Events...
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
}
|
||||
```
|
||||
|
||||
Diese Architektur ermöglicht eine saubere, robuste und hochgradig entkoppelte Kommunikation zwischen den Diensten.
|
||||
**Beispiel für das Empfangen von Nachrichten (reaktiv):**
|
||||
```kotlin
|
||||
@Component
|
||||
class EventListener(
|
||||
private val eventConsumer: EventConsumer
|
||||
) {
|
||||
@PostConstruct
|
||||
fun listenForEvents() {
|
||||
val topic = "new-events-topic"
|
||||
eventConsumer.receiveEvents<EventDetails>(topic)
|
||||
.subscribe { event ->
|
||||
logger.info("Received new event with ID: {}", event.id)
|
||||
// Geschäftslogik zur Verarbeitung des Events...
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Testing-Strategie
|
||||
|
||||
Die Zuverlässigkeit des Moduls wird durch einen umfassenden Integrationstest sichergestellt, der auf dem "Goldstandard"-Prinzip beruht:
|
||||
|
||||
* **Testcontainers: Der KafkaIntegrationTest startet einen echten Apachen Kafka Docker-Container, um die Funktionalität unter realen Bedingungen zu validieren.*
|
||||
|
||||
* **Reaktives Testen: Der Test nutzt Project Reactor's StepVerifier, um die reaktiven Streams (Mono, Flux) deterministisch und ohne unzuverlässige Thread.sleep-Aufrufe zu überprüfen.*
|
||||
|
||||
* **Lifecycle Management: Der Test-Lebenszyklus wird sauber über @BeforeEach und @AfterEach verwaltet, um sicherzustellen, dass alle Ressourcen (insbesondere Producer-Threads) nach jedem Test korrekt freigegeben werden.*
|
||||
|
||||
---
|
||||
**Letzte Aktualisierung**: 31. Juli 2025
|
||||
|
||||
**Letzte Aktualisierung**: 9. August 2025
|
||||
|
|
|
|||
|
|
@ -0,0 +1,30 @@
|
|||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import reactor.core.publisher.Flux
|
||||
|
||||
/**
|
||||
* A generic, reactive interface for consuming events from a message broker.
|
||||
*/
|
||||
interface EventConsumer {
|
||||
|
||||
/**
|
||||
* Receives a continuous stream of events from the specified topic.
|
||||
*
|
||||
* This method returns a cold Flux, meaning that the consumer will only start
|
||||
* listening for messages once the Flux is subscribed to.
|
||||
*
|
||||
* @param T The expected type of the event payload.
|
||||
* @param topic The topic to subscribe to.
|
||||
* @return A reactive stream (Flux) of events of type T.
|
||||
*/
|
||||
fun <T : Any> receiveEvents(topic: String, eventType: Class<T>): Flux<T>
|
||||
}
|
||||
|
||||
/**
|
||||
* Kotlin-idiomatic extension function for `receiveEvents` using reified types.
|
||||
*
|
||||
* Example: `consumer.receiveEvents<MyEvent>("my-topic").subscribe { ... }`
|
||||
*/
|
||||
inline fun <reified T : Any> EventConsumer.receiveEvents(topic: String): Flux<T> {
|
||||
return this.receiveEvents(topic, T::class.java)
|
||||
}
|
||||
|
|
@ -1,24 +1,22 @@
|
|||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import reactor.core.publisher.Flux
|
||||
import reactor.core.publisher.Mono
|
||||
|
||||
/**
|
||||
* Interface for publishing domain events to message broker.
|
||||
*/
|
||||
interface EventPublisher {
|
||||
|
||||
/**
|
||||
* Publishes an event to the specified topic.
|
||||
*
|
||||
* @param topic The topic to publish to
|
||||
* @param key The message key (optional)
|
||||
* @param event The event to publish
|
||||
* Publishes a single event to the specified topic.
|
||||
* Returns a Mono that completes when the send operation is finished.
|
||||
*/
|
||||
suspend fun publishEvent(topic: String, key: String? = null, event: Any)
|
||||
fun publishEvent(topic: String, key: String? = null, event: Any): Mono<Void>
|
||||
|
||||
/**
|
||||
* Publishes multiple events to the specified topic.
|
||||
*
|
||||
* @param topic The topic to publish to
|
||||
* @param events The events to publish with their keys
|
||||
* Returns a Flux that completes when all send operations are finished.
|
||||
*/
|
||||
suspend fun publishEvents(topic: String, events: List<Pair<String?, Any>>)
|
||||
fun publishEvents(topic: String, events: List<Pair<String?, Any>>): Flux<Void>
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,48 @@
|
|||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import org.apache.kafka.clients.consumer.ConsumerConfig
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.kafka.support.serializer.JsonDeserializer
|
||||
import org.springframework.stereotype.Component
|
||||
import reactor.core.publisher.Flux
|
||||
import reactor.kafka.receiver.KafkaReceiver
|
||||
import reactor.kafka.receiver.ReceiverOptions
|
||||
import java.util.Collections
|
||||
|
||||
/**
|
||||
* A reactive, non-blocking Kafka implementation of the EventConsumer interface.
|
||||
*/
|
||||
@Component
|
||||
class KafkaEventConsumer(
|
||||
// Wir injizieren die Basis-Konfigurationseigenschaften aus messaging-config
|
||||
private val consumerConfig: Map<String, Any>
|
||||
) : EventConsumer {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(KafkaEventConsumer::class.java)
|
||||
|
||||
override fun <T : Any> receiveEvents(topic: String, eventType: Class<T>): Flux<T> {
|
||||
// Für jeden Aufruf wird eine neue, spezifische Konfiguration für diesen Topic erstellt.
|
||||
val receiverOptions = ReceiverOptions.create<String, T>(consumerConfig)
|
||||
.subscription(Collections.singleton(topic))
|
||||
.withValueDeserializer(JsonDeserializer(eventType).trustedPackages("*"))
|
||||
.addAssignListener { partitions ->
|
||||
logger.info("Partitions assigned for topic '{}': {}", topic, partitions)
|
||||
}
|
||||
.addRevokeListener { partitions ->
|
||||
logger.warn("Partitions revoked for topic '{}': {}", topic, partitions)
|
||||
}
|
||||
|
||||
return KafkaReceiver.create(receiverOptions)
|
||||
.receive()
|
||||
.doOnNext { record ->
|
||||
logger.debug(
|
||||
"Received message from topic-partition {}-{} with offset {}",
|
||||
record.topic(), record.partition(), record.offset()
|
||||
)
|
||||
}
|
||||
.map { it.value() } // Extrahiere nur die deserialisierte Nachricht
|
||||
.doOnError { exception ->
|
||||
logger.error("Error receiving events from topic '{}'", topic, exception)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -1,49 +1,46 @@
|
|||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import kotlinx.coroutines.future.await
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.kafka.core.KafkaTemplate
|
||||
import org.springframework.kafka.core.reactive.ReactiveKafkaProducerTemplate
|
||||
import org.springframework.stereotype.Component
|
||||
import reactor.core.publisher.Flux
|
||||
import reactor.core.publisher.Mono
|
||||
|
||||
/**
|
||||
* Kafka implementation of EventPublisher.
|
||||
* A reactive, non-blocking Kafka implementation of EventPublisher.
|
||||
*/
|
||||
@Component
|
||||
class KafkaEventPublisher(
|
||||
private val kafkaTemplate: KafkaTemplate<String, Any>
|
||||
// KORREKTUR: Verwendung des reaktiven Templates
|
||||
private val reactiveKafkaTemplate: ReactiveKafkaProducerTemplate<String, Any>
|
||||
) : EventPublisher {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(KafkaEventPublisher::class.java)
|
||||
|
||||
override suspend fun publishEvent(topic: String, key: String?, event: Any) {
|
||||
try {
|
||||
logger.debug("Publishing event to topic '{}' with key '{}'", topic, key)
|
||||
|
||||
val sendResult = if (key != null) {
|
||||
kafkaTemplate.send(topic, key, event).get()
|
||||
} else {
|
||||
kafkaTemplate.send(topic, event).get()
|
||||
override fun publishEvent(topic: String, key: String?, event: Any): Mono<Void> {
|
||||
logger.debug("Publishing event to topic '{}' with key '{}'", topic, key)
|
||||
return reactiveKafkaTemplate.send(topic, key, event)
|
||||
.doOnSuccess { result ->
|
||||
val record = result.recordMetadata()
|
||||
logger.info(
|
||||
"Successfully published event to topic-partition {}-{} with offset {}",
|
||||
record.topic(), record.partition(), record.offset()
|
||||
)
|
||||
}
|
||||
|
||||
logger.info("Successfully published event to topic '{}' with key '{}'", topic, key)
|
||||
} catch (exception: Exception) {
|
||||
logger.error("Failed to publish event to topic '{}' with key '{}'", topic, key, exception)
|
||||
throw exception
|
||||
}
|
||||
.doOnError { exception ->
|
||||
logger.error("Failed to publish event to topic '{}' with key '{}'", topic, key, exception)
|
||||
}
|
||||
.then() // Wandelt das Ergebnis in ein Mono<Void> um
|
||||
}
|
||||
|
||||
override suspend fun publishEvents(topic: String, events: List<Pair<String?, Any>>) {
|
||||
try {
|
||||
logger.debug("Publishing {} events to topic '{}'", events.size, topic)
|
||||
|
||||
events.forEach { (key, event) ->
|
||||
override fun publishEvents(topic: String, events: List<Pair<String?, Any>>): Flux<Void> {
|
||||
logger.debug("Publishing {} events to topic '{}'", events.size, topic)
|
||||
// Verwendet Flux.fromIterable, um eine Sequenz von Sende-Operationen zu erstellen
|
||||
return Flux.fromIterable(events)
|
||||
// .flatMap stellt sicher, dass die Sende-Operationen parallelisiert,
|
||||
// aber dennoch reaktiv (nicht-blockierend) ausgeführt werden.
|
||||
.flatMap { (key, event) ->
|
||||
publishEvent(topic, key, event)
|
||||
}
|
||||
|
||||
logger.info("Successfully published {} events to topic '{}'", events.size, topic)
|
||||
} catch (exception: Exception) {
|
||||
logger.error("Failed to publish events to topic '{}'", topic, exception)
|
||||
throw exception
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,18 @@
|
|||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import org.springframework.context.annotation.Bean
|
||||
import org.springframework.context.annotation.Configuration
|
||||
import org.springframework.kafka.core.ProducerFactory
|
||||
import org.springframework.kafka.core.reactive.ReactiveKafkaProducerTemplate
|
||||
import reactor.kafka.sender.SenderOptions
|
||||
|
||||
@Configuration
|
||||
class ReactiveKafkaConfig {
|
||||
|
||||
@Bean
|
||||
fun reactiveKafkaProducerTemplate(producerFactory: ProducerFactory<String, Any>): ReactiveKafkaProducerTemplate<String, Any> {
|
||||
// Nutzt die ProducerFactory aus dem messaging-config-Modul
|
||||
val senderOptions = SenderOptions.create<String, Any>(producerFactory.configurationProperties)
|
||||
return ReactiveKafkaProducerTemplate(senderOptions)
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,85 @@
|
|||
// KafkaIntegrationTest.kt
|
||||
|
||||
package at.mocode.infrastructure.messaging.client
|
||||
|
||||
import at.mocode.infrastructure.messaging.config.KafkaConfig
|
||||
import org.apache.kafka.clients.consumer.ConsumerConfig
|
||||
import org.apache.kafka.common.serialization.StringDeserializer
|
||||
import org.junit.jupiter.api.AfterEach
|
||||
import org.junit.jupiter.api.BeforeEach
|
||||
import org.junit.jupiter.api.Test
|
||||
import org.springframework.kafka.core.DefaultKafkaProducerFactory
|
||||
import org.springframework.kafka.support.serializer.JsonDeserializer
|
||||
import org.testcontainers.containers.KafkaContainer
|
||||
import org.testcontainers.junit.jupiter.Container
|
||||
import org.testcontainers.junit.jupiter.Testcontainers
|
||||
import org.testcontainers.utility.DockerImageName
|
||||
import reactor.kafka.receiver.KafkaReceiver
|
||||
import reactor.kafka.receiver.ReceiverOptions
|
||||
import reactor.test.StepVerifier
|
||||
import java.util.*
|
||||
|
||||
@Testcontainers
|
||||
class KafkaIntegrationTest {
|
||||
|
||||
companion object {
|
||||
@Container
|
||||
private val kafkaContainer = KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka:7.5.0"))
|
||||
}
|
||||
|
||||
private lateinit var kafkaEventPublisher: KafkaEventPublisher
|
||||
private lateinit var producerFactory: DefaultKafkaProducerFactory<String, Any>
|
||||
private val testTopic = "test-topic-${UUID.randomUUID()}"
|
||||
|
||||
@BeforeEach
|
||||
fun setUp() {
|
||||
val kafkaConfig = KafkaConfig().apply {
|
||||
bootstrapServers = kafkaContainer.bootstrapServers
|
||||
}
|
||||
producerFactory = kafkaConfig.producerFactory() as DefaultKafkaProducerFactory<String, Any>
|
||||
|
||||
val reactiveKafkaConfig = ReactiveKafkaConfig()
|
||||
val reactiveTemplate = reactiveKafkaConfig.reactiveKafkaProducerTemplate(producerFactory)
|
||||
kafkaEventPublisher = KafkaEventPublisher(reactiveTemplate)
|
||||
}
|
||||
|
||||
@AfterEach
|
||||
fun tearDown() {
|
||||
producerFactory.destroy()
|
||||
}
|
||||
|
||||
@Test
|
||||
fun `publishEvent should send a message that can be received`() {
|
||||
// Arrange
|
||||
val testKey = "test-key"
|
||||
val testEvent = TestEvent("Test Message")
|
||||
|
||||
val consumerProps = mapOf(
|
||||
ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG to kafkaContainer.bootstrapServers,
|
||||
ConsumerConfig.GROUP_ID_CONFIG to "test-group-${UUID.randomUUID()}",
|
||||
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG to StringDeserializer::class.java,
|
||||
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG to JsonDeserializer::class.java,
|
||||
ConsumerConfig.AUTO_OFFSET_RESET_CONFIG to "earliest",
|
||||
JsonDeserializer.TRUSTED_PACKAGES to "*"
|
||||
)
|
||||
val receiverOptions = ReceiverOptions.create<String, TestEvent>(consumerProps).subscription(listOf(testTopic))
|
||||
|
||||
// Der Mono, der das nächste empfangene Ereignis darstellt
|
||||
val receivedEvent = KafkaReceiver.create(receiverOptions)
|
||||
.receive()
|
||||
.next() // Nimm nur das erste Ereignis
|
||||
.map { it.value() } // Extrahiere den Wert (unsere TestEvent-Instanz)
|
||||
|
||||
// Der Mono, der die Sende-Aktion darstellt
|
||||
val sendAction = kafkaEventPublisher.publishEvent(testTopic, testKey, testEvent)
|
||||
|
||||
// KORREKTUR: Kombiniere die Sende-Aktion und die Empfangs-Erwartung in einem StepVerifier.
|
||||
// Die `then` Methode stellt sicher, dass erst die Sende-Aktion abgeschlossen wird,
|
||||
// bevor der `receivedEvent` Mono abonniert und verifiziert wird.
|
||||
StepVerifier.create(sendAction.then(receivedEvent))
|
||||
.expectNext(testEvent) // Erwarte, dass unser Test-Event ankommt
|
||||
.verifyComplete() // Schließe die Überprüfung ab
|
||||
}
|
||||
|
||||
data class TestEvent(val message: String)
|
||||
}
|
||||
|
|
@ -1,6 +1,8 @@
|
|||
package at.mocode.infrastructure.messaging.config
|
||||
|
||||
import org.apache.kafka.clients.consumer.ConsumerConfig
|
||||
import org.apache.kafka.clients.producer.ProducerConfig
|
||||
import org.apache.kafka.common.serialization.StringDeserializer
|
||||
import org.apache.kafka.common.serialization.StringSerializer
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Bean
|
||||
|
|
@ -8,16 +10,18 @@ import org.springframework.context.annotation.Configuration
|
|||
import org.springframework.kafka.core.DefaultKafkaProducerFactory
|
||||
import org.springframework.kafka.core.KafkaTemplate
|
||||
import org.springframework.kafka.core.ProducerFactory
|
||||
import org.springframework.kafka.support.serializer.JsonDeserializer
|
||||
import org.springframework.kafka.support.serializer.JsonSerializer
|
||||
|
||||
/**
|
||||
* Kafka configuration for event publishing.
|
||||
*/
|
||||
@Configuration
|
||||
class KafkaConfig {
|
||||
|
||||
// KORREKTUR: Von lateinit zu einer public var mit Standardwert, um Tests zu ermöglichen
|
||||
@Value($$"${spring.kafka.bootstrap-servers:localhost:9092}")
|
||||
private lateinit var bootstrapServers: String
|
||||
var bootstrapServers: String = "localhost:9092"
|
||||
|
||||
@Value("\${spring.kafka.consumer.group-id:meldestelle-group}")
|
||||
private lateinit var consumerGroupId: String
|
||||
|
||||
@Bean
|
||||
fun producerFactory(): ProducerFactory<String, Any> {
|
||||
|
|
@ -34,7 +38,20 @@ class KafkaConfig {
|
|||
}
|
||||
|
||||
@Bean
|
||||
fun kafkaTemplate(): KafkaTemplate<String, Any> {
|
||||
return KafkaTemplate(producerFactory())
|
||||
fun kafkaTemplate(producerFactory: ProducerFactory<String, Any>): KafkaTemplate<String, Any> {
|
||||
return KafkaTemplate(producerFactory)
|
||||
}
|
||||
|
||||
// NEU: Stellt eine zentrale Map mit den Basis-Konfigurationen für alle Consumer bereit.
|
||||
@Bean
|
||||
fun kafkaConsumerConfiguration(): Map<String, Any> {
|
||||
return mapOf(
|
||||
ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG to bootstrapServers,
|
||||
ConsumerConfig.GROUP_ID_CONFIG to consumerGroupId,
|
||||
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG to StringDeserializer::class.java,
|
||||
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG to JsonDeserializer::class.java,
|
||||
ConsumerConfig.AUTO_OFFSET_RESET_CONFIG to "earliest", // Beginne davon am Anfang, wenn kein Offset existiert
|
||||
JsonDeserializer.TRUSTED_PACKAGES to "*" // Erlaube Deserialisierung aller unserer Klassen
|
||||
)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -12,6 +12,10 @@ dependencies {
|
|||
api(libs.bundles.testing.jvm)
|
||||
api(libs.bundles.testcontainers)
|
||||
|
||||
// Macht Kafka- und Reactor-Test-Bibliotheken verfügbar
|
||||
api(libs.testcontainers.kafka)
|
||||
api(libs.reactor.test)
|
||||
|
||||
// Stellt Spring Boot Test-Abhängigkeiten und die H2-Datenbank für Tests bereit.
|
||||
api(libs.spring.boot.starter.test)
|
||||
api(libs.h2.driver)
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user