* Make all catalog product events have same name id.
This should be useful for stream processing in flink,
because we can then key a stream with these events on the productId
more easily.
* Add product bought event, which is published for each product.
This makes it easier to check whether a product is being oversold.
* Allow for setting the catalog item id through API call.
Previously the entity framework API did have a counter in place,
ignoring the id that was send in the catalog api request.
* Add kafka eventbus to Catalog, Ordering and Payment.
Only catalog is confirmed to work, I think the ordering background
service and payment are still subscribing to rabbitMQ for some reason
(needs more investigation)
* Fix kafka consumer group ids.
* Fix build problems (duplicate package + lower version)
All services seem to use Kafka now, but it seems like
kafka takes to long to start up and the services
fail to connect to the borkers (either we have to do
retries similar like rabbitmq) or we are going to
enforce the execution order in docker compose by
using something like healthchecks.
* Add kafka broker dependency for other services.
Still have bug in eventBus subscription because
for example the ordering service should
handle a UserCheckoutAccepted event put it does have
no subscription for it when such an event is
published to the Kafka topic:
```
src-ordering-api-1 | [15:04:42 WRN] No subscription for Kafka event: UserCheckoutAccepted
src-ordering-api-1 | Consumed event: UserCheckoutAccepted
src-ordering-api-1 | Content: 632B63DB0CE145D499FE01904F76A475
```
* Add logging for subscription manager problem.
Seems like the subscription manager is not used
correctly in the kafka eventbus (two different objects?).
* add printing handlers
* actually trigger printing handlers
* add kafkapersistentconnection registration
* Revert "add kafkapersistentconnection registration"
This reverts commit 704ee3e36f.
* add allowAutoCreateTopics in consumers (different than default) and in producers (just to be explicit)
* register DefaultKafkaPersistentConnection in ordering.backgroundtasks
* remove noise in logs
* Make eventNames in kafka eventbus consistent.
Do not remove IntegrationEventSuffix, before this
change the subscription handlers and eventNames did not match.
* Create kafka admin background service to create empty topic.
We have to create the eshop_event_bus kafka topic on startup,
because otherwise the consumer of the microservices would fail.
---------
Co-authored-by: Philipp Theyssen <p.theyssen@gmail.com>
Co-authored-by: Philipp Theyssen <34607877+PTheyssen@users.noreply.github.com>
This enables clients outside of docker network to connect,
to test this you can use a tool like kafkacat (kcat).
For example run on the docker compose host machine:
> kafkacat -b localhost:29092 -L
For now we have a single topic for all events. Since
we use the eventname as the key for the kafka message,
we have the property that they all get assigned to the same
parition inside kafka and therefore are in-order.
Alternatively one could have multiple topics.
The code for the kafka connection is based on:
https://github.com/confluentinc/confluent-kafka-dotnet/blob/master/examples/Web/KafkaClientHandle.cs
So far only copied details from other eventbus implementations.
Key next steps are to implement a persistent connection abstraction (class)
for the Kafka eventbus and the publish and subscribe functions. For this
we need knowledge about how Kafka works, for example how one publishes
events topics etc.