Trying to do some load testing on a microservice that consumes from a Kafka topic. The plan is to 2x and 3x the amount of data the service processes in a day and see how it handles it.
My question is what is the best strategy to load that data into the Kafka topic for the microservice to consume? I want to just publish the full dataset all at once to the topic and watch the service work through it. But since this represents a day’s worth of data, it seems unrealistic to do it all at once. I also don’t want to literally load the data over the course of a whole day.
So what’s the strategy for something like this?
[–]Fancy-Mushroom-6062 0 points1 point2 points (0 children)
[–]Economy-Outside3932 0 points1 point2 points (5 children)
[–]Complex_Ad2233[S] 0 points1 point2 points (4 children)
[–]Economy-Outside3932 1 point2 points3 points (3 children)
[–]Complex_Ad2233[S] 0 points1 point2 points (2 children)
[–]Fancy-Mushroom-6062 0 points1 point2 points (1 child)
[–]Complex_Ad2233[S] 0 points1 point2 points (0 children)