Before Kafka exists, the information exchange between entities in a system could be very complicated, involving many databases, consumers (like Power BI), data sources (like an app), logging, and so on.
Once Kafka was released, the complexity might still exists, however in a more organized way, since Kafka purpose is to work as an intermediate layer and its role is to connect data producers and data consumers, like the image below:
Extracted from: https://medium.com/trainingcenter/apache-kafka-838882261e83
Producer and Kafka
Consumer and Kafka
It is worth mentioning that when consuming data with Kafka, Kafka logs all the transactions, by informing which messages were read, the topic name and the consumer id. In case the consumer dies due to an unknown reason and after some time it comes back, Kafka will know how to proceed the reading process, by determining the next message to be read without having duplications!
Topic Partitioning and Scalability