Process data on Event Hubs as Apache Kafka compatible ecosystem
The ApacheKafka Output connector can be used to forward data to Event Hubs, which provides an endpoint compatible with the Apache Kafka® producer and consumer APIs. The Kafka connector is available on both Data Flow and Rule Engine.
The Kafka connector requires at least one kafka bootstrap server to be provided in a comma-separated list. It also requires kafka topic to which the data will be sent.
For the authentication, a simple username/password authentication mechanism, values for Security Protocol (SASL_SSL) and SASL mechanism (PLAIN) are fixed and not editable.
Username ($ConnectionString) and password (value of connection string for the event hub namespace) only need to be provided in order to authenticate.
Optionally: The custom payload can be selected for making use of templated payload in the desired structure. If the custom payload is not selected, the whole data sample will be forwarded.
A tutorial about "How to send data to Event Hubs with Apache Kafka" can be found here.