How to send data to Event Hubs with Apache Kafka
In this tutorial you will learn how to send data to Event Hubs with Apache Kafka
Benefits of the Apache Kafka output connector:
- Easily process low-power device data e.g. LoRaWAN device data with Apache Kafka
- Following benefits:
- Real-Time Handling Apache Kafka can handle real-time data pipelines at scale with zero downtime and zero data loss.
- High-throughput: Capable of handling high-velocity and high-volume data.
- Low Latency: Handling messages with very low latency of the range of milliseconds
- Variety of IoT Use Cases Manages a variety of IoT use cases commonly required for a Data Lake. (log aggregation, web activity tracking etc.)
In this tutorial, you will learn, how to send data to Event Hubs using the Apache Kafka output connector. The following steps are explained in more detail:
- Create a new Event Hubs resource, by searching for it in All Services and then creating a new Event Hubs namespace
- HubsAssign a resource group by choosing an existing group or creating a new one and fill the IoT hub name and region fields; choose Public access in the networking tab to select a tier that fits your use case in the Management tab (use the free tier for testing)
- Click review and create to provision the IoT hub resource
- Navigate to your Event Hub resource and select Shared access policies in the Settings section.
- Select + Add shared access policy, give a name, and check all boxes for the permissions (Manage, Send, Listen)
Adding a shared access policy
Once the policy is created, obtain the Primary Connection String, by clicking the policy and copying the connection string. The connection string will be used as a JAAS password to create the connector in akenza.
Obtaining the connection string
In akenza, create a new Data Flow with Kafka as Output Connector. Define a Connector Name and insert the Kafka brokers, Kafka topic, JAAS username, JAAS password (which is previously obtained Connection String) obtained from the shared access policy. Save the Data Flow accordingly.
Apache Kafka template
Create a new device using the Data Flow and start sending data.
Important: Each Kafka topic will create a new instance in event hubs namespace resource.
The Event Hubs namespace resource can be found by navigating to Event Hub resource and then on the left tile under Entities choose Event Hubs. It will list all instances in that Event Hubs namespace.