How to send data to Azure Event Hubs with Apache Kafka

In this tutorial you will learn how to send data to Event Hubs with Apache Kafka

Benefits of the Apache Kafka output connector:

  • Directly stream device data from akenza to Event Hubs with Apache Kafka and subsequently process it in Azure services

  • Easily process low-power device data e.g. LoRaWAN device data with Apache Kafka

  • Following benefits:

    • Real-Time Handling Apache Kafka can handle real-time data pipelines at scale with zero downtime and zero data loss.

    • High-throughput: Capable of handling high-velocity and high-volume data.

    • Low Latency: Handling messages with very low latency of the range of milliseconds

    • Variety of IoT Use Cases Manages a variety of IoT use cases commonly required for a Data Lake. (log aggregation, web activity tracking etc.)

In this tutorial, you will learn, how to send data to Event Hubs using the Apache Kafka output connector. The following steps are explained in more detail:

1. Setup Event Hubs namespace

First, you will need to set up an Azure IoT Hub in order to successfully process data from akenza in Azure services.

Create an Event Hub resource

  • Sign in to the Azure portal

  • Create a new Event Hubs resource, by searching for it in All Services and then creating a new Event Hubs namespace

    • HubsAssign a resource group by choosing an existing group or creating a new one and fill the IoT hub name and region fields; choose Public access in the networking tab to select a tier that fits your use case in the Management tab (use the free tier for testing)

  • Click review and create to provision the IoT hub resource

Add a shared access policy

  • Navigate to your Event Hub resource and select Shared access policies in the Settings section.

  • Select + Add shared access policy, give a name, and check all boxes for the permissions (Manage, Send, Listen)

2. Obtain an Azure Event Hub connection string

Once the policy is created, obtain the Primary Connection String, by clicking the policy and copying the connection string. The connection string will be used as a JAAS password to create the connector in akenza.

3. Setup a Data Flow in akenza

In akenza, create a new Data Flow with Kafka as Output Connector. Define a Connector Name and insert the Kafka brokers, Kafka topic, JAAS username, JAAS password (which is previously obtained Connection String) obtained from the shared access policy. Save the Data Flow accordingly.

Create a new device using the Data Flow and start sending data.

Important: Each Kafka topic will create a new instance in event hubs namespace resource.

The Event Hubs namespace resource can be found by navigating to Event Hub resource and then on the left tile under Entities choose Event Hubs. It will list all instances in that Event Hubs namespace.

4. Use Azure Data Explorer storage to store and view messages

One way to monitor incoming uplinks on Even Hubs is to create a database, more info about the setup can be found here.

Last updated