# How to send data to Azure Event Hubs with Apache Kafka

![](https://2165942204-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MMKXTFIN5ZlLOjBlfC4%2Fuploads%2FTyO2LI4TOPUmSpsCnH4i%2Fimage.png?alt=media\&token=c5313a0c-58c2-4d38-ac75-87062e934184)

{% hint style="success" %}
**Benefits of the Apache Kafka output connector**:

* Directly stream device data from akenza to Event Hubs with **Apache Kafka** and subsequently process it in [Azure services](https://azure.microsoft.com/en-us/services/)
* Easily process **low-power device** data e.g. LoRaWAN device data with **Apache Kafka**
* Following benefits:
  * **Real-Time Handling**\
    Apache Kafka can handle real-time data pipelines at scale with zero downtime and zero data loss.
  * **High-throughput**: Capable of handling high-velocity and high-volume data.
  * **Low Latency**: Handling messages with very low latency of the range of milliseconds
  * **Variety of IoT Use Cases**\
    Manages a variety of IoT use cases commonly required for a Data Lake. (log aggregation, web activity tracking etc.)
    {% endhint %}

In this tutorial, you will learn, how to send data to Event Hubs using the **Apache** **Kafka output connector**. The following steps are explained in more detail:

1. [Setup Event Hubs namespace](#1.-setup-event-hubs-namespace)
2. [Obtain an Azure Event Hub connection string](#2.-obtain-an-azure-event-hub-connection-string)
3. [Setup a **Data Flow** in akenza](#3.-setup-a-data-flow-in-akenza)
4. [Use Azure Data Explorer storage to store and view messages](#4.-use-azure-data-explorer-storage-to-store-and-view-messages)

### 1. Setup Event Hubs namespace

First, you will need to [set up an Azure IoT Hub](https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-through-portal) in order to successfully process data from akenza in Azure services.

#### Create an Event Hub resource

* Sign in to the [Azure portal](https://portal.azure.com/)
* Create a new Event Hubs resource, by searching for it in *All Services* and then creating a new Event Hubs namespace
  * HubsAssign a *resource group* by choosing an existing group or creating a new one and fill the *IoT hub name* and *region* fields; choose *Public access* in the networking tab to select a *tier* that fits your use case in the Management tab (use the free tier for testing)
* Click **review and create** to provision the IoT hub resource

#### Add a shared access policy

* Navigate to your Event Hub resource and select *Shared access policies* in the *Settings* section.&#x20;
* Select *+ Add shared access policy***,** give a name, and check *all boxes* for the permissions (Manage, Send, Listen)

![Adding a shared access policy](https://2165942204-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MMKXTFIN5ZlLOjBlfC4%2Fuploads%2Fsv8Tb6a91Fknxn28qcKd%2Fimage.png?alt=media\&token=1d4277ce-6dbf-4ffa-8915-0769a59b2a9e)

### 2. Obtain an Azure Event Hub **connection string**

Once the policy is created, obtain the *Primary Connection String*, by clicking the policy and copying the connection string. The connection string will be used as a JAAS password to create the connector in akenza.

![Obtaining the connection string](https://2165942204-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MMKXTFIN5ZlLOjBlfC4%2Fuploads%2F2Bw2gjSkq7egc8W93bJU%2Fimage.png?alt=media\&token=257310da-0ddc-456b-a775-9e181a14a519)

### 3. Setup a Data Flow in akenza

In akenza, create a new **Data Flow** with **Kafka** as Output Connector. Define a **Connector Name** and insert the **Kafka brokers, Kafka topic, JAAS username, JAAS password** (which is previously obtained Connection String) obtained from the shared access policy. Save the Data Flow accordingly.

![Apache Kafka template](https://2165942204-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MMKXTFIN5ZlLOjBlfC4%2Fuploads%2FyfT3s3KudyweHwgGKwd9%2Fscreely-1637329818097_kafka_2.jpg?alt=media\&token=e7754f35-859c-45f0-8b4c-053ea8f711df)

Create a new device using the Data Flow and start sending data.&#x20;

{% hint style="warning" %}
Important: Each Kafka topic will create a new instance in event hubs namespace resource.
{% endhint %}

The Event Hubs namespace resource can be found by navigating to **Event Hub resource** and then on the left tile under *Entities* choose *Event Hubs. It* will list all instances in that Event Hubs namespace.

### 4. Use Azure Data Explorer storage to store and view messages

One way to monitor incoming uplinks on Even Hubs is to create a database, more info about the setup can be found [here](https://docs.microsoft.com/en-us/azure/data-explorer/ingest-data-event-hub).
