# How to send data to Azure Event Hubs with Apache Kafka

![](/files/kUAwWrwrzGGzx2OmsrkY)

{% hint style="success" %}
**Benefits of the Apache Kafka output connector**:

* Directly stream device data from akenza to Event Hubs with **Apache Kafka** and subsequently process it in [Azure services](https://azure.microsoft.com/en-us/services/)
* Easily process **low-power device** data e.g. LoRaWAN device data with **Apache Kafka**
* Following benefits:
  * **Real-Time Handling**\
    Apache Kafka can handle real-time data pipelines at scale with zero downtime and zero data loss.
  * **High-throughput**: Capable of handling high-velocity and high-volume data.
  * **Low Latency**: Handling messages with very low latency of the range of milliseconds
  * **Variety of IoT Use Cases**\
    Manages a variety of IoT use cases commonly required for a Data Lake. (log aggregation, web activity tracking etc.)
    {% endhint %}

In this tutorial, you will learn, how to send data to Event Hubs using the **Apache** **Kafka output connector**. The following steps are explained in more detail:

1. [Setup Event Hubs namespace](#1.-setup-event-hubs-namespace)
2. [Obtain an Azure Event Hub connection string](#2.-obtain-an-azure-event-hub-connection-string)
3. [Setup a **Data Flow** in akenza](#3.-setup-a-data-flow-in-akenza)
4. [Use Azure Data Explorer storage to store and view messages](#4.-use-azure-data-explorer-storage-to-store-and-view-messages)

### 1. Setup Event Hubs namespace

First, you will need to [set up an Azure IoT Hub](https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-through-portal) in order to successfully process data from akenza in Azure services.

#### Create an Event Hub resource

* Sign in to the [Azure portal](https://portal.azure.com/)
* Create a new Event Hubs resource, by searching for it in *All Services* and then creating a new Event Hubs namespace
  * HubsAssign a *resource group* by choosing an existing group or creating a new one and fill the *IoT hub name* and *region* fields; choose *Public access* in the networking tab to select a *tier* that fits your use case in the Management tab (use the free tier for testing)
* Click **review and create** to provision the IoT hub resource

#### Add a shared access policy

* Navigate to your Event Hub resource and select *Shared access policies* in the *Settings* section.&#x20;
* Select *+ Add shared access policy***,** give a name, and check *all boxes* for the permissions (Manage, Send, Listen)

![Adding a shared access policy](/files/rn8RXEJjJF67ZdoyrlDT)

### 2. Obtain an Azure Event Hub **connection string**

Once the policy is created, obtain the *Primary Connection String*, by clicking the policy and copying the connection string. The connection string will be used as a JAAS password to create the connector in akenza.

![Obtaining the connection string](/files/aPowlCjwKEoHWPsbOKSs)

### 3. Setup a Data Flow in akenza

In akenza, create a new **Data Flow** with **Kafka** as Output Connector. Define a **Connector Name** and insert the **Kafka brokers, Kafka topic, JAAS username, JAAS password** (which is previously obtained Connection String) obtained from the shared access policy. Save the Data Flow accordingly.

![Apache Kafka template](/files/22qaZjl9JfrX3No9qH4f)

Create a new device using the Data Flow and start sending data.&#x20;

{% hint style="warning" %}
Important: Each Kafka topic will create a new instance in event hubs namespace resource.
{% endhint %}

The Event Hubs namespace resource can be found by navigating to **Event Hub resource** and then on the left tile under *Entities* choose *Event Hubs. It* will list all instances in that Event Hubs namespace.

### 4. Use Azure Data Explorer storage to store and view messages

One way to monitor incoming uplinks on Even Hubs is to create a database, more info about the setup can be found [here](https://docs.microsoft.com/en-us/azure/data-explorer/ingest-data-event-hub).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.akenza.io/akenza.io/tutorials/create-enterprise-solutions/how-to-send-data-to-event-hubs-with-apache-kafka.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
