akenza.io
WebsiteAPI DocumentationWhat's newLogin
  • Overview
  • Getting Started
    • Connect a Device
  • Changelog
  • General
    • Applications
    • Organization context
    • Workspace Context
    • Users
      • User Roles
  • Device Management
    • Managing an organization
      • API keys
      • Users (Organization)
        • Add & removing users from your organization
    • Managing a workspace
      • General Settings
        • Device Lifecycle Reports
        • Workspace properties
        • Occupancy settings
        • Device Setup Assistant
      • Custom fields
      • Tags
      • Users (Workspace)
    • Devices
      • Device
      • Device Simulator
      • Bulk actions
      • Bulk import CSV templates
    • Rules
      • Input
      • Logic blocks
        • Comparison
        • Custom Logic
          • Logic Block Scripting
      • Timed Rules
      • Rule Actions
        • Downlink
    • Data Flows
      • Device Connectors
        • Device Security
          • Using Device Credentials
            • Creating Public/Private Key Pairs
            • Using JSON Web Tokens (JWTs)
              • Python
              • Java
        • HTTP
        • MQTT
        • CoAP
        • LoRaWAN
          • Connectivity Management
          • Swisscom LoRaWAN
          • The Things Network
          • Loriot
          • Actility’s ThingPark Wireless
          • EWZ
          • Cibicom
          • Helium
          • ChirpStack
        • NB-IoT
        • mioty
        • Disruptive Technologies
        • VergeSense
        • Spaceti
        • Haltian
      • Device Types
        • Custom Device Types
          • Uplink
          • Downlink
          • Scripting
        • Library
      • Output Connectors
        • Databases
          • akenza DB
          • InfluxDB
          • SQL Databases
        • Streaming
          • Webhook
          • Azure IoT Hub
          • AWS Kinesis
          • Google Cloud Pub/Sub
          • Apache Kafka
        • Notifications
          • E-Mail
          • SMS
          • Microsoft Teams
          • Slack
    • Custom Components
    • Integrations
    • Reference
      • REST API
        • Filtering
        • Querying Device Data
      • WebSocket API
      • Scripting
        • Stateful Operations
        • Utility Functions
      • Payload Templating
  • Reference
  • Dashboard Builder
    • Managing Dashboards
      • Embedding dashboards
    • Components
      • Map
      • Floorplan
  • Device Setup Assistant
    • Device Setup Assistant - Overview
  • Tutorials
    • BI Tools
      • Grafana Data Source Plugin
      • How to build a dashboard with Retool
      • How to analyze data with AWS QuickSight
    • Devices
      • How to integrate the XDK device from Legic via MQTT on akenza
      • How to connect the Disruptive Technologies-cloud on akenza
      • How to send Downlinks to the Kuando Busylight device
      • How to integrate an Arduino device via MQTT on akenza
      • Integrate a MClimate Vicki LoRaWAN Radiator Thermostat on akenza
      • How to integrate an ERS Elsys device with Loriot on akenza
      • How to integrate the IAM Decentlab device with TTN on akenza
      • How to integrate the Seeed SenseCAP T1000 tracker on akenza
      • How to integrate a Swisscom Multisense device on akenza
    • Notifications
      • How to send SMS notifications
      • How to send notifications to Slack
      • How to send notifications to Microsoft Teams
    • Enterprise solutions
      • How to send data to Azure IoT Hub
      • How to send data to the Google Cloud Pub/Sub
      • How to send data to InfluxDB
      • How to send data to AWS Kinesis
      • How to send data to Azure Event Hubs with Apache Kafka
    • IoT Starter Kits
      • How to integrate the IAQ Kit with Actility on akenza
      • How to integrate the CoWork Kit with Actility on akenza
      • How to integrate the Smart Building Kit with Actility on akenza
      • How to integrate the Pepperl+Fuchs Kit with Actility on akenza
  • Support Center
    • FAQ
    • Status Page
    • Service Desk
    • Request a feature
  • Deprecated
    • SIM-Cards
    • Everynet
    • Sigfox
    • How to connect the Yanzi Lifecycle cloud on akenza
Powered by GitBook
On this page
  • 1. Setup Event Hubs namespace
  • 2. Obtain an Azure Event Hub connection string
  • 3. Setup a Data Flow in akenza
  • 4. Use Azure Data Explorer storage to store and view messages

Was this helpful?

  1. Tutorials
  2. Enterprise solutions

How to send data to Azure Event Hubs with Apache Kafka

In this tutorial you will learn how to send data to Event Hubs with Apache Kafka

PreviousHow to send data to AWS KinesisNextIoT Starter Kits

Last updated 5 months ago

Was this helpful?

Benefits of the Apache Kafka output connector:

  • Directly stream device data from akenza to Event Hubs with Apache Kafka and subsequently process it in

  • Easily process low-power device data e.g. LoRaWAN device data with Apache Kafka

  • Following benefits:

    • Real-Time Handling Apache Kafka can handle real-time data pipelines at scale with zero downtime and zero data loss.

    • High-throughput: Capable of handling high-velocity and high-volume data.

    • Low Latency: Handling messages with very low latency of the range of milliseconds

    • Variety of IoT Use Cases Manages a variety of IoT use cases commonly required for a Data Lake. (log aggregation, web activity tracking etc.)

In this tutorial, you will learn, how to send data to Event Hubs using the Apache Kafka output connector. The following steps are explained in more detail:

1. Setup Event Hubs namespace

Create an Event Hub resource

  • Create a new Event Hubs resource, by searching for it in All Services and then creating a new Event Hubs namespace

    • HubsAssign a resource group by choosing an existing group or creating a new one and fill the IoT hub name and region fields; choose Public access in the networking tab to select a tier that fits your use case in the Management tab (use the free tier for testing)

  • Click review and create to provision the IoT hub resource

Add a shared access policy

  • Navigate to your Event Hub resource and select Shared access policies in the Settings section.

  • Select + Add shared access policy, give a name, and check all boxes for the permissions (Manage, Send, Listen)

2. Obtain an Azure Event Hub connection string

Once the policy is created, obtain the Primary Connection String, by clicking the policy and copying the connection string. The connection string will be used as a JAAS password to create the connector in akenza.

3. Setup a Data Flow in akenza

In akenza, create a new Data Flow with Kafka as Output Connector. Define a Connector Name and insert the Kafka brokers, Kafka topic, JAAS username, JAAS password (which is previously obtained Connection String) obtained from the shared access policy. Save the Data Flow accordingly.

Create a new device using the Data Flow and start sending data.

Important: Each Kafka topic will create a new instance in event hubs namespace resource.

The Event Hubs namespace resource can be found by navigating to Event Hub resource and then on the left tile under Entities choose Event Hubs. It will list all instances in that Event Hubs namespace.

4. Use Azure Data Explorer storage to store and view messages

First, you will need to in order to successfully process data from akenza in Azure services.

Sign in to the

One way to monitor incoming uplinks on Even Hubs is to create a database, more info about the setup can be found .

set up an Azure IoT Hub
Azure portal
here
Setup Event Hubs namespace
Obtain an Azure Event Hub connection string
Setup a Data Flow in akenza
Use Azure Data Explorer storage to store and view messages
Azure services
Adding a shared access policy
Obtaining the connection string
Apache Kafka template