Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This tutorial shows you how to connect Akka Streams through the Event Hubs support for Apache Kafka without changing your protocol clients or running your own clusters.
In this tutorial, you learn how to:
- Create an Event Hubs namespace
- Clone the example project
- Run Akka Streams producer
- Run Akka Streams consumer
Note
This sample is available on GitHub
Prerequisites
To complete this tutorial, make sure you have the following prerequisites:
- Read through the Event Hubs for Apache Kafka article.
- An Azure subscription. If you don't have one, create a free account before you begin.
- Java Development Kit (JDK) 1.8+
- On Ubuntu, run apt-get install default-jdkto install the JDK.
- Be sure to set the JAVA_HOME environment variable to point to the folder where the JDK is installed.
 
- On Ubuntu, run 
- Download and install a Maven binary archive
- On Ubuntu, you can run apt-get install mavento install Maven.
 
- On Ubuntu, you can run 
- Git
- On Ubuntu, you can run sudo apt-get install gitto install Git.
 
- On Ubuntu, you can run 
Create an Event Hubs namespace
An Event Hubs namespace is required to send or receive from any Event Hubs service. See Create an event hub for detailed information. Make sure to copy the Event Hubs connection string for later use.
Clone the example project
Now that you have an Event Hubs connection string, clone the Azure Event Hubs for Kafka repository and navigate to the akka subfolder:
git clone https://github.com/Azure/azure-event-hubs-for-kafka.git
cd azure-event-hubs-for-kafka/tutorials/akka/java
Run Akka Streams producer
Using the provided Akka Streams producer example, send messages to the Event Hubs service.
Provide an Event Hubs Kafka endpoint
Producer application.conf
Update the bootstrap.servers and sasl.jaas.config values in producer/src/main/resources/application.conf to direct the producer to the Event Hubs Kafka endpoint with the correct authentication.
akka.kafka.producer {
    #Akka Kafka producer properties can be defined here
    # Properties defined by org.apache.kafka.clients.producer.ProducerConfig
    # can be defined in this configuration section.
    kafka-clients {
        bootstrap.servers="{YOUR.EVENTHUBS.FQDN}:9093"
        sasl.mechanism=PLAIN
        security.protocol=SASL_SSL
        sasl.jaas.config="org.apache.kafka.common.security.plain.PlainLoginModule required username=\"$ConnectionString\" password=\"{YOUR.EVENTHUBS.CONNECTION.STRING}\";"
    }
}
Important
Replace {YOUR.EVENTHUBS.CONNECTION.STRING} with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration: sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
Run producer from the command line
To run the producer from the command line, generate the JAR and then run from within Maven (or generate the JAR using Maven, then run in Java by adding the necessary Kafka Java Archive files (JARs) to the classpath):
mvn clean package
mvn exec:java -Dexec.mainClass="AkkaTestProducer"
The producer begins sending events to the event hub at topic test, and prints the events to stdout.
Run Akka Streams consumer
Using the provided consumer example, receive messages from the event hub.
Provide an Event Hubs Kafka endpoint
Consumer application.conf
Update the bootstrap.servers and sasl.jaas.config values in consumer/src/main/resources/application.conf to direct the consumer to the Event Hubs Kafka endpoint with the correct authentication.
akka.kafka.consumer {
    #Akka Kafka consumer properties defined here
    wakeup-timeout=60s
    # Properties defined by org.apache.kafka.clients.consumer.ConsumerConfig
    # defined in this configuration section.
    kafka-clients {
       request.timeout.ms=60000
       group.id=akka-example-consumer
       bootstrap.servers="{YOUR.EVENTHUBS.FQDN}:9093"
       sasl.mechanism=PLAIN
       security.protocol=SASL_SSL
       sasl.jaas.config="org.apache.kafka.common.security.plain.PlainLoginModule required username=\"$ConnectionString\" password=\"{YOUR.EVENTHUBS.CONNECTION.STRING}\";"
    }
}
Important
Replace {YOUR.EVENTHUBS.CONNECTION.STRING} with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration: sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
Run consumer from the command line
To run the consumer from the command line, generate the JAR and then run from within Maven (or generate the JAR using Maven, then run in Java by adding the necessary Kafka JARs to the classpath):
mvn clean package
mvn exec:java -Dexec.mainClass="AkkaTestConsumer"
If the event hub has events (for instance, if your producer is also running), then the consumer begins receiving events from topic test.
Check out the Akka Streams Kafka Guide for more detailed information about Akka Streams.
Next steps
To learn more about Event Hubs for Kafka, see the following articles: