Deploying Kafka Connector via API

Support for launching a Kafka connector is currently only available in Java. Messages can be sent into Kafka using any supported library, however the connector which receives those messages and processes them for their respective real-time digital twin instance is launched as a Java process. For full API documentation, please see the JavaDocs API Reference.

Prerequisites

As a prerequisite, be sure that Java is installed and the JAVA_HOME environment variable is set. You will need either Gradle or Maven to download the required JARs.

The first step is to create a Java project and set up the classpath using your preferred build tool (Gradle or Maven). The ScaleOut JARs are published to Maven Central.

Once the project is initialized, add the real-time digital twin APIs as dependencies.

Adding Datasource Gradle Dependency

compile group: 'com.scaleoutsoftware.digitaltwin', name: "datasource", version: '3.2.10'

Adding Datasource Maven Dependency

<dependencies>
        <!-- ... -->
        <!-- your dependencies -->
        <!-- ... -->
        <dependency>
          <groupId>com.scaleoutsoftware.digitaltwin</groupId>
          <artifactId>datasource</artifactId>
          <version>3.2.10</version>
        </dependency>
</dependencies>

Creating a Kafka Connector

The following code snippet demonstrates how to use the Java API’s KafkaEndpointBuilder class to start an Kafka connector connecting to Azure’s Kafka Event Hub:

KafkaEndpoint kafkaEndpoint = new KafkaEndpointBuilder(
            "<connector name>",                  // connector name
            "<connection string>")               // Azure Kafka Event Hub connection string
            .addTopic("DT_MODEL_A",              // one of the models that processes messages for this connector
                      "datasource_to_dt_topicA", // the topic a data source should send messages to for this real-time digital twin model
                      "dt_to_datasource_topicA") // the topic that instances of this model should reply to
            .addTopic("DT_MODEL_B",
                      "datasource_to_dt_topicB",
                      "dt_to_datasource_topicB")
            .build();

Note

addTopic can be called multiple times to register multiple digital twin models with the same connector.

The KafkaEndpointBuilder class also can be used to connect to a Kafka service running locally by specifying a connect.properties file. Please refer to the Java API documentation for details on server.properties files. Here is a code snippet of this usage:

KafkaEndpoint kafkaEndpoint = new KafkaEndpointBuilder(
            "<connector name>",                      // connector name
            new File("/path/to/server.properties"))  // path to the connect.properties Kafka configuration file
            .addTopic("DT_MODEL_A",                  // one of the models that processes messages for this connector
                      "datasource_to_dt_topicA",     // the topic a data source should send messages to for this real-time digital twin model
                      "dt_to_datasource_topicA")     // the topic that instances of this model should reply to
            .addTopic("DT_MODEL_B",
                      "datasource_to_dt_topicB",
                      "dt_to_datasource_topicB")
            .build();