Kafka Topics
Kafka uses topics to logically separate where messages are sent and received. The Kafka connector uses two Kafka topics for each model: one for receiving messages inside a real-time digital twin model, and one for sending messages back to data sources. Data sources that are implemented as Kafka clients can send messages to a real-time digital twin instance using the model’s subscribed topic and optionally receive messages from the corresponding instance in a separate topic.
Create Topic
For full documentation regarding creating topics, please consult the Kafka documentation. Topic creation is accomplished using Kafka APIs or via Kafka convenience scripts. For example, you can create the “MyDigitalTwinTopic” and “MyResponseTopic” topics used in the example below with the kafka-topics.sh script. The “MyDigitalTwinTopic” is the topic where a Kafka client representing a data source would send messages. The “MyResponseTopic” is the topic where a Kafka client would listen for responses from a digital twin instance:
$ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 4 --topic MyDigitalTwinTopic
$ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 4 --topic MyResponseTopic
Kafka messages are sent as key/value pairs. Each data source sending a Kafka message to a real-time digital twin instance should specify its instance’s name in the key and message in the value. When listening on the response topic corresponding to its real-time digital twin model, the data source should use the key to filter for messages for its corresponding instance.
Note
The Kafka connector requires messages within Kafka to specify a key. The key will be used as the instance identifier for the corresponding real-time digital twin instance.
Note
In multi-host environments, Kafka topics should be created with at least the same number of partitions as hosts to scale the workload to all hosts.