In this section we demonstrate how to achieve Kafka messaging to the web.

Install MigratoryData

Download the tarball package of the MigratoryData server from the downloads page, unzip the tarball to any folder, change to that folder, and run on Linux/Unix/MacOS:

$ ./start-migratorydata.sh

By default, the MigratoryData server will accept client connections on the address localhost:8800. Therefore, open in a browser the following url:

http://localhost:8800

A welcome page should be loaded. Click on the DEMO button and a demo web app should be loaded. Click on the buttons Connect, then Subscribe and Publish to verify that you installation is correct.

If you encounter any issue with the installation, please check out the Installation Guide.

Install Kafka

Download the tarball binary package of Kafka from the Kafka downloads page, unzip the tarball to any folder, change to that folder, and run on Linux/Unix/MacOS:

$ ./bin/zookeeper-server-start.sh config/zookeeper.properties
$ ./bin/kafka-server-start.sh config/server.properties

Install Kafka Connect

Download the MigratoryData Sink Connector for Kafka from the MigratoryData Connectors section of the downloads page, unzip it to any folder, say /tmp/kafka/connectors. Change to the folder where you installed Kafka at the previous step and edit the configuration file config/connect-distributed.properties as follows:

plugin.path = /tmp/kafka/connectors/migratorydata-connector-kafka

Finally, start the Kafka Connect distributed service as follows:

$ ./bin/connect-distributed.sh config/connect-distributed.properties

Deploy the Connector

In this example, we show how to deploy MigratoryData Sink Connector as two Kafka Connect tasks which consumes the topics topic_1 and topic_2, and maps them to MigratoryData subjects as follows:

Kafka topic MigratoryData subject
topic_1 /server/status
topic_2 /x/topic_2,/y/topic_2/<KEY>

where <KEY> is the key of the received Kafka message.

To load the connector, run the following command:

$ curl --header "Content-Type: application/json" \
  --request PUT \
  --data '{
      "connector.class":"com.migratorydata.kafka.sink.MigratoryDataSinkConnector",
      "key.converter":"org.apache.kafka.connect.storage.StringConverter",
      "value.converter":"org.apache.kafka.connect.storage.StringConverter",
      "tasks.max": "2",
      "migratorydata.servers":"127.0.0.1:8800",
      "migratorydata.entitlement_token":"some-token",
      "topics":"topic_1, topic_2",
      "kafka.topics.topic_1":"/server/status",
      "kafka.topics.topic_2":"/x/${topic}, /y/${topic}/${key}"
}' \
http://127.0.0.1:8083/connectors/migratory_data_sink_00/config

To check that the connector is up and running, run the following command:

curl -s localhost:8083/connectors/migratory_data_sink_00/status | jq .

Test the Connector

Open the demo web app at http://localhost:8800 as detailed above, connect and subscribe to the topics:

  • /server/status
  • /x/topic_2
  • /y/topic_2/key1

Change to the folder where you installed Kafka, and run the following Kafka publisher for the topic topic_1:

$ ./bin/kafka-console-producer.sh --topic topic_1 --bootstrap-server localhost:9092

You should be able to see that the web app displays in real-time the Kafka messages that you publish here.

To publish Kafka messages with keys, let’s run the following Kafka publisher for the topic topic_2. You can publish a message with a key by running the following command and prefixing the message with a key as follows:

# ./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic topic_2 --property "parse.key=true" --property "key.separator=:"
key1:abc
key1:cde
key2:efg
...

You should be able to see that the web app displays in real-time the Kafka messages that you publish here. Moreover, you will see that the example message abc is published across two subjects /x/topic_2 and /y/topic_2/key1 according to the defined mapping of the connector and the current subscriptions of your web app.

Delete the Connector

To delete the MigratoryData Sink Connector installed above, run the following command:

curl --header "Content-Type: application/json" \
  --request DELETE \
  http://127.0.0.1:8083/connectors/migratory_data_sink_00