Kafka Schema Registry Upload Avro Schema From Command Line

This example shows how to use the Kafka Schema Registry to shop information schemas for Kafka topics which nosotros will generate using Apache Avro. The example will also demonstrate how to use the Schema Registry to produce and consume generated Apache Avro objects using an Instaclustr Kafka cluster.

Creating an Apache Kafka cluster with the Kafka Schema Registry add-on

Instaclustr now offers Kafka Schema Registry as an addition for our Apache Kafka Managed Service. To take advantage of this offering, you can now select 'Kafka Schema Registry' as an choice when creating a new Apache Kafka cluster.

Kafka Schema Registry Screenshot

If you wish to add the Kafka Schema Registry to an existing Apache Kafka cluster, y'all tin contact [email protected] .

Using the Schema Registry

Now that the Schema Registry is up and running, you can now use it in your applications to shop data schemas for your Kafka topics. The following case is a Java application that uses the Schema Registry and Apache Avro to produce and consume some simulated product order events.

Permit admission to your client application

Before nosotros can access our schema registry application, nosotros need to open the firewall to our client application IP address. Once your cluster is upward and running, go to Firewall Rules and add your IP accost to the Kafka Schema Registry Allowed Addresses.

Client Dependencies

Add the kafka_2.12, avro, and kafka-avro-serializer packages to your application. These bundle are available via Maven (kafka_2.12, avro, kafka-avro-serializer). To add together the post-obit dependencies using Maven, add the following to your pom.xml file:

You will too need the avro-tools utility in order to compile the information schema into a Java class. The avro-tools utility is available here.

Create the Avro Schema

Before you tin can produce or consume messages using Avro and the Schema Registry you commencement need to define the data schema. Create a file orderEventSchema.avsc with the following content:

This file specifies a simple OrderEvent data serialization schema for production orders, with each OrderEvent containing an id, timestamp, product name, and cost. For more than information on the Avro serialization format meet the documentation here.

Generate the Avro Object Class

With the schema file created, use the avro-tools utility to compile the schema file into an actual Java class:

Note: The src/primary/java file path at the finish of the command can exist wherever you want, simply brand sure the generated class will be accessible by your application code. An instance file construction is:

Create Kafka Topic

Apply the guide hither to create a new topic chosen orders.

Producing Avro Objects

Client configuration

Earlier creating a Kafka producer client, you lot first need to ascertain the configuration backdrop for the producer client to use. In this instance we provide only the required properties for the producer client. Come across hither for the full list of configuration options.

TheConnectedness Info folio in the Instaclustr Panel has these instance settings pre-configured with your cluster's ip addresses, username and countersign.

If your cluster has customer ⇆ broker encryption enabled, create a new file named producer.properties with the following content, ensuring the password, truststore location, and bootstrap servers list are correct:

If your cluster does not have customer ⇆ broker encryption enabled, create a new file named producer.backdrop with the post-obit content, ensuring the countersign and bootstrap servers listing are correct:

Make sure the password and bootstrap servers list are correct.

Important Notes:

  • To connect to your Kafka cluster over the private network, use port 9093 instead of 9092.
  • Instaclustr'south Kafka Schema Registry is configured with basic authentication credentials in the format 'user:[email protected]:8085'
  • basic.auth.credentials.source=URL is necessary for this basic authentication to piece of work correctly.

Java Code

Now that the configuration backdrop have been setup you can create a Kafka producer.

First, load the properties:

Once you've loaded the properties you can create the producer itself:

Next, create some OrderEvents to produce:

Where the getTimestamp() part is:

At present turn each OrderEvent into a ProducerRecord to be produced to the orders topic, and send them:

Finally, use the producer'south flush() method to ensure all letters get sent to Kafka:

Full code example:

Consuming Avro Objects

Client configuration

As in the producer example, before creating a Kafka consumer customer you offset need to define the configuration properties for the consumer client to use. In this case we provide but the required properties for the consumer client. Run across here for the full list of configuration options.

If your cluster has client ⇆ banker encryption enabled, create a new file named consumer.properties with the following content, ensuring the password, truststore location, and bootstrap servers listing are correct:

If your cluster does not accept client ⇆ broker encryption enabled, create a new file named consumer.backdrop with the following content, ensuring the password and bootstrap servers listing are right:

Brand certain the password and bootstrap servers list are right.

Important Notes:

  • To connect to your Kafka cluster over the individual network, use port 9093 instead of 9092.
  • Instaclustr'due south Kafka Schema Registry is configured with basic authentication credentials in the format 'user:[email protected]:8085'
  • basic.auth.credentials.source=URL is necessary for this basic hallmark to work correctly.

Java Code

Now that the configuration properties accept been setup y'all can create a Kafka consumer.

Outset, load the properties:

Once you've loaded the properties you can create the consumer itself:

Before you tin consume messages, you need to subscribe the consumer to the topic(due south) y'all wish to receive messages from, in this case the orders topic:

Finally, continually poll Kafka for new messages, and print each OrderEvent received:

Full code case:

Putting Them Together

Now that you take a consumer and producer ready, it's time to combine them.

Get-go the Consumer

Outset the consumer before starting the producer, because past default, consumers merely consume letters that were produced after the consumer started.

Starting time the Producer

Now that the consumer is setup and ready to consume letters, you can now offset your producer.

If the consumer and producer are setup correctly the consumer should output the messages sent past the producer shortly after they were produced, for example:

Transparent, fair, and flexible pricing for your data infrastructure :Meet Instaclustr Pricing Here

garciahestand.blogspot.com

Source: https://www.instaclustr.com/support/documentation/kafka-add-ons/using-the-kafka-schema-registry/

0 Response to "Kafka Schema Registry Upload Avro Schema From Command Line"

Enregistrer un commentaire

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel