The Spring-Cloud-Stream-Schema-Avro module adds schema evolution support to your SCSt module using avro serializer.
Add a dependency to the module on your application:
<dependency>
<groupId>org.springframework.cloud.stream.schema</groupId>
<artifactId>avro-codec</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
</dependency>
The module uses Spring boot auto configuration capabilities and it will bootstrap when it detects avro jars on the classpath.
You need to set the Content-Type
property of your outputs in order to enable the MessageConverter
spring:
cloud:
stream:
bindings:
output:
destination: sensor-topic
contentType: "avro/binary"
The modules requires a schema registry available to register and fetch schemas in order to encode/decode the payload.
For now, only Confluent Schema Registry is supported.
Configure it using a property on your application.yml
confluent:
schemaregistry:
endpoint: http://192.168.99.100:8081
The module supports three types of objects when serializing data, to understand how avro works with serialization please look here
-
SpecificRecord entities: Those are generated from an avro schema
-
GenericRecord entities: It’s a container that uses a schema internally to validate fields and process serialization
-
Any Java Pojo: Using reflection
You need to set the Content-Type
for sources.
If you are using SpecificRecord, then the module can retrieve the schema from the object, so no other configuration is needed.
If you decide to use pojos using reflection instead, the module supports two approaches:
Using a schema file
You need an avro schema file (*.avsc) on your classpath for each entity you want to use with reflection. The module maps the pojo entity and the schema file using the fully qualified name of the class and the schema fullname (namespace + name)
Dynamic generation of schema
It’s possible to enable dynamic schema generation (not recommended). But if this is what you want enable it on your application.yml
spring:
cloud:
stream:
codec:
avro:
dynamicSchemaGeneration: true
Please note that any schema generated dynamically will be registered with the schema registry
You don’t need to configure a Content-Type
for Sinks, the module uses information at the message headers
to enable deserialization.
But since Avro support schema evolution, for readers there’s always the concept of the writer and the reader schema.
By default, a Sink will always be configured to use the same schema as the writer (from the message payload). If you want to set a sink to use a specific schema, you can configure this at the application.yml file
spring:
cloud:
stream:
codec:
avro:
readerSchema: "org.springframework.cloud.stream.samples.User"