pyflink.datastream.connectors.kafka.KafkaRecordSerializationSchemaBuilder#
- class KafkaRecordSerializationSchemaBuilder[source]#
Builder to construct
KafkaRecordSerializationSchema
.Example:
>>> KafkaRecordSerializationSchema.builder() \ ... .set_topic('topic') \ ... .set_key_serialization_schema(SimpleStringSchema()) \ ... .set_value_serialization_schema(SimpleStringSchema()) \ ... .build()
And the sink topic can be calculated dynamically from each record:
>>> KafkaRecordSerializationSchema.builder() \ ... .set_topic_selector(lambda row: 'topic-' + row['category']) \ ... .set_value_serialization_schema( ... JsonRowSerializationSchema.builder().with_type_info(ROW_TYPE).build()) \ ... .build()
It is necessary to configure exactly one serialization method for the value and a topic.
New in version 1.16.0.
Methods
build
()Constructs the
KafkaRecordSerializationSchemaBuilder
with the configured properties.set_key_serialization_schema
(...)Sets a
SerializationSchema
which is used to serialize the incoming element to the key of the producer record.set_topic
(topic)Sets a fixed topic which used as destination for all records.
set_topic_selector
(topic_selector)Sets a topic selector which computes the target topic for every incoming record.
set_value_serialization_schema
(...)Sets a
SerializationSchema
which is used to serialize the incoming element to the value of the producer record.