@Internal public abstract class KafkaTableSourceBase extends Object implements org.apache.flink.table.sources.StreamTableSource<Row>, org.apache.flink.table.sources.DefinedProctimeAttribute, org.apache.flink.table.sources.DefinedRowtimeAttributes, DefinedFieldMapping
StreamTableSource
.
The version-specific Kafka consumers need to extend this class and
override createKafkaConsumer(String, Properties, DeserializationSchema)
}.
Modifier | Constructor and Description |
---|---|
protected |
KafkaTableSourceBase(TableSchema schema,
Optional<String> proctimeAttribute,
List<org.apache.flink.table.sources.RowtimeAttributeDescriptor> rowtimeAttributeDescriptors,
Optional<Map<String,String>> fieldMapping,
String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets)
Creates a generic Kafka
StreamTableSource . |
protected |
KafkaTableSourceBase(TableSchema schema,
String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema)
Creates a generic Kafka
StreamTableSource . |
Modifier and Type | Method and Description |
---|---|
protected abstract FlinkKafkaConsumerBase<Row> |
createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema)
Creates a version-specific Kafka consumer.
|
boolean |
equals(Object o) |
String |
explainSource()
Describes the table source.
|
DataStream<Row> |
getDataStream(StreamExecutionEnvironment env)
NOTE: This method is for internal use only for defining a TableSource.
|
DeserializationSchema<Row> |
getDeserializationSchema()
Returns the deserialization schema.
|
Map<String,String> |
getFieldMapping()
Returns the mapping for the fields of the
TableSource 's TableSchema to the fields of
its return type TypeInformation . |
protected FlinkKafkaConsumerBase<Row> |
getKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema)
Returns a version-specific Kafka consumer with the start position configured.
|
String |
getProctimeAttribute() |
Properties |
getProperties()
Returns the properties for the Kafka consumer.
|
TypeInformation<Row> |
getReturnType()
Returns the
TypeInformation for the return type of the TableSource . |
List<org.apache.flink.table.sources.RowtimeAttributeDescriptor> |
getRowtimeAttributeDescriptors() |
TableSchema |
getTableSchema()
Returns the schema of the produced table.
|
int |
hashCode() |
protected KafkaTableSourceBase(TableSchema schema, Optional<String> proctimeAttribute, List<org.apache.flink.table.sources.RowtimeAttributeDescriptor> rowtimeAttributeDescriptors, Optional<Map<String,String>> fieldMapping, String topic, Properties properties, DeserializationSchema<Row> deserializationSchema, StartupMode startupMode, Map<KafkaTopicPartition,Long> specificStartupOffsets)
StreamTableSource
.schema
- Schema of the produced table.proctimeAttribute
- Field name of the processing time attribute.rowtimeAttributeDescriptors
- Descriptor for a rowtime attributefieldMapping
- Mapping for the fields of the table schema to
fields of the physical returned type.topic
- Kafka topic to consume.properties
- Properties for the Kafka consumer.deserializationSchema
- Deserialization schema for decoding records from Kafka.startupMode
- Startup mode for the contained consumer.specificStartupOffsets
- Specific startup offsets; only relevant when startup
mode is StartupMode.SPECIFIC_OFFSETS
.protected KafkaTableSourceBase(TableSchema schema, String topic, Properties properties, DeserializationSchema<Row> deserializationSchema)
StreamTableSource
.schema
- Schema of the produced table.topic
- Kafka topic to consume.properties
- Properties for the Kafka consumer.deserializationSchema
- Deserialization schema for decoding records from Kafka.public DataStream<Row> getDataStream(StreamExecutionEnvironment env)
getDataStream
in interface org.apache.flink.table.sources.StreamTableSource<Row>
public TypeInformation<Row> getReturnType()
TableSource
TypeInformation
for the return type of the TableSource
.
The fields of the return type are mapped to the table schema based on their name.getReturnType
in interface TableSource<Row>
DataSet
or DataStream
.public TableSchema getTableSchema()
TableSource
getTableSchema
in interface TableSource<Row>
TableSchema
of the produced table.public String getProctimeAttribute()
getProctimeAttribute
in interface org.apache.flink.table.sources.DefinedProctimeAttribute
public List<org.apache.flink.table.sources.RowtimeAttributeDescriptor> getRowtimeAttributeDescriptors()
getRowtimeAttributeDescriptors
in interface org.apache.flink.table.sources.DefinedRowtimeAttributes
public Map<String,String> getFieldMapping()
DefinedFieldMapping
TableSource
's TableSchema
to the fields of
its return type TypeInformation
.
The mapping is done based on field names, e.g., a mapping "name" -> "f1" maps the schema field
"name" to the field "f1" of the return type, for example in this case the second field of a
Tuple
.
The returned mapping must map all fields (except proctime and rowtime fields) to the return
type. It can also provide a mapping for fields which are not in the TableSchema
to make
fields in the physical TypeInformation
accessible for a TimestampExtractor
.
getFieldMapping
in interface DefinedFieldMapping
TableSchema
fields to TypeInformation
fields or
null if no mapping is necessary.public String explainSource()
TableSource
explainSource
in interface TableSource<Row>
TableSource
.public Properties getProperties()
public DeserializationSchema<Row> getDeserializationSchema()
protected FlinkKafkaConsumerBase<Row> getKafkaConsumer(String topic, Properties properties, DeserializationSchema<Row> deserializationSchema)
topic
- Kafka topic to consume.properties
- Properties for the Kafka consumer.deserializationSchema
- Deserialization schema to use for Kafka records.protected abstract FlinkKafkaConsumerBase<Row> createKafkaConsumer(String topic, Properties properties, DeserializationSchema<Row> deserializationSchema)
topic
- Kafka topic to consume.properties
- Properties for the Kafka consumer.deserializationSchema
- Deserialization schema to use for Kafka records.Copyright © 2014–2020 The Apache Software Foundation. All rights reserved.