Modifier and Type | Class and Description |
---|---|
class |
FlinkKafkaConsumer<T>
The Flink Kafka Consumer is a streaming data source that pulls a parallel data stream from Apache
Kafka.
|
class |
FlinkKafkaConsumer010<T>
The Flink Kafka Consumer is a streaming data source that pulls a parallel data stream from Apache
Kafka 0.10.x.
|
class |
FlinkKafkaConsumer011<T>
The Flink Kafka Consumer is a streaming data source that pulls a parallel data stream from Apache
Kafka 0.11.x.
|
Modifier and Type | Method and Description |
---|---|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.assignTimestampsAndWatermarks(AssignerWithPeriodicWatermarks<T> assigner)
Deprecated.
Please use
assignTimestampsAndWatermarks(WatermarkStrategy) instead. |
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.assignTimestampsAndWatermarks(AssignerWithPunctuatedWatermarks<T> assigner)
Deprecated.
Please use
assignTimestampsAndWatermarks(WatermarkStrategy) instead. |
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.assignTimestampsAndWatermarks(WatermarkStrategy<T> watermarkStrategy)
Sets the given
WatermarkStrategy on this consumer. |
protected FlinkKafkaConsumerBase<Row> |
KafkaTableSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema) |
protected FlinkKafkaConsumerBase<Row> |
Kafka011TableSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema) |
protected FlinkKafkaConsumerBase<Row> |
Kafka010TableSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema) |
protected abstract FlinkKafkaConsumerBase<Row> |
KafkaTableSourceBase.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema)
Creates a version-specific Kafka consumer.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.disableFilterRestoredPartitionsWithSubscribedTopics()
By default, when restoring from a checkpoint / savepoint, the consumer always ignores
restored partitions that are no longer associated with the current specified topics or topic
pattern to subscribe to.
|
protected FlinkKafkaConsumerBase<Row> |
KafkaTableSourceBase.getKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<Row> deserializationSchema)
Returns a version-specific Kafka consumer with the start position configured.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setCommitOffsetsOnCheckpoints(boolean commitOnCheckpoints)
Specifies whether or not the consumer should commit offsets back to Kafka on checkpoints.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setStartFromEarliest()
Specifies the consumer to start reading from the earliest offset for all partitions.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setStartFromGroupOffsets()
Specifies the consumer to start reading from any committed group offsets found in Zookeeper /
Kafka brokers.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setStartFromLatest()
Specifies the consumer to start reading from the latest offset for all partitions.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setStartFromSpecificOffsets(Map<KafkaTopicPartition,Long> specificStartupOffsets)
Specifies the consumer to start reading partitions from specific offsets, set independently
for each partition.
|
FlinkKafkaConsumerBase<T> |
FlinkKafkaConsumerBase.setStartFromTimestamp(long startupOffsetsTimestamp)
Specifies the consumer to start reading partitions from a specified timestamp.
|
Modifier and Type | Class and Description |
---|---|
class |
FlinkKafkaShuffleConsumer<T>
Flink Kafka Shuffle Consumer Function.
|
Modifier and Type | Method and Description |
---|---|
protected FlinkKafkaConsumerBase<RowData> |
KafkaDynamicSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<RowData> deserializationSchema) |
protected FlinkKafkaConsumerBase<RowData> |
Kafka011DynamicSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<RowData> deserializationSchema) |
protected FlinkKafkaConsumerBase<RowData> |
Kafka010DynamicSource.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<RowData> deserializationSchema) |
protected abstract FlinkKafkaConsumerBase<RowData> |
KafkaDynamicSourceBase.createKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<RowData> deserializationSchema)
Creates a version-specific Kafka consumer.
|
protected FlinkKafkaConsumerBase<RowData> |
KafkaDynamicSourceBase.getKafkaConsumer(String topic,
Properties properties,
DeserializationSchema<RowData> deserializationSchema)
Returns a version-specific Kafka consumer with the start position configured.
|
Copyright © 2014–2021 The Apache Software Foundation. All rights reserved.