@PublicEvolving public class FlinkKafkaConsumer011<T> extends FlinkKafkaConsumer010<T>
The Flink Kafka Consumer participates in checkpointing and guarantees that no data is lost during a failure, and that the computation processes elements "exactly once". (Note: These guarantees naturally assume that Kafka itself does not loose any data.)
Please note that Flink snapshots the offsets internally as part of its distributed checkpoints. The offsets committed to Kafka / ZooKeeper are only to bring the outside view of progress in sync with Flink's view of the progress. That way, monitoring and other jobs can get a view of how far the Flink Kafka consumer has consumed a topic.
Please refer to Kafka's documentation for the available configuration properties: http://kafka.apache.org/documentation.html#newconsumerconfigs
SourceFunction.SourceContext<T>
DEFAULT_POLL_TIMEOUT, KEY_POLL_TIMEOUT, pollTimeout, properties
deserializer, KEY_DISABLE_METRICS, KEY_PARTITION_DISCOVERY_INTERVAL_MILLIS, MAX_NUM_PENDING_CHECKPOINTS, PARTITION_DISCOVERY_DISABLED
Constructor and Description |
---|
FlinkKafkaConsumer011(List<String> topics,
DeserializationSchema<T> deserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x
|
FlinkKafkaConsumer011(List<String> topics,
KafkaDeserializationSchema<T> deserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x
|
FlinkKafkaConsumer011(Pattern subscriptionPattern,
DeserializationSchema<T> valueDeserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
|
FlinkKafkaConsumer011(Pattern subscriptionPattern,
KafkaDeserializationSchema<T> deserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
|
FlinkKafkaConsumer011(String topic,
DeserializationSchema<T> valueDeserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
|
FlinkKafkaConsumer011(String topic,
KafkaDeserializationSchema<T> deserializer,
Properties props)
Creates a new Kafka streaming source consumer for Kafka 0.11.x
|
createFetcher, createPartitionDiscoverer, fetchOffsetsWithTimestamp, setStartFromTimestamp
getIsAutoCommitEnabled, getRateLimiter, setRateLimiter
assignTimestampsAndWatermarks, assignTimestampsAndWatermarks, cancel, close, disableFilterRestoredPartitionsWithSubscribedTopics, getProducedType, initializeState, notifyCheckpointComplete, open, run, setCommitOffsetsOnCheckpoints, setStartFromEarliest, setStartFromGroupOffsets, setStartFromLatest, setStartFromSpecificOffsets, snapshotState
getIterationRuntimeContext, getRuntimeContext, setRuntimeContext
public FlinkKafkaConsumer011(String topic, DeserializationSchema<T> valueDeserializer, Properties props)
topic
- The name of the topic that should be consumed.valueDeserializer
- The de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties used to configure the Kafka consumer client, and the ZooKeeper client.public FlinkKafkaConsumer011(String topic, KafkaDeserializationSchema<T> deserializer, Properties props)
This constructor allows passing a KafkaDeserializationSchema
for reading key/value
pairs, offsets, and topic names from Kafka.
topic
- The name of the topic that should be consumed.deserializer
- The keyed de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties used to configure the Kafka consumer client, and the ZooKeeper client.public FlinkKafkaConsumer011(List<String> topics, DeserializationSchema<T> deserializer, Properties props)
This constructor allows passing multiple topics to the consumer.
topics
- The Kafka topics to read from.deserializer
- The de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties that are used to configure both the fetcher and the offset handler.public FlinkKafkaConsumer011(List<String> topics, KafkaDeserializationSchema<T> deserializer, Properties props)
This constructor allows passing multiple topics and a key/value deserialization schema.
topics
- The Kafka topics to read from.deserializer
- The keyed de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties that are used to configure both the fetcher and the offset handler.@PublicEvolving public FlinkKafkaConsumer011(Pattern subscriptionPattern, DeserializationSchema<T> valueDeserializer, Properties props)
If partition discovery is enabled (by setting a non-negative value for
FlinkKafkaConsumerBase.KEY_PARTITION_DISCOVERY_INTERVAL_MILLIS
in the properties), topics
with names matching the pattern will also be subscribed to as they are created on the fly.
subscriptionPattern
- The regular expression for a pattern of topic names to subscribe to.valueDeserializer
- The de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties used to configure the Kafka consumer client, and the ZooKeeper client.@PublicEvolving public FlinkKafkaConsumer011(Pattern subscriptionPattern, KafkaDeserializationSchema<T> deserializer, Properties props)
If partition discovery is enabled (by setting a non-negative value for
FlinkKafkaConsumerBase.KEY_PARTITION_DISCOVERY_INTERVAL_MILLIS
in the properties), topics
with names matching the pattern will also be subscribed to as they are created on the fly.
This constructor allows passing a KafkaDeserializationSchema
for reading key/value
pairs, offsets, and topic names from Kafka.
subscriptionPattern
- The regular expression for a pattern of topic names to subscribe to.deserializer
- The keyed de-/serializer used to convert between Kafka's byte messages and Flink's objects.props
- The properties used to configure the Kafka consumer client, and the ZooKeeper client.Copyright © 2014–2020 The Apache Software Foundation. All rights reserved.