IN
- Type of the messages to write into Kafka.@PublicEvolving public class FlinkKafkaProducer09<IN> extends FlinkKafkaProducerBase<IN>
Please note that this producer does not have any reliability guarantees.
SinkFunction.Context<T>
asyncException, callback, defaultTopicId, flinkKafkaPartitioner, flushOnCheckpoint, KEY_DISABLE_METRICS, logFailuresOnly, pendingRecords, pendingRecordsLock, producer, producerConfig, schema, topicPartitionsMap
Constructor and Description |
---|
FlinkKafkaProducer09(String topicId,
KeyedSerializationSchema<IN> serializationSchema,
Properties producerConfig)
Creates a FlinkKafkaProducer for a given topic.
|
FlinkKafkaProducer09(String topicId,
KeyedSerializationSchema<IN> serializationSchema,
Properties producerConfig,
FlinkKafkaPartitioner<IN> customPartitioner)
Creates a FlinkKafkaProducer for a given topic.
|
FlinkKafkaProducer09(String topicId,
KeyedSerializationSchema<IN> serializationSchema,
Properties producerConfig,
KafkaPartitioner<IN> customPartitioner)
Deprecated.
This is a deprecated constructor that does not correctly handle partitioning when
producing to multiple topics. Use
FlinkKafkaProducer09(String, KeyedSerializationSchema, Properties, FlinkKafkaPartitioner) instead. |
FlinkKafkaProducer09(String topicId,
SerializationSchema<IN> serializationSchema,
Properties producerConfig)
Creates a FlinkKafkaProducer for a given topic.
|
FlinkKafkaProducer09(String topicId,
SerializationSchema<IN> serializationSchema,
Properties producerConfig,
FlinkKafkaPartitioner<IN> customPartitioner)
Creates a FlinkKafkaProducer for a given topic.
|
FlinkKafkaProducer09(String topicId,
SerializationSchema<IN> serializationSchema,
Properties producerConfig,
KafkaPartitioner<IN> customPartitioner)
Deprecated.
This is a deprecated constructor that does not correctly handle partitioning when
producing to multiple topics. Use
FlinkKafkaProducer09(String, SerializationSchema, Properties, FlinkKafkaPartitioner) instead. |
FlinkKafkaProducer09(String brokerList,
String topicId,
KeyedSerializationSchema<IN> serializationSchema)
Creates a FlinkKafkaProducer for a given topic.
|
FlinkKafkaProducer09(String brokerList,
String topicId,
SerializationSchema<IN> serializationSchema)
Creates a FlinkKafkaProducer for a given topic.
|
Modifier and Type | Method and Description |
---|---|
protected void |
flush()
Flush pending records.
|
checkErroneous, close, getKafkaProducer, getPartitionsByTopic, getPropertiesFromBrokerList, initializeState, invoke, numPendingRecords, open, setFlushOnCheckpoint, setLogFailuresOnly, snapshotState
getIterationRuntimeContext, getRuntimeContext, setRuntimeContext
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
invoke
public FlinkKafkaProducer09(String brokerList, String topicId, SerializationSchema<IN> serializationSchema)
Using this constructor, the default FlinkFixedPartitioner
will be used as
the partitioner. This default partitioner maps each sink subtask to a single Kafka
partition (i.e. all records received by a sink subtask will end up in the same
Kafka partition).
To use a custom partitioner, please use
FlinkKafkaProducer09(String, SerializationSchema, Properties, FlinkKafkaPartitioner)
instead.
brokerList
- Comma separated addresses of the brokerstopicId
- ID of the Kafka topic.serializationSchema
- User defined key-less serialization schema.public FlinkKafkaProducer09(String topicId, SerializationSchema<IN> serializationSchema, Properties producerConfig)
Using this constructor, the default FlinkFixedPartitioner
will be used as
the partitioner. This default partitioner maps each sink subtask to a single Kafka
partition (i.e. all records received by a sink subtask will end up in the same
Kafka partition).
To use a custom partitioner, please use
FlinkKafkaProducer09(String, SerializationSchema, Properties, FlinkKafkaPartitioner)
instead.
topicId
- ID of the Kafka topic.serializationSchema
- User defined key-less serialization schema.producerConfig
- Properties with the producer configuration.public FlinkKafkaProducer09(String topicId, SerializationSchema<IN> serializationSchema, Properties producerConfig, @Nullable FlinkKafkaPartitioner<IN> customPartitioner)
SerializationSchema
and possibly a custom FlinkKafkaPartitioner
.
Since a key-less SerializationSchema
is used, all records sent to Kafka will not have an
attached key. Therefore, if a partitioner is also not provided, records will be distributed to Kafka
partitions in a round-robin fashion.
topicId
- The topic to write data toserializationSchema
- A key-less serializable serialization schema for turning user objects into a kafka-consumable byte[]producerConfig
- Configuration properties for the KafkaProducer. 'bootstrap.servers.' is the only required argument.customPartitioner
- A serializable partitioner for assigning messages to Kafka partitions.
If set to null
, records will be distributed to Kafka partitions
in a round-robin fashion.public FlinkKafkaProducer09(String brokerList, String topicId, KeyedSerializationSchema<IN> serializationSchema)
Using this constructor, the default FlinkFixedPartitioner
will be used as
the partitioner. This default partitioner maps each sink subtask to a single Kafka
partition (i.e. all records received by a sink subtask will end up in the same
Kafka partition).
To use a custom partitioner, please use
FlinkKafkaProducer09(String, KeyedSerializationSchema, Properties, FlinkKafkaPartitioner)
instead.
brokerList
- Comma separated addresses of the brokerstopicId
- ID of the Kafka topic.serializationSchema
- User defined serialization schema supporting key/value messagespublic FlinkKafkaProducer09(String topicId, KeyedSerializationSchema<IN> serializationSchema, Properties producerConfig)
Using this constructor, the default FlinkFixedPartitioner
will be used as
the partitioner. This default partitioner maps each sink subtask to a single Kafka
partition (i.e. all records received by a sink subtask will end up in the same
Kafka partition).
To use a custom partitioner, please use
FlinkKafkaProducer09(String, KeyedSerializationSchema, Properties, FlinkKafkaPartitioner)
instead.
topicId
- ID of the Kafka topic.serializationSchema
- User defined serialization schema supporting key/value messagesproducerConfig
- Properties with the producer configuration.public FlinkKafkaProducer09(String topicId, KeyedSerializationSchema<IN> serializationSchema, Properties producerConfig, @Nullable FlinkKafkaPartitioner<IN> customPartitioner)
KeyedSerializationSchema
and possibly a custom FlinkKafkaPartitioner
.
If a partitioner is not provided, written records will be partitioned by the attached key of each
record (as determined by KeyedSerializationSchema.serializeKey(Object)
). If written records do not
have a key (i.e., KeyedSerializationSchema.serializeKey(Object)
returns null
), they
will be distributed to Kafka partitions in a round-robin fashion.
topicId
- The topic to write data toserializationSchema
- A serializable serialization schema for turning user objects into a kafka-consumable byte[] supporting key/value messagesproducerConfig
- Configuration properties for the KafkaProducer. 'bootstrap.servers.' is the only required argument.customPartitioner
- A serializable partitioner for assigning messages to Kafka partitions.
If set to null
, records will be partitioned by the key of each record
(determined by KeyedSerializationSchema.serializeKey(Object)
). If the keys
are null
, then records will be distributed to Kafka partitions in a
round-robin fashion.@Deprecated public FlinkKafkaProducer09(String topicId, SerializationSchema<IN> serializationSchema, Properties producerConfig, KafkaPartitioner<IN> customPartitioner)
FlinkKafkaProducer09(String, SerializationSchema, Properties, FlinkKafkaPartitioner)
instead.topicId
- The topic to write data toserializationSchema
- A (keyless) serializable serialization schema for turning user objects into a kafka-consumable byte[]producerConfig
- Configuration properties for the KafkaProducer. 'bootstrap.servers.' is the only required argument.customPartitioner
- A serializable partitioner for assigning messages to Kafka partitions (when passing null, we'll use Kafka's partitioner)@Deprecated public FlinkKafkaProducer09(String topicId, KeyedSerializationSchema<IN> serializationSchema, Properties producerConfig, KafkaPartitioner<IN> customPartitioner)
FlinkKafkaProducer09(String, KeyedSerializationSchema, Properties, FlinkKafkaPartitioner)
instead.topicId
- The topic to write data toserializationSchema
- A serializable serialization schema for turning user objects into a kafka-consumable byte[] supporting key/value messagesproducerConfig
- Configuration properties for the KafkaProducer. 'bootstrap.servers.' is the only required argument.customPartitioner
- A serializable partitioner for assigning messages to Kafka partitions.protected void flush()
FlinkKafkaProducerBase
flush
in class FlinkKafkaProducerBase<IN>
Copyright © 2014–2020 The Apache Software Foundation. All rights reserved.