- cache - Variable in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- call() - Method in class org.apache.flink.table.store.file.compact.CompactTask
-
- cancelCompaction() - Method in class org.apache.flink.table.store.file.compact.CompactFutureManager
-
- cancelCompaction() - Method in interface org.apache.flink.table.store.file.compact.CompactManager
-
Cancel currently running compaction task.
- cancelCompaction() - Method in class org.apache.flink.table.store.file.compact.NoopCompactManager
-
- cancelJob(String) - Method in class org.apache.flink.table.store.benchmark.metric.FlinkRestClient
-
- canEqual(Object) - Method in class org.apache.flink.table.store.connector.sink.CommittableTypeInfo
-
- capabilities() - Method in class org.apache.flink.table.store.spark.SparkTable
-
- castFromString(String, LogicalType) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
- castToIntegral(DecimalData) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- catalog() - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- Catalog - Interface in org.apache.flink.table.store.file.catalog
-
This interface is responsible for reading and writing metadata such as database/table from a
table store catalog.
- Catalog.DatabaseAlreadyExistException - Exception in org.apache.flink.table.store.file.catalog
-
Exception for trying to create a database that already exists.
- Catalog.DatabaseNotEmptyException - Exception in org.apache.flink.table.store.file.catalog
-
Exception for trying to drop on a database that is not empty.
- Catalog.DatabaseNotExistException - Exception in org.apache.flink.table.store.file.catalog
-
Exception for trying to operate on a database that doesn't exist.
- Catalog.TableAlreadyExistException - Exception in org.apache.flink.table.store.file.catalog
-
Exception for trying to create a table that already exists.
- Catalog.TableNotExistException - Exception in org.apache.flink.table.store.file.catalog
-
Exception for trying to operate on a table that doesn't exist.
- CatalogFactory - Interface in org.apache.flink.table.store.file.catalog
-
- CatalogLock - Interface in org.apache.flink.table.store.file.catalog
-
An interface that allows source and sink to use global lock to some transaction-related things.
- CatalogLock.Factory - Interface in org.apache.flink.table.store.file.catalog
-
- CatalogLockFactory(CatalogLock.Factory, ObjectPath) - Constructor for class org.apache.flink.table.store.file.operation.Lock.CatalogLockFactory
-
- CatalogOptions - Class in org.apache.flink.table.store
-
Catalog options for table store.
- changelog() - Method in class org.apache.flink.table.store.file.compact.CompactResult
-
- CHANGELOG_FILE_PREFIX - Static variable in class org.apache.flink.table.store.file.io.DataFilePathFactory
-
- CHANGELOG_PRODUCER - Static variable in class org.apache.flink.table.store.CoreOptions
-
- CHANGELOG_PRODUCER_FULL_COMPACTION_TRIGGER_INTERVAL - Static variable in class org.apache.flink.table.store.CoreOptions
-
- changelogFiles() - Method in class org.apache.flink.table.store.file.io.CompactIncrement
-
- changelogFiles() - Method in class org.apache.flink.table.store.file.io.NewFilesIncrement
-
- changelogManifestList() - Method in class org.apache.flink.table.store.file.Snapshot
-
- changelogProducer() - Method in class org.apache.flink.table.store.CoreOptions
-
- changelogProducerFullCompactionTriggerInterval() - Method in class org.apache.flink.table.store.CoreOptions
-
- ChangelogValueCountFileStoreTable - Class in org.apache.flink.table.store.table
-
- ChangelogWithKeyFileStoreTable - Class in org.apache.flink.table.store.table
-
- channelReaderInputViewIterator(AbstractChannelReaderInputView) - Method in class org.apache.flink.table.store.file.sort.BinaryExternalMerger
-
- checkAlterTableOption(String) - Static method in class org.apache.flink.table.store.file.schema.SchemaManager
-
- checkMaxSleep(HiveConf) - Static method in class org.apache.flink.table.store.hive.HiveCatalogLock
-
- checkNextIndexOffset() - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
check if we need request next index memory.
- checkOutputSpecs(FileSystem, JobConf) - Method in class org.apache.flink.table.store.mapred.TableStoreOutputFormat
-
- checkPidPgrpidForMatch() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
Verify that the given process id is same as its process group id.
- checkPidPgrpidForMatch(String, String) - Static method in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- checkpointId() - Method in class org.apache.flink.table.store.connector.sink.Committable
-
- children() - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- cleanUp() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- clear() - Method in class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- clear() - Method in interface org.apache.flink.table.store.file.mergetree.WriteBuffer
-
Removes all records from this table.
- clear() - Method in class org.apache.flink.table.store.file.sort.BinaryExternalSortBuffer
-
- clear() - Method in class org.apache.flink.table.store.file.sort.BinaryInMemorySortBuffer
-
We add clear() method here instead of reset() to release all memory segments.
- clear() - Method in interface org.apache.flink.table.store.file.sort.SortBuffer
-
- Clock - Class in org.apache.flink.table.store.benchmark.metric.cpu.clock
-
A clock that gives access to time.
- Clock() - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.clock.Clock
-
- close() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricReceiver
-
- close() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricSender
-
- close() - Method in class org.apache.flink.table.store.benchmark.metric.FlinkRestClient
-
- close() - Method in class org.apache.flink.table.store.benchmark.metric.MetricReporter
-
- close() - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess
-
- close() - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- close() - Method in class org.apache.flink.table.store.connector.lookup.FileStoreLookupFunction
-
- close() - Method in class org.apache.flink.table.store.connector.lookup.RocksDBStateFactory
-
- close() - Method in class org.apache.flink.table.store.connector.sink.CommitterOperator
-
- close() - Method in class org.apache.flink.table.store.connector.sink.StoreCommitter
-
- close() - Method in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- close() - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- close() - Method in class org.apache.flink.table.store.connector.source.ContinuousFileSplitEnumerator
-
- close() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitReader
-
- close() - Method in class org.apache.flink.table.store.connector.source.StaticFileStoreSplitEnumerator
-
- close() - Method in class org.apache.flink.table.store.file.append.AppendOnlyWriter
-
- close() - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- close() - Method in class org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader
-
- close() - Method in class org.apache.flink.table.store.file.io.RollingFileWriter
-
- close() - Method in class org.apache.flink.table.store.file.io.RowDataFileRecordReader
-
- close() - Method in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- close() - Method in class org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader
-
- close() - Method in class org.apache.flink.table.store.file.mergetree.compact.SortMergeReader
-
- close() - Method in class org.apache.flink.table.store.file.mergetree.DropDeleteReader
-
- close() - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- close() - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- close() - Method in interface org.apache.flink.table.store.file.operation.FileStoreWrite
-
Close the writer.
- close() - Method in class org.apache.flink.table.store.file.operation.Lock.CatalogLockImpl
-
- close() - Method in class org.apache.flink.table.store.file.operation.Lock.EmptyLock
-
- close() - Method in class org.apache.flink.table.store.file.utils.AtomicFsDataOutputStream
-
Closes this stream.
- close() - Method in class org.apache.flink.table.store.file.utils.IteratorRecordReader
-
- close() - Method in interface org.apache.flink.table.store.file.utils.RecordReader
-
Closes the reader and should release all resources.
- close() - Method in class org.apache.flink.table.store.file.utils.RecordReaderIterator
-
- close() - Method in interface org.apache.flink.table.store.file.utils.RecordWriter
-
Close this writer, the call will delete newly generated but not committed files.
- close() - Method in class org.apache.flink.table.store.file.utils.RenamingAtomicFsDataOutputStream
-
- close() - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- close() - Method in class org.apache.flink.table.store.hive.HiveCatalogLock
-
- close() - Method in class org.apache.flink.table.store.mapred.TableStoreRecordReader
-
- close() - Method in class org.apache.flink.table.store.table.sink.TableCommit
-
- close() - Method in interface org.apache.flink.table.store.table.sink.TableWrite
-
- close() - Method in class org.apache.flink.table.store.table.sink.TableWriteImpl
-
- closeAndCommit() - Method in class org.apache.flink.table.store.file.utils.AtomicFsDataOutputStream
-
Closes the stream, ensuring persistence of all data (similar to FSDataOutputStream.sync()
).
- closeAndCommit() - Method in class org.apache.flink.table.store.file.utils.RenamingAtomicFsDataOutputStream
-
- closed - Variable in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- CodeGenerator - Interface in org.apache.flink.table.store.codegen
-
- CodeGeneratorImpl - Class in org.apache.flink.table.store.codegen
-
- CodeGeneratorImpl() - Constructor for class org.apache.flink.table.store.codegen.CodeGeneratorImpl
-
- CodeGenLoader - Class in org.apache.flink.table.store.codegen
-
Copied and modified from the flink-table-planner-loader module.
- CodeGenLoader() - Constructor for class org.apache.flink.table.store.codegen.CodeGenLoader
-
- CodeGenUtils - Class in org.apache.flink.table.store.codegen
-
Utils for code generations.
- CodeGenUtils() - Constructor for class org.apache.flink.table.store.codegen.CodeGenUtils
-
- collect(RowData) - Method in class org.apache.flink.table.store.format.FieldStatsCollector
-
Update the statistics with a new row data.
- columnFamily - Variable in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- combine(long, List<Committable>) - Method in interface org.apache.flink.table.store.connector.sink.Committer
-
Compute an aggregated committable from a list of committables.
- combine(long, List<Committable>) - Method in class org.apache.flink.table.store.connector.sink.StoreCommitter
-
- comment() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- comment() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- commit(List<ManifestCommittable>) - Method in interface org.apache.flink.table.store.connector.sink.Committer
-
- commit(List<ManifestCommittable>) - Method in class org.apache.flink.table.store.connector.sink.StoreCommitter
-
- commit(ManifestCommittable, Map<String, String>) - Method in interface org.apache.flink.table.store.file.operation.FileStoreCommit
-
Commit from manifest committable.
- commit(ManifestCommittable, Map<String, String>) - Method in class org.apache.flink.table.store.file.operation.FileStoreCommitImpl
-
- commit(long, List<FileCommittable>) - Method in class org.apache.flink.table.store.table.sink.TableCommit
-
- commit(List<ManifestCommittable>) - Method in class org.apache.flink.table.store.table.sink.TableCommit
-
- COMMIT_FORCE_COMPACT - Static variable in class org.apache.flink.table.store.CoreOptions
-
- commitChanges(List<SchemaChange>) - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
- commitCreateTable(Table) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- commitDropTable(Table, boolean) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- commitEarliestHint(long) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- commitForceCompact() - Method in class org.apache.flink.table.store.CoreOptions
-
- commitIdentifier() - Method in class org.apache.flink.table.store.file.Snapshot
-
- commitKind() - Method in class org.apache.flink.table.store.file.Snapshot
-
- commitLatestHint(long) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- commitNewVersion(UpdateSchema) - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
- Committable - Class in org.apache.flink.table.store.connector.sink
-
- Committable(long, Committable.Kind, Object) - Constructor for class org.apache.flink.table.store.connector.sink.Committable
-
- CommittableSerializer - Class in org.apache.flink.table.store.connector.sink
-
- CommittableSerializer(FileCommittableSerializer) - Constructor for class org.apache.flink.table.store.connector.sink.CommittableSerializer
-
- committablesPerCheckpoint - Variable in class org.apache.flink.table.store.connector.sink.CommitterOperator
-
Group the committable by the checkpoint id.
- CommittableStateManager - Interface in org.apache.flink.table.store.connector.sink
-
- CommittableTypeInfo - Class in org.apache.flink.table.store.connector.sink
-
- CommittableTypeInfo() - Constructor for class org.apache.flink.table.store.connector.sink.CommittableTypeInfo
-
- Committer - Interface in org.apache.flink.table.store.connector.sink
-
- committer - Variable in class org.apache.flink.table.store.connector.sink.CommitterOperator
-
Aggregate committables to global committables and commit the global committables to the
external system.
- CommitterOperator - Class in org.apache.flink.table.store.connector.sink
-
- CommitterOperator(boolean, String, SerializableFunction<String, Committer>, CommittableStateManager) - Constructor for class org.apache.flink.table.store.connector.sink.CommitterOperator
-
- commitUser - Variable in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- commitUser() - Method in class org.apache.flink.table.store.file.Snapshot
-
- COMMON_IO_FORK_JOIN_POOL - Static variable in class org.apache.flink.table.store.file.utils.FileUtils
-
- compact(Path) - Static method in class org.apache.flink.table.store.connector.action.FlinkActions
-
- compact(BinaryRowData, int, boolean) - Method in class org.apache.flink.table.store.connector.sink.FullChangelogStoreSinkWrite
-
- compact(BinaryRowData, int, boolean) - Method in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- compact(boolean) - Method in class org.apache.flink.table.store.file.append.AppendOnlyWriter
-
- compact(boolean) - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- compact(BinaryRowData, int, boolean) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- compact(BinaryRowData, int, boolean) - Method in interface org.apache.flink.table.store.file.operation.FileStoreWrite
-
Compact data stored in given partition and bucket.
- compact(boolean) - Method in interface org.apache.flink.table.store.file.utils.RecordWriter
-
Compact files related to the writer.
- compact(BinaryRowData, int, boolean) - Method in interface org.apache.flink.table.store.table.sink.TableWrite
-
- compact(BinaryRowData, int, boolean) - Method in class org.apache.flink.table.store.table.sink.TableWriteImpl
-
- CompactAction - Class in org.apache.flink.table.store.connector.action
-
Table compact action for Flink.
- compactAfter() - Method in class org.apache.flink.table.store.file.io.CompactIncrement
-
- compactBefore() - Method in class org.apache.flink.table.store.file.io.CompactIncrement
-
- compactedFiles - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- CompactedStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- CompactedStartingScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.CompactedStartingScanner
-
- CompactFutureManager - Class in org.apache.flink.table.store.file.compact
-
Base implementation of
CompactManager
which runs compaction in a separate thread.
- CompactFutureManager() - Constructor for class org.apache.flink.table.store.file.compact.CompactFutureManager
-
- CompactIncrement - Class in org.apache.flink.table.store.file.io
-
Files changed before and after compaction, with changelog produced during compaction.
- CompactIncrement(List<DataFileMeta>, List<DataFileMeta>, List<DataFileMeta>) - Constructor for class org.apache.flink.table.store.file.io.CompactIncrement
-
- compactIncrement() - Method in interface org.apache.flink.table.store.file.utils.RecordWriter.CommitIncrement
-
- compactIncrement() - Method in class org.apache.flink.table.store.table.sink.FileCommittable
-
- COMPACTION_MAX_FILE_NUM - Static variable in class org.apache.flink.table.store.CoreOptions
-
- COMPACTION_MAX_SIZE_AMPLIFICATION_PERCENT - Static variable in class org.apache.flink.table.store.CoreOptions
-
- COMPACTION_MAX_SORTED_RUN_NUM - Static variable in class org.apache.flink.table.store.CoreOptions
-
- COMPACTION_MIN_FILE_NUM - Static variable in class org.apache.flink.table.store.CoreOptions
-
- COMPACTION_SIZE_RATIO - Static variable in class org.apache.flink.table.store.CoreOptions
-
- COMPACTION_STYLE - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- CompactionChangelogFollowUpScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- CompactionChangelogFollowUpScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.CompactionChangelogFollowUpScanner
-
- compactionMaxFileNum() - Method in class org.apache.flink.table.store.CoreOptions
-
- compactionMinFileNum() - Method in class org.apache.flink.table.store.CoreOptions
-
- CompactManager - Interface in org.apache.flink.table.store.file.compact
-
Manager to submit compaction task.
- CompactorSink - Class in org.apache.flink.table.store.connector.sink
-
- CompactorSink(FileStoreTable, Lock.Factory) - Constructor for class org.apache.flink.table.store.connector.sink.CompactorSink
-
- CompactorSinkBuilder - Class in org.apache.flink.table.store.connector.sink
-
- CompactorSinkBuilder(FileStoreTable) - Constructor for class org.apache.flink.table.store.connector.sink.CompactorSinkBuilder
-
- CompactorSourceBuilder - Class in org.apache.flink.table.store.connector.source
-
- CompactorSourceBuilder(String, FileStoreTable) - Constructor for class org.apache.flink.table.store.connector.source.CompactorSourceBuilder
-
- CompactResult - Class in org.apache.flink.table.store.file.compact
-
Result of compaction.
- CompactResult() - Constructor for class org.apache.flink.table.store.file.compact.CompactResult
-
- CompactResult(DataFileMeta, DataFileMeta) - Constructor for class org.apache.flink.table.store.file.compact.CompactResult
-
- CompactResult(List<DataFileMeta>, List<DataFileMeta>) - Constructor for class org.apache.flink.table.store.file.compact.CompactResult
-
- CompactResult(List<DataFileMeta>, List<DataFileMeta>, List<DataFileMeta>) - Constructor for class org.apache.flink.table.store.file.compact.CompactResult
-
- CompactRewriter - Interface in org.apache.flink.table.store.file.mergetree.compact
-
Rewrite sections to the files.
- CompactStrategy - Interface in org.apache.flink.table.store.file.mergetree.compact
-
Compact strategy to decide which files to select for compaction.
- CompactTask - Class in org.apache.flink.table.store.file.compact
-
Compact task.
- CompactTask(List<DataFileMeta>) - Constructor for class org.apache.flink.table.store.file.compact.CompactTask
-
- CompactUnit - Interface in org.apache.flink.table.store.file.compact
-
A files unit for compaction.
- comparator - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- compare(RowData, RowData) - Method in interface org.apache.flink.table.store.codegen.RecordComparator
-
- compare(int, int) - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- compare(int, int, int, int) - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- compare(Object, Object, LogicalTypeRoot) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- compareKey(MemorySegment, int, MemorySegment, int) - Method in interface org.apache.flink.table.store.codegen.NormalizedKeyComputer
-
Compares two normalized keys in respective MemorySegment
.
- compareLiteral(LogicalType, Object, Object) - Static method in class org.apache.flink.table.store.file.predicate.CompareUtils
-
- CompareUtils - Class in org.apache.flink.table.store.file.predicate
-
Utils for comparator.
- compile(ClassLoader, String, String) - Static method in class org.apache.flink.table.store.codegen.CompileUtils
-
Compiles a generated code to a Class.
- compile(ClassLoader) - Method in class org.apache.flink.table.store.codegen.GeneratedClass
-
Compiles the generated code, the compiled class will be cached in the
GeneratedClass
.
- CompileUtils - Class in org.apache.flink.table.store.codegen
-
Utilities to compile a generated code to a Class.
- CompileUtils() - Constructor for class org.apache.flink.table.store.codegen.CompileUtils
-
- complement(int) - Method in class org.apache.flink.table.store.utils.Projection
-
Complement this projection.
- complement(DataType) - Method in class org.apache.flink.table.store.utils.Projection
-
- ComponentClassLoader - Class in org.apache.flink.table.store.plugin
-
A
URLClassLoader
that restricts which classes can be loaded to those contained within the
given classpath, except classes from a given set of packages that are either loaded owner or
component-first.
- ComponentClassLoader(URL[], ClassLoader, String[], String[]) - Constructor for class org.apache.flink.table.store.plugin.ComponentClassLoader
-
- CompoundPredicate - Class in org.apache.flink.table.store.file.predicate
-
- CompoundPredicate(CompoundPredicate.Function, List<Predicate>) - Constructor for class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- CompoundPredicate.Function - Class in org.apache.flink.table.store.file.predicate
-
Evaluate the predicate result based on multiple
Predicate
s.
- COMPRESSION_TYPE - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- ConcatRecordReader<T> - Class in org.apache.flink.table.store.file.mergetree.compact
-
This reader is to concatenate a list of
RecordReader
s and read them sequentially.
- ConcatRecordReader(List<ConcatRecordReader.ReaderSupplier<T>>) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader
-
- ConcatRecordReader.ReaderSupplier<T> - Interface in org.apache.flink.table.store.file.mergetree.compact
-
- conf() - Method in class org.apache.flink.table.store.hive.SerializableHiveConf
-
- ConfigOptionsDocGenerator - Class in org.apache.flink.table.store.docs.configuration
-
Class used for generating code based documentation of configuration parameters.
- configureInputJobCredentials(TableDesc, Map<String, String>) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- configureInputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- configureInputJobProperties(Configuration, Properties, Map<String, String>) - Static method in class org.apache.flink.table.store.TableStoreJobConf
-
- configureJobConf(TableDesc, JobConf) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- configureOutputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- configureTableJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- ConfigUtil - Class in org.apache.flink.table.store.benchmark.config
-
Config utils to load benchmark config from yaml in classpath, the main class is copied from
ConfigUtil
in flink-benchmarks.
- ConfigUtil() - Constructor for class org.apache.flink.table.store.benchmark.config.ConfigUtil
-
- consumeDataStream(ProviderContext, DataStream<RowData>) - Method in class org.apache.flink.table.store.connector.TableStoreDataStreamSinkProvider
-
- contains(String) - Method in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
Returns boolean indicating whether pid is in process tree.
- containsFields(Predicate, Set<String>) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- CONTINUOUS_DISCOVERY_INTERVAL - Static variable in class org.apache.flink.table.store.CoreOptions
-
- ContinuousCompactorFollowUpScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
FollowUpScanner
used internally for stand-alone streaming compact job sources.
- ContinuousCompactorFollowUpScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousCompactorFollowUpScanner
-
- ContinuousCompactorStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
StartingScanner
used internally for stand-alone streaming compact job sources.
- ContinuousCompactorStartingScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousCompactorStartingScanner
-
- ContinuousDataFileSnapshotEnumerator - Class in org.apache.flink.table.store.table.source.snapshot
-
- ContinuousDataFileSnapshotEnumerator(Path, DataTableScan, StartingScanner, FollowUpScanner, Long) - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousDataFileSnapshotEnumerator
-
- ContinuousDataFileSnapshotEnumerator.Factory - Interface in org.apache.flink.table.store.table.source.snapshot
-
- continuousDiscoveryInterval() - Method in class org.apache.flink.table.store.CoreOptions
-
- ContinuousFileSplitEnumerator - Class in org.apache.flink.table.store.connector.source
-
A continuously monitoring enumerator.
- ContinuousFileSplitEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>, Collection<FileStoreSourceSplit>, Long, long, SnapshotEnumerator) - Constructor for class org.apache.flink.table.store.connector.source.ContinuousFileSplitEnumerator
-
- ContinuousFileStoreSource - Class in org.apache.flink.table.store.connector.source
-
- ContinuousFileStoreSource(DataTable, int[][], Predicate, Long) - Constructor for class org.apache.flink.table.store.connector.source.ContinuousFileStoreSource
-
- ContinuousFileStoreSource(DataTable, int[][], Predicate, Long, ContinuousDataFileSnapshotEnumerator.Factory) - Constructor for class org.apache.flink.table.store.connector.source.ContinuousFileStoreSource
-
- ContinuousFromSnapshotStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- ContinuousFromSnapshotStartingScanner(long) - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousFromSnapshotStartingScanner
-
- ContinuousFromTimestampStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- ContinuousFromTimestampStartingScanner(long) - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousFromTimestampStartingScanner
-
- ContinuousLatestStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- ContinuousLatestStartingScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.ContinuousLatestStartingScanner
-
- convert(RowType, ResolvedExpression) - Static method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
Try best to convert a
ResolvedExpression
to
Predicate
.
- convert() - Method in class org.apache.flink.table.store.SearchArgumentToPredicateConverter
-
- convert(Map<String, String>) - Static method in class org.apache.flink.table.store.spark.SparkCaseSensitiveConverter
-
- convert(Filter) - Method in class org.apache.flink.table.store.spark.SparkFilterConverter
-
- convert(RowData) - Method in class org.apache.flink.table.store.table.sink.SinkRecordConverter
-
- convert(RowData) - Method in class org.apache.flink.table.store.utils.RowDataToObjectArrayConverter
-
- convertFrom(int, RowData) - Method in class org.apache.flink.table.store.file.manifest.ManifestEntrySerializer
-
- convertFrom(int, RowData) - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMetaSerializer
-
- convertFrom(int, RowData) - Method in class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- convertJavaObject(LogicalType, Object) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- convertTo(ManifestEntry) - Method in class org.apache.flink.table.store.file.manifest.ManifestEntrySerializer
-
- convertTo(ManifestFileMeta) - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMetaSerializer
-
- convertTo(T) - Method in class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- convertToLogSinkRecord(SinkRecord) - Method in class org.apache.flink.table.store.table.sink.SinkRecordConverter
-
- copy() - Method in class org.apache.flink.table.store.connector.sink.BucketStreamPartitioner
-
- copy() - Method in class org.apache.flink.table.store.connector.sink.OffsetRowDataHashStreamPartitioner
-
- copy() - Method in class org.apache.flink.table.store.connector.sink.TableStoreSink
-
- copy() - Method in class org.apache.flink.table.store.connector.source.SystemTableSource
-
- copy() - Method in class org.apache.flink.table.store.connector.source.TableStoreSource
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.connector.SystemCatalogTable
-
- copy() - Method in class org.apache.flink.table.store.connector.SystemCatalogTable
-
- copy(List<String>) - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- copy(RowDataSerializer, RowDataSerializer) - Method in class org.apache.flink.table.store.file.KeyValue
-
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.ArrayDataType
-
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.AtomicDataType
-
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.DataType
-
Returns a copy of this data type with possibly different nullability.
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.MapDataType
-
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.MultisetDataType
-
- copy(boolean) - Method in class org.apache.flink.table.store.file.schema.RowDataType
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- copy() - Method in class org.apache.flink.table.store.spark.SparkArrayData
-
- copy() - Method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- copy(TableSchema) - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- copy(TableSchema) - Method in class org.apache.flink.table.store.table.AppendOnlyFileStoreTable
-
- copy(TableSchema) - Method in class org.apache.flink.table.store.table.ChangelogValueCountFileStoreTable
-
- copy(TableSchema) - Method in class org.apache.flink.table.store.table.ChangelogWithKeyFileStoreTable
-
- copy(Map<String, String>) - Method in interface org.apache.flink.table.store.table.FileStoreTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.system.AuditLogTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.system.BucketsTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.system.OptionsTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.system.SchemasTable
-
- copy(Map<String, String>) - Method in class org.apache.flink.table.store.table.system.SnapshotsTable
-
- copy(Map<String, String>) - Method in interface org.apache.flink.table.store.table.Table
-
- copy(Object, LogicalType) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- copyArray(ArrayData, LogicalType) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreCharObjectInspector
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreDateObjectInspector
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreDecimalObjectInspector
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreStringObjectInspector
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreTimestampObjectInspector
-
- copyObject(Object) - Method in class org.apache.flink.table.store.hive.objectinspector.TableStoreVarcharObjectInspector
-
- copyRowData(RowData, RowType) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- CoreOptions - Class in org.apache.flink.table.store
-
Core options for table store.
- CoreOptions(Map<String, String>) - Constructor for class org.apache.flink.table.store.CoreOptions
-
- CoreOptions(Configuration) - Constructor for class org.apache.flink.table.store.CoreOptions
-
- CoreOptions.ChangelogProducer - Enum in org.apache.flink.table.store
-
Specifies the changelog producer for table.
- CoreOptions.Immutable - Annotation Type in org.apache.flink.table.store
-
Annotation used on ConfigOption
fields to exclude it from schema change.
- CoreOptions.LogChangelogMode - Enum in org.apache.flink.table.store
-
Specifies the log changelog mode for table.
- CoreOptions.LogConsistency - Enum in org.apache.flink.table.store
-
Specifies the log consistency mode for table.
- CoreOptions.MergeEngine - Enum in org.apache.flink.table.store
-
Specifies the merge engine for table with primary key.
- CoreOptions.StartupMode - Enum in org.apache.flink.table.store
-
Specifies the startup mode for log consumer.
- countPerBatch - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- CpuMetric - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
- CpuMetric(String, int, double) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetric
-
- CpuMetricReceiver - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
- CpuMetricReceiver(String, int) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricReceiver
-
- CpuMetricSender - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
- CpuMetricSender(String, int, Duration) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricSender
-
- CpuTimeTracker - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
Utility for sampling and computing CPU usage.
- CpuTimeTracker(long) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.CpuTimeTracker
-
- create(String...) - Static method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess
-
- create(String[]) - Static method in class org.apache.flink.table.store.connector.action.Action.Factory
-
- create(String[]) - Static method in class org.apache.flink.table.store.connector.action.CompactAction
-
- create(String[]) - Static method in class org.apache.flink.table.store.connector.action.DropPartitionAction
-
- create(RocksDBStateFactory, RowType, List<String>, List<String>, Predicate<RowData>, long) - Static method in interface org.apache.flink.table.store.connector.lookup.LookupTable
-
- create(HybridSource.SourceSwitchContext<StaticFileStoreSplitEnumerator>) - Method in class org.apache.flink.table.store.connector.source.LogHybridSourceFactory
-
- create(String, Configuration) - Method in interface org.apache.flink.table.store.file.catalog.CatalogFactory
-
- create() - Method in interface org.apache.flink.table.store.file.catalog.CatalogLock.Factory
-
- create(String, Configuration) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalogFactory
-
- create() - Method in class org.apache.flink.table.store.file.manifest.ManifestFile.Factory
-
- create() - Method in class org.apache.flink.table.store.file.manifest.ManifestList.Factory
-
- create(List<ConcatRecordReader.ReaderSupplier<R>>) - Static method in class org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader
-
- create() - Method in interface org.apache.flink.table.store.file.mergetree.compact.MergeFunctionFactory
-
- create(int[][]) - Method in interface org.apache.flink.table.store.file.mergetree.compact.MergeFunctionFactory
-
- create() - Method in class org.apache.flink.table.store.file.operation.Lock.CatalogLockFactory
-
- create() - Method in class org.apache.flink.table.store.file.operation.Lock.EmptyFactory
-
- create() - Method in interface org.apache.flink.table.store.file.operation.Lock.Factory
-
- create(Predicate, RowType) - Static method in class org.apache.flink.table.store.file.predicate.BucketSelector
-
- create(FileSystem) - Static method in interface org.apache.flink.table.store.file.utils.AtomicFileWriter
-
- create(Configuration) - Method in class org.apache.flink.table.store.format.avro.AvroFileFormatFactory
-
- create(Configuration) - Method in interface org.apache.flink.table.store.format.FileFormatFactory
-
- create(Path, FsPermission, boolean, int, short, long, Progressable) - Method in class org.apache.flink.table.store.format.fs.HadoopReadOnlyFileSystem
-
- create(Configuration) - Method in class org.apache.flink.table.store.format.orc.OrcFileFormatFactory
-
- create(Configuration, RowType, int[], List<OrcFilters.Predicate>) - Static method in class org.apache.flink.table.store.format.orc.OrcInputFormatFactory
-
- create(Configuration) - Method in class org.apache.flink.table.store.format.parquet.ParquetFileFormatFactory
-
- create(Configuration, RowType, TypeInformation<RowData>, boolean) - Static method in class org.apache.flink.table.store.format.parquet.ParquetInputFormatFactory
-
- create(String, Configuration) - Method in class org.apache.flink.table.store.hive.HiveCatalogFactory
-
- create(LogicalType) - Static method in class org.apache.flink.table.store.hive.objectinspector.TableStoreObjectInspectorFactory
-
- create(Path) - Static method in class org.apache.flink.table.store.table.FileStoreTableFactory
-
- create(Configuration) - Static method in class org.apache.flink.table.store.table.FileStoreTableFactory
-
- create(Path, TableSchema) - Static method in class org.apache.flink.table.store.table.FileStoreTableFactory
-
- create(Path, TableSchema, Configuration) - Static method in class org.apache.flink.table.store.table.FileStoreTableFactory
-
- create(DataTable, DataTableScan, Long) - Static method in class org.apache.flink.table.store.table.source.snapshot.ContinuousDataFileSnapshotEnumerator
-
- create(DataTable, DataTableScan, Long) - Method in interface org.apache.flink.table.store.table.source.snapshot.ContinuousDataFileSnapshotEnumerator.Factory
-
- create(DataTable, DataTableScan) - Static method in class org.apache.flink.table.store.table.source.snapshot.StaticDataFileSnapshotEnumerator
-
- create(DataTable, DataTableScan) - Method in interface org.apache.flink.table.store.table.source.snapshot.StaticDataFileSnapshotEnumerator.Factory
-
- createBatchWrapper(TypeDescription, int) - Method in class org.apache.flink.table.store.format.orc.OrcShimImpl
-
- createBuffer(NormalizedKeyComputer, AbstractRowDataSerializer<RowData>, RecordComparator, MemorySegmentPool) - Static method in class org.apache.flink.table.store.file.sort.BinaryInMemorySortBuffer
-
Create a memory sorter in `insert` way.
- createCatalog(CatalogFactory.Context) - Method in class org.apache.flink.table.store.connector.FlinkCatalogFactory
-
- createCatalog(String, Configuration, ClassLoader) - Static method in class org.apache.flink.table.store.connector.FlinkCatalogFactory
-
- createCatalog(Configuration) - Static method in interface org.apache.flink.table.store.file.catalog.CatalogFactory
-
If the ClassLoader is not specified, using the context ClassLoader of current thread as
default.
- createCatalog(Configuration, ClassLoader) - Static method in interface org.apache.flink.table.store.file.catalog.CatalogFactory
-
- createColumnOptions(ColumnFamilyOptions, Configuration) - Static method in class org.apache.flink.table.store.connector.RocksDBOptions
-
- createCommittableStateManager() - Method in class org.apache.flink.table.store.connector.sink.CompactorSink
-
- createCommittableStateManager() - Method in class org.apache.flink.table.store.connector.sink.FileStoreSink
-
- createCommittableStateManager() - Method in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- createCommitterFactory(boolean) - Method in class org.apache.flink.table.store.connector.sink.CompactorSink
-
- createCommitterFactory(boolean) - Method in class org.apache.flink.table.store.connector.sink.FileStoreSink
-
- createCommitterFactory(boolean) - Method in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- createConverter() - Method in class org.apache.flink.table.store.format.avro.AbstractAvroBulkFormat
-
- createDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- createDatabase(String, boolean) - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Create a database.
- createDatabase(String, boolean) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- createDatabase(String, boolean) - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- createDataFilePathFactory(BinaryRowData, int) - Method in class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- createDataFilters(List<DataField>, List<DataField>, List<Predicate>) - Static method in class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
Create predicate list from data fields.
- createDataProjection(List<DataField>, List<DataField>, int[][]) - Static method in class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
Create data projection from table projection.
- createDBOptions(DBOptions, Configuration) - Static method in class org.apache.flink.table.store.connector.RocksDBOptions
-
- createDynamicTableSink(DynamicTableFactory.Context) - Method in class org.apache.flink.table.store.connector.AbstractTableStoreFactory
-
- createDynamicTableSink(DynamicTableFactory.Context) - Method in class org.apache.flink.table.store.connector.TableStoreConnectorFactory
-
- createDynamicTableSource(DynamicTableFactory.Context) - Method in class org.apache.flink.table.store.connector.AbstractTableStoreFactory
-
- createDynamicTableSource(DynamicTableFactory.Context) - Method in class org.apache.flink.table.store.connector.TableStoreConnectorFactory
-
- createEmptyWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- createEmptyWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.AppendOnlyFileStoreWrite
-
- createEmptyWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.KeyValueFileStoreWrite
-
- createEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>) - Method in class org.apache.flink.table.store.connector.source.FlinkSource
-
- createFactory(HiveConf, String) - Static method in class org.apache.flink.table.store.hive.HiveCatalogLock
-
Create a hive lock factory.
- createFieldGetters(List<LogicalType>) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- createFileDataDir(Configuration) - Static method in class org.apache.flink.table.store.benchmark.config.ConfigUtil
-
Create file data dir from given configuration.
- createFormatReader(BulkFormat<RowData, FileSourceSplit>, Path) - Static method in class org.apache.flink.table.store.file.utils.FileUtils
-
- createFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- createIndexMapping(List<DataField>, List<DataField>) - Static method in class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
Create index mapping from table fields to underlying data fields.
- createIndexMapping(int[], List<DataField>, int[], List<DataField>) - Static method in class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
Create index mapping from table projection to underlying data projection.
- createIndexMapping(int[], List<DataField>, List<DataField>, int[], List<DataField>, List<DataField>) - Static method in class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
Create index mapping from table projection to data with key and value fields.
- createKey() - Method in class org.apache.flink.table.store.mapred.TableStoreRecordReader
-
- createKeyValueFields(List<DataField>, List<DataField>, int) - Static method in class org.apache.flink.table.store.file.KeyValue
-
Create key-value fields, we need to add a const value to the id of value field to ensure that
they are consistent when compared by field id.
- createNamespace(String[], Map<String, String>) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- createNullCheckingFieldGetter(LogicalType, int) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- createPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- createReader(SourceReaderContext) - Method in class org.apache.flink.table.store.connector.source.FlinkSource
-
- createReader(DataSplit) - Method in class org.apache.flink.table.store.file.operation.AppendOnlyFileStoreRead
-
- createReader(DataSplit) - Method in interface org.apache.flink.table.store.file.operation.FileStoreRead
-
- createReader(DataSplit) - Method in class org.apache.flink.table.store.file.operation.KeyValueFileStoreRead
-
- createReader(Configuration, SplitT) - Method in class org.apache.flink.table.store.format.avro.AbstractAvroBulkFormat
-
- createReader(Configuration, Path) - Static method in class org.apache.flink.table.store.format.orc.OrcShimImpl
-
- createReader(InputPartition) - Method in class org.apache.flink.table.store.spark.SparkReaderFactory
-
- createReader(Split) - Method in class org.apache.flink.table.store.table.source.KeyValueTableRead
-
- createReader(Split) - Method in interface org.apache.flink.table.store.table.source.TableRead
-
- createReader(List<Split>) - Method in interface org.apache.flink.table.store.table.source.TableRead
-
- createReaderFactory(RowType, int[][], List<Predicate>) - Method in class org.apache.flink.table.store.format.avro.AvroFileFormat
-
- createReaderFactory(RowType, int[][], List<Predicate>) - Method in class org.apache.flink.table.store.format.FileFormat
-
Create a BulkFormat
from the type, with projection pushed down.
- createReaderFactory(RowType) - Method in class org.apache.flink.table.store.format.FileFormat
-
- createReaderFactory(RowType, int[][]) - Method in class org.apache.flink.table.store.format.FileFormat
-
- createReaderFactory(RowType, int[][], List<Predicate>) - Method in class org.apache.flink.table.store.format.orc.OrcFileFormat
-
- createReaderFactory(RowType, int[][], List<Predicate>) - Method in class org.apache.flink.table.store.format.parquet.ParquetFileFormat
-
- createRecordReader(long, String, int) - Method in class org.apache.flink.table.store.file.io.KeyValueFileReaderFactory
-
- createRecordReader(Configuration, TypeDescription, int[], List<OrcFilters.Predicate>, Path, long, long) - Method in class org.apache.flink.table.store.format.orc.OrcShimImpl
-
- createRecordWriter() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- createReusedAvroRecord() - Method in class org.apache.flink.table.store.format.avro.AbstractAvroBulkFormat
-
- createRollingChangelogFileWriter(int) - Method in class org.apache.flink.table.store.file.io.KeyValueFileWriterFactory
-
- createRollingMergeTreeFileWriter(int) - Method in class org.apache.flink.table.store.file.io.KeyValueFileWriterFactory
-
- createSerializer(ExecutionConfig) - Method in class org.apache.flink.table.store.connector.sink.CommittableTypeInfo
-
- createSink() - Method in class org.apache.flink.table.store.kafka.KafkaLogSinkProvider
-
- createSink() - Method in interface org.apache.flink.table.store.log.LogSinkProvider
-
- createSinkProvider(DynamicTableFactory.Context, DynamicTableSink.Context) - Method in class org.apache.flink.table.store.kafka.KafkaLogStoreFactory
-
- createSinkProvider(DynamicTableFactory.Context, DynamicTableSink.Context) - Method in interface org.apache.flink.table.store.log.LogStoreTableFactory
-
Creates a
LogSinkProvider
instance from a
CatalogTable
and additional context
information.
- createSource(Map<Integer, Long>) - Method in class org.apache.flink.table.store.kafka.KafkaLogSourceProvider
-
- createSource(Map<Integer, Long>) - Method in interface org.apache.flink.table.store.log.LogSourceProvider
-
Creates a Source
instance.
- createSourceProvider(DynamicTableFactory.Context, DynamicTableSource.Context, int[][]) - Method in class org.apache.flink.table.store.kafka.KafkaLogStoreFactory
-
- createSourceProvider(DynamicTableFactory.Context, DynamicTableSource.Context, int[][]) - Method in interface org.apache.flink.table.store.log.LogStoreTableFactory
-
Creates a
LogSourceProvider
instance from a
CatalogTable
and additional
context information.
- createSplits(TableScan.Plan) - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitGenerator
-
- createStatsExtractor(RowType) - Method in class org.apache.flink.table.store.format.FileFormat
-
- createStatsExtractor(RowType) - Method in class org.apache.flink.table.store.format.orc.OrcFileFormat
-
- createStatsExtractor(RowType) - Method in class org.apache.flink.table.store.format.parquet.ParquetFileFormat
-
- createTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- createTable(ObjectPath, UpdateSchema, boolean) - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Create a new table.
- createTable(ObjectPath, UpdateSchema, boolean) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- createTable(ObjectPath, UpdateSchema, boolean) - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- createValue() - Method in class org.apache.flink.table.store.mapred.TableStoreRecordReader
-
- createWithSnapshotStarting(DataTable, DataTableScan) - Static method in class org.apache.flink.table.store.table.source.snapshot.ContinuousDataFileSnapshotEnumerator
-
- createWriteOperator(StoreSinkWrite.Provider, boolean) - Method in class org.apache.flink.table.store.connector.sink.CompactorSink
-
- createWriteOperator(StoreSinkWrite.Provider, boolean) - Method in class org.apache.flink.table.store.connector.sink.FileStoreSink
-
- createWriteOperator(StoreSinkWrite.Provider, boolean) - Method in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- createWriteProvider(String) - Method in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- createWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- createWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.AppendOnlyFileStoreWrite
-
- createWriterContainer(BinaryRowData, int, ExecutorService) - Method in class org.apache.flink.table.store.file.operation.KeyValueFileStoreWrite
-
- createWriterFactory(RowType) - Method in class org.apache.flink.table.store.format.avro.AvroFileFormat
-
- createWriterFactory(RowType) - Method in class org.apache.flink.table.store.format.FileFormat
-
Create a BulkWriter.Factory
from the type.
- createWriterFactory(RowType) - Method in class org.apache.flink.table.store.format.orc.OrcFileFormat
-
The OrcBulkWriterFactory
will create ThreadLocalClassLoaderConfiguration
from
the input writer config to avoid classloader leaks.
- createWriterFactory(RowType) - Method in class org.apache.flink.table.store.format.parquet.ParquetFileFormat
-
- currentHighestFieldId(List<DataField>) - Static method in class org.apache.flink.table.store.file.schema.TableSchema
-
- currentSnapshotId() - Method in class org.apache.flink.table.store.connector.source.PendingSplitsCheckpoint
-
- currentSortIndexOffset - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- currentSortIndexSegment - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- Factory() - Constructor for class org.apache.flink.table.store.connector.action.Action.Factory
-
- Factory(SchemaManager, long, RowType, FileFormat, FileStorePathFactory, long) - Constructor for class org.apache.flink.table.store.file.manifest.ManifestFile.Factory
-
- Factory(RowType, FileFormat, FileStorePathFactory) - Constructor for class org.apache.flink.table.store.file.manifest.ManifestList.Factory
-
- factory(Configuration, List<String>, List<LogicalType>, List<String>) - Static method in class org.apache.flink.table.store.file.mergetree.compact.aggregate.AggregateMergeFunction
-
- factory() - Static method in class org.apache.flink.table.store.file.mergetree.compact.DeduplicateMergeFunction
-
- factory(boolean, List<LogicalType>) - Static method in class org.apache.flink.table.store.file.mergetree.compact.PartialUpdateMergeFunction
-
- factory() - Static method in class org.apache.flink.table.store.file.mergetree.compact.ValueCountMergeFunction
-
- factory(CatalogLock.Factory, ObjectPath) - Static method in interface org.apache.flink.table.store.file.operation.Lock
-
- factoryIdentifier() - Method in class org.apache.flink.table.store.connector.FlinkCatalogFactory
-
- factoryIdentifier() - Method in class org.apache.flink.table.store.connector.TableStoreConnectorFactory
-
- factoryIdentifier() - Method in class org.apache.flink.table.store.kafka.KafkaLogStoreFactory
-
- fetch() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitReader
-
- FieldAggregator - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
abstract class of aggregating a field of a row.
- FieldAggregator(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldAggregator
-
- FieldBoolAndAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
bool_and aggregate a field of a row.
- FieldBoolAndAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldBoolAndAgg
-
- FieldBoolOrAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
bool_or aggregate a field of a row.
- FieldBoolOrAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldBoolOrAgg
-
- fieldComments() - Method in class org.apache.flink.table.store.hive.HiveSchema
-
- FieldIgnoreRetractAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
An aggregator which ignores retraction messages.
- FieldIgnoreRetractAgg(FieldAggregator) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldIgnoreRetractAgg
-
- FieldLastNonNullValueAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
last non-null value aggregate a field of a row.
- FieldLastNonNullValueAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldLastNonNullValueAgg
-
- FieldLastValueAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
last value aggregate a field of a row.
- FieldLastValueAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldLastValueAgg
-
- FieldListaggAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
listagg aggregate a field of a row.
- FieldListaggAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldListaggAgg
-
- FieldMaxAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
max aggregate a field of a row.
- FieldMaxAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldMaxAgg
-
- FieldMinAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
min aggregate a field of a row.
- FieldMinAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldMinAgg
-
- fieldName() - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- fieldName() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.AddColumn
-
- fieldName() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.DropColumn
-
- fieldName() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.RenameColumn
-
- fieldName() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.UpdateColumnType
-
- fieldNames() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.UpdateColumnComment
-
- fieldNames() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.UpdateColumnNullability
-
- fieldNames() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- fieldNames() - Method in class org.apache.flink.table.store.hive.HiveSchema
-
- FieldPrimaryKeyAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
primary key aggregate a field of a row.
- FieldPrimaryKeyAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldPrimaryKeyAgg
-
- FieldRef - Class in org.apache.flink.table.store.file.predicate
-
A reference to a field in an input.
- FieldRef(int, String, LogicalType) - Constructor for class org.apache.flink.table.store.file.predicate.FieldRef
-
- fieldRef() - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- FIELDS - Static variable in class org.apache.flink.table.store.file.mergetree.compact.aggregate.AggregateMergeFunction.RowAggregator
-
- fields() - Method in class org.apache.flink.table.store.file.schema.RowDataType
-
- fields() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- fields(FieldStatsArraySerializer) - Method in class org.apache.flink.table.store.file.stats.BinaryTableStats
-
- fields(FieldStatsArraySerializer, Long) - Method in class org.apache.flink.table.store.file.stats.BinaryTableStats
-
- fieldStats() - Method in class org.apache.flink.table.store.file.io.StatsCollectingSingleFileWriter
-
- FieldStats - Class in org.apache.flink.table.store.format
-
Statistics for each field.
- FieldStats(Object, Object, long) - Constructor for class org.apache.flink.table.store.format.FieldStats
-
- FieldStatsArraySerializer - Class in org.apache.flink.table.store.file.stats
-
- FieldStatsArraySerializer(RowType) - Constructor for class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- FieldStatsArraySerializer(RowType, int[]) - Constructor for class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- FieldStatsCollector - Class in org.apache.flink.table.store.format
-
Collector to extract statistics of each fields from a series of records.
- FieldStatsCollector(RowType) - Constructor for class org.apache.flink.table.store.format.FieldStatsCollector
-
- FieldSumAgg - Class in org.apache.flink.table.store.file.mergetree.compact.aggregate
-
sum aggregate a field of a row.
- FieldSumAgg(LogicalType) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldSumAgg
-
- fieldType - Variable in class org.apache.flink.table.store.file.mergetree.compact.aggregate.FieldAggregator
-
- fieldTypes() - Method in class org.apache.flink.table.store.hive.HiveSchema
-
- file - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- file() - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry
-
- FILE_DATA_BASE_DIR - Static variable in class org.apache.flink.table.store.benchmark.config.FileBenchmarkOptions
-
- FILE_FORMAT - Static variable in class org.apache.flink.table.store.CoreOptions
-
- FileBenchmarkOptions - Class in org.apache.flink.table.store.benchmark.config
-
Benchmark options for file.
- FileBenchmarkOptions() - Constructor for class org.apache.flink.table.store.benchmark.config.FileBenchmarkOptions
-
- FileCommittable - Class in org.apache.flink.table.store.table.sink
-
File committable for sink.
- FileCommittable(BinaryRowData, int, NewFilesIncrement, CompactIncrement) - Constructor for class org.apache.flink.table.store.table.sink.FileCommittable
-
- fileCommittables() - Method in class org.apache.flink.table.store.file.manifest.ManifestCommittable
-
- FileCommittableSerializer - Class in org.apache.flink.table.store.table.sink
-
- FileCommittableSerializer() - Constructor for class org.apache.flink.table.store.table.sink.FileCommittableSerializer
-
- fileFormat() - Method in class org.apache.flink.table.store.CoreOptions
-
- FileFormat - Class in org.apache.flink.table.store.format
-
Factory class which creates reader and writer factories for specific file format.
- FileFormat(String) - Constructor for class org.apache.flink.table.store.format.FileFormat
-
- FileFormatFactory - Interface in org.apache.flink.table.store.format
-
- FileKind - Enum in org.apache.flink.table.store.file.manifest
-
Kind of a file.
- fileName() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- fileName - Variable in class org.apache.flink.table.store.file.manifest.ManifestEntry.Identifier
-
- fileName() - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- files() - Method in interface org.apache.flink.table.store.file.compact.CompactUnit
-
- files() - Method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- files() - Method in interface org.apache.flink.table.store.file.operation.FileStoreScan.Plan
-
- files(FileKind) - Method in interface org.apache.flink.table.store.file.operation.FileStoreScan.Plan
-
- files() - Method in class org.apache.flink.table.store.table.source.DataSplit
-
- fileSize() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- fileSize() - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- FileStatsExtractor - Interface in org.apache.flink.table.store.format
-
Extracts statistics directly from file.
- FileStore<T> - Interface in org.apache.flink.table.store.file
-
File store interface.
- FileStoreCommit - Interface in org.apache.flink.table.store.file.operation
-
Commit operation which provides commit and overwrite.
- FileStoreCommitImpl - Class in org.apache.flink.table.store.file.operation
-
- FileStoreCommitImpl(long, String, RowType, FileStorePathFactory, SnapshotManager, ManifestFile.Factory, ManifestList.Factory, FileStoreScan, int, MemorySize, int, Comparator<RowData>) - Constructor for class org.apache.flink.table.store.file.operation.FileStoreCommitImpl
-
- FileStoreExpire - Interface in org.apache.flink.table.store.file.operation
-
Expire operation which provides snapshots expire.
- FileStoreExpireImpl - Class in org.apache.flink.table.store.file.operation
-
- FileStoreExpireImpl(int, int, long, FileStorePathFactory, SnapshotManager, ManifestFile.Factory, ManifestList.Factory) - Constructor for class org.apache.flink.table.store.file.operation.FileStoreExpireImpl
-
- FileStoreLookupFunction - Class in org.apache.flink.table.store.connector.lookup
-
A lookup TableFunction
for file store.
- FileStoreLookupFunction(FileStoreTable, int[], int[], Predicate) - Constructor for class org.apache.flink.table.store.connector.lookup.FileStoreLookupFunction
-
- FileStorePathFactory - Class in org.apache.flink.table.store.file.utils
-
Factory which produces Path
s for manifest files.
- FileStorePathFactory(Path) - Constructor for class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- FileStorePathFactory(Path, RowType, String, String) - Constructor for class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- FileStoreRead<T> - Interface in org.apache.flink.table.store.file.operation
-
- FileStoreScan - Interface in org.apache.flink.table.store.file.operation
-
Scan operation which produces a plan.
- FileStoreScan.Plan - Interface in org.apache.flink.table.store.file.operation
-
Result plan of this scan.
- FileStoreSink - Class in org.apache.flink.table.store.connector.sink
-
FlinkSink
for writing records into table store.
- FileStoreSink(FileStoreTable, Lock.Factory, Map<String, String>, LogSinkFunction) - Constructor for class org.apache.flink.table.store.connector.sink.FileStoreSink
-
- FileStoreSourceReader - Class in org.apache.flink.table.store.connector.source
-
- FileStoreSourceReader(SourceReaderContext, TableRead, Long) - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceReader
-
- FileStoreSourceSplit - Class in org.apache.flink.table.store.connector.source
-
SourceSplit
of file store.
- FileStoreSourceSplit(String, Split) - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplit
-
- FileStoreSourceSplit(String, Split, long) - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplit
-
- FileStoreSourceSplitGenerator - Class in org.apache.flink.table.store.connector.source
-
The
FileStoreSplitGenerator
's task is to plan all files to be read and to split them into
a set of
FileStoreSourceSplit
.
- FileStoreSourceSplitGenerator() - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplitGenerator
-
- FileStoreSourceSplitReader - Class in org.apache.flink.table.store.connector.source
-
The SplitReader
implementation for the file store source.
- FileStoreSourceSplitReader(TableRead, Long) - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplitReader
-
- FileStoreSourceSplitSerializer - Class in org.apache.flink.table.store.connector.source
-
- FileStoreSourceSplitSerializer() - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplitSerializer
-
- FileStoreSourceSplitState - Class in org.apache.flink.table.store.connector.source
-
- FileStoreSourceSplitState(FileStoreSourceSplit) - Constructor for class org.apache.flink.table.store.connector.source.FileStoreSourceSplitState
-
- FileStoreTable - Interface in org.apache.flink.table.store.table
-
An abstraction layer above
FileStore
to provide reading
and writing of
RowData
.
- FileStoreTableFactory - Class in org.apache.flink.table.store.table
-
- FileStoreTableFactory() - Constructor for class org.apache.flink.table.store.table.FileStoreTableFactory
-
- FileStoreWrite<T> - Interface in org.apache.flink.table.store.file.operation
-
- FileSystemCatalog - Class in org.apache.flink.table.store.file.catalog
-
A catalog implementation for FileSystem
.
- FileSystemCatalog(Path) - Constructor for class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- FileSystemCatalogFactory - Class in org.apache.flink.table.store.file.catalog
-
- FileSystemCatalogFactory() - Constructor for class org.apache.flink.table.store.file.catalog.FileSystemCatalogFactory
-
- FileSystemLoader - Interface in org.apache.flink.table.store.plugin
-
Loader to load FileSystemFactory
.
- FileSystems - Class in org.apache.flink.table.store.filesystem
-
A new FileSystem
loader to support:
Support access to file systems in Hive, Spark, Trino and other computing engines.
- FileSystems() - Constructor for class org.apache.flink.table.store.filesystem.FileSystems
-
- FileUtils - Class in org.apache.flink.table.store.file.utils
-
Utils for file reading and writing.
- FileUtils() - Constructor for class org.apache.flink.table.store.file.utils.FileUtils
-
- FileWriter<T,R> - Interface in org.apache.flink.table.store.file.io
-
File writer to accept one record or a branch of records and generate metadata after closing it.
- filterByStats(ManifestEntry) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreScan
-
Note: Keep this thread-safe.
- filterByStats(ManifestEntry) - Method in class org.apache.flink.table.store.file.operation.AppendOnlyFileStoreScan
-
Note: Keep this thread-safe.
- filterByStats(ManifestEntry) - Method in class org.apache.flink.table.store.file.operation.KeyValueFileStoreScan
-
Note: Keep this thread-safe.
- filterCommitted(List<ManifestCommittable>) - Method in interface org.apache.flink.table.store.file.operation.FileStoreCommit
-
Find out which manifest committable need to be retried when recovering from the failure.
- filterCommitted(List<ManifestCommittable>) - Method in class org.apache.flink.table.store.file.operation.FileStoreCommitImpl
-
- filterCommitted(List<ManifestCommittable>) - Method in class org.apache.flink.table.store.table.sink.TableCommit
-
- filterRecoveredCommittables(List<ManifestCommittable>) - Method in interface org.apache.flink.table.store.connector.sink.Committer
-
Find out which global committables need to be retried when recovering from the failure.
- filterRecoveredCommittables(List<ManifestCommittable>) - Method in class org.apache.flink.table.store.connector.sink.StoreCommitter
-
- finish() - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- FIRST_SNAPSHOT_ID - Static variable in class org.apache.flink.table.store.file.Snapshot
-
- FLINK_REST_ADDRESS - Static variable in class org.apache.flink.table.store.benchmark.BenchmarkOptions
-
- FLINK_REST_PORT - Static variable in class org.apache.flink.table.store.benchmark.BenchmarkOptions
-
- FlinkActions - Class in org.apache.flink.table.store.connector.action
-
Table maintenance actions for Flink.
- FlinkActions() - Constructor for class org.apache.flink.table.store.connector.action.FlinkActions
-
- FlinkCatalog - Class in org.apache.flink.table.store.connector
-
Catalog for table store.
- FlinkCatalog(Catalog, String, String) - Constructor for class org.apache.flink.table.store.connector.FlinkCatalog
-
- FlinkCatalogFactory - Class in org.apache.flink.table.store.connector
-
- FlinkCatalogFactory() - Constructor for class org.apache.flink.table.store.connector.FlinkCatalogFactory
-
- FlinkConnectorOptions - Class in org.apache.flink.table.store.connector
-
Options for flink connector.
- FlinkConnectorOptions() - Constructor for class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- FlinkRestClient - Class in org.apache.flink.table.store.benchmark.metric
-
A HTTP client to request BPS metric to JobMaster REST API.
- FlinkRestClient(String, int) - Constructor for class org.apache.flink.table.store.benchmark.metric.FlinkRestClient
-
- FlinkSink - Class in org.apache.flink.table.store.connector.sink
-
Abstract sink of table store.
- FlinkSink(FileStoreTable, boolean) - Constructor for class org.apache.flink.table.store.connector.sink.FlinkSink
-
- FlinkSinkBuilder - Class in org.apache.flink.table.store.connector.sink
-
- FlinkSinkBuilder(FileStoreTable) - Constructor for class org.apache.flink.table.store.connector.sink.FlinkSinkBuilder
-
- FlinkSource - Class in org.apache.flink.table.store.connector.source
-
A Flink Source
for table store.
- FlinkSource(Table, int[][], Predicate, Long) - Constructor for class org.apache.flink.table.store.connector.source.FlinkSource
-
- FlinkSourceBuilder - Class in org.apache.flink.table.store.connector.source
-
- FlinkSourceBuilder(ObjectIdentifier, FileStoreTable) - Constructor for class org.apache.flink.table.store.connector.source.FlinkSourceBuilder
-
- FlinkTableSource - Class in org.apache.flink.table.store.connector.source
-
A Flink ScanTableSource
for table store.
- FlinkTableSource(Table) - Constructor for class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- FlinkTableSource(Table, Predicate, int[][], Long) - Constructor for class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- flush() - Method in class org.apache.flink.table.store.file.utils.RenamingAtomicFsDataOutputStream
-
- flush() - Method in class org.apache.flink.table.store.kafka.KafkaSinkFunction
-
- flush() - Method in interface org.apache.flink.table.store.table.sink.LogSinkFunction
-
Flush pending records.
- flushMemory() - Method in interface org.apache.flink.table.store.file.memory.MemoryOwner
-
Flush memory of owner, release memory.
- flushMemory() - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- flushMemory() - Method in class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- flushMemory() - Method in interface org.apache.flink.table.store.file.mergetree.WriteBuffer
-
Flush memory, return false if not supported.
- flushMemory() - Method in class org.apache.flink.table.store.file.sort.BinaryExternalSortBuffer
-
- flushMemory() - Method in class org.apache.flink.table.store.file.sort.BinaryInMemorySortBuffer
-
- flushMemory() - Method in interface org.apache.flink.table.store.file.sort.SortBuffer
-
Flush memory, return false if not supported.
- FollowUpScanner - Interface in org.apache.flink.table.store.table.source.snapshot
-
- forAppend(String, long, long, BinaryTableStats, long, long, long) - Static method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- forEach(Comparator<RowData>, MergeFunction<KeyValue>, WriteBuffer.KvConsumer, WriteBuffer.KvConsumer) - Method in class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- forEach(Comparator<RowData>, MergeFunction<KeyValue>, WriteBuffer.KvConsumer, WriteBuffer.KvConsumer) - Method in interface org.apache.flink.table.store.file.mergetree.WriteBuffer
-
Performs the given action for each remaining element in this buffer until all elements have
been processed or the action throws an exception.
- forEachRemaining(RecordReader<T>, Consumer<? super T>) - Static method in class org.apache.flink.table.store.file.utils.RecordReaderUtils
-
Performs the given action for each remaining element in
RecordReader
until all
elements have been processed or the action throws an exception.
- format - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- formatDataFreshness(Long) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkUtils
-
- formatDoubleValue(double) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkUtils
-
- formatIdentifier - Variable in class org.apache.flink.table.store.format.FileFormat
-
- formatLongValue(long) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkUtils
-
- formatLongValuePerSecond(long) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkUtils
-
- freePages() - Method in class org.apache.flink.table.store.file.memory.HeapMemorySegmentPool
-
- from(int[][]) - Static method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- from(int[]) - Static method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- from(Projection) - Static method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- fromBinary(BinaryTableStats) - Method in class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- fromBinary(BinaryTableStats, Long) - Method in class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- fromBytes(byte[]) - Static method in class org.apache.flink.table.store.connector.sink.LogOffsetCommittable
-
- fromByteValue(byte) - Static method in enum org.apache.flink.table.store.file.manifest.FileKind
-
- fromCatalog(CatalogLock, ObjectPath) - Static method in interface org.apache.flink.table.store.file.operation.Lock
-
- fromCatalogTable(CatalogTable) - Static method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- fromFiles(int, List<DataFileMeta>) - Static method in interface org.apache.flink.table.store.file.compact.CompactUnit
-
- fromFlink(Object, LogicalType) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(StringData) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(DecimalData) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(RowData, RowType) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(TimestampData) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(ArrayData, ArrayType) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlink(MapData, LogicalType) - Static method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- fromFlinkRowType(RowType) - Static method in class org.apache.flink.table.store.spark.SparkTypeUtils
-
- fromFlinkType(LogicalType) - Static method in class org.apache.flink.table.store.spark.SparkTypeUtils
-
- fromIdentifier(String, Configuration) - Static method in class org.apache.flink.table.store.format.FileFormat
-
Create a
FileFormat
from format identifier and format options.
- fromJson(String) - Static method in class org.apache.flink.table.store.file.Snapshot
-
- fromJson(String, Class<T>) - Static method in class org.apache.flink.table.store.file.utils.JsonSerdeUtil
-
- fromJsonArray(String) - Static method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetric
-
- fromLevelRuns(int, List<LevelSortedRun>) - Static method in interface org.apache.flink.table.store.file.compact.CompactUnit
-
- fromMap(Map<String, String>, RowType) - Static method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- fromPath(Path) - Static method in class org.apache.flink.table.store.file.Snapshot
-
- fromRow(RowData) - Method in class org.apache.flink.table.store.file.io.DataFileMetaSerializer
-
- fromRow(RowData) - Method in class org.apache.flink.table.store.file.KeyValueSerializer
-
- fromRow(RowData) - Method in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
Convert a RowData
to T
.
- fromRow(RowData) - Method in class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- fromRowData(RowData) - Static method in class org.apache.flink.table.store.file.stats.BinaryTableStats
-
- fromSingle(DataFileMeta) - Static method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- fromSorted(List<DataFileMeta>) - Static method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- fromStringArrayData(ArrayData) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- fromTableOptions(Configuration, ConfigOption<String>) - Static method in class org.apache.flink.table.store.format.FileFormat
-
- fromUnsorted(List<DataFileMeta>, Comparator<RowData>) - Static method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- FullChangelogMergeFunctionWrapper - Class in org.apache.flink.table.store.file.mergetree.compact
-
Wrapper for
MergeFunction
s to produce changelog during a full compaction.
- FullChangelogMergeFunctionWrapper(MergeFunction<KeyValue>, int) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.FullChangelogMergeFunctionWrapper
-
- FullChangelogMergeFunctionWrapper.Result - Class in org.apache.flink.table.store.file.mergetree.compact
-
Changelog and final result for the same key.
- FullChangelogMergeTreeCompactRewriter - Class in org.apache.flink.table.store.file.mergetree.compact
-
- FullChangelogMergeTreeCompactRewriter(int, KeyValueFileReaderFactory, KeyValueFileWriterFactory, Comparator<RowData>, MergeFunctionFactory<KeyValue>) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.FullChangelogMergeTreeCompactRewriter
-
- FullChangelogStoreSinkWrite - Class in org.apache.flink.table.store.connector.sink
-
StoreSinkWrite
for CoreOptions.ChangelogProducer#FULL_COMPACTION
changelog
producer.
- FullChangelogStoreSinkWrite(FileStoreTable, StateInitializationContext, String, IOManager, boolean, long) - Constructor for class org.apache.flink.table.store.connector.sink.FullChangelogStoreSinkWrite
-
- FullStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- FullStartingScanner() - Constructor for class org.apache.flink.table.store.table.source.snapshot.FullStartingScanner
-
- function() - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- Function() - Constructor for class org.apache.flink.table.store.file.predicate.CompoundPredicate.Function
-
- function() - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- functionExists(ObjectPath) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- FunctionVisitor<T> - Interface in org.apache.flink.table.store.file.predicate
-
- lastIndexEntryOffset - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- latest() - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
- LATEST - Static variable in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- latestCompactedSnapshotId() - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- latestSnapshotId() - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- latestSnapshotOfUser(String) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- leaf(NullFalseLeafBinaryFunction, int, Object) - Method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- leaf(LeafUnaryFunction, int) - Method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- LeafFunction - Class in org.apache.flink.table.store.file.predicate
-
Function to test a field with literals.
- LeafFunction() - Constructor for class org.apache.flink.table.store.file.predicate.LeafFunction
-
- LeafPredicate - Class in org.apache.flink.table.store.file.predicate
-
- LeafPredicate(LeafFunction, LogicalType, int, String, List<Object>) - Constructor for class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- LeafUnaryFunction - Class in org.apache.flink.table.store.file.predicate
-
Function to test a field.
- LeafUnaryFunction() - Constructor for class org.apache.flink.table.store.file.predicate.LeafUnaryFunction
-
- length() - Method in interface org.apache.flink.table.store.file.io.FileWriter
-
The estimated length of the current writer.
- length() - Method in class org.apache.flink.table.store.file.io.RollingFileWriter
-
- length() - Method in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- LessOrEqual - Class in org.apache.flink.table.store.file.predicate
-
- lessOrEqual(int, Object) - Method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- LessThan - Class in org.apache.flink.table.store.file.predicate
-
- lessThan(int, Object) - Method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- level() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- level() - Method in class org.apache.flink.table.store.file.KeyValue
-
- level - Variable in class org.apache.flink.table.store.file.manifest.ManifestEntry.Identifier
-
- level() - Method in class org.apache.flink.table.store.file.mergetree.LevelSortedRun
-
- levels() - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactManager
-
- Levels - Class in org.apache.flink.table.store.file.mergetree
-
A class which stores all level files of merge tree.
- Levels(Comparator<RowData>, List<DataFileMeta>, int) - Constructor for class org.apache.flink.table.store.file.mergetree.Levels
-
- LevelSortedRun - Class in org.apache.flink.table.store.file.mergetree
-
- LevelSortedRun(int, SortedRun) - Constructor for class org.apache.flink.table.store.file.mergetree.LevelSortedRun
-
- levelSortedRuns() - Method in class org.apache.flink.table.store.file.mergetree.Levels
-
- limit - Variable in class org.apache.flink.table.store.connector.source.FlinkSource
-
- limit - Variable in class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- listAll() - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
List all schema.
- listAllIds() - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
List all schema IDs.
- listDatabases() - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listDatabases() - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Get the names of all databases in this catalog.
- listDatabases() - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- listDatabases() - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- listFunctions(String) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listNamespaces() - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- listNamespaces(String[]) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- listPartitions(ObjectPath) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listPartitions(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listPartitionsByFilter(ObjectPath, List<Expression>) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listStatus(Path) - Method in class org.apache.flink.table.store.format.fs.HadoopReadOnlyFileSystem
-
- listTables(String) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- listTables(String) - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Get names of all tables under this database.
- listTables(String) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- listTables(String) - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- listTables(String[]) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- listVersionedFiles(Path, String) - Static method in class org.apache.flink.table.store.file.utils.FileUtils
-
List versioned files for the directory.
- listViews(String) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- literals() - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- load(Path) - Static method in class org.apache.flink.table.store.benchmark.Query
-
- load(Path) - Static method in class org.apache.flink.table.store.benchmark.Sink
-
- load() - Method in class org.apache.flink.table.store.oss.OSSLoader
-
- load() - Method in interface org.apache.flink.table.store.plugin.FileSystemLoader
-
- load() - Method in class org.apache.flink.table.store.s3.S3Loader
-
- load(String, Path) - Static method in class org.apache.flink.table.store.table.system.SystemTableLoader
-
- loadBenchMarkConf() - Static method in class org.apache.flink.table.store.benchmark.config.ConfigUtil
-
Load benchmark conf from classpath.
- loadClass(String, boolean) - Method in class org.apache.flink.table.store.plugin.ComponentClassLoader
-
- loadConfiguration() - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkGlobalConfiguration
-
Loads the global configuration from the environment.
- loadConfiguration(Configuration) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkGlobalConfiguration
-
Loads the global configuration and adds the given dynamic properties configuration.
- loadConfiguration(String) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkGlobalConfiguration
-
Loads the configuration files from the specified directory.
- loadConfiguration(String, Configuration) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkGlobalConfiguration
-
Loads the configuration files from the specified directory.
- loadNamespaceMetadata(String[]) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- loadTable(Identifier) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- LOCAL_SORT_MAX_NUM_FILE_HANDLES - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOCAL_TZ - Static variable in class org.apache.flink.table.store.utils.DateTimeUtils
-
The local time zone.
- LocalFileUtils - Class in org.apache.flink.table.store.utils
-
Utils for local file.
- LocalFileUtils() - Constructor for class org.apache.flink.table.store.utils.LocalFileUtils
-
- localSortMaxNumFileHandles() - Method in class org.apache.flink.table.store.CoreOptions
-
- location() - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- location() - Method in class org.apache.flink.table.store.table.system.AuditLogTable
-
- location() - Method in class org.apache.flink.table.store.table.system.BucketsTable
-
- location() - Method in class org.apache.flink.table.store.table.system.OptionsTable
-
- location() - Method in class org.apache.flink.table.store.table.system.SchemasTable
-
- location() - Method in class org.apache.flink.table.store.table.system.SnapshotsTable
-
- location() - Method in interface org.apache.flink.table.store.table.Table
-
- Lock - Interface in org.apache.flink.table.store.file.operation
-
An interface that allows file store to use global lock to some transaction-related things.
- Lock.CatalogLockFactory - Class in org.apache.flink.table.store.file.operation
-
- Lock.CatalogLockImpl - Class in org.apache.flink.table.store.file.operation
-
- Lock.EmptyFactory - Class in org.apache.flink.table.store.file.operation
-
- Lock.EmptyLock - Class in org.apache.flink.table.store.file.operation
-
An empty lock.
- Lock.Factory - Interface in org.apache.flink.table.store.file.operation
-
A factory to create
Lock
.
- LOCK_ACQUIRE_TIMEOUT - Static variable in class org.apache.flink.table.store.CatalogOptions
-
- LOCK_CHECK_MAX_SLEEP - Static variable in class org.apache.flink.table.store.CatalogOptions
-
- LOCK_ENABLED - Static variable in class org.apache.flink.table.store.CatalogOptions
-
- lockFactory() - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Get lock factory from catalog.
- lockFactory() - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- lockFactory() - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- LOG_CHANGELOG_MODE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_CONSISTENCY - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_DIR - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- LOG_FILE_NUM - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- LOG_FORMAT - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_KEY_FORMAT - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_LEVEL - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- LOG_MAX_FILE_SIZE - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- LOG_RETENTION - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_SCAN_REMOVE_NORMALIZE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- LOG_SYSTEM - Static variable in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- LogHybridSourceFactory - Class in org.apache.flink.table.store.connector.source
-
- LogHybridSourceFactory(LogSourceProvider) - Constructor for class org.apache.flink.table.store.connector.source.LogHybridSourceFactory
-
- logicalBucketKeyType() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- logicalPartitionType() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- logicalRowType() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- logicalTrimmedPrimaryKeysType() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- logicalType - Variable in class org.apache.flink.table.store.file.schema.DataType
-
- logicalType() - Method in class org.apache.flink.table.store.file.schema.DataType
-
Returns the corresponding logical type.
- logicalType() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.AddColumn
-
- logicalTypeToTypeInfo(LogicalType) - Static method in class org.apache.flink.table.store.hive.HiveTypeUtils
-
- logMetric(long, List<DataFileMeta>, List<DataFileMeta>) - Method in class org.apache.flink.table.store.file.compact.CompactTask
-
- logMetric(long, List<DataFileMeta>, List<DataFileMeta>) - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactTask
-
- LogOffsetCommittable - Class in org.apache.flink.table.store.connector.sink
-
Log offset committable for a bucket.
- LogOffsetCommittable(int, long) - Constructor for class org.apache.flink.table.store.connector.sink.LogOffsetCommittable
-
- logOffsets() - Method in class org.apache.flink.table.store.file.manifest.ManifestCommittable
-
- LogSinkFunction - Interface in org.apache.flink.table.store.table.sink
-
- LogSinkFunction.WriteCallback - Interface in org.apache.flink.table.store.table.sink
-
A callback interface that the user can implement to know the offset of the bucket when the
request is complete.
- LogSinkProvider - Interface in org.apache.flink.table.store.log
-
- LogSourceProvider - Interface in org.apache.flink.table.store.log
-
- LogStoreTableFactory - Interface in org.apache.flink.table.store.log
-
Base interface for configuring a default log table connector.
- LogWriteCallback - Class in org.apache.flink.table.store.log
-
- LogWriteCallback() - Constructor for class org.apache.flink.table.store.log.LogWriteCallback
-
- LOOKUP_CACHE_ROWS - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- LookupTable - Interface in org.apache.flink.table.store.connector.lookup
-
A lookup table which provides get and refresh.
- pack(Iterable<T>, Function<T, Long>, long) - Static method in class org.apache.flink.table.store.utils.OrderedPacking
-
- PAGE_SIZE - Static variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- PAGE_SIZE - Static variable in class org.apache.flink.table.store.benchmark.metric.cpu.SysInfoLinux
-
- PAGE_SIZE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- pageSize() - Method in class org.apache.flink.table.store.CoreOptions
-
- pageSize() - Method in class org.apache.flink.table.store.file.memory.HeapMemorySegmentPool
-
- PARENT_FIRST_LOGGING_PATTERNS - Static variable in class org.apache.flink.table.store.plugin.PluginLoader
-
- ParquetFileFormat - Class in org.apache.flink.table.store.format.parquet
-
- ParquetFileFormat(Configuration) - Constructor for class org.apache.flink.table.store.format.parquet.ParquetFileFormat
-
- ParquetFileFormatFactory - Class in org.apache.flink.table.store.format.parquet
-
- ParquetFileFormatFactory() - Constructor for class org.apache.flink.table.store.format.parquet.ParquetFileFormatFactory
-
- ParquetFileStatsExtractor - Class in org.apache.flink.table.store.format.parquet
-
- ParquetFileStatsExtractor(RowType) - Constructor for class org.apache.flink.table.store.format.parquet.ParquetFileStatsExtractor
-
- ParquetInputFormatFactory - Class in org.apache.flink.table.store.format.parquet
-
Factory to create parquet input format for different Flink versions.
- ParquetInputFormatFactory() - Constructor for class org.apache.flink.table.store.format.parquet.ParquetInputFormatFactory
-
- ParquetUtil - Class in org.apache.flink.table.store.format.parquet
-
Parquet utilities that support to extract the metadata, assert expected stats, etc.
- ParquetUtil() - Constructor for class org.apache.flink.table.store.format.parquet.ParquetUtil
-
- parseDate(String) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
- parseTime(String) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
- parseTimestampData(String, int) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
- PARTIAL_UPDATE_IGNORE_DELETE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- PartialUpdateMergeFunction - Class in org.apache.flink.table.store.file.mergetree.compact
-
A
MergeFunction
where key is primary key (unique) and value is the partial record, update
non-null fields on merge.
- PartialUpdateMergeFunction(RowData.FieldGetter[], boolean) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.PartialUpdateMergeFunction
-
- partition - Variable in class org.apache.flink.table.store.file.manifest.ManifestEntry.Identifier
-
- partition() - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry
-
- partition() - Method in class org.apache.flink.table.store.file.mergetree.compact.IntervalPartition
-
Returns a two-dimensional list of
SortedRun
s.
- partition() - Method in class org.apache.flink.table.store.table.sink.FileCommittable
-
- partition() - Method in class org.apache.flink.table.store.table.sink.SinkRecord
-
- partition() - Method in class org.apache.flink.table.store.table.source.DataSplit
-
- PARTITION_DEFAULT_NAME - Static variable in class org.apache.flink.table.store.CoreOptions
-
- PARTITION_DEFAULT_NAME - Static variable in class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- PARTITION_EXPIRATION_CHECK_INTERVAL - Static variable in class org.apache.flink.table.store.CoreOptions
-
- PARTITION_EXPIRATION_TIME - Static variable in class org.apache.flink.table.store.CoreOptions
-
- PARTITION_TIMESTAMP_FORMATTER - Static variable in class org.apache.flink.table.store.CoreOptions
-
- PARTITION_TIMESTAMP_PATTERN - Static variable in class org.apache.flink.table.store.CoreOptions
-
- partitionColumns - Variable in class org.apache.flink.table.store.file.utils.RowDataPartitionComputer
-
- partitionDefaultName() - Method in class org.apache.flink.table.store.CoreOptions
-
- partitionExists(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- PartitionExpire - Class in org.apache.flink.table.store.file.operation
-
Expire partitions.
- PartitionExpire(RowType, Duration, Duration, String, String, FileStoreScan, FileStoreCommit) - Constructor for class org.apache.flink.table.store.file.operation.PartitionExpire
-
- partitionExpireCheckInterval() - Method in class org.apache.flink.table.store.CoreOptions
-
- partitionExpireTime() - Method in class org.apache.flink.table.store.CoreOptions
-
- partitionFieldGetters - Variable in class org.apache.flink.table.store.file.utils.RowDataPartitionComputer
-
- partitioning() - Method in class org.apache.flink.table.store.spark.SparkTable
-
- partitionKeys() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- partitionKeys() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- partitionKeys() - Method in interface org.apache.flink.table.store.table.FileStoreTable
-
- partitionKeys() - Method in interface org.apache.flink.table.store.table.SupportsPartition
-
- partitionStats() - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- PartitionTimeExtractor - Class in org.apache.flink.table.store.file.partition
-
Time extractor to extract time from partition values.
- PartitionTimeExtractor(String, String) - Constructor for class org.apache.flink.table.store.file.partition.PartitionTimeExtractor
-
- partitionTimestampFormatter() - Method in class org.apache.flink.table.store.CoreOptions
-
- partitionTimestampPattern() - Method in class org.apache.flink.table.store.CoreOptions
-
- partitionType - Variable in class org.apache.flink.table.store.file.AbstractFileStore
-
- partitionType() - Method in class org.apache.flink.table.store.file.AbstractFileStore
-
- partitionType() - Method in interface org.apache.flink.table.store.file.FileStore
-
- partitionWithBucketRowType(RowType) - Static method in class org.apache.flink.table.store.table.system.BucketsTable
-
- PATH - Static variable in class org.apache.flink.table.store.CoreOptions
-
- path() - Method in class org.apache.flink.table.store.CoreOptions
-
- path(Map<String, String>) - Static method in class org.apache.flink.table.store.CoreOptions
-
- path(Configuration) - Static method in class org.apache.flink.table.store.CoreOptions
-
- path - Variable in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- path() - Method in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- path - Variable in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- pathFactory() - Method in class org.apache.flink.table.store.file.AbstractFileStore
-
- pathFactory() - Method in class org.apache.flink.table.store.file.io.KeyValueFileWriterFactory
-
- PendingSplitsCheckpoint - Class in org.apache.flink.table.store.connector.source
-
A checkpoint of the current state of the containing the currently pending splits that are not yet
assigned.
- PendingSplitsCheckpoint(Collection<FileStoreSourceSplit>, Long) - Constructor for class org.apache.flink.table.store.connector.source.PendingSplitsCheckpoint
-
- PendingSplitsCheckpointSerializer - Class in org.apache.flink.table.store.connector.source
-
- PendingSplitsCheckpointSerializer(FileStoreSourceSplitSerializer) - Constructor for class org.apache.flink.table.store.connector.source.PendingSplitsCheckpointSerializer
-
- pick(int, List<LevelSortedRun>) - Method in interface org.apache.flink.table.store.file.mergetree.compact.CompactStrategy
-
Pick compaction unit from runs.
- pick(int, List<LevelSortedRun>) - Method in class org.apache.flink.table.store.file.mergetree.compact.UniversalCompaction
-
- pickFullCompaction(int, List<LevelSortedRun>) - Static method in interface org.apache.flink.table.store.file.mergetree.compact.CompactStrategy
-
Pick a compaction unit consisting of all existing files.
- pickTransformFieldMapping(List<Predicate>, List<String>, List<String>) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- pickTransformFieldMapping(List<Predicate>, int[]) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- plan() - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreScan
-
- plan() - Method in interface org.apache.flink.table.store.file.operation.FileStoreScan
-
- plan() - Method in class org.apache.flink.table.store.table.source.AbstractDataTableScan
-
- plan() - Method in interface org.apache.flink.table.store.table.source.DataTableScan
-
- plan() - Method in interface org.apache.flink.table.store.table.source.TableScan
-
- planInputPartitions() - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- PluginLoader - Class in org.apache.flink.table.store.plugin
-
Loader to load plugin jar.
- PluginLoader(String) - Constructor for class org.apache.flink.table.store.plugin.PluginLoader
-
- preCreateTable(Table) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- predicate - Variable in class org.apache.flink.table.store.connector.source.FlinkSource
-
- predicate - Variable in class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- Predicate - Interface in org.apache.flink.table.store.file.predicate
-
Predicate which returns Boolean and provides testing by stats.
- PREDICATE_CONVERTER - Static variable in class org.apache.flink.table.store.table.system.AuditLogTable
-
- PredicateBuilder - Class in org.apache.flink.table.store.file.predicate
-
A utility class to create
Predicate
object for common filter conditions.
- PredicateBuilder(RowType) - Constructor for class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- PredicateConverter - Class in org.apache.flink.table.store.file.predicate
-
- PredicateConverter(RowType) - Constructor for class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- PredicateConverter(PredicateBuilder) - Constructor for class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- PredicateConverter.UnsupportedExpression - Exception in org.apache.flink.table.store.file.predicate
-
Encounter an unsupported expression, the caller can choose to ignore this filter branch.
- PredicateFilter - Class in org.apache.flink.table.store.file.predicate
-
- PredicateFilter(RowType, List<Predicate>) - Constructor for class org.apache.flink.table.store.file.predicate.PredicateFilter
-
- PredicateFilter(RowType, Predicate) - Constructor for class org.apache.flink.table.store.file.predicate.PredicateFilter
-
- PredicateReplaceVisitor - Interface in org.apache.flink.table.store.file.predicate
-
- PredicateVisitor<T> - Interface in org.apache.flink.table.store.file.predicate
-
- preDropTable(Table) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.connector.sink.FullChangelogStoreSinkWrite
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.connector.sink.PrepareCommitOperator
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.connector.sink.StoreCompactOperator
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- prepareCommit(boolean) - Method in class org.apache.flink.table.store.file.append.AppendOnlyWriter
-
- prepareCommit(boolean) - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- prepareCommit(boolean, long) - Method in interface org.apache.flink.table.store.file.operation.FileStoreWrite
-
Prepare commit in the write.
- prepareCommit(boolean) - Method in interface org.apache.flink.table.store.file.utils.RecordWriter
-
Prepare for a commit.
- prepareCommit(boolean, long) - Method in interface org.apache.flink.table.store.table.sink.TableWrite
-
- prepareCommit(boolean, long) - Method in class org.apache.flink.table.store.table.sink.TableWriteImpl
-
- PrepareCommitOperator - Class in org.apache.flink.table.store.connector.sink
-
- PrepareCommitOperator() - Constructor for class org.apache.flink.table.store.connector.sink.PrepareCommitOperator
-
- prepareSnapshotPreBarrier(long) - Method in class org.apache.flink.table.store.connector.sink.PrepareCommitOperator
-
- PRIMARY_KEY_UNSUPPORTED_LOGICAL_TYPES - Static variable in class org.apache.flink.table.store.file.schema.TableSchema
-
- primaryKey - Variable in class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- primaryKey() - Method in class org.apache.flink.table.store.table.sink.SinkRecord
-
- PrimaryKeyLookupTable - Class in org.apache.flink.table.store.connector.lookup
-
- PrimaryKeyLookupTable(RocksDBStateFactory, RowType, List<String>, Predicate<RowData>, long) - Constructor for class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- primaryKeyMapping - Variable in class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- primaryKeys() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- primaryKeys() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- printAndLog(Logger, String) - Static method in class org.apache.flink.table.store.benchmark.utils.BenchmarkUtils
-
- printSummary(LinkedHashMap<String, QueryRunner.Result>) - Static method in class org.apache.flink.table.store.benchmark.Benchmark
-
- processElement(StreamRecord<Committable>) - Method in class org.apache.flink.table.store.connector.sink.CommitterOperator
-
- processElement(StreamRecord<RowData>) - Method in class org.apache.flink.table.store.connector.sink.PrepareCommitOperator
-
- processElement(StreamRecord<RowData>) - Method in class org.apache.flink.table.store.connector.sink.StoreCompactOperator
-
- processElement(StreamRecord<RowData>) - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- processSMAPTree - Variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- processTree - Variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- processWatermark(Watermark) - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- PROCFS_CMDLINE_FILE - Static variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- PROCFS_STAT_FILE - Static variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
A Proc file-system based ProcessTree.
- ProcfsBasedProcessTree() - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree(boolean) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree(String) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree(String, boolean) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree(String, String) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- ProcfsBasedProcessTree(String, String, Clock, boolean) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
Build a new process tree rooted at the pid.
- produceDataStream(ProviderContext, StreamExecutionEnvironment) - Method in class org.apache.flink.table.store.connector.TableStoreDataStreamScanProvider
-
- project(int[][], int[][], int) - Static method in class org.apache.flink.table.store.file.KeyValue
-
- project(DataType) - Method in class org.apache.flink.table.store.utils.Projection
-
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given index paths.
- project(LogicalType) - Method in class org.apache.flink.table.store.utils.Projection
-
- project(T[]) - Method in class org.apache.flink.table.store.utils.Projection
-
Project array.
- project(List<T>) - Method in class org.apache.flink.table.store.utils.Projection
-
Project list.
- project(RowType, int[]) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
- projectedFields - Variable in class org.apache.flink.table.store.connector.source.FlinkSource
-
- ProjectedRowData - Class in org.apache.flink.table.store.utils
-
An implementation of RowData
which provides a projected view of the underlying RowData
.
- ProjectedRowData(int[]) - Constructor for class org.apache.flink.table.store.utils.ProjectedRowData
-
- projectFields - Variable in class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- Projection - Interface in org.apache.flink.table.store.codegen
-
Interface for code generated projection, which will map a RowData to another BinaryRowData.
- projection(List<String>) - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- Projection - Class in org.apache.flink.table.store.utils
-
Projection
represents a list of (possibly nested) indexes that can be used to project
data types.
- properties() - Method in class org.apache.flink.table.store.spark.SparkTable
-
- pruneColumns(StructType) - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- pruneColumns(StructType) - Method in class org.apache.flink.table.store.spark.SparkScanBuilder
-
- pushedFilters() - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- pushedFilters() - Method in class org.apache.flink.table.store.spark.SparkScanBuilder
-
- pushFilters(Filter[]) - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- pushFilters(Filter[]) - Method in class org.apache.flink.table.store.spark.SparkScanBuilder
-
- put(RowData, RowData) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBValueState
-
- put(long, RowKind, RowData, RowData) - Method in class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- put(long, RowKind, RowData, RowData) - Method in interface org.apache.flink.table.store.file.mergetree.WriteBuffer
-
Put a record with sequence number and value kind.
- putKey(RowData, MemorySegment, int) - Method in interface org.apache.flink.table.store.codegen.NormalizedKeyComputer
-
Writes a normalized key for the given record into the target MemorySegment
.
- range(int, int) - Static method in class org.apache.flink.table.store.utils.Projection
-
- read(String) - Method in class org.apache.flink.table.store.file.manifest.ManifestFile
-
- read(String) - Method in class org.apache.flink.table.store.file.manifest.ManifestList
-
- read(SpecializedGetters, int, DataType) - Static method in class org.apache.flink.table.store.spark.SpecializedGettersReader
-
- read - Variable in class org.apache.flink.table.store.table.source.KeyValueTableRead
-
- readAllDataManifests(ManifestList) - Method in class org.apache.flink.table.store.file.Snapshot
-
- readBatch() - Method in class org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader
-
- readBatch() - Method in class org.apache.flink.table.store.file.io.RowDataFileRecordReader
-
- readBatch() - Method in class org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader
-
- readBatch() - Method in class org.apache.flink.table.store.file.mergetree.compact.SortMergeReader
-
- readBatch() - Method in class org.apache.flink.table.store.file.mergetree.DropDeleteReader
-
- readBatch() - Method in class org.apache.flink.table.store.file.utils.IteratorRecordReader
-
- readBatch() - Method in interface org.apache.flink.table.store.file.utils.RecordReader
-
Reads one batch.
- readerFactory - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- readerFactory - Variable in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactRewriter
-
- readerForMergeTree(List<List<SortedRun>>, boolean, KeyValueFileReaderFactory, Comparator<RowData>, MergeFunction<KeyValue>) - Static method in class org.apache.flink.table.store.file.mergetree.MergeTreeReaders
-
- readerForRun(SortedRun, KeyValueFileReaderFactory) - Static method in class org.apache.flink.table.store.file.mergetree.MergeTreeReaders
-
- readerForSection(List<SortedRun>, KeyValueFileReaderFactory, Comparator<RowData>, MergeFunctionWrapper<KeyValue>) - Static method in class org.apache.flink.table.store.file.mergetree.MergeTreeReaders
-
- readerSchema - Variable in class org.apache.flink.table.store.format.avro.AbstractAvroBulkFormat
-
- readFields(DataInput) - Method in class org.apache.flink.table.store.mapred.TableStoreInputSplit
-
- readFields(DataInput) - Method in class org.apache.flink.table.store.RowDataContainer
-
- readFileUtf8(Path) - Static method in class org.apache.flink.table.store.file.utils.FileUtils
-
- readHint(String) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- readListFromFile(Path, ObjectSerializer<T>, BulkFormat<RowData, FileSourceSplit>) - Static method in class org.apache.flink.table.store.file.utils.FileUtils
-
- readSchema() - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- readSchema() - Method in class org.apache.flink.table.store.spark.SparkScan
-
- recordBuffer - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- RecordComparator - Interface in org.apache.flink.table.store.codegen
-
Record comparator for BinaryInMemorySortBuffer
.
- recordCount() - Method in interface org.apache.flink.table.store.file.io.FileWriter
-
The total written record count.
- recordCount() - Method in class org.apache.flink.table.store.file.io.RollingFileWriter
-
- recordCount() - Method in class org.apache.flink.table.store.file.io.SingleFileWriter
-
- recordFilter - Variable in class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- RecordReader<T> - Interface in org.apache.flink.table.store.file.utils
-
The reader that reads the batches of records.
- RecordReader.RecordIterator<T> - Interface in org.apache.flink.table.store.file.utils
-
An internal iterator interface which presents a more restrictive API than
Iterator
.
- RecordReaderIterator<T> - Class in org.apache.flink.table.store.file.utils
-
- RecordReaderIterator(RecordReader<T>) - Constructor for class org.apache.flink.table.store.file.utils.RecordReaderIterator
-
- RecordReaderUtils - Class in org.apache.flink.table.store.file.utils
-
- RecordReaderUtils() - Constructor for class org.apache.flink.table.store.file.utils.RecordReaderUtils
-
- recordSize() - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- recordsPerSegment() - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- recordsToSkip() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplit
-
- recordsToSkip() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitState
-
- RecordWriter<T> - Interface in org.apache.flink.table.store.file.utils
-
The RecordWriter
is responsible for writing data and handling in-progress files used to
write yet un-staged data.
- RecordWriter.CommitIncrement - Interface in org.apache.flink.table.store.file.utils
-
Changes to commit.
- RecoverableAtomicFileWriter - Class in org.apache.flink.table.store.file.utils
-
- RecoverableAtomicFileWriter(RecoverableWriter) - Constructor for class org.apache.flink.table.store.file.utils.RecoverableAtomicFileWriter
-
- ReducerMergeFunctionWrapper - Class in org.apache.flink.table.store.file.mergetree.compact
-
- ReducerMergeFunctionWrapper(MergeFunction<KeyValue>) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.ReducerMergeFunctionWrapper
-
- ref(byte[]) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- Reference(byte[]) - Constructor for class org.apache.flink.table.store.connector.lookup.RocksDBState.Reference
-
- ReflectionUtils - Class in org.apache.flink.table.store.utils
-
Utils for java reflection.
- ReflectionUtils() - Constructor for class org.apache.flink.table.store.utils.ReflectionUtils
-
- refresh(Iterator<RowData>) - Method in interface org.apache.flink.table.store.connector.lookup.LookupTable
-
- refresh(Iterator<RowData>) - Method in class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- refresh(Iterator<RowData>) - Method in class org.apache.flink.table.store.connector.lookup.SecondaryIndexLookupTable
-
- relativeTablePath(ObjectIdentifier) - Static method in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- relativeTimeMillis() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.clock.Clock
-
- relativeTimeMillis() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.clock.SystemClock
-
- relativeTimeNanos() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.clock.Clock
-
- relativeTimeNanos() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.clock.SystemClock
-
- releaseBatch() - Method in interface org.apache.flink.table.store.file.utils.RecordReader.RecordIterator
-
Releases the batch that this iterator iterated over.
- releaseBatch() - Method in class org.apache.flink.table.store.table.source.ResetRowKindRecordIterator
-
- removeOption(String) - Static method in interface org.apache.flink.table.store.file.schema.SchemaChange
-
- rename(Path, Path) - Method in class org.apache.flink.table.store.format.fs.HadoopReadOnlyFileSystem
-
- renameColumn(String, String) - Static method in interface org.apache.flink.table.store.file.schema.SchemaChange
-
- renameTable(ObjectPath, String, boolean) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- renameTable(ObjectPath, ObjectPath, boolean) - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Rename a table.
- renameTable(ObjectPath, ObjectPath, boolean) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- renameTable(ObjectPath, ObjectPath, boolean) - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- renameTable(Identifier, Identifier) - Method in class org.apache.flink.table.store.spark.SparkCatalog
-
- RenamingAtomicFsDataOutputStream - Class in org.apache.flink.table.store.file.utils
-
A stream initially writes to hidden files or temp files and only creates the target file once it
is closed and "committed".
- RenamingAtomicFsDataOutputStream(FileSystem, Path, Path) - Constructor for class org.apache.flink.table.store.file.utils.RenamingAtomicFsDataOutputStream
-
- replace(RowData, RowKind, RowData) - Method in class org.apache.flink.table.store.file.KeyValue
-
- replace(RowData, long, RowKind, RowData) - Method in class org.apache.flink.table.store.file.KeyValue
-
- replace(RowData) - Method in class org.apache.flink.table.store.file.utils.OffsetRowData
-
- replace(ArrayData) - Method in class org.apache.flink.table.store.spark.SparkArrayData
-
- replace(RowData) - Method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- replaceKey(RowData) - Method in class org.apache.flink.table.store.file.KeyValue
-
- replaceRow(RowData) - Method in class org.apache.flink.table.store.utils.KeyProjectedRowData
-
- replaceRow(RowData) - Method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- reportMetric(String, String) - Method in class org.apache.flink.table.store.benchmark.metric.MetricReporter
-
- requiredOptions() - Method in class org.apache.flink.table.store.connector.AbstractTableStoreFactory
-
- requiredOptions() - Method in class org.apache.flink.table.store.connector.FlinkCatalogFactory
-
- requiredOptions() - Method in class org.apache.flink.table.store.kafka.KafkaLogStoreFactory
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.aggregate.AggregateMergeFunction
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.DeduplicateMergeFunction
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.FullChangelogMergeFunctionWrapper
-
- reset() - Method in interface org.apache.flink.table.store.file.mergetree.compact.MergeFunction
-
Reset the merge function to its default state.
- reset() - Method in interface org.apache.flink.table.store.file.mergetree.compact.MergeFunctionWrapper
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.PartialUpdateMergeFunction
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.ReducerMergeFunctionWrapper
-
- reset() - Method in class org.apache.flink.table.store.file.mergetree.compact.ValueCountMergeFunction
-
- ResetRowKindRecordIterator - Class in org.apache.flink.table.store.table.source
-
A RecordReader.RecordIterator
which resets RowKind.INSERT
to previous key value.
- ResetRowKindRecordIterator(RecordReader.RecordIterator<KeyValue>) - Constructor for class org.apache.flink.table.store.table.source.ResetRowKindRecordIterator
-
- RestoreAndFailCommittableStateManager - Class in org.apache.flink.table.store.connector.sink
-
- RestoreAndFailCommittableStateManager(SerializableSupplier<SimpleVersionedSerializer<ManifestCommittable>>) - Constructor for class org.apache.flink.table.store.connector.sink.RestoreAndFailCommittableStateManager
-
- restoreEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>, PendingSplitsCheckpoint) - Method in class org.apache.flink.table.store.connector.source.ContinuousFileStoreSource
-
- restoreEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>, PendingSplitsCheckpoint) - Method in class org.apache.flink.table.store.connector.source.SimpleSystemSource
-
- restoreEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>, PendingSplitsCheckpoint) - Method in class org.apache.flink.table.store.connector.source.StaticFileStoreSource
-
- restoreReader(Configuration, SplitT) - Method in class org.apache.flink.table.store.format.avro.AbstractAvroBulkFormat
-
- result() - Method in interface org.apache.flink.table.store.file.io.FileWriter
-
- result() - Method in class org.apache.flink.table.store.file.io.KeyValueDataFileWriter
-
- result() - Method in class org.apache.flink.table.store.file.io.RollingFileWriter
-
- result() - Method in class org.apache.flink.table.store.file.io.RowDataFileWriter
-
- result() - Method in class org.apache.flink.table.store.file.mergetree.compact.FullChangelogMergeFunctionWrapper.Result
-
Latest full compaction result (result of merge function) for this key.
- retract(RowData, RowData) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBSetState
-
- returnAll(List<MemorySegment>) - Method in class org.apache.flink.table.store.file.memory.HeapMemorySegmentPool
-
- rewrite(List<DataFileMeta>) - Method in interface org.apache.flink.table.store.file.append.AppendOnlyCompactManager.CompactRewriter
-
- rewrite(int, boolean, List<List<SortedRun>>) - Method in interface org.apache.flink.table.store.file.mergetree.compact.CompactRewriter
-
- rewrite(int, boolean, List<List<SortedRun>>) - Method in class org.apache.flink.table.store.file.mergetree.compact.FullChangelogMergeTreeCompactRewriter
-
- rewrite(int, boolean, List<List<SortedRun>>) - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactRewriter
-
- rewriteCompaction(int, boolean, List<List<SortedRun>>) - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactRewriter
-
- RocksDBOptions - Class in org.apache.flink.table.store.connector
-
Options for rocksdb.
- RocksDBOptions() - Constructor for class org.apache.flink.table.store.connector.RocksDBOptions
-
- RocksDBSetState - Class in org.apache.flink.table.store.connector.lookup
-
Rocksdb state for key -> Set values.
- RocksDBSetState(RocksDB, ColumnFamilyHandle, TypeSerializer<RowData>, TypeSerializer<RowData>, long) - Constructor for class org.apache.flink.table.store.connector.lookup.RocksDBSetState
-
- RocksDBState<CacheV> - Class in org.apache.flink.table.store.connector.lookup
-
Rocksdb state for key value.
- RocksDBState(RocksDB, ColumnFamilyHandle, TypeSerializer<RowData>, TypeSerializer<RowData>, long) - Constructor for class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- RocksDBState.ByteArray - Class in org.apache.flink.table.store.connector.lookup
-
A class wraps byte[] to implement equals and hashCode.
- RocksDBState.Reference - Class in org.apache.flink.table.store.connector.lookup
-
A class wraps byte[] to indicate contain or not contain.
- RocksDBStateFactory - Class in org.apache.flink.table.store.connector.lookup
-
Factory to create state.
- RocksDBStateFactory(String, Configuration) - Constructor for class org.apache.flink.table.store.connector.lookup.RocksDBStateFactory
-
- RocksDBValueState - Class in org.apache.flink.table.store.connector.lookup
-
Rocksdb state for key -> a single value.
- RocksDBValueState(RocksDB, ColumnFamilyHandle, TypeSerializer<RowData>, TypeSerializer<RowData>, long) - Constructor for class org.apache.flink.table.store.connector.lookup.RocksDBValueState
-
- rollbackCreateTable(Table) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- rollbackDropTable(Table) - Method in class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- RollingFileWriter<T,R> - Class in org.apache.flink.table.store.file.io
-
Writer to roll over to a new file if the current size exceed the target file size.
- RollingFileWriter(Supplier<? extends SingleFileWriter<T, R>>, long) - Constructor for class org.apache.flink.table.store.file.io.RollingFileWriter
-
- root() - Method in class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- ROOT_PATH - Static variable in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- row() - Method in class org.apache.flink.table.store.table.sink.SinkRecord
-
- row - Variable in class org.apache.flink.table.store.utils.ProjectedRowData
-
- row1 - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- ROW_KIND - Static variable in class org.apache.flink.table.store.table.system.AuditLogTable
-
- RowAggregator(Configuration, List<String>, List<LogicalType>, List<String>) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.aggregate.AggregateMergeFunction.RowAggregator
-
- rowCount() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- rowCount() - Method in class org.apache.flink.table.store.table.source.DataSplit
-
- rowCount() - Method in interface org.apache.flink.table.store.table.source.Split
-
- RowDataContainer - Class in org.apache.flink.table.store
-
A reusable object to hold RowData
.
- RowDataContainer() - Constructor for class org.apache.flink.table.store.RowDataContainer
-
- RowDataFileRecordReader - Class in org.apache.flink.table.store.file.io
-
Reads RowData
from data files.
- RowDataFileRecordReader(Path, BulkFormat<RowData, FileSourceSplit>, int[]) - Constructor for class org.apache.flink.table.store.file.io.RowDataFileRecordReader
-
- RowDataFileWriter - Class in org.apache.flink.table.store.file.io
-
- RowDataFileWriter(BulkWriter.Factory<RowData>, Path, RowType, FileStatsExtractor, long, LongCounter) - Constructor for class org.apache.flink.table.store.file.io.RowDataFileWriter
-
- RowDataPartitionComputer - Class in org.apache.flink.table.store.file.utils
-
PartitionComputer for RowData
.
- RowDataPartitionComputer(String, RowType, String[]) - Constructor for class org.apache.flink.table.store.file.utils.RowDataPartitionComputer
-
- rowDataRecordIteratorFromKv(RecordReader.RecordIterator<KeyValue>) - Method in class org.apache.flink.table.store.table.source.KeyValueTableRead
-
- RowDataRollingFileWriter - Class in org.apache.flink.table.store.file.io
-
- RowDataRollingFileWriter(long, FileFormat, long, RowType, DataFilePathFactory, LongCounter) - Constructor for class org.apache.flink.table.store.file.io.RowDataRollingFileWriter
-
- RowDataToObjectArrayConverter - Class in org.apache.flink.table.store.utils
-
Convert RowData
to object array.
- RowDataToObjectArrayConverter(RowType) - Constructor for class org.apache.flink.table.store.utils.RowDataToObjectArrayConverter
-
- rowDataToString(RowData, RowType) - Static method in class org.apache.flink.table.store.file.KeyValue
-
- RowDataType - Class in org.apache.flink.table.store.file.schema
-
A data type that contains field data types.
- RowDataType(List<DataField>) - Constructor for class org.apache.flink.table.store.file.schema.RowDataType
-
- RowDataType(boolean, List<DataField>) - Constructor for class org.apache.flink.table.store.file.schema.RowDataType
-
- RowDataUtils - Class in org.apache.flink.table.store.utils
-
Utils for RowData
structures.
- RowDataUtils() - Constructor for class org.apache.flink.table.store.utils.RowDataUtils
-
- rowNum - Variable in class org.apache.flink.table.store.benchmark.Query.WriteSql
-
- rowSerializer - Variable in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
- rowType() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- rowType() - Method in interface org.apache.flink.table.store.table.FileStoreTable
-
- rowType() - Method in class org.apache.flink.table.store.table.system.AuditLogTable
-
- rowType() - Method in class org.apache.flink.table.store.table.system.BucketsTable
-
- rowType() - Method in class org.apache.flink.table.store.table.system.OptionsTable
-
- rowType() - Method in class org.apache.flink.table.store.table.system.SchemasTable
-
- rowType() - Method in class org.apache.flink.table.store.table.system.SnapshotsTable
-
- rowType() - Method in interface org.apache.flink.table.store.table.Table
-
- rowType() - Method in class org.apache.flink.table.store.utils.RowDataToObjectArrayConverter
-
- run() - Method in class org.apache.flink.table.store.benchmark.QueryRunner
-
- run() - Method in interface org.apache.flink.table.store.connector.action.Action
-
The execution method of the action.
- run() - Method in class org.apache.flink.table.store.connector.action.CompactAction
-
- run() - Method in class org.apache.flink.table.store.connector.action.DropPartitionAction
-
- run() - Method in class org.apache.flink.table.store.file.mergetree.LevelSortedRun
-
- runBlocking() - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- runBlocking(Duration) - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- runBlocking(String...) - Static method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess
-
- runBlockingWithRetry(int, Duration, Duration) - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- runNonBlocking() - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- runNonBlocking(String...) - Static method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess
-
- runOfLevel(int) - Method in class org.apache.flink.table.store.file.mergetree.Levels
-
- runServer() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricReceiver
-
- runServerBlocking() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricReceiver
-
- runWithLock(String, String, Callable<T>) - Method in interface org.apache.flink.table.store.file.catalog.CatalogLock
-
Run with catalog lock.
- runWithLock(Callable<T>) - Method in class org.apache.flink.table.store.file.operation.Lock.CatalogLockImpl
-
- runWithLock(Callable<T>) - Method in class org.apache.flink.table.store.file.operation.Lock.EmptyLock
-
- runWithLock(Callable<T>) - Method in interface org.apache.flink.table.store.file.operation.Lock
-
Run with lock.
- runWithLock(String, String, Callable<T>) - Method in class org.apache.flink.table.store.hive.HiveCatalogLock
-
- S3Loader - Class in org.apache.flink.table.store.s3
-
- S3Loader() - Constructor for class org.apache.flink.table.store.s3.S3Loader
-
- safelyListFileStatus(Path) - Static method in class org.apache.flink.table.store.file.utils.FileUtils
-
- SCAN_MODE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SCAN_PARALLELISM - Static variable in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- SCAN_PLAN_SORT_PARTITION - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SCAN_SNAPSHOT_ID - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SCAN_TIMESTAMP_MILLIS - Static variable in class org.apache.flink.table.store.CoreOptions
-
- scanAll() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeReaderBenchmark
-
- scanExistingFileMetas(Long, BinaryRowData, int) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- ScanKind - Enum in org.apache.flink.table.store.file.operation
-
Scan which part of the snapshot.
- scanPlanSortPartition() - Method in class org.apache.flink.table.store.CoreOptions
-
- scanRps - Variable in class org.apache.flink.table.store.benchmark.QueryRunner.Result
-
- scanSnapshotId() - Method in class org.apache.flink.table.store.CoreOptions
-
- scanTableSchema() - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreScan
-
Note: Keep this thread-safe.
- scanTableSchema(long) - Method in class org.apache.flink.table.store.file.operation.AbstractFileStoreScan
-
Note: Keep this thread-safe.
- scanTimestampMills() - Method in class org.apache.flink.table.store.CoreOptions
-
- schema() - Static method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- schema(RowType, RowType) - Static method in class org.apache.flink.table.store.file.KeyValue
-
- schema() - Static method in class org.apache.flink.table.store.file.manifest.ManifestEntry
-
- schema() - Static method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- schema(long) - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
Read schema for schema id.
- schema() - Static method in class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- schema() - Method in class org.apache.flink.table.store.spark.SparkTable
-
- schema() - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- schema() - Method in interface org.apache.flink.table.store.table.FileStoreTable
-
- SchemaChange - Interface in org.apache.flink.table.store.file.schema
-
Schema change to table.
- SchemaChange.AddColumn - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to add a field.
- SchemaChange.DropColumn - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to drop a field.
- SchemaChange.RemoveOption - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to remove a table option.
- SchemaChange.RenameColumn - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to rename a field.
- SchemaChange.SetOption - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to set a table option.
- SchemaChange.UpdateColumnComment - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to update the (nested) field comment.
- SchemaChange.UpdateColumnNullability - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to update the (nested) field nullability.
- SchemaChange.UpdateColumnType - Class in org.apache.flink.table.store.file.schema
-
A SchemaChange to update the field type.
- SchemaEvolutionUtil - Class in org.apache.flink.table.store.file.schema
-
Utils for schema evolution.
- SchemaEvolutionUtil() - Constructor for class org.apache.flink.table.store.file.schema.SchemaEvolutionUtil
-
- schemaId - Variable in class org.apache.flink.table.store.file.AbstractFileStore
-
- schemaId() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- schemaId() - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- schemaId() - Method in class org.apache.flink.table.store.file.Snapshot
-
- schemaManager - Variable in class org.apache.flink.table.store.file.AbstractFileStore
-
- SchemaManager - Class in org.apache.flink.table.store.file.schema
-
Schema Manager to manage schema versions.
- SchemaManager(Path) - Constructor for class org.apache.flink.table.store.file.schema.SchemaManager
-
- schemaManager() - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- SCHEMAS - Static variable in class org.apache.flink.table.store.table.system.SchemasTable
-
- SchemaSerializer - Class in org.apache.flink.table.store.file.schema
-
- SchemaSerializer() - Constructor for class org.apache.flink.table.store.file.schema.SchemaSerializer
-
- SchemasTable - Class in org.apache.flink.table.store.table.system
-
A
Table
for showing schemas of table.
- SchemasTable(Path) - Constructor for class org.apache.flink.table.store.table.system.SchemasTable
-
- SearchArgumentToPredicateConverter - Class in org.apache.flink.table.store
-
Converts
SearchArgument
to
Predicate
with best effort.
- SearchArgumentToPredicateConverter(SearchArgument, List<String>, List<LogicalType>) - Constructor for class org.apache.flink.table.store.SearchArgumentToPredicateConverter
-
- SecondaryIndexLookupTable - Class in org.apache.flink.table.store.connector.lookup
-
A
LookupTable
for primary key table which provides lookup by secondary key.
- SecondaryIndexLookupTable(RocksDBStateFactory, RowType, List<String>, List<String>, Predicate<RowData>, long) - Constructor for class org.apache.flink.table.store.connector.lookup.SecondaryIndexLookupTable
-
- select(int, int) - Method in class org.apache.flink.table.store.file.predicate.BucketSelector
-
- selectChannel(SerializationDelegate<StreamRecord<RowData>>) - Method in class org.apache.flink.table.store.connector.sink.BucketStreamPartitioner
-
- selectChannel(SerializationDelegate<StreamRecord<RowData>>) - Method in class org.apache.flink.table.store.connector.sink.OffsetRowDataHashStreamPartitioner
-
- SEQUENCE_FIELD - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SEQUENCE_NUMBER - Static variable in class org.apache.flink.table.store.file.schema.TableSchema
-
- sequenceField() - Method in class org.apache.flink.table.store.CoreOptions
-
- SequenceGenerator - Class in org.apache.flink.table.store.table.sink
-
Generate sequence number.
- SequenceGenerator(String, RowType) - Constructor for class org.apache.flink.table.store.table.sink.SequenceGenerator
-
- sequenceNumber - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- sequenceNumber() - Method in class org.apache.flink.table.store.file.KeyValue
-
- SerializableCommittable - Class in org.apache.flink.table.store.table.sink
-
- SerializableCommittable() - Constructor for class org.apache.flink.table.store.table.sink.SerializableCommittable
-
- SerializableHiveConf - Class in org.apache.flink.table.store.hive
-
Wrap HiveConf
to a serializable class.
- SerializableHiveConf(HiveConf) - Constructor for class org.apache.flink.table.store.hive.SerializableHiveConf
-
- SerializationUtils - Class in org.apache.flink.table.store.file.utils
-
Utils for serialization.
- SerializationUtils() - Constructor for class org.apache.flink.table.store.file.utils.SerializationUtils
-
- serialize(Committable) - Method in class org.apache.flink.table.store.connector.sink.CommittableSerializer
-
- serialize(FileStoreSourceSplit) - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitSerializer
-
- serialize(PendingSplitsCheckpoint) - Method in class org.apache.flink.table.store.connector.source.PendingSplitsCheckpointSerializer
-
- serialize(ManifestCommittable) - Method in class org.apache.flink.table.store.file.manifest.ManifestCommittableSerializer
-
- serialize(DataField, JsonGenerator) - Method in class org.apache.flink.table.store.file.schema.DataFieldSerializer
-
- serialize(TableSchema, JsonGenerator) - Method in class org.apache.flink.table.store.file.schema.SchemaSerializer
-
- serialize(T, JsonGenerator) - Method in interface org.apache.flink.table.store.file.utils.JsonSerializer
-
- serialize(T, DataOutputView) - Method in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
Serializes the given record to the given target output view.
- serialize(Object, ObjectInspector) - Method in class org.apache.flink.table.store.hive.TableStoreSerDe
-
- serialize(SinkRecord, Long) - Method in class org.apache.flink.table.store.kafka.KafkaLogSerializationSchema
-
- serialize(FileCommittable) - Method in class org.apache.flink.table.store.table.sink.FileCommittableSerializer
-
- serialize(DataOutputView) - Method in class org.apache.flink.table.store.table.source.DataSplit
-
- serializeBinaryRow(BinaryRowData) - Static method in class org.apache.flink.table.store.file.utils.SerializationUtils
-
Serialize BinaryRowData
, the difference between this and BinaryRowDataSerializer
is that arity is also serialized here, so the deserialization is
schemaless.
- serializeBinaryRow(BinaryRowData, DataOutputView) - Static method in class org.apache.flink.table.store.file.utils.SerializationUtils
-
Serialize BinaryRowData
to a DataOutputView
.
- serializeBytes(DataOutputView, byte[]) - Static method in class org.apache.flink.table.store.file.utils.SerializationUtils
-
Serialize a byte[]
bytes with length.
- serializeKey(RowData) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- serializeList(List<T>, DataOutputView) - Method in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
Serializes the given record list to the given target output view.
- serializeList(List<T>) - Method in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
- serializeList(List<FileCommittable>, DataOutputView) - Method in class org.apache.flink.table.store.table.sink.FileCommittableSerializer
-
- serializer - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- serializer1 - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- service - Variable in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeBenchmark
-
- set(RowData) - Method in class org.apache.flink.table.store.RowDataContainer
-
- setConf(Configuration) - Method in class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- setDefaultValues(Configuration) - Static method in class org.apache.flink.table.store.CoreOptions
-
Set the default values of the
CoreOptions
via the given
Configuration
.
- setLevel(int) - Method in class org.apache.flink.table.store.file.KeyValue
-
- setLockFactory(CatalogLock.Factory) - Method in class org.apache.flink.table.store.connector.sink.TableStoreSink
-
- setMemoryPool(MemorySegmentPool) - Method in interface org.apache.flink.table.store.file.memory.MemoryOwner
-
Set MemorySegmentPool
for the owner.
- setMemoryPool(MemorySegmentPool) - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- setNullAt(int) - Method in class org.apache.flink.table.store.spark.SparkArrayData
-
- setNullAt(int) - Method in class org.apache.flink.table.store.spark.SparkInternalRow
-
- setOption(String, String) - Static method in interface org.apache.flink.table.store.file.schema.SchemaChange
-
- setPosition(RecordAndPosition<RowData>) - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitState
-
- setRowKind(RowKind) - Method in class org.apache.flink.table.store.file.utils.OffsetRowData
-
- setRowKind(RowKind) - Method in class org.apache.flink.table.store.spark.SparkRowData
-
- setRowKind(RowKind) - Method in class org.apache.flink.table.store.utils.KeyProjectedRowData
-
- setRowKind(RowKind) - Method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- setSmapsEnabled(boolean) - Method in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- setState(String, TypeSerializer<RowData>, TypeSerializer<RowData>, long) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBStateFactory
-
- setStderrProcessor(Consumer<String>) - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- setStdInputs(String...) - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- setStdoutProcessor(Consumer<String>) - Method in class org.apache.flink.table.store.benchmark.utils.AutoClosableProcess.AutoClosableProcessBuilder
-
- setUp() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeReaderBenchmark
-
- setUp() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeWriterBenchmark
-
- setup(int) - Method in class org.apache.flink.table.store.connector.sink.BucketStreamPartitioner
-
- setup(int) - Method in class org.apache.flink.table.store.connector.sink.OffsetRowDataHashStreamPartitioner
-
- setup(StreamTask<?, ?>, StreamConfig, Output<StreamRecord<Committable>>) - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- setWorkingDirectory(Path) - Method in class org.apache.flink.table.store.format.fs.HadoopReadOnlyFileSystem
-
- setWriteCallback(LogSinkFunction.WriteCallback) - Method in class org.apache.flink.table.store.kafka.KafkaSinkFunction
-
- setWriteCallback(LogSinkFunction.WriteCallback) - Method in interface org.apache.flink.table.store.table.sink.LogSinkFunction
-
- ShellCommandExecutor - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
The shell command executor.
- ShellCommandExecutor(String[]) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.ShellCommandExecutor
-
Instantiates a new Shell command executor.
- shortName() - Method in class org.apache.flink.table.store.spark.SparkSource
-
- shouldScanSnapshot(Snapshot) - Method in class org.apache.flink.table.store.table.source.snapshot.CompactionChangelogFollowUpScanner
-
- shouldScanSnapshot(Snapshot) - Method in class org.apache.flink.table.store.table.source.snapshot.ContinuousCompactorFollowUpScanner
-
- shouldScanSnapshot(Snapshot) - Method in class org.apache.flink.table.store.table.source.snapshot.DeltaFollowUpScanner
-
- shouldScanSnapshot(Snapshot) - Method in interface org.apache.flink.table.store.table.source.snapshot.FollowUpScanner
-
- shouldScanSnapshot(Snapshot) - Method in class org.apache.flink.table.store.table.source.snapshot.InputChangelogFollowUpScanner
-
- shouldWaitCompaction() - Method in class org.apache.flink.table.store.file.append.AppendOnlyCompactManager
-
- shouldWaitCompaction() - Method in interface org.apache.flink.table.store.file.compact.CompactManager
-
Should wait compaction finish.
- shouldWaitCompaction() - Method in class org.apache.flink.table.store.file.compact.NoopCompactManager
-
- shouldWaitCompaction() - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactManager
-
- SimpleSystemSource - Class in org.apache.flink.table.store.connector.source
-
- SimpleSystemSource(Table, int[][], Predicate, Long) - Constructor for class org.apache.flink.table.store.connector.source.SimpleSystemSource
-
- SingleFileWriter<T,R> - Class in org.apache.flink.table.store.file.io
-
- SingleFileWriter(BulkWriter.Factory<RowData>, Path, Function<T, RowData>) - Constructor for class org.apache.flink.table.store.file.io.SingleFileWriter
-
- Sink - Class in org.apache.flink.table.store.benchmark
-
Benchmark sink.
- SINK_PARALLELISM - Static variable in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- SINK_PATH - Static variable in class org.apache.flink.table.store.benchmark.BenchmarkOptions
-
- sinkFrom(DataStream<RowData>) - Method in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- SinkRecord - Class in org.apache.flink.table.store.table.sink
-
A sink record contains key, row and partition, bucket information.
- SinkRecord(BinaryRowData, int, BinaryRowData, RowData) - Constructor for class org.apache.flink.table.store.table.sink.SinkRecord
-
- SinkRecordConverter - Class in org.apache.flink.table.store.table.sink
-
- SinkRecordConverter(TableSchema) - Constructor for class org.apache.flink.table.store.table.sink.SinkRecordConverter
-
- size() - Method in class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- size() - Method in interface org.apache.flink.table.store.file.mergetree.WriteBuffer
-
Record size of this table.
- size() - Method in class org.apache.flink.table.store.file.sort.BinaryExternalSortBuffer
-
- size() - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- size() - Method in interface org.apache.flink.table.store.file.sort.SortBuffer
-
- SMAPS - Static variable in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
- snapshot() - Method in class org.apache.flink.table.store.connector.source.StaticFileStoreSplitEnumerator
-
- Snapshot - Class in org.apache.flink.table.store.file
-
This file is the entrance to all data committed at some specific time point.
- Snapshot(long, long, String, String, String, String, long, Snapshot.CommitKind, long, Map<Integer, Long>) - Constructor for class org.apache.flink.table.store.file.Snapshot
-
- Snapshot(Integer, long, long, String, String, String, String, long, Snapshot.CommitKind, long, Map<Integer, Long>) - Constructor for class org.apache.flink.table.store.file.Snapshot
-
- snapshot(long) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- Snapshot.CommitKind - Enum in org.apache.flink.table.store.file
-
Type of changes in this snapshot.
- SNAPSHOT_NUM_RETAINED_MAX - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SNAPSHOT_NUM_RETAINED_MIN - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SNAPSHOT_TIME_RETAINED - Static variable in class org.apache.flink.table.store.CoreOptions
-
- snapshotCount() - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- snapshotDirectory() - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- SnapshotEnumerator - Interface in org.apache.flink.table.store.table.source.snapshot
-
Enumerate incremental changes from newly created snapshots.
- snapshotExists(long) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- snapshotId() - Method in interface org.apache.flink.table.store.file.operation.FileStoreScan.Plan
-
Snapshot id of this plan, return null if the table is empty or the manifest list is
specified.
- snapshotId() - Method in class org.apache.flink.table.store.table.source.DataSplit
-
- snapshotId - Variable in class org.apache.flink.table.store.table.source.DataTableScan.DataFilePlan
-
- snapshotManager() - Method in class org.apache.flink.table.store.file.AbstractFileStore
-
- snapshotManager() - Method in interface org.apache.flink.table.store.file.FileStore
-
- snapshotManager - Variable in class org.apache.flink.table.store.file.operation.AbstractFileStoreWrite
-
- SnapshotManager - Class in org.apache.flink.table.store.file.utils
-
Manager for
Snapshot
, providing utility methods related to paths and snapshot hints.
- SnapshotManager(Path) - Constructor for class org.apache.flink.table.store.file.utils.SnapshotManager
-
- snapshotManager() - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- snapshotManager() - Method in interface org.apache.flink.table.store.table.DataTable
-
- snapshotManager() - Method in class org.apache.flink.table.store.table.source.AbstractDataTableScan
-
- snapshotManager() - Method in class org.apache.flink.table.store.table.system.AuditLogTable
-
- snapshotManager() - Method in class org.apache.flink.table.store.table.system.BucketsTable
-
- snapshotNumRetainMax() - Method in class org.apache.flink.table.store.CoreOptions
-
- snapshotNumRetainMin() - Method in class org.apache.flink.table.store.CoreOptions
-
- snapshotPath(long) - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- snapshots() - Method in class org.apache.flink.table.store.file.utils.SnapshotManager
-
- SNAPSHOTS - Static variable in class org.apache.flink.table.store.table.system.SnapshotsTable
-
- SnapshotsTable - Class in org.apache.flink.table.store.table.system
-
A
Table
for showing committing snapshots of table.
- SnapshotsTable(Path) - Constructor for class org.apache.flink.table.store.table.system.SnapshotsTable
-
- snapshotState(StateSnapshotContext, List<ManifestCommittable>) - Method in interface org.apache.flink.table.store.connector.sink.CommittableStateManager
-
- snapshotState(StateSnapshotContext) - Method in class org.apache.flink.table.store.connector.sink.CommitterOperator
-
- snapshotState(StateSnapshotContext) - Method in class org.apache.flink.table.store.connector.sink.FullChangelogStoreSinkWrite
-
- snapshotState(StateSnapshotContext, List<ManifestCommittable>) - Method in class org.apache.flink.table.store.connector.sink.NoopCommittableStateManager
-
- snapshotState(StateSnapshotContext, List<ManifestCommittable>) - Method in class org.apache.flink.table.store.connector.sink.RestoreAndFailCommittableStateManager
-
- snapshotState(StateSnapshotContext) - Method in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- snapshotState(StateSnapshotContext) - Method in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- snapshotState(long) - Method in class org.apache.flink.table.store.connector.source.ContinuousFileSplitEnumerator
-
- snapshotState(long) - Method in class org.apache.flink.table.store.connector.source.StaticFileStoreSplitEnumerator
-
- snapshotTimeRetain() - Method in class org.apache.flink.table.store.CoreOptions
-
- SortBuffer - Interface in org.apache.flink.table.store.file.sort
-
Sort buffer to sort records.
- SortBufferWriteBuffer - Class in org.apache.flink.table.store.file.mergetree
-
- SortBufferWriteBuffer(RowType, RowType, MemorySegmentPool, boolean, int, IOManager) - Constructor for class org.apache.flink.table.store.file.mergetree.SortBufferWriteBuffer
-
- sortedIterator() - Method in class org.apache.flink.table.store.file.sort.BinaryExternalSortBuffer
-
- sortedIterator() - Method in class org.apache.flink.table.store.file.sort.BinaryInMemorySortBuffer
-
- sortedIterator() - Method in interface org.apache.flink.table.store.file.sort.SortBuffer
-
- SortedRun - Class in org.apache.flink.table.store.file.mergetree
-
A
SortedRun
is a list of files sorted by their keys.
- sortedRunSizeRatio() - Method in class org.apache.flink.table.store.CoreOptions
-
- SortFieldSpec(int, boolean, boolean) - Constructor for class org.apache.flink.table.store.codegen.SortSpec.SortFieldSpec
-
- sortIndex - Variable in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- SortMergeReader<T> - Class in org.apache.flink.table.store.file.mergetree.compact
-
This reader is to read a list of
RecordReader
, which is already sorted by key and
sequence number, and perform a sort merge algorithm.
- SortMergeReader(List<RecordReader<KeyValue>>, Comparator<RowData>, MergeFunctionWrapper<T>) - Constructor for class org.apache.flink.table.store.file.mergetree.compact.SortMergeReader
-
- SortSpec - Class in org.apache.flink.table.store.codegen
-
SortSpec
describes how the data will be sorted.
- SortSpec(SortSpec.SortFieldSpec[]) - Constructor for class org.apache.flink.table.store.codegen.SortSpec
-
- SortSpec.SortFieldSpec - Class in org.apache.flink.table.store.codegen
-
Sort info for a Field.
- SortSpec.SortSpecBuilder - Class in org.apache.flink.table.store.codegen
-
SortSpec builder.
- SortSpecBuilder() - Constructor for class org.apache.flink.table.store.codegen.SortSpec.SortSpecBuilder
-
- SOURCE_SPLIT_OPEN_FILE_COST - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SOURCE_SPLIT_TARGET_SIZE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- SparkArrayData - Class in org.apache.flink.table.store.spark
-
Spark ArrayData
to wrap flink ArrayData
.
- SparkArrayData(LogicalType) - Constructor for class org.apache.flink.table.store.spark.SparkArrayData
-
- SparkCaseSensitiveConverter - Class in org.apache.flink.table.store.spark
-
This util convert lowercase key to case-sensitive key.
- SparkCatalog - Class in org.apache.flink.table.store.spark
-
Spark TableCatalog
for table store.
- SparkCatalog() - Constructor for class org.apache.flink.table.store.spark.SparkCatalog
-
- SparkDataSourceReader - Class in org.apache.flink.table.store.spark
-
A Spark DataSourceReader
for table store.
- SparkDataSourceReader(Table) - Constructor for class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- SparkFilterConverter - Class in org.apache.flink.table.store.spark
-
- SparkFilterConverter(RowType) - Constructor for class org.apache.flink.table.store.spark.SparkFilterConverter
-
- SparkInputPartition - Class in org.apache.flink.table.store.spark
-
A Spark InputPartition
for table store.
- SparkInputPartition(Split) - Constructor for class org.apache.flink.table.store.spark.SparkInputPartition
-
- SparkInternalRow - Class in org.apache.flink.table.store.spark
-
Spark InternalRow
to wrap RowData
.
- SparkInternalRow(RowType) - Constructor for class org.apache.flink.table.store.spark.SparkInternalRow
-
- SparkReaderFactory - Class in org.apache.flink.table.store.spark
-
A Spark PartitionReaderFactory
for table store.
- SparkReaderFactory(Table, int[], List<Predicate>, Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkReaderFactory
-
- SparkRowData - Class in org.apache.flink.table.store.spark
-
A RowData
wraps spark Row
.
- SparkRowData(RowType, Row) - Constructor for class org.apache.flink.table.store.spark.SparkRowData
-
- SparkScan - Class in org.apache.flink.table.store.spark
-
A Spark Scan
for table store.
- SparkScan(Table, List<Predicate>, int[], Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkScan
-
- SparkScanBuilder - Class in org.apache.flink.table.store.spark
-
A Spark ScanBuilder
for table store.
- SparkScanBuilder(Table, Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkScanBuilder
-
- SparkSource - Class in org.apache.flink.table.store.spark
-
The spark source for table store.
- SparkSource() - Constructor for class org.apache.flink.table.store.spark.SparkSource
-
- SparkTable - Class in org.apache.flink.table.store.spark
-
A spark Table
for table store.
- SparkTable(Table, Lock.Factory, Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkTable
-
- SparkTypeUtils - Class in org.apache.flink.table.store.spark
-
Utils for spark DataType
.
- SparkWrite - Class in org.apache.flink.table.store.spark
-
Spark V1Write
, it is required to use v1 write for grouping by bucket.
- SparkWrite(SupportsWrite, String, Lock.Factory, Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkWrite
-
- SparkWriteBuilder - Class in org.apache.flink.table.store.spark
-
Spark WriteBuilder
.
- SparkWriteBuilder(SupportsWrite, String, Lock.Factory, Configuration) - Constructor for class org.apache.flink.table.store.spark.SparkWriteBuilder
-
- SpecializedGettersReader - Class in org.apache.flink.table.store.spark
-
Reader of Spark SpecializedGetters
.
- split() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplit
-
- split() - Method in class org.apache.flink.table.store.mapred.TableStoreInputSplit
-
- split() - Method in class org.apache.flink.table.store.spark.SparkInputPartition
-
- split(List<DataFileMeta>) - Method in class org.apache.flink.table.store.table.source.AppendOnlySplitGenerator
-
- split(List<DataFileMeta>) - Method in class org.apache.flink.table.store.table.source.MergeTreeSplitGenerator
-
- Split - Interface in org.apache.flink.table.store.table.source
-
An input split for reading.
- split(List<DataFileMeta>) - Method in interface org.apache.flink.table.store.table.source.SplitGenerator
-
- splitAnd(Predicate) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- splitGenerator(FileStorePathFactory) - Method in class org.apache.flink.table.store.table.source.AbstractDataTableScan
-
- SplitGenerator - Interface in org.apache.flink.table.store.table.source
-
- splitId() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplit
-
- splitOpenFileCost() - Method in class org.apache.flink.table.store.CoreOptions
-
- splitOr(Predicate) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- splits() - Method in class org.apache.flink.table.store.connector.source.PendingSplitsCheckpoint
-
- splits() - Method in class org.apache.flink.table.store.spark.SparkDataSourceReader
-
- splits() - Method in class org.apache.flink.table.store.spark.SparkScan
-
- splits - Variable in class org.apache.flink.table.store.table.source.DataTableScan.DataFilePlan
-
- splits() - Method in class org.apache.flink.table.store.table.source.DataTableScan.DataFilePlan
-
- splits() - Method in interface org.apache.flink.table.store.table.source.TableScan.Plan
-
- splitTargetSize() - Method in class org.apache.flink.table.store.CoreOptions
-
- start() - Method in class org.apache.flink.table.store.connector.source.ContinuousFileSplitEnumerator
-
- start() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceReader
-
- start() - Method in class org.apache.flink.table.store.connector.source.StaticFileStoreSplitEnumerator
-
- startClient() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetricSender
-
- StartingScanner - Interface in org.apache.flink.table.store.table.source.snapshot
-
- startsWith(int, Object) - Method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- StartsWith - Class in org.apache.flink.table.store.file.predicate
-
- startupMode() - Method in class org.apache.flink.table.store.CoreOptions
-
- startupMode(ReadableConfig) - Static method in class org.apache.flink.table.store.CoreOptions
-
- StateUtils - Class in org.apache.flink.table.store.connector.sink
-
Utility class for sink state manipulation.
- StateUtils() - Constructor for class org.apache.flink.table.store.connector.sink.StateUtils
-
- StaticDataFileSnapshotEnumerator - Class in org.apache.flink.table.store.table.source.snapshot
-
- StaticDataFileSnapshotEnumerator(Path, DataTableScan, StartingScanner) - Constructor for class org.apache.flink.table.store.table.source.snapshot.StaticDataFileSnapshotEnumerator
-
- StaticDataFileSnapshotEnumerator.Factory - Interface in org.apache.flink.table.store.table.source.snapshot
-
- StaticFileStoreSource - Class in org.apache.flink.table.store.connector.source
-
- StaticFileStoreSource(DataTable, int[][], Predicate, Long) - Constructor for class org.apache.flink.table.store.connector.source.StaticFileStoreSource
-
- StaticFileStoreSource(DataTable, int[][], Predicate, Long, StaticDataFileSnapshotEnumerator.Factory) - Constructor for class org.apache.flink.table.store.connector.source.StaticFileStoreSource
-
- StaticFileStoreSplitEnumerator - Class in org.apache.flink.table.store.connector.source
-
- StaticFileStoreSplitEnumerator(SplitEnumeratorContext<FileStoreSourceSplit>, Snapshot, Collection<FileStoreSourceSplit>) - Constructor for class org.apache.flink.table.store.connector.source.StaticFileStoreSplitEnumerator
-
- StaticFromSnapshotStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- StaticFromSnapshotStartingScanner(long) - Constructor for class org.apache.flink.table.store.table.source.snapshot.StaticFromSnapshotStartingScanner
-
- StaticFromTimestampStartingScanner - Class in org.apache.flink.table.store.table.source.snapshot
-
- StaticFromTimestampStartingScanner(long) - Constructor for class org.apache.flink.table.store.table.source.snapshot.StaticFromTimestampStartingScanner
-
- StatsCollectingSingleFileWriter<T,R> - Class in org.apache.flink.table.store.file.io
-
- StatsCollectingSingleFileWriter(BulkWriter.Factory<RowData>, Path, Function<T, RowData>, RowType, FileStatsExtractor) - Constructor for class org.apache.flink.table.store.file.io.StatsCollectingSingleFileWriter
-
- stopJobWithSavepoint(String, String) - Method in class org.apache.flink.table.store.benchmark.metric.FlinkRestClient
-
- store() - Method in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- store() - Method in class org.apache.flink.table.store.table.AppendOnlyFileStoreTable
-
- store() - Method in class org.apache.flink.table.store.table.ChangelogValueCountFileStoreTable
-
- store() - Method in class org.apache.flink.table.store.table.ChangelogWithKeyFileStoreTable
-
- StoreCommitter - Class in org.apache.flink.table.store.connector.sink
-
- StoreCommitter(TableCommit) - Constructor for class org.apache.flink.table.store.connector.sink.StoreCommitter
-
- StoreCompactOperator - Class in org.apache.flink.table.store.connector.sink
-
A dedicated operator for manual triggered compaction.
- StoreCompactOperator(FileStoreTable, StoreSinkWrite.Provider, boolean) - Constructor for class org.apache.flink.table.store.connector.sink.StoreCompactOperator
-
- StoreSinkWriteImpl - Class in org.apache.flink.table.store.connector.sink
-
Default implementation of StoreSinkWrite
.
- StoreSinkWriteImpl(FileStoreTable, StateInitializationContext, String, IOManager, boolean) - Constructor for class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- StoreWriteOperator - Class in org.apache.flink.table.store.connector.sink
-
- StoreWriteOperator(FileStoreTable, LogSinkFunction, StoreSinkWrite.Provider) - Constructor for class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- StreamExecutionEnvironmentUtils - Class in org.apache.flink.table.store.connector.utils
-
Utility methods for StreamExecutionEnvironment
.
- StreamExecutionEnvironmentUtils() - Constructor for class org.apache.flink.table.store.connector.utils.StreamExecutionEnvironmentUtils
-
- submitSQLJob(String) - Method in class org.apache.flink.table.store.benchmark.QueryRunner
-
- suggestedFileSize() - Method in class org.apache.flink.table.store.file.manifest.ManifestFile
-
- SUPPORT_FILTERS - Static variable in class org.apache.flink.table.store.spark.SparkFilterConverter
-
- supportsExternalMetadata() - Method in class org.apache.flink.table.store.spark.SparkSource
-
- supportsNestedProjection() - Method in class org.apache.flink.table.store.connector.source.FlinkTableSource
-
- SupportsPartition - Interface in org.apache.flink.table.store.table
-
An interface for
Table
partition support.
- SupportsWrite - Interface in org.apache.flink.table.store.table
-
An interface for
Table
write support.
- swap(int, int) - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- swap(int, int, int, int) - Method in class org.apache.flink.table.store.file.sort.BinaryIndexedSortable
-
- swapKey(MemorySegment, int, MemorySegment, int) - Method in interface org.apache.flink.table.store.codegen.NormalizedKeyComputer
-
Swaps two normalized keys in respective MemorySegment
.
- sync() - Method in class org.apache.flink.table.store.file.append.AppendOnlyWriter
-
- sync() - Method in class org.apache.flink.table.store.file.mergetree.MergeTreeWriter
-
- sync() - Method in interface org.apache.flink.table.store.file.utils.RecordWriter
-
Sync the writer.
- sync() - Method in class org.apache.flink.table.store.file.utils.RenamingAtomicFsDataOutputStream
-
- SysInfoLinux - Class in org.apache.flink.table.store.benchmark.metric.cpu
-
Plugin to calculate resource information on Linux systems.
- SysInfoLinux() - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.SysInfoLinux
-
- SysInfoLinux(String, String, String, String, String, long) - Constructor for class org.apache.flink.table.store.benchmark.metric.cpu.SysInfoLinux
-
Constructor which allows assigning the /proc/ directories.
- SYSTEM_FIELD_NAMES - Static variable in class org.apache.flink.table.store.file.schema.TableSchema
-
- SYSTEM_TABLE_SPLITTER - Static variable in interface org.apache.flink.table.store.file.catalog.Catalog
-
- SystemCatalogTable - Class in org.apache.flink.table.store.connector
-
A CatalogTable
to represent system table.
- SystemCatalogTable(Table) - Constructor for class org.apache.flink.table.store.connector.SystemCatalogTable
-
- SystemClock - Class in org.apache.flink.table.store.benchmark.metric.cpu.clock
-
A clock that returns the time of the system / process.
- SystemTableLoader - Class in org.apache.flink.table.store.table.system
-
Loader to load system
Table
s.
- SystemTableLoader() - Constructor for class org.apache.flink.table.store.table.system.SystemTableLoader
-
- SystemTableSource - Class in org.apache.flink.table.store.connector.source
-
- SystemTableSource(Table, boolean) - Constructor for class org.apache.flink.table.store.connector.source.SystemTableSource
-
- SystemTableSource(Table, boolean, Predicate, int[][], Long) - Constructor for class org.apache.flink.table.store.connector.source.SystemTableSource
-
- table - Variable in class org.apache.flink.table.store.connector.sink.FlinkSink
-
- table - Variable in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- table - Variable in class org.apache.flink.table.store.connector.sink.StoreWriteOperator
-
- table - Variable in class org.apache.flink.table.store.connector.source.FlinkSource
-
- table() - Method in class org.apache.flink.table.store.connector.SystemCatalogTable
-
- table - Variable in class org.apache.flink.table.store.spark.SparkScan
-
- Table - Interface in org.apache.flink.table.store.table
-
A table provides basic abstraction for table type and table scan and table read.
- TABLE_STORE_PREFIX - Static variable in class org.apache.flink.table.store.connector.FlinkConnectorOptions
-
- TABLE_TYPE - Static variable in class org.apache.flink.table.store.CatalogOptions
-
- TABLE_TYPE - Static variable in class org.apache.flink.table.store.table.system.OptionsTable
-
- TABLE_TYPE - Static variable in class org.apache.flink.table.store.table.system.SchemasTable
-
- TABLE_TYPE - Static variable in class org.apache.flink.table.store.table.system.SnapshotsTable
-
- TableAlreadyExistException(ObjectPath) - Constructor for exception org.apache.flink.table.store.file.catalog.Catalog.TableAlreadyExistException
-
- TableAlreadyExistException(ObjectPath, Throwable) - Constructor for exception org.apache.flink.table.store.file.catalog.Catalog.TableAlreadyExistException
-
- TableCommit - Class in org.apache.flink.table.store.table.sink
-
- TableCommit(FileStoreCommit, FileStoreExpire, PartitionExpire) - Constructor for class org.apache.flink.table.store.table.sink.TableCommit
-
- TableConfigUtils - Class in org.apache.flink.table.store.connector
-
Utils for TableConfig
.
- TableConfigUtils() - Constructor for class org.apache.flink.table.store.connector.TableConfigUtils
-
- tableExists(ObjectPath) - Method in class org.apache.flink.table.store.connector.FlinkCatalog
-
- tableExists(ObjectPath) - Method in interface org.apache.flink.table.store.file.catalog.Catalog
-
Check if a table exists in this catalog.
- tableExists(ObjectPath) - Method in class org.apache.flink.table.store.file.catalog.FileSystemCatalog
-
- tableExists(ObjectPath) - Method in class org.apache.flink.table.store.hive.HiveCatalog
-
- tableName() - Method in class org.apache.flink.table.store.benchmark.Sink
-
- TableNotExistException(ObjectPath) - Constructor for exception org.apache.flink.table.store.file.catalog.Catalog.TableNotExistException
-
- TableNotExistException(ObjectPath, Throwable) - Constructor for exception org.apache.flink.table.store.file.catalog.Catalog.TableNotExistException
-
- tablePath() - Method in exception org.apache.flink.table.store.file.catalog.Catalog.TableAlreadyExistException
-
- tablePath() - Method in exception org.apache.flink.table.store.file.catalog.Catalog.TableNotExistException
-
- tableProperties() - Method in class org.apache.flink.table.store.benchmark.Sink
-
- TableRead - Interface in org.apache.flink.table.store.table.source
-
An abstraction layer above
FileStoreRead
to provide reading of
RowData
.
- TableScan - Interface in org.apache.flink.table.store.table.source
-
- TableScan.Plan - Interface in org.apache.flink.table.store.table.source
-
Plan of scan.
- TableSchema - Class in org.apache.flink.table.store.file.schema
-
Schema of a table.
- TableSchema(long, List<DataField>, int, List<String>, List<String>, Map<String, String>, String) - Constructor for class org.apache.flink.table.store.file.schema.TableSchema
-
- tableSchema - Variable in class org.apache.flink.table.store.table.AbstractFileStoreTable
-
- tableState - Variable in class org.apache.flink.table.store.connector.lookup.PrimaryKeyLookupTable
-
- TableStoreCharObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for CHAR type.
- TableStoreCharObjectInspector(int) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreCharObjectInspector
-
- TableStoreConnectorFactory - Class in org.apache.flink.table.store.connector
-
A table store DynamicTableFactory
to create source and sink.
- TableStoreConnectorFactory() - Constructor for class org.apache.flink.table.store.connector.TableStoreConnectorFactory
-
- TableStoreConnectorFactory(CatalogLock.Factory) - Constructor for class org.apache.flink.table.store.connector.TableStoreConnectorFactory
-
- TableStoreDataStreamScanProvider - Class in org.apache.flink.table.store.connector
-
Table Store DataStreamScanProvider
.
- TableStoreDataStreamScanProvider(boolean, Function<StreamExecutionEnvironment, DataStream<RowData>>) - Constructor for class org.apache.flink.table.store.connector.TableStoreDataStreamScanProvider
-
- TableStoreDataStreamSinkProvider - Class in org.apache.flink.table.store.connector
-
Table Store DataStreamSinkProvider
.
- TableStoreDataStreamSinkProvider(Function<DataStream<RowData>, DataStreamSink<?>>) - Constructor for class org.apache.flink.table.store.connector.TableStoreDataStreamSinkProvider
-
- TableStoreDateObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for DATE type.
- TableStoreDateObjectInspector() - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreDateObjectInspector
-
- TableStoreDecimalObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for DECIMAL type.
- TableStoreDecimalObjectInspector(int, int) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreDecimalObjectInspector
-
- TableStoreDynamicContext - Class in org.apache.flink.table.store.connector
-
Table Store DynamicTableFactory.Context
.
- TableStoreDynamicContext(DynamicTableFactory.Context, Map<String, String>) - Constructor for class org.apache.flink.table.store.connector.TableStoreDynamicContext
-
- TableStoreHiveMetaHook - Class in org.apache.flink.table.store.hive
-
HiveMetaHook
for table store.
- TableStoreHiveMetaHook() - Constructor for class org.apache.flink.table.store.hive.TableStoreHiveMetaHook
-
- TableStoreHiveStorageHandler - Class in org.apache.flink.table.store.hive
-
HiveStorageHandler
for table store.
- TableStoreHiveStorageHandler() - Constructor for class org.apache.flink.table.store.hive.TableStoreHiveStorageHandler
-
- TableStoreInputFormat - Class in org.apache.flink.table.store.mapred
-
InputFormat
for table store.
- TableStoreInputFormat() - Constructor for class org.apache.flink.table.store.mapred.TableStoreInputFormat
-
- TableStoreInputSplit - Class in org.apache.flink.table.store.mapred
-
FileSplit
for table store.
- TableStoreInputSplit() - Constructor for class org.apache.flink.table.store.mapred.TableStoreInputSplit
-
- TableStoreInputSplit(String, DataSplit) - Constructor for class org.apache.flink.table.store.mapred.TableStoreInputSplit
-
- TableStoreJobConf - Class in org.apache.flink.table.store
-
Utility class to convert Hive table property keys and get file store specific configurations from
JobConf
.
- TableStoreJobConf(JobConf) - Constructor for class org.apache.flink.table.store.TableStoreJobConf
-
- TableStoreListObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
ListObjectInspector
for ArrayData
.
- TableStoreListObjectInspector(LogicalType) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreListObjectInspector
-
- TableStoreManagedFactory - Class in org.apache.flink.table.store.connector
-
Default implementation of ManagedTableFactory
.
- TableStoreManagedFactory() - Constructor for class org.apache.flink.table.store.connector.TableStoreManagedFactory
-
- TableStoreMapObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
MapObjectInspector
for MapData
.
- TableStoreMapObjectInspector(LogicalType, LogicalType) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreMapObjectInspector
-
- TableStoreObjectInspectorFactory - Class in org.apache.flink.table.store.hive.objectinspector
-
Factory to create ObjectInspector
s according to the given LogicalType
.
- TableStoreObjectInspectorFactory() - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreObjectInspectorFactory
-
- TableStoreOutputFormat - Class in org.apache.flink.table.store.mapred
-
OutputFormat
for table split.
- TableStoreOutputFormat() - Constructor for class org.apache.flink.table.store.mapred.TableStoreOutputFormat
-
- TableStoreRecordReader - Class in org.apache.flink.table.store.mapred
-
Base RecordReader
for table store.
- TableStoreRecordReader(TableRead, TableStoreInputSplit, List<String>, List<String>) - Constructor for class org.apache.flink.table.store.mapred.TableStoreRecordReader
-
- TableStoreRowDataObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
StructObjectInspector
for RowData
.
- TableStoreRowDataObjectInspector(List<String>, List<LogicalType>, List<String>) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreRowDataObjectInspector
-
- TableStoreSerDe - Class in org.apache.flink.table.store.hive
-
AbstractSerDe
for table store.
- TableStoreSerDe() - Constructor for class org.apache.flink.table.store.hive.TableStoreSerDe
-
- TableStoreSink - Class in org.apache.flink.table.store.connector.sink
-
Table sink to create StoreSink
.
- TableStoreSink(ObjectIdentifier, FileStoreTable, DynamicTableFactory.Context, LogStoreTableFactory) - Constructor for class org.apache.flink.table.store.connector.sink.TableStoreSink
-
- TableStoreSource - Class in org.apache.flink.table.store.connector.source
-
- TableStoreSource(ObjectIdentifier, FileStoreTable, boolean, DynamicTableFactory.Context, LogStoreTableFactory) - Constructor for class org.apache.flink.table.store.connector.source.TableStoreSource
-
- TableStoreStringObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for STRING type.
- TableStoreStringObjectInspector() - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreStringObjectInspector
-
- TableStoreTimestampObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for TIMESTAMP type.
- TableStoreTimestampObjectInspector() - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreTimestampObjectInspector
-
- TableStoreVarcharObjectInspector - Class in org.apache.flink.table.store.hive.objectinspector
-
AbstractPrimitiveJavaObjectInspector
for VARCHAR type.
- TableStoreVarcharObjectInspector(int) - Constructor for class org.apache.flink.table.store.hive.objectinspector.TableStoreVarcharObjectInspector
-
- TableStreamingReader - Class in org.apache.flink.table.store.table.source
-
A streaming reader to read table.
- TableStreamingReader(FileStoreTable, int[], Predicate) - Constructor for class org.apache.flink.table.store.table.source.TableStreamingReader
-
- TableType - Enum in org.apache.flink.table.store.table
-
Enum of catalog table type.
- TableWrite - Interface in org.apache.flink.table.store.table.sink
-
- TableWriteImpl<T> - Class in org.apache.flink.table.store.table.sink
-
- TableWriteImpl(FileStoreWrite<T>, SinkRecordConverter, TableWriteImpl.RecordExtractor<T>) - Constructor for class org.apache.flink.table.store.table.sink.TableWriteImpl
-
- TableWriteImpl.RecordExtractor<T> - Interface in org.apache.flink.table.store.table.sink
-
- TARGET_FILE_SIZE - Static variable in class org.apache.flink.table.store.benchmark.config.FileBenchmarkOptions
-
- TARGET_FILE_SIZE - Static variable in class org.apache.flink.table.store.CoreOptions
-
- TARGET_FILE_SIZE_BASE - Static variable in class org.apache.flink.table.store.connector.RocksDBOptions
-
- targetFileSize() - Method in class org.apache.flink.table.store.CoreOptions
-
- targetFileSize() - Method in class org.apache.flink.table.store.file.io.RollingFileWriter
-
- taskFuture - Variable in class org.apache.flink.table.store.file.compact.CompactFutureManager
-
- tearDown() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeReaderBenchmark
-
- tearDown() - Method in class org.apache.flink.table.store.benchmark.file.mergetree.MergeTreeWriterBenchmark
-
- test(Object[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.And
-
- test(long, FieldStats[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.And
-
- test(Object[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate.Function
-
- test(long, FieldStats[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate.Function
-
- test(Object[]) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- test(long, FieldStats[]) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.Equal
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.Equal
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.GreaterOrEqual
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.GreaterOrEqual
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.GreaterThan
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.GreaterThan
-
- test(LogicalType, Object, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.In
-
- test(LogicalType, long, FieldStats, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.In
-
- test(LogicalType, Object) - Method in class org.apache.flink.table.store.file.predicate.IsNotNull
-
- test(LogicalType, long, FieldStats) - Method in class org.apache.flink.table.store.file.predicate.IsNotNull
-
- test(LogicalType, Object) - Method in class org.apache.flink.table.store.file.predicate.IsNull
-
- test(LogicalType, long, FieldStats) - Method in class org.apache.flink.table.store.file.predicate.IsNull
-
- test(LogicalType, Object, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LeafFunction
-
- test(LogicalType, long, FieldStats, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LeafFunction
-
- test(Object[]) - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- test(long, FieldStats[]) - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- test(LogicalType, Object) - Method in class org.apache.flink.table.store.file.predicate.LeafUnaryFunction
-
- test(LogicalType, long, FieldStats) - Method in class org.apache.flink.table.store.file.predicate.LeafUnaryFunction
-
- test(LogicalType, Object, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LeafUnaryFunction
-
- test(LogicalType, long, FieldStats, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LeafUnaryFunction
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.LessOrEqual
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.LessOrEqual
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.LessThan
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.LessThan
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.NotEqual
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.NotEqual
-
- test(LogicalType, Object, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NotIn
-
- test(LogicalType, long, FieldStats, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NotIn
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.NullFalseLeafBinaryFunction
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.NullFalseLeafBinaryFunction
-
- test(LogicalType, Object, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NullFalseLeafBinaryFunction
-
- test(LogicalType, long, FieldStats, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NullFalseLeafBinaryFunction
-
- test(Object[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.Or
-
- test(long, FieldStats[], List<Predicate>) - Method in class org.apache.flink.table.store.file.predicate.Or
-
- test(Object[]) - Method in interface org.apache.flink.table.store.file.predicate.Predicate
-
Test based on the specific input column values.
- test(long, FieldStats[]) - Method in interface org.apache.flink.table.store.file.predicate.Predicate
-
Test based on the statistical information to determine whether a hit is possible.
- test(RowData) - Method in class org.apache.flink.table.store.file.predicate.PredicateFilter
-
- test(LogicalType, Object, Object) - Method in class org.apache.flink.table.store.file.predicate.StartsWith
-
- test(LogicalType, long, FieldStats, Object) - Method in class org.apache.flink.table.store.file.predicate.StartsWith
-
- text - Variable in class org.apache.flink.table.store.benchmark.Query.WriteSql
-
- timeMillis() - Method in class org.apache.flink.table.store.file.Snapshot
-
- timestampPrecision(LogicalType) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
- toBatch() - Method in class org.apache.flink.table.store.spark.SparkScan
-
- toBinary(FieldStats[]) - Method in class org.apache.flink.table.store.file.stats.FieldStatsArraySerializer
-
- toBoolean(BinaryStringData) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
Parse a StringData
to boolean.
- toBytes() - Method in class org.apache.flink.table.store.connector.sink.LogOffsetCommittable
-
- toByteValue() - Method in enum org.apache.flink.table.store.file.manifest.FileKind
-
- toCatalogTable() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- toDataType(LogicalType, AtomicInteger) - Static method in class org.apache.flink.table.store.file.schema.TableSchema
-
- toDate(BinaryStringData) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
- toFlatJson(T) - Static method in class org.apache.flink.table.store.file.utils.JsonSerdeUtil
-
- toFlinkType(DataType) - Static method in class org.apache.flink.table.store.spark.SparkTypeUtils
-
- toInsertableRelation() - Method in class org.apache.flink.table.store.spark.SparkWrite
-
- toInternal(Date) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
Converts the Java type used for UDF parameters of SQL DATE type (
Date
) to
internal representation (int).
- toInternal(LocalDate) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
- toJson() - Method in class org.apache.flink.table.store.file.Snapshot
-
- toJson(T) - Static method in class org.apache.flink.table.store.file.utils.JsonSerdeUtil
-
- toKafkaProperties(ReadableConfig) - Static method in class org.apache.flink.table.store.kafka.KafkaLogStoreFactory
-
- toLocalDateTimeDefault(String) - Static method in class org.apache.flink.table.store.file.partition.PartitionTimeExtractor
-
- toLogRecord(SinkRecord) - Method in class org.apache.flink.table.store.connector.sink.StoreSinkWriteImpl
-
- toLogRecord(SinkRecord) - Method in interface org.apache.flink.table.store.table.sink.TableWrite
-
Log record need to preserve original pk (which includes partition fields).
- toLogRecord(SinkRecord) - Method in class org.apache.flink.table.store.table.sink.TableWriteImpl
-
- toManifestFilePath(String) - Method in class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- toManifestListPath(String) - Method in class org.apache.flink.table.store.file.utils.FileStorePathFactory
-
- toMap() - Method in class org.apache.flink.table.store.CoreOptions
-
- toNestedIndexes() - Method in class org.apache.flink.table.store.utils.Projection
-
Convert this instance to a nested projection index paths.
- toPath(String) - Method in class org.apache.flink.table.store.file.io.DataFilePathFactory
-
- TOPIC - Static variable in class org.apache.flink.table.store.kafka.KafkaLogOptions
-
- toRow(DataFileMeta) - Method in class org.apache.flink.table.store.file.io.DataFileMetaSerializer
-
- toRow(KeyValue) - Method in class org.apache.flink.table.store.file.KeyValueSerializer
-
- toRow(RowData, long, RowKind, RowData) - Method in class org.apache.flink.table.store.file.KeyValueSerializer
-
- toRow(T) - Method in class org.apache.flink.table.store.file.utils.ObjectSerializer
-
Convert a T
to RowData
.
- toRow(T) - Method in class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- toRowData() - Method in class org.apache.flink.table.store.file.stats.BinaryTableStats
-
- toRowType(boolean, List<DataField>) - Static method in class org.apache.flink.table.store.file.schema.RowDataType
-
- toSchemaPath(long) - Method in class org.apache.flink.table.store.file.schema.SchemaManager
-
- toSourceSplit() - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceSplitState
-
- toSplitType(String, FileStoreSourceSplitState) - Method in class org.apache.flink.table.store.connector.source.FileStoreSourceReader
-
- toSQLDate(int) - Static method in class org.apache.flink.table.store.utils.DateTimeUtils
-
Converts the internal representation of a SQL DATE (int) to the Java type used for UDF
parameters (
Date
).
- toString() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuMetric
-
- toString() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.CpuTimeTracker
-
- toString() - Method in class org.apache.flink.table.store.benchmark.metric.cpu.ProcfsBasedProcessTree
-
Returns a string printing PIDs of process present in the ProcfsBasedProcessTree.
- toString() - Method in class org.apache.flink.table.store.codegen.SortSpec.SortFieldSpec
-
- toString() - Method in class org.apache.flink.table.store.codegen.SortSpec
-
- toString() - Method in class org.apache.flink.table.store.connector.sink.BucketStreamPartitioner
-
- toString() - Method in class org.apache.flink.table.store.connector.sink.Committable
-
- toString() - Method in class org.apache.flink.table.store.connector.sink.CommittableTypeInfo
-
- toString() - Method in class org.apache.flink.table.store.connector.sink.OffsetRowDataHashStreamPartitioner
-
- toString() - Method in enum org.apache.flink.table.store.CoreOptions.ChangelogProducer
-
- toString() - Method in enum org.apache.flink.table.store.CoreOptions.LogChangelogMode
-
- toString() - Method in enum org.apache.flink.table.store.CoreOptions.LogConsistency
-
- toString() - Method in enum org.apache.flink.table.store.CoreOptions.MergeEngine
-
- toString() - Method in enum org.apache.flink.table.store.CoreOptions.StartupMode
-
- toString() - Method in class org.apache.flink.table.store.file.io.CompactIncrement
-
- toString() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- toString() - Method in class org.apache.flink.table.store.file.io.NewFilesIncrement
-
- toString(RowType, RowType) - Method in class org.apache.flink.table.store.file.KeyValue
-
- toString() - Method in class org.apache.flink.table.store.file.manifest.ManifestCommittable
-
- toString() - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry.Identifier
-
- toString(FileStorePathFactory) - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry.Identifier
-
- toString() - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry
-
- toString() - Method in class org.apache.flink.table.store.file.manifest.ManifestFileMeta
-
- toString() - Method in class org.apache.flink.table.store.file.mergetree.LevelSortedRun
-
- toString() - Method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- toString() - Method in class org.apache.flink.table.store.file.predicate.FieldRef
-
- toString() - Method in class org.apache.flink.table.store.file.schema.DataField
-
- toString() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- toString() - Method in class org.apache.flink.table.store.file.schema.UpdateSchema
-
- toString() - Method in enum org.apache.flink.table.store.file.WriteMode
-
- toString() - Method in class org.apache.flink.table.store.format.FieldStats
-
- toString() - Method in class org.apache.flink.table.store.mapred.TableStoreInputSplit
-
- toString() - Method in class org.apache.flink.table.store.table.sink.FileCommittable
-
- toString() - Method in enum org.apache.flink.table.store.table.TableType
-
- toString() - Method in class org.apache.flink.table.store.utils.KeyProjectedRowData
-
- toString() - Method in class org.apache.flink.table.store.utils.ProjectedRowData
-
- toString() - Method in class org.apache.flink.table.store.utils.Projection
-
- toStringArrayData(List<String>) - Static method in class org.apache.flink.table.store.utils.RowDataUtils
-
- totalBuckets() - Method in class org.apache.flink.table.store.file.manifest.ManifestEntry
-
- totalSize() - Method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- toTime(BinaryStringData) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
- toTimestamp(BinaryStringData, int) - Static method in class org.apache.flink.table.store.utils.TypeUtils
-
Used by CAST(x as TIMESTAMP)
.
- toTopLevelIndexes() - Method in class org.apache.flink.table.store.utils.Projection
-
Convert this instance to a projection of top level indexes.
- toUpdateSchema() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- transform(RecordReader<L>, Function<L, R>) - Static method in class org.apache.flink.table.store.file.utils.RecordReaderUtils
-
Returns a
RecordReader
that applies
function
to each element of
fromReader
.
- transform(RecordReader.RecordIterator<L>, Function<L, R>) - Static method in class org.apache.flink.table.store.file.utils.RecordReaderUtils
-
Returns an iterator that applies function
to each element of fromIterator
.
- transformFieldMapping(Predicate, int[]) - Static method in class org.apache.flink.table.store.file.predicate.PredicateBuilder
-
- triggerCompaction(boolean) - Method in class org.apache.flink.table.store.file.append.AppendOnlyCompactManager
-
- triggerCompaction(boolean) - Method in interface org.apache.flink.table.store.file.compact.CompactManager
-
Trigger a new compaction task.
- triggerCompaction(boolean) - Method in class org.apache.flink.table.store.file.compact.NoopCompactManager
-
- triggerCompaction(boolean) - Method in class org.apache.flink.table.store.file.mergetree.compact.MergeTreeCompactManager
-
- trimmedPrimaryKeys() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- trimmedPrimaryKeysFields() - Method in class org.apache.flink.table.store.file.schema.TableSchema
-
- type() - Method in class org.apache.flink.table.store.file.predicate.FieldRef
-
- type() - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- type() - Method in class org.apache.flink.table.store.file.schema.DataField
-
- TypeUtils - Class in org.apache.flink.table.store.utils
-
Type related helper functions.
- TypeUtils() - Constructor for class org.apache.flink.table.store.utils.TypeUtils
-
- validate(Comparator<RowData>) - Method in class org.apache.flink.table.store.file.mergetree.SortedRun
-
- validate(TableSchema) - Static method in class org.apache.flink.table.store.table.source.snapshot.ContinuousDataFileSnapshotEnumerator
-
- validateKeyFormat(Format, String) - Static method in interface org.apache.flink.table.store.log.LogStoreTableFactory
-
- validateTableSchema(TableSchema) - Static method in class org.apache.flink.table.store.CoreOptions
-
- validateValueFormat(Format, String) - Static method in interface org.apache.flink.table.store.log.LogStoreTableFactory
-
- value() - Method in class org.apache.flink.table.store.file.KeyValue
-
- value() - Method in class org.apache.flink.table.store.file.schema.SchemaChange.SetOption
-
- VALUE_COUNT - Static variable in class org.apache.flink.table.store.file.schema.TableSchema
-
- VALUE_KIND - Static variable in class org.apache.flink.table.store.file.schema.TableSchema
-
- ValueContentRowDataRecordIterator - Class in org.apache.flink.table.store.table.source
-
A
RecordReader.RecordIterator
mapping a
KeyValue
to its value.
- ValueContentRowDataRecordIterator(RecordReader.RecordIterator<KeyValue>) - Constructor for class org.apache.flink.table.store.table.source.ValueContentRowDataRecordIterator
-
- ValueCountMergeFunction - Class in org.apache.flink.table.store.file.mergetree.compact
-
A
MergeFunction
where key is the full record and value is a count which represents number
of records of the exact same fields.
- ValueCountMergeFunction() - Constructor for class org.apache.flink.table.store.file.mergetree.compact.ValueCountMergeFunction
-
- ValueCountRowDataRecordIterator - Class in org.apache.flink.table.store.table.source
-
An
RecordReader.RecordIterator
mapping a
KeyValue
to several
RowData
according to its key.
- ValueCountRowDataRecordIterator(RecordReader.RecordIterator<KeyValue>) - Constructor for class org.apache.flink.table.store.table.source.ValueCountRowDataRecordIterator
-
- valueFields(TableSchema) - Method in interface org.apache.flink.table.store.file.schema.KeyValueFieldsExtractor
-
Extract value fields from table schema.
- valueInputView - Variable in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- valueKind() - Method in class org.apache.flink.table.store.file.KeyValue
-
- valueOf(String) - Static method in enum org.apache.flink.table.store.benchmark.metric.cpu.OperatingSystem
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.CoreOptions.ChangelogProducer
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.CoreOptions.LogChangelogMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.CoreOptions.LogConsistency
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.CoreOptions.MergeEngine
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.CoreOptions.StartupMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.file.manifest.FileKind
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.file.operation.ScanKind
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.file.Snapshot.CommitKind
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.file.WriteMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.flink.table.store.table.TableType
-
Returns the enum constant of this type with the specified name.
- valueOutputView - Variable in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- values() - Static method in enum org.apache.flink.table.store.benchmark.metric.cpu.OperatingSystem
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.CoreOptions.ChangelogProducer
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.CoreOptions.LogChangelogMode
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.CoreOptions.LogConsistency
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.CoreOptions.MergeEngine
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.CoreOptions.StartupMode
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.file.manifest.FileKind
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.file.operation.ScanKind
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.file.Snapshot.CommitKind
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.file.WriteMode
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.flink.table.store.table.TableType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- valueSerializer - Variable in class org.apache.flink.table.store.connector.lookup.RocksDBState
-
- valueState(String, TypeSerializer<RowData>, TypeSerializer<RowData>, long) - Method in class org.apache.flink.table.store.connector.lookup.RocksDBStateFactory
-
- valueStats() - Method in class org.apache.flink.table.store.file.io.DataFileMeta
-
- valueType() - Method in class org.apache.flink.table.store.file.io.KeyValueFileWriterFactory
-
- valueType() - Method in class org.apache.flink.table.store.file.schema.MapDataType
-
- version() - Method in class org.apache.flink.table.store.file.Snapshot
-
- VersionedObjectSerializer<T> - Class in org.apache.flink.table.store.file.utils
-
- VersionedObjectSerializer(RowType) - Constructor for class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- versionType(RowType) - Static method in class org.apache.flink.table.store.file.utils.VersionedObjectSerializer
-
- visit(FunctionVisitor<T>, List<T>) - Method in class org.apache.flink.table.store.file.predicate.And
-
- visit(FunctionVisitor<T>, List<T>) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate.Function
-
- visit(PredicateVisitor<T>) - Method in class org.apache.flink.table.store.file.predicate.CompoundPredicate
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.Equal
-
- visit(LeafPredicate) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visit(CompoundPredicate) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.GreaterOrEqual
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.GreaterThan
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.In
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.IsNotNull
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.IsNull
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LeafFunction
-
- visit(PredicateVisitor<T>) - Method in class org.apache.flink.table.store.file.predicate.LeafPredicate
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LessOrEqual
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.LessThan
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NotEqual
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.NotIn
-
- visit(FunctionVisitor<T>, List<T>) - Method in class org.apache.flink.table.store.file.predicate.Or
-
- visit(PredicateVisitor<T>) - Method in interface org.apache.flink.table.store.file.predicate.Predicate
-
- visit(CallExpression) - Method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- visit(ValueLiteralExpression) - Method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- visit(FieldReferenceExpression) - Method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- visit(TypeLiteralExpression) - Method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- visit(Expression) - Method in class org.apache.flink.table.store.file.predicate.PredicateConverter
-
- visit(CompoundPredicate) - Method in interface org.apache.flink.table.store.file.predicate.PredicateReplaceVisitor
-
- visit(LeafPredicate) - Method in interface org.apache.flink.table.store.file.predicate.PredicateVisitor
-
- visit(CompoundPredicate) - Method in interface org.apache.flink.table.store.file.predicate.PredicateVisitor
-
- visit(FunctionVisitor<T>, FieldRef, List<Object>) - Method in class org.apache.flink.table.store.file.predicate.StartsWith
-
- visitAnd(List<T>) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitAnd(List<Optional<OrcFilters.Predicate>>) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitEqual(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitEqual(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitGreaterOrEqual(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitGreaterOrEqual(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitGreaterThan(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitGreaterThan(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitIn(FieldRef, List<Object>) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitIn(FieldRef, List<Object>) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitIsNotNull(FieldRef) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitIsNotNull(FieldRef) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitIsNull(FieldRef) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitIsNull(FieldRef) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitLessOrEqual(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitLessOrEqual(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitLessThan(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitLessThan(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitNotEqual(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitNotEqual(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitNotIn(FieldRef, List<Object>) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitNotIn(FieldRef, List<Object>) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitOr(List<T>) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- VISITOR - Static variable in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitOr(List<Optional<OrcFilters.Predicate>>) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-
- visitStartsWith(FieldRef, Object) - Method in interface org.apache.flink.table.store.file.predicate.FunctionVisitor
-
- visitStartsWith(FieldRef, Object) - Method in class org.apache.flink.table.store.format.orc.OrcPredicateFunctionVisitor
-