Modifier and Type | Method and Description |
---|---|
static Object |
PythonBridgeUtils.getPickledBytesFromRow(Row row,
DataType[] dataTypes) |
Modifier and Type | Method and Description |
---|---|
DataType |
AsyncDynamicTableSinkFactory.AsyncDynamicSinkContext.getPhysicalDataType() |
Constructor and Description |
---|
DataGenTableSource(DataGenerator<?>[] fieldGenerators,
String tableName,
DataType rowDataType,
long rowsPerSecond,
Long numberOfRows) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
FileSystemTableSource.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
void |
FileSystemTableSource.applyProjection(int[][] projectedFields,
DataType producedDataType) |
void |
FileSystemTableSource.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType) |
static Object |
RowPartitionComputer.restorePartValueFromType(String valStr,
DataType type) |
Constructor and Description |
---|
FileSystemTableSource(ObjectIdentifier tableIdentifier,
DataType physicalRowDataType,
List<String> partitionKeys,
ReadableConfig tableOptions,
DecodingFormat<BulkFormat<RowData,FileSourceSplit>> bulkReaderFormat,
DecodingFormat<DeserializationSchema<RowData>> deserializationFormat) |
RowDataPartitionComputer(String defaultPartValue,
String[] columnNames,
DataType[] columnTypes,
String[] partitionColumns) |
Modifier and Type | Method and Description |
---|---|
KinesisFirehoseDynamicSink.KinesisFirehoseDynamicSinkBuilder |
KinesisFirehoseDynamicSink.KinesisFirehoseDynamicSinkBuilder.setConsumedDataType(DataType consumedDataType) |
Constructor and Description |
---|
KinesisFirehoseDynamicSink(Integer maxBatchSize,
Integer maxInFlightRequests,
Integer maxBufferedRequests,
Long maxBufferSizeInBytes,
Long maxTimeInBufferMS,
Boolean failOnError,
DataType consumedDataType,
String deliveryStream,
Properties firehoseClientProperties,
EncodingFormat<SerializationSchema<RowData>> encodingFormat) |
Modifier and Type | Method and Description |
---|---|
void |
AbstractHBaseDynamicTableSource.applyProjection(int[][] projectedFields,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
static void |
HBaseConnectorOptionsUtil.validatePrimaryKey(DataType dataType,
int[] primaryKeyIndexes)
Checks that the HBase table have row key defined.
|
Modifier and Type | Method and Description |
---|---|
DataType |
HBaseTableSchema.convertToDataType()
Converts this
HBaseTableSchema to DataType , the fields are consisted of
families and rowkey, the order is in the definition order (i.e. |
DataType[] |
HBaseTableSchema.getQualifierDataTypes(String family) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
HBaseTableSchema.getRowKeyDataType() |
Modifier and Type | Method and Description |
---|---|
void |
HBaseTableSchema.addColumn(String family,
String qualifier,
DataType type) |
static HBaseTableSchema |
HBaseTableSchema.fromDataType(DataType physicalRowType)
Construct a
HBaseTableSchema from a DataType . |
void |
HBaseTableSchema.setRowKey(String rowKeyName,
DataType type) |
Modifier and Type | Method and Description |
---|---|
protected DataType |
AbstractJdbcCatalog.fromJDBCType(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex) |
protected DataType |
MySqlCatalog.fromJDBCType(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex)
Converts MySQL type to Flink
DataType . |
protected DataType |
PostgresCatalog.fromJDBCType(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex)
Converts Postgres type to Flink
DataType . |
Modifier and Type | Method and Description |
---|---|
DataType |
JdbcDialectTypeMapper.mapping(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex) |
Modifier and Type | Method and Description |
---|---|
DataType |
MySqlTypeMapper.mapping(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex) |
Modifier and Type | Method and Description |
---|---|
DataType |
PostgresTypeMapper.mapping(ObjectPath tablePath,
ResultSetMetaData metadata,
int colIndex) |
Modifier and Type | Method and Description |
---|---|
void |
JdbcDynamicTableSource.applyProjection(int[][] projectedFields,
DataType producedDataType) |
JdbcOutputFormatBuilder |
JdbcOutputFormatBuilder.setFieldDataTypes(DataType[] fieldDataTypes) |
Constructor and Description |
---|
JdbcDynamicTableSink(JdbcConnectorOptions jdbcOptions,
JdbcExecutionOptions executionOptions,
JdbcDmlOptions dmlOptions,
DataType physicalRowDataType) |
JdbcDynamicTableSource(JdbcConnectorOptions options,
JdbcReadOptions readOptions,
int lookupMaxRetryTimes,
LookupCache cache,
DataType physicalRowDataType) |
JdbcRowDataLookupFunction(JdbcConnectorOptions options,
int maxRetryTimes,
String[] fieldNames,
DataType[] fieldTypes,
String[] keyNames,
RowType rowType) |
Modifier and Type | Method and Description |
---|---|
KinesisDynamicSink.KinesisDynamicTableSinkBuilder |
KinesisDynamicSink.KinesisDynamicTableSinkBuilder.setConsumedDataType(DataType consumedDataType) |
Constructor and Description |
---|
KinesisDynamicSink(Integer maxBatchSize,
Integer maxInFlightRequests,
Integer maxBufferedRequests,
Long maxBufferSizeInBytes,
Long maxTimeInBufferMS,
Boolean failOnError,
DataType consumedDataType,
String stream,
Properties kinesisClientProperties,
EncodingFormat<SerializationSchema<RowData>> encodingFormat,
PartitionKeyGenerator<RowData> partitioner) |
Modifier and Type | Method and Description |
---|---|
ExternalSystemDataReader<RowData> |
TableSinkExternalContext.createSinkRowDataReader(TestingSinkSettings sinkOptions,
DataType dataType)
Create a new split in the external system and return a data writer corresponding to the new
split.
|
Modifier and Type | Method and Description |
---|---|
ExternalSystemSplitDataWriter<RowData> |
TableSourceExternalContext.createSplitRowDataWriter(TestingSourceSettings sourceOptions,
DataType dataType)
Create a new split in the external system and return a data writer for writing
RowData corresponding to the new split. |
Modifier and Type | Method and Description |
---|---|
void |
HiveTableSource.applyProjection(int[][] projectedFields,
DataType producedDataType) |
Constructor and Description |
---|
HiveContinuousPartitionFetcherContext(ObjectPath tablePath,
HiveShim hiveShim,
JobConfWrapper confWrapper,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
Configuration configuration,
String defaultPartitionName) |
HiveRowDataPartitionComputer(HiveShim hiveShim,
String defaultPartValue,
String[] columnNames,
DataType[] columnTypes,
String[] partitionColumns) |
Modifier and Type | Field and Description |
---|---|
protected DataType[] |
HivePartitionFetcherContextBase.fieldTypes |
Constructor and Description |
---|
HiveBulkFormatAdapter(JobConfWrapper jobConfWrapper,
List<String> partitionKeys,
String[] fieldNames,
DataType[] fieldTypes,
String hiveVersion,
RowType producedRowType,
boolean useMapRedReader)
Deprecated.
|
HiveInputFormat(JobConfWrapper jobConfWrapper,
List<String> partitionKeys,
String[] fieldNames,
DataType[] fieldTypes,
String hiveVersion,
RowType producedRowType,
boolean useMapRedReader) |
HiveInputFormatPartitionReader(int threadNum,
org.apache.hadoop.mapred.JobConf jobConf,
String hiveVersion,
ObjectPath tablePath,
DataType[] fieldTypes,
String[] fieldNames,
List<String> partitionKeys,
int[] selectedFields,
boolean useMapRedReader) |
HiveMapredSplitReader(org.apache.hadoop.mapred.JobConf jobConf,
List<String> partitionKeys,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split,
HiveShim hiveShim) |
HivePartitionFetcherContextBase(ObjectPath tablePath,
HiveShim hiveShim,
JobConfWrapper confWrapper,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
Configuration configuration,
String defaultPartitionName) |
HiveTableInputFormat(int threadNum,
org.apache.hadoop.mapred.JobConf jobConf,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
int[] projectedFields,
Long limit,
String hiveVersion,
boolean useMapRedReader,
List<HiveTablePartition> partitions) |
HiveVectorizedOrcSplitReader(String hiveVersion,
org.apache.hadoop.mapred.JobConf jobConf,
String[] fieldNames,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split) |
HiveVectorizedParquetSplitReader(String hiveVersion,
org.apache.hadoop.mapred.JobConf jobConf,
String[] fieldNames,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split) |
Modifier and Type | Method and Description |
---|---|
static Map<String,Object> |
HivePartitionUtils.parsePartitionValues(Map<String,String> partitionSpecs,
String[] fieldNames,
DataType[] fieldTypes,
String defaultPartitionName,
HiveShim shim)
Parse partition string specs into object values.
|
Modifier and Type | Method and Description |
---|---|
static RowType |
DebeziumAvroSerializationSchema.createDebeziumAvroRowType(DataType dataType) |
static RowType |
DebeziumAvroDeserializationSchema.createDebeziumAvroRowType(DataType databaseSchema) |
Modifier and Type | Method and Description |
---|---|
static DataType |
AvroSchemaConverter.convertToDataType(String avroSchemaString)
Converts an Avro schema string into a nested row structure with deterministic field order and
data types that are compatible with Flink's Table & SQL API.
|
Modifier and Type | Method and Description |
---|---|
static BulkWriter.Factory<RowData> |
PythonCsvUtils.createCsvBulkWriterFactory(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema schema,
DataType physicalDataType)
Util for creating a
BulkWriter.Factory that wraps CsvBulkWriter.forSchema(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvMapper, org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema, org.apache.flink.formats.common.Converter<T, R, C>, C, org.apache.flink.core.fs.FSDataOutputStream) . |
static CsvReaderFormat<Object> |
PythonCsvUtils.createCsvReaderFormat(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema schema,
DataType dataType)
Util for creating a
CsvReaderFormat . |
BulkFormat<RowData,FileSourceSplit> |
CsvFileFormatFactory.CsvBulkDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
TableStats |
CsvFileFormatFactory.CsvBulkDecodingFormat.reportStatistics(List<Path> files,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
CanalJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
static CanalJsonDeserializationSchema.Builder |
CanalJsonDeserializationSchema.builder(DataType physicalDataType,
List<org.apache.flink.formats.json.canal.CanalJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo)
Creates A builder for building a
CanalJsonDeserializationSchema . |
DeserializationSchema<RowData> |
CanalJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
DebeziumJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
DebeziumJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Constructor and Description |
---|
DebeziumJsonDeserializationSchema(DataType physicalDataType,
List<org.apache.flink.formats.json.debezium.DebeziumJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo,
boolean schemaInclude,
boolean ignoreParseErrors,
TimestampFormat timestampFormat) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
MaxwellJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
MaxwellJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Constructor and Description |
---|
MaxwellJsonDeserializationSchema(DataType physicalDataType,
List<org.apache.flink.formats.json.maxwell.MaxwellJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo,
boolean ignoreParseErrors,
TimestampFormat timestampFormat) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
OggJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
OggJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType) |
Constructor and Description |
---|
OggJsonDeserializationSchema(DataType physicalDataType,
List<org.apache.flink.formats.json.ogg.OggJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo,
boolean ignoreParseErrors,
TimestampFormat timestampFormat) |
Modifier and Type | Method and Description |
---|---|
BulkFormat<RowData,FileSourceSplit> |
ParquetFileFormatFactory.ParquetBulkDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context sourceContext,
DataType producedDataType,
int[][] projections) |
TableStats |
ParquetColumnarRowInputFormat.reportStatistics(List<Path> files,
DataType producedDataType) |
TableStats |
ParquetFileFormatFactory.ParquetBulkDecodingFormat.reportStatistics(List<Path> files,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
static TableStats |
ParquetFormatStatisticsReportUtil.getTableStatistics(List<Path> files,
DataType producedDataType,
Configuration hadoopConfig,
boolean isUtcTimestamp) |
Modifier and Type | Method and Description |
---|---|
static ParquetColumnarRowSplitReader |
ParquetSplitReaderUtil.genPartColumnarRowReader(boolean utcTimestamp,
boolean caseSensitive,
Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
ParquetColumnarRowSplitReader . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
PbDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
SerializationSchema<RowData> |
PbEncodingFormat.createRuntimeEncoder(DynamicTableSink.Context context,
DataType consumedDataType) |
Modifier and Type | Method and Description |
---|---|
static org.apache.orc.TypeDescription |
OrcSplitReaderUtil.convertToOrcTypeWithPart(String[] fullFieldNames,
DataType[] fullFieldTypes,
Collection<String> partitionKeys) |
BulkFormat<RowData,FileSourceSplit> |
OrcFileFormatFactory.OrcBulkDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context sourceContext,
DataType producedDataType,
int[][] projections) |
static OrcColumnarRowSplitReader<org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch> |
OrcSplitReaderUtil.genPartColumnarRowReader(String hiveVersion,
Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
List<OrcFilters.Predicate> conjunctPredicates,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
OrcColumnarRowSplitReader . |
TableStats |
OrcColumnarRowInputFormat.reportStatistics(List<Path> files,
DataType producedDataType) |
TableStats |
OrcFileFormatFactory.OrcBulkDecodingFormat.reportStatistics(List<Path> files,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
static OrcColumnarRowSplitReader<org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch> |
OrcNoHiveSplitReaderUtil.genPartColumnarRowReader(Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
List<OrcFilters.Predicate> conjunctPredicates,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
OrcColumnarRowSplitReader . |
Modifier and Type | Method and Description |
---|---|
static TableStats |
OrcFormatStatisticsReportUtil.getTableStatistics(List<Path> files,
DataType producedDataType,
Configuration hadoopConfig) |
Constructor and Description |
---|
RowRowMapper(DataType dataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
BatchSQLTestProgram.GeneratorTableSource.getProducedDataType() |
Modifier and Type | Field and Description |
---|---|
protected DataType |
KafkaDynamicSink.consumedDataType
Data type of consumed data type.
|
protected DataType |
KafkaDynamicSource.physicalDataType
Data type to configure the formats.
|
protected DataType |
KafkaDynamicSink.physicalDataType
Data type to configure the formats.
|
protected DataType |
KafkaDynamicSource.producedDataType
Data type that describes the final output of the source.
|
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
KafkaDynamicSource.listReadableMetadata() |
Map<String,DataType> |
KafkaDynamicSink.listWritableMetadata() |
Modifier and Type | Method and Description |
---|---|
void |
KafkaDynamicSource.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType) |
void |
KafkaDynamicSink.applyWritableMetadata(List<String> metadataKeys,
DataType consumedDataType) |
protected KafkaDynamicSink |
KafkaDynamicTableFactory.createKafkaTableSink(DataType physicalDataType,
EncodingFormat<SerializationSchema<RowData>> keyEncodingFormat,
EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
String topic,
Properties properties,
FlinkKafkaPartitioner<RowData> partitioner,
DeliveryGuarantee deliveryGuarantee,
Integer parallelism,
String transactionalIdPrefix) |
protected KafkaDynamicSource |
KafkaDynamicTableFactory.createKafkaTableSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
String tableIdentifier) |
DeserializationSchema<RowData> |
UpsertKafkaDynamicTableFactory.DecodingFormatWrapper.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
SerializationSchema<RowData> |
UpsertKafkaDynamicTableFactory.EncodingFormatWrapper.createRuntimeEncoder(DynamicTableSink.Context context,
DataType consumedDataType) |
Constructor and Description |
---|
KafkaDynamicSink(DataType consumedDataType,
DataType physicalDataType,
EncodingFormat<SerializationSchema<RowData>> keyEncodingFormat,
EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
String topic,
Properties properties,
FlinkKafkaPartitioner<RowData> partitioner,
DeliveryGuarantee deliveryGuarantee,
boolean upsertMode,
SinkBufferFlushMode flushMode,
Integer parallelism,
String transactionalIdPrefix) |
KafkaDynamicSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
boolean upsertMode,
String tableIdentifier) |
Modifier and Type | Method and Description |
---|---|
DataType |
RowDataKinesisDeserializationSchema.Metadata.getDataType() |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
KinesisDynamicSource.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
void |
KinesisDynamicSource.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType) |
Constructor and Description |
---|
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat) |
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType,
List<RowDataKinesisDeserializationSchema.Metadata> requestedMetadataFields) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypes.ARRAY(DataType elementDataType)
Data type of an array of elements with same subtype.
|
static DataType |
DataTypes.BIGINT()
Data type of an 8-byte signed integer with values from -9,223,372,036,854,775,808 to
9,223,372,036,854,775,807.
|
static DataType |
DataTypes.BINARY(int n)
Data type of a fixed-length binary string (=a sequence of bytes)
BINARY(n) where
n is the number of bytes. |
static DataType |
DataTypes.BOOLEAN()
Data type of a boolean with a (possibly) three-valued logic of
TRUE, FALSE, UNKNOWN . |
static DataType |
DataTypes.BYTES()
Data type of a variable-length binary string (=a sequence of bytes) with defined maximum
length.
|
static DataType |
DataTypes.CHAR(int n)
Data type of a fixed-length character string
CHAR(n) where n is the number of
code points. |
static DataType |
DataTypes.DATE()
Data type of a date consisting of
year-month-day with values ranging from 0000-01-01 to 9999-12-31 . |
static DataType |
DataTypes.DECIMAL(int precision,
int scale)
Data type of a decimal number with fixed precision and scale
DECIMAL(p, s) where
p is the number of digits in a number (=precision) and s is the number of
digits to the right of the decimal point in a number (=scale). |
static DataType |
DataTypes.DOUBLE()
Data type of an 8-byte double precision floating point number.
|
static DataType |
DataTypes.FLOAT()
Data type of a 4-byte single precision floating point number.
|
DataType |
DataTypes.Field.getDataType() |
DataType[] |
TableSchema.getFieldDataTypes()
Deprecated.
Returns all field data types as an array.
|
DataType |
TableColumn.getType()
Deprecated.
Returns the data type of this column.
|
DataType |
WatermarkSpec.getWatermarkExprOutputType()
Deprecated.
Returns the data type of the computation result of watermark generation expression.
|
static DataType |
DataTypes.INT()
Data type of a 4-byte signed integer with values from -2,147,483,648 to 2,147,483,647.
|
static DataType |
DataTypes.INTERVAL(DataTypes.Resolution resolution)
Data type of a temporal interval.
|
static DataType |
DataTypes.INTERVAL(DataTypes.Resolution upperResolution,
DataTypes.Resolution lowerResolution)
Data type of a temporal interval.
|
static DataType |
DataTypes.MAP(DataType keyDataType,
DataType valueDataType)
Data type of an associative array that maps keys (including
NULL ) to values
(including NULL ). |
static DataType |
DataTypes.MULTISET(DataType elementDataType)
Data type of a multiset (=bag).
|
static DataType |
DataTypes.NULL()
Data type for representing untyped
NULL values. |
static DataType |
DataTypes.of(LogicalType logicalType)
Creates a
DataType from a LogicalType with default conversion class. |
static <T> DataType |
DataTypes.RAW(Class<T> clazz,
TypeSerializer<T> serializer)
Data type of an arbitrary serialized type.
|
static DataType |
DataTypes.ROW()
Data type of a row type with no fields.
|
static DataType |
DataTypes.ROW(DataType... fieldDataTypes)
Data type of a sequence of fields.
|
static DataType |
DataTypes.ROW(DataTypes.Field... fields)
Data type of a sequence of fields.
|
static DataType |
DataTypes.ROW(List<DataTypes.Field> fields) |
static DataType |
DataTypes.SMALLINT()
Data type of a 2-byte signed integer with values from -32,768 to 32,767.
|
static DataType |
DataTypes.STRING()
Data type of a variable-length character string with defined maximum length.
|
static <T> DataType |
DataTypes.STRUCTURED(Class<T> implementationClass,
DataTypes.Field... fields)
Data type of a user-defined object structured type.
|
static DataType |
DataTypes.TIME()
Data type of a time WITHOUT time zone
TIME with no fractional seconds by default. |
static DataType |
DataTypes.TIME(int precision)
Data type of a time WITHOUT time zone
TIME(p) where p is the number of digits
of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP_LTZ()
Data type of a timestamp WITH LOCAL time zone.
|
static DataType |
DataTypes.TIMESTAMP_LTZ(int precision)
Data type of a timestamp WITH LOCAL time zone.
|
static DataType |
DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE()
Data type of a timestamp WITH LOCAL time zone
TIMESTAMP WITH LOCAL TIME ZONE with 6
digits of fractional seconds by default. |
static DataType |
DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(int precision)
Data type of a timestamp WITH LOCAL time zone
TIMESTAMP(p) WITH LOCAL TIME ZONE where
p is the number of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP_WITH_TIME_ZONE()
Data type of a timestamp WITH time zone
TIMESTAMP WITH TIME ZONE with 6 digits of
fractional seconds by default. |
static DataType |
DataTypes.TIMESTAMP_WITH_TIME_ZONE(int precision)
Data type of a timestamp WITH time zone
TIMESTAMP(p) WITH TIME ZONE where p
is the number of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP()
Data type of a timestamp WITHOUT time zone
TIMESTAMP with 6 digits of fractional
seconds by default. |
static DataType |
DataTypes.TIMESTAMP(int precision)
Data type of a timestamp WITHOUT time zone
TIMESTAMP(p) where p is the number
of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TINYINT()
Data type of a 1-byte signed integer with values from -128 to 127.
|
DataType |
TableSchema.toPersistedRowDataType()
Deprecated.
Converts all persisted columns of this schema into a (possibly nested) row data type.
|
DataType |
TableSchema.toPhysicalRowDataType()
Deprecated.
Converts all physical columns of this schema into a (possibly nested) row data type.
|
DataType |
TableSchema.toRowDataType()
Deprecated.
Converts all columns of this schema into a (possibly nested) row data type.
|
static DataType |
DataTypes.VARBINARY(int n)
Data type of a variable-length binary string (=a sequence of bytes)
VARBINARY(n)
where n is the maximum number of bytes. |
static DataType |
DataTypes.VARCHAR(int n)
Data type of a variable-length character string
VARCHAR(n) where n is the
maximum number of code points. |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
TableSchema.getFieldDataType(int fieldIndex)
Deprecated.
Returns the specified data type for the given field index.
|
Optional<DataType> |
TableSchema.getFieldDataType(String fieldName)
Deprecated.
Returns the specified data type for the given field name.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypes.ARRAY(DataType elementDataType)
Data type of an array of elements with same subtype.
|
static TableColumn.ComputedColumn |
TableColumn.computed(String name,
DataType type,
String expression)
Deprecated.
Creates a computed column that is computed from the given SQL expression.
|
TableSchema.Builder |
TableSchema.Builder.field(String name,
DataType dataType)
Add a field with name and data type.
|
static DataTypes.Field |
DataTypes.FIELD(String name,
DataType dataType)
Field definition with field name and data type.
|
TableSchema.Builder |
TableSchema.Builder.field(String name,
DataType dataType,
String expression)
Add a computed field which is generated by the given expression.
|
static DataTypes.Field |
DataTypes.FIELD(String name,
DataType dataType,
String description)
Field definition with field name, data type, and a description.
|
TableSchema.Builder |
TableSchema.Builder.fields(String[] names,
DataType[] dataTypes)
Add an array of fields with names and data types.
|
Schema.Builder |
Schema.Builder.fromRowDataType(DataType dataType)
Adopts all fields of the given row as physical columns of the schema.
|
static ApiExpression |
Expressions.lit(Object v,
DataType dataType)
Creates a SQL literal of a given
DataType . |
static DataType |
DataTypes.MAP(DataType keyDataType,
DataType valueDataType)
Data type of an associative array that maps keys (including
NULL ) to values
(including NULL ). |
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type)
Deprecated.
Creates a metadata column from metadata of the given column name.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
boolean isVirtual)
Deprecated.
Creates a metadata column from metadata of the given column name.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
String metadataAlias)
Deprecated.
Creates a metadata column from metadata of the given alias.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
String metadataAlias,
boolean isVirtual)
Deprecated.
Creates a metadata column from metadata of the given column name or from metadata of the
given alias (if not null).
|
static DataType |
DataTypes.MULTISET(DataType elementDataType)
Data type of a multiset (=bag).
|
static ApiExpression |
Expressions.nullOf(DataType dataType)
Returns a null literal value of a given data type.
|
static TableColumn |
TableColumn.of(String name,
DataType type)
Deprecated.
Use
TableColumn.physical(String, DataType) instead. |
static TableColumn |
TableColumn.of(String name,
DataType type,
String expression)
Deprecated.
Use
TableColumn.computed(String, DataType, String) instead. |
static TableColumn.PhysicalColumn |
TableColumn.physical(String name,
DataType type)
Deprecated.
Creates a regular table column that represents physical data.
|
static DataType |
DataTypes.ROW(DataType... fieldDataTypes)
Data type of a sequence of fields.
|
TableSchema.Builder |
TableSchema.Builder.watermark(String rowtimeAttribute,
String watermarkExpressionString,
DataType watermarkExprOutputType)
Specifies the previously defined field as an event-time attribute and specifies the
watermark strategy.
|
Constructor and Description |
---|
WatermarkSpec(String rowtimeAttribute,
String watermarkExpressionString,
DataType watermarkExprOutputType)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
protected <T> DataType |
AbstractStreamTableEnvironmentImpl.wrapWithChangeFlag(TypeInformation<T> outputType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
ListView.newListViewDataType(DataType elementDataType)
|
static DataType |
MapView.newMapViewDataType(DataType keyDataType,
DataType valueDataType)
|
Modifier and Type | Method and Description |
---|---|
static DataType |
ListView.newListViewDataType(DataType elementDataType)
|
static DataType |
MapView.newMapViewDataType(DataType keyDataType,
DataType valueDataType)
|
Modifier and Type | Method and Description |
---|---|
OutType |
BaseExpressions.cast(DataType toType)
Returns a new value being cast to
toType . |
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType)
Extracts a scalar from a JSON string.
|
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType,
InType defaultOnEmptyOrError)
Extracts a scalar from a JSON string.
|
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType,
JsonValueOnEmptyOrError onEmpty,
InType defaultOnEmpty,
JsonValueOnEmptyOrError onError,
InType defaultOnError)
Extracts a scalar from a JSON string.
|
OutType |
BaseExpressions.tryCast(DataType toType)
Like
BaseExpressions.cast(DataType) , but in case of error, returns null rather than failing
the job. |
Modifier and Type | Field and Description |
---|---|
protected DataType |
Column.dataType |
Modifier and Type | Method and Description |
---|---|
DataType |
DataTypeFactory.createDataType(AbstractDataType<?> abstractDataType)
Creates a type out of an
AbstractDataType . |
<T> DataType |
DataTypeFactory.createDataType(Class<T> clazz)
Creates a type by analyzing the given class.
|
DataType |
DataTypeFactory.createDataType(String typeString)
Creates a type by a fully or partially defined name.
|
<T> DataType |
DataTypeFactory.createDataType(TypeInformation<T> typeInfo)
Creates a type for the given
TypeInformation . |
DataType |
DataTypeFactory.createDataType(UnresolvedIdentifier identifier)
Creates a type by a fully or partially defined identifier.
|
<T> DataType |
DataTypeFactory.createRawDataType(Class<T> clazz)
Creates a RAW type for the given class in cases where no serializer is known and a generic
serializer should be used.
|
<T> DataType |
DataTypeFactory.createRawDataType(TypeInformation<T> typeInfo)
Creates a RAW type for the given
TypeInformation . |
DataType |
Column.getDataType()
Returns the data type of this column.
|
DataType |
SchemaTranslator.ConsumingResult.getPhysicalDataType() |
DataType |
ResolvedSchema.toPhysicalRowDataType()
Converts all physical columns of this schema into a (possibly nested) row data type.
|
DataType |
ResolvedSchema.toSinkRowDataType()
Converts all persisted columns of this schema into a (possibly nested) row data type.
|
DataType |
ResolvedSchema.toSourceRowDataType()
Converts all columns of this schema into a (possibly nested) row data type.
|
Modifier and Type | Method and Description |
---|---|
List<DataType> |
ResolvedSchema.getColumnDataTypes()
Returns all column data types.
|
Optional<DataType> |
SchemaTranslator.ProducingResult.getPhysicalDataType() |
Modifier and Type | Method and Description |
---|---|
abstract Column |
Column.copy(DataType newType)
Returns a copy of the column with a replaced
DataType . |
Column |
Column.PhysicalColumn.copy(DataType newDataType) |
Column |
Column.ComputedColumn.copy(DataType newDataType) |
Column |
Column.MetadataColumn.copy(DataType newDataType) |
static SchemaTranslator.ConsumingResult |
SchemaTranslator.createConsumingResult(DataTypeFactory dataTypeFactory,
DataType inputDataType,
Schema declaredSchema,
boolean mergePhysicalSchema)
Converts the given
DataType and an optional declared Schema (possibly
incomplete) into the final SchemaTranslator.ConsumingResult . |
static Column.MetadataColumn |
Column.metadata(String name,
DataType dataType,
String metadataKey,
boolean isVirtual)
Creates a metadata column from metadata of the given column name or from metadata of the
given key (if not null).
|
static ResolvedSchema |
ResolvedSchema.physical(String[] columnNames,
DataType[] columnDataTypes)
Shortcut for a resolved schema of only physical columns.
|
static Column.PhysicalColumn |
Column.physical(String name,
DataType dataType)
Creates a regular table column that represents physical data.
|
CallExpression |
ContextResolvedFunction.toCallExpression(List<ResolvedExpression> resolvedArgs,
DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
static ResolvedSchema |
ResolvedSchema.physical(List<String> columnNames,
List<DataType> columnDataTypes)
Shortcut for a resolved schema of only physical columns.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
HiveTypeUtil.toFlinkType(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector inspector)
Convert a Hive ObjectInspector to a Flink data type.
|
static DataType |
HiveTypeUtil.toFlinkType(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo hiveType)
Convert Hive data type to a Flink data type.
|
Modifier and Type | Method and Description |
---|---|
static org.apache.hadoop.hive.serde2.typeinfo.TypeInfo |
HiveTypeUtil.toHiveTypeInfo(DataType dataType,
boolean checkPrecision)
Convert Flink DataType to Hive TypeInfo.
|
Modifier and Type | Method and Description |
---|---|
static String[] |
CliUtils.typesToString(DataType[] types) |
Modifier and Type | Method and Description |
---|---|
abstract DataType |
Projection.project(DataType dataType)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given index paths.
|
Modifier and Type | Method and Description |
---|---|
static Projection |
Projection.all(DataType dataType)
Create a
Projection of all the fields in the provided dataType . |
Projection |
Projection.complement(DataType dataType)
Like
Projection.complement(int) , using the dataType fields count. |
static Projection |
Projection.fromFieldNames(DataType dataType,
List<String> projectedFields)
|
abstract DataType |
Projection.project(DataType dataType)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given index paths.
|
Modifier and Type | Method and Description |
---|---|
default Map<String,DataType> |
DecodingFormat.listReadableMetadata()
Returns the map of metadata keys and their corresponding data types that can be produced by
this format for reading.
|
default Map<String,DataType> |
EncodingFormat.listWritableMetadata()
Returns the map of metadata keys and their corresponding data types that can be consumed by
this format for writing.
|
Modifier and Type | Method and Description |
---|---|
default I |
ProjectableDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType projectedPhysicalDataType) |
I |
DecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType)
Creates runtime decoder implementation that is configured to produce data of the given data
type.
|
I |
ProjectableDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections)
Creates runtime decoder implementation that is configured to produce data of type
Projection.of(projections).project(physicalDataType) . |
I |
EncodingFormat.createRuntimeEncoder(DynamicTableSink.Context context,
DataType physicalDataType)
Creates runtime encoder implementation that is configured to consume data of the given data
type.
|
TableStats |
FileBasedStatisticsReportableInputFormat.reportStatistics(List<Path> files,
DataType producedDataType)
Returns the estimated statistics of this input format.
|
Modifier and Type | Method and Description |
---|---|
DynamicTableSink.DataStructureConverter |
DynamicTableSink.Context.createDataStructureConverter(DataType consumedDataType)
Creates a converter for mapping between Flink's internal data structures and objects
specified by the given
DataType that can be passed into a runtime implementation. |
<T> TypeInformation<T> |
DynamicTableSink.Context.createTypeInformation(DataType consumedDataType)
Creates type information describing the internal data structures of the given
DataType . |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
SupportsWritingMetadata.listWritableMetadata()
Returns the map of metadata keys and their corresponding data types that can be consumed by
this table sink for writing.
|
Modifier and Type | Method and Description |
---|---|
void |
SupportsWritingMetadata.applyWritableMetadata(List<String> metadataKeys,
DataType consumedDataType)
Provides a list of metadata keys that the consumed
RowData will contain as appended
metadata columns which must be persisted. |
Modifier and Type | Method and Description |
---|---|
DynamicTableSource.DataStructureConverter |
DynamicTableSource.Context.createDataStructureConverter(DataType producedDataType)
Creates a converter for mapping between objects specified by the given
DataType
and Flink's internal data structures that can be passed into a runtime implementation. |
<T> TypeInformation<T> |
DynamicTableSource.Context.createTypeInformation(DataType producedDataType)
Creates type information describing the internal data structures of the given
DataType . |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
SupportsReadingMetadata.listReadableMetadata()
Returns the map of metadata keys and their corresponding data types that can be produced by
this table source for reading.
|
Modifier and Type | Method and Description |
---|---|
boolean |
SupportsAggregatePushDown.applyAggregates(List<int[]> groupingSets,
List<AggregateExpression> aggregateExpressions,
DataType producedDataType)
Provides a list of aggregate expressions and the grouping keys.
|
default void |
SupportsProjectionPushDown.applyProjection(int[][] projectedFields,
DataType producedDataType)
Provides the field index paths that should be used for a projection.
|
void |
SupportsReadingMetadata.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType)
Provides a list of metadata keys that the produced
RowData must contain as appended
metadata columns. |
Modifier and Type | Method and Description |
---|---|
static ArrayObjectArrayConverter<?> |
ArrayObjectArrayConverter.create(DataType dataType) |
static RawObjectConverter<?> |
RawObjectConverter.create(DataType dataType) |
static YearMonthIntervalPeriodConverter |
YearMonthIntervalPeriodConverter.create(DataType dataType) |
static RawByteArrayConverter<?> |
RawByteArrayConverter.create(DataType dataType) |
static RowRowConverter |
RowRowConverter.create(DataType dataType) |
static StructuredObjectConverter<?> |
StructuredObjectConverter.create(DataType dataType) |
static ArrayListConverter<?> |
ArrayListConverter.create(DataType dataType) |
static <E> ArrayObjectArrayConverter<E> |
ArrayObjectArrayConverter.createForElement(DataType elementDataType) |
static MapMapConverter<?,?> |
MapMapConverter.createForMapType(DataType dataType) |
static MapMapConverter<?,?> |
MapMapConverter.createForMultisetType(DataType dataType) |
static DataStructureConverter<Object,Object> |
DataStructureConverters.getConverter(DataType dataType)
Returns a converter for the given
DataType . |
Modifier and Type | Method and Description |
---|---|
static DataFormatConverters.DataFormatConverter |
DataFormatConverters.getConverterForDataType(DataType originDataType)
|
Constructor and Description |
---|
AbstractRowDataConverter(DataType[] fieldTypes) |
CaseClassConverter(TupleTypeInfoBase t,
DataType[] fieldTypes) |
MapConverter(DataType keyTypeInfo,
DataType valueTypeInfo) |
ObjectArrayConverter(DataType elementType) |
PojoConverter(PojoTypeInfo<T> t,
DataType[] fieldTypes) |
RowConverter(DataType[] fieldTypes) |
TupleConverter(Class<Tuple> clazz,
DataType[] fieldTypes) |
Modifier and Type | Method and Description |
---|---|
DataType |
DescriptorProperties.getDataType(String key)
Deprecated.
Returns the DataType under the given existing key.
|
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
DescriptorProperties.getOptionalDataType(String key)
Deprecated.
Returns the DataType under the given key if it exists.
|
Modifier and Type | Method and Description |
---|---|
Schema |
Schema.field(String fieldName,
DataType fieldType)
Deprecated.
Adds a field with the field name and the data type.
|
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
ChangelogCsvFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
Constructor and Description |
---|
SocketDynamicTableSource(String hostname,
int port,
byte byteDelimiter,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
TableReferenceExpression.getOutputDataType() |
DataType |
LocalReferenceExpression.getOutputDataType() |
DataType |
TypeLiteralExpression.getOutputDataType() |
DataType |
ValueLiteralExpression.getOutputDataType() |
DataType |
FieldReferenceExpression.getOutputDataType() |
DataType |
ResolvedExpression.getOutputDataType()
Returns the data type of the computation result.
|
DataType |
CallExpression.getOutputDataType() |
DataType |
AggregateExpression.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
static CallExpression |
CallExpression.anonymous(FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType)
Creates a
CallExpression to an anonymous function that has been declared inline
without a FunctionIdentifier . |
static LocalReferenceExpression |
ApiExpressionUtils.localRef(String name,
DataType dataType) |
static CallExpression |
CallExpression.permanent(BuiltInFunctionDefinition builtInFunctionDefinition,
List<ResolvedExpression> args,
DataType dataType)
Creates a
CallExpression to a resolved built-in function. |
static CallExpression |
CallExpression.permanent(FunctionIdentifier functionIdentifier,
FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType)
|
CallExpression |
CallExpression.replaceArgs(List<ResolvedExpression> args,
DataType dataType) |
CallExpression |
UnresolvedCallExpression.resolve(List<ResolvedExpression> args,
DataType dataType) |
static CallExpression |
CallExpression.temporary(FunctionIdentifier functionIdentifier,
FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType)
Creates a
CallExpression to a temporary function (potentially shadowing a Catalog function or providing a system function). |
static TypeLiteralExpression |
ApiExpressionUtils.typeLiteral(DataType dataType) |
static ValueLiteralExpression |
ApiExpressionUtils.valueLiteral(Object value,
DataType dataType) |
Constructor and Description |
---|
AggregateExpression(FunctionDefinition functionDefinition,
List<FieldReferenceExpression> args,
CallExpression filterExpression,
DataType resultType,
boolean distinct,
boolean approximate,
boolean ignoreNulls) |
CallExpression(boolean isTemporary,
FunctionIdentifier functionIdentifier,
FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType) |
CallExpression(FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType)
Deprecated.
|
CallExpression(FunctionIdentifier functionIdentifier,
FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType)
|
FieldReferenceExpression(String name,
DataType dataType,
int inputIndex,
int fieldIndex) |
TypeLiteralExpression(DataType dataType) |
ValueLiteralExpression(Object value,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
CallExpression |
ExpressionResolver.PostResolverFactory.array(DataType dataType,
ResolvedExpression... expression) |
CallExpression |
ExpressionResolver.PostResolverFactory.cast(ResolvedExpression expression,
DataType dataType) |
CallExpression |
ExpressionResolver.PostResolverFactory.get(ResolvedExpression composite,
ValueLiteralExpression key,
DataType dataType) |
CallExpression |
ExpressionResolver.PostResolverFactory.map(DataType dataType,
ResolvedExpression... expression) |
CallExpression |
ExpressionResolver.PostResolverFactory.row(DataType dataType,
ResolvedExpression... expression) |
ExpressionResolver.ExpressionResolverBuilder |
ExpressionResolver.ExpressionResolverBuilder.withOutputDataType(DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
ResolverRule.ResolutionContext.getOutputDataType()
Access to the expected top-level output data type.
|
Modifier and Type | Method and Description |
---|---|
default DataType |
DynamicTableFactory.Context.getPhysicalRowDataType()
Returns the physical schema to use for encoding and decoding records.
|
Modifier and Type | Method and Description |
---|---|
SpecializedFunction.ExpressionEvaluator |
SpecializedFunction.ExpressionEvaluatorFactory.createEvaluator(BuiltInFunctionDefinition function,
DataType outputDataType,
DataType... args)
Creates a serializable factory that can be passed into a
UserDefinedFunction for
evaluating a BuiltInFunctionDefinition during runtime. |
SpecializedFunction.ExpressionEvaluator |
SpecializedFunction.ExpressionEvaluatorFactory.createEvaluator(BuiltInFunctionDefinition function,
DataType outputDataType,
DataType... args)
Creates a serializable factory that can be passed into a
UserDefinedFunction for
evaluating a BuiltInFunctionDefinition during runtime. |
SpecializedFunction.ExpressionEvaluator |
SpecializedFunction.ExpressionEvaluatorFactory.createEvaluator(Expression expression,
DataType outputDataType,
DataTypes.Field... args)
Creates a serializable factory that can be passed into a
UserDefinedFunction for
evaluating an Expression during runtime. |
SpecializedFunction.ExpressionEvaluator |
SpecializedFunction.ExpressionEvaluatorFactory.createEvaluator(String sqlExpression,
DataType outputDataType,
DataTypes.Field... args)
Shorthand for
createEvaluator(callSql("..."), ...) . |
BuiltInFunctionDefinition.Builder |
BuiltInFunctionDefinition.Builder.typedArguments(DataType... argumentTypes) |
Modifier and Type | Method and Description |
---|---|
DataType |
HiveFunctionArguments.getDataType(int pos) |
DataType |
HiveGenericUDAF.inferReturnType() |
DataType |
HiveGenericUDF.inferReturnType() |
DataType |
HiveGenericUDTF.inferReturnType() |
DataType |
HiveFunction.inferReturnType()
Infers the return type of the function.
|
DataType |
HiveSimpleUDF.inferReturnType() |
Modifier and Type | Method and Description |
---|---|
Optional<List<DataType>> |
HiveFunction.HiveFunctionInputStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<DataType> |
HiveFunction.HiveFunctionOutputStrategy.inferType(CallContext callContext) |
Modifier and Type | Method and Description |
---|---|
static org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector |
HiveInspectors.getObjectInspector(DataType flinkType)
Get Hive
ObjectInspector for a Flink DataType . |
static org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[] |
HiveInspectors.toInspectors(HiveShim hiveShim,
Object[] args,
DataType[] argTypes)
Get an array of ObjectInspector from the give array of args and their types.
|
Constructor and Description |
---|
PythonAggregateFunction(String name,
byte[] serializedAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonAggregateFunction(String name,
byte[] serializedAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonScalarFunction(String name,
byte[] serializedScalarFunction,
DataType[] inputTypes,
DataType resultType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonScalarFunction(String name,
byte[] serializedScalarFunction,
DataType[] inputTypes,
DataType resultType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableAggregateFunction(String name,
byte[] serializedTableAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableAggregateFunction(String name,
byte[] serializedTableAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableFunction(String name,
byte[] serializedScalarFunction,
DataType[] inputTypes,
DataType resultType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableFunction(String name,
byte[] serializedScalarFunction,
DataType[] inputTypes,
DataType resultType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
Modifier and Type | Method and Description |
---|---|
DataType |
CollectModifyOperation.getConsumedDataType() |
DataType |
ExternalQueryOperation.getPhysicalDataType() |
DataType |
ExternalModifyOperation.getPhysicalDataType() |
DataType |
OutputConversionModifyOperation.getType() |
Modifier and Type | Method and Description |
---|---|
void |
CollectModifyOperation.setConsumedDataType(DataType consumedDataType) |
Constructor and Description |
---|
ExternalModifyOperation(ContextResolvedTable contextResolvedTable,
QueryOperation child,
ChangelogMode changelogMode,
DataType physicalDataType) |
ExternalQueryOperation(ContextResolvedTable contextResolvedTable,
DataStream<E> dataStream,
DataType physicalDataType,
boolean isTopLevelRecord,
ChangelogMode changelogMode) |
OutputConversionModifyOperation(QueryOperation child,
DataType type,
OutputConversionModifyOperation.UpdateMode updateMode) |
Modifier and Type | Method and Description |
---|---|
QueryOperation |
OperationTreeBuilder.values(DataType rowType,
Expression... expressions) |
Modifier and Type | Method and Description |
---|---|
static org.apache.calcite.rel.RelNode |
DynamicSourceUtils.convertDataStreamToRel(boolean isBatchMode,
ReadableConfig config,
FlinkRelBuilder relBuilder,
ContextResolvedTable contextResolvedTable,
DataStream<?> dataStream,
DataType physicalDataType,
boolean isTopLevelRecord,
ChangelogMode changelogMode)
Converts a given
DataStream to a RelNode . |
Modifier and Type | Method and Description |
---|---|
static DataType |
HiveParserUtils.toDataType(org.apache.calcite.rel.type.RelDataType relDataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
RexNodeExpression.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
static ValueLiteralExpression |
ExpressionBuilder.literal(Object value,
DataType type) |
static ValueLiteralExpression |
ExpressionBuilder.nullOf(DataType type) |
static TypeLiteralExpression |
ExpressionBuilder.typeLiteral(DataType type) |
Constructor and Description |
---|
RexNodeExpression(org.apache.calcite.rex.RexNode rexNode,
DataType outputDataType,
String summaryString,
String serializableString) |
Modifier and Type | Method and Description |
---|---|
DataType[] |
DenseRankAggFunction.getAggBufferTypes() |
DataType[] |
NTILEAggFunction.getAggBufferTypes() |
DataType[] |
CountAggFunction.getAggBufferTypes() |
DataType[] |
CumeDistAggFunction.getAggBufferTypes() |
DataType[] |
Count1AggFunction.getAggBufferTypes() |
DataType[] |
RowNumberAggFunction.getAggBufferTypes() |
DataType[] |
LeadLagAggFunction.getAggBufferTypes() |
DataType[] |
SumAggFunction.getAggBufferTypes() |
abstract DataType[] |
DeclarativeAggregateFunction.getAggBufferTypes()
All types of the aggregate buffer.
|
DataType[] |
MaxAggFunction.getAggBufferTypes() |
DataType[] |
Sum0AggFunction.getAggBufferTypes() |
DataType[] |
ListAggFunction.getAggBufferTypes() |
DataType[] |
MinAggFunction.getAggBufferTypes() |
DataType[] |
SingleValueAggFunction.getAggBufferTypes() |
DataType[] |
AvgAggFunction.getAggBufferTypes() |
DataType[] |
RankAggFunction.getAggBufferTypes() |
DataType[] |
SumWithRetractAggFunction.getAggBufferTypes() |
DataType |
NTILEAggFunction.getResultType() |
DataType |
CountAggFunction.getResultType() |
DataType |
CumeDistAggFunction.getResultType() |
DataType |
Count1AggFunction.getResultType() |
DataType |
RowNumberAggFunction.getResultType() |
DataType |
LeadLagAggFunction.IntLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.ByteLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.ShortLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.LongLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.FloatLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DoubleLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.BooleanLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DecimalLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.StringLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.CharLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DateLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.TimeLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.TimestampLeadLagAggFunction.getResultType() |
DataType |
SumAggFunction.IntSumAggFunction.getResultType() |
DataType |
SumAggFunction.ByteSumAggFunction.getResultType() |
DataType |
SumAggFunction.ShortSumAggFunction.getResultType() |
DataType |
SumAggFunction.LongSumAggFunction.getResultType() |
DataType |
SumAggFunction.FloatSumAggFunction.getResultType() |
DataType |
SumAggFunction.DoubleSumAggFunction.getResultType() |
DataType |
SumAggFunction.DecimalSumAggFunction.getResultType() |
abstract DataType |
DeclarativeAggregateFunction.getResultType()
The result type of the function.
|
DataType |
RankLikeAggFunctionBase.getResultType() |
DataType |
MaxAggFunction.IntMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.ByteMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.ShortMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.LongMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.FloatMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DoubleMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DecimalMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.BooleanMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.StringMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DateMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimeMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimestampMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimestampLtzMaxAggFunction.getResultType() |
DataType |
Sum0AggFunction.IntSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.ByteSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.ShortSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.LongSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.FloatSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.DoubleSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.DecimalSum0AggFunction.getResultType() |
DataType |
ListAggFunction.getResultType() |
DataType |
MinAggFunction.IntMinAggFunction.getResultType() |
DataType |
MinAggFunction.ByteMinAggFunction.getResultType() |
DataType |
MinAggFunction.ShortMinAggFunction.getResultType() |
DataType |
MinAggFunction.LongMinAggFunction.getResultType() |
DataType |
MinAggFunction.FloatMinAggFunction.getResultType() |
DataType |
MinAggFunction.DoubleMinAggFunction.getResultType() |
DataType |
MinAggFunction.DecimalMinAggFunction.getResultType() |
DataType |
MinAggFunction.BooleanMinAggFunction.getResultType() |
DataType |
MinAggFunction.StringMinAggFunction.getResultType() |
DataType |
MinAggFunction.DateMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimeMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimestampMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimestampLtzMinAggFunction.getResultType() |
DataType |
SingleValueAggFunction.ByteSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.ShortSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.IntSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.LongSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.FloatSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DoubleSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.BooleanSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DecimalSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.CharSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.StringSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DateSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimeSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimestampSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimestampLtzSingleValueAggFunction.getResultType() |
DataType |
AvgAggFunction.ByteAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.ShortAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.IntAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.LongAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.FloatAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.DoubleAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.DecimalAvgAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.IntSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.ByteSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.ShortSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.LongSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.FloatSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.DoubleSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.DecimalSumWithRetractAggFunction.getResultType() |
abstract DataType |
AvgAggFunction.getSumType() |
DataType |
AvgAggFunction.ByteAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.ShortAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.IntAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.LongAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.FloatAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.DoubleAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.DecimalAvgAggFunction.getSumType() |
Constructor and Description |
---|
RowDataToStringConverterImpl(DataType dataType,
java.time.ZoneId zoneId,
ClassLoader classLoader,
boolean legacyBehaviour) |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
LookupCallContext.getArgumentDataTypes() |
List<DataType> |
OperatorBindingCallContext.getArgumentDataTypes() |
List<DataType> |
CallBindingCallContext.getArgumentDataTypes() |
Optional<DataType> |
LookupCallContext.getOutputDataType() |
Optional<DataType> |
OperatorBindingCallContext.getOutputDataType() |
Optional<DataType> |
CallBindingCallContext.getOutputDataType() |
Constructor and Description |
---|
BatchExecBoundedStreamScan(ReadableConfig tableConfig,
DataStream<?> dataStream,
DataType sourceType,
int[] fieldIndexes,
List<String> qualifiedName,
RowType outputType,
String description) |
Modifier and Type | Method and Description |
---|---|
protected LogicalType[] |
StreamExecWindowAggregateBase.convertToLogicalTypes(DataType[] dataTypes) |
Constructor and Description |
---|
StreamExecDataStreamScan(ReadableConfig tableConfig,
DataStream<?> dataStream,
DataType sourceType,
int[] fieldIndexes,
String[] fieldNames,
List<String> qualifiedName,
RowType outputType,
String description) |
Modifier and Type | Method and Description |
---|---|
static DataViewSpec[] |
CommonPythonUtil.extractDataViewSpecs(int index,
DataType accType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataViewUtils.adjustDataViews(DataType accumulatorDataType,
boolean hasStateBackedDataViews)
Modifies the data type of an accumulator regarding data views.
|
static DataType |
DataViewUtils.createDistinctViewDataType(DataType keyDataType,
int filterArgs,
int filterArgsLimit)
Creates a special
DataType for DISTINCT aggregates. |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataViewUtils.adjustDataViews(DataType accumulatorDataType,
boolean hasStateBackedDataViews)
Modifies the data type of an accumulator regarding data views.
|
static DataType |
DataViewUtils.createDistinctViewDataType(DataType keyDataType,
int filterArgs,
int filterArgsLimit)
Creates a special
DataType for DISTINCT aggregates. |
static DataViewUtils.DistinctViewSpec |
DataViewUtils.createDistinctViewSpec(int index,
DataType distinctViewDataType)
Creates a special
DataViewUtils.DistinctViewSpec for DISTINCT aggregates. |
static List<DataViewSpec> |
DataViewUtils.extractDataViews(int aggIndex,
DataType accumulatorDataType)
Searches for data views in the data type of an accumulator and extracts them.
|
Constructor and Description |
---|
DistinctViewSpec(String stateId,
DataType distinctViewDataType) |
Modifier and Type | Method and Description |
---|---|
static ArrowTableSource |
ArrowUtils.createArrowTableSource(DataType dataType,
String fileName) |
Modifier and Type | Method and Description |
---|---|
DataType |
ArrowTableSource.getProducedDataType() |
Constructor and Description |
---|
ArrowTableSource(DataType dataType,
byte[][] arrowData) |
Modifier and Type | Method and Description |
---|---|
DynamicTableSink.DataStructureConverter |
SinkRuntimeProviderContext.createDataStructureConverter(DataType consumedDataType) |
TypeInformation<?> |
SinkRuntimeProviderContext.createTypeInformation(DataType consumedDataType) |
Modifier and Type | Method and Description |
---|---|
DynamicTableSource.DataStructureConverter |
ScanRuntimeProviderContext.createDataStructureConverter(DataType producedDataType) |
DynamicTableSource.DataStructureConverter |
LookupRuntimeProviderContext.createDataStructureConverter(DataType producedDataType) |
TypeInformation<?> |
ScanRuntimeProviderContext.createTypeInformation(DataType producedDataType) |
TypeInformation<?> |
LookupRuntimeProviderContext.createTypeInformation(DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
DataViewSpec.getDataType() |
DataType |
ListViewSpec.getElementDataType() |
DataType |
MapViewSpec.getKeyDataType() |
DataType |
MapViewSpec.getValueDataType() |
Constructor and Description |
---|
ListViewSpec(String stateId,
int fieldIndex,
DataType dataType) |
ListViewSpec(String stateId,
int fieldIndex,
DataType dataType,
TypeSerializer<?> elementSerializer)
Deprecated.
|
MapViewSpec(String stateId,
int fieldIndex,
DataType dataType,
boolean containsNullKey) |
MapViewSpec(String stateId,
int fieldIndex,
DataType dataType,
boolean containsNullKey,
TypeSerializer<?> keySerializer,
TypeSerializer<?> valueSerializer)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInAggregateFunction.getAccumulatorDataType() |
DataType |
FirstValueWithRetractAggFunction.getAccumulatorDataType() |
DataType |
LastValueAggFunction.getAccumulatorDataType() |
DataType |
FirstValueAggFunction.getAccumulatorDataType() |
DataType |
ListAggWsWithRetractAggFunction.getAccumulatorDataType() |
DataType |
LagAggFunction.getAccumulatorDataType() |
DataType |
CollectAggFunction.getAccumulatorDataType() |
DataType |
ListAggWithRetractAggFunction.getAccumulatorDataType() |
DataType |
MinWithRetractAggFunction.getAccumulatorDataType() |
DataType |
JsonObjectAggFunction.getAccumulatorDataType() |
DataType |
MaxWithRetractAggFunction.getAccumulatorDataType() |
DataType |
LastValueWithRetractAggFunction.getAccumulatorDataType() |
DataType |
BatchApproxCountDistinctAggFunctions.ApproxCountDistinctAggFunction.getAccumulatorDataType() |
DataType |
JsonArrayAggFunction.getAccumulatorDataType() |
DataType |
BuiltInAggregateFunction.getOutputDataType() |
DataType |
FirstValueWithRetractAggFunction.getOutputDataType() |
DataType |
LastValueAggFunction.getOutputDataType() |
DataType |
FirstValueAggFunction.getOutputDataType() |
DataType |
ListAggWsWithRetractAggFunction.getOutputDataType() |
DataType |
LagAggFunction.getOutputDataType() |
DataType |
CollectAggFunction.getOutputDataType() |
DataType |
ListAggWithRetractAggFunction.getOutputDataType() |
DataType |
MinWithRetractAggFunction.getOutputDataType() |
DataType |
JsonObjectAggFunction.getOutputDataType() |
DataType |
MaxWithRetractAggFunction.getOutputDataType() |
DataType |
LastValueWithRetractAggFunction.getOutputDataType() |
DataType |
BatchApproxCountDistinctAggFunctions.ApproxCountDistinctAggFunction.getOutputDataType() |
DataType |
JsonArrayAggFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInAggregateFunction.getArgumentDataTypes() |
List<DataType> |
FirstValueWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
LastValueAggFunction.getArgumentDataTypes() |
List<DataType> |
FirstValueAggFunction.getArgumentDataTypes() |
List<DataType> |
ListAggWsWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
LagAggFunction.getArgumentDataTypes() |
List<DataType> |
CollectAggFunction.getArgumentDataTypes() |
List<DataType> |
ListAggWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
MinWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
JsonObjectAggFunction.getArgumentDataTypes() |
List<DataType> |
MaxWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
LastValueWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
BatchApproxCountDistinctAggFunctions.ApproxCountDistinctAggFunction.getArgumentDataTypes() |
List<DataType> |
JsonArrayAggFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInScalarFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInScalarFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInTableFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInTableFunction.getArgumentDataTypes() |
Constructor and Description |
---|
PunctuatedWatermarkAssignerWrapper(PunctuatedWatermarkAssigner assigner,
int timeFieldIdx,
DataType sourceType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
ClassDataTypeConverter.fromClassToDataType(Class<?> clazz) |
static DataType |
LogicalTypeDataTypeConverter.fromLogicalTypeToDataType(LogicalType logicalType)
Deprecated.
|
DataType |
DataTypePrecisionFixer.visit(AtomicDataType dataType) |
DataType |
DataTypePrecisionFixer.visit(CollectionDataType collectionDataType) |
DataType |
DataTypePrecisionFixer.visit(FieldsDataType fieldsDataType) |
DataType |
DataTypePrecisionFixer.visit(KeyValueDataType keyValueDataType) |
Modifier and Type | Method and Description |
---|---|
static LogicalType |
LogicalTypeDataTypeConverter.fromDataTypeToLogicalType(DataType dataType)
Deprecated.
|
static TypeInformation<?> |
TypeInfoDataTypeConverter.fromDataTypeToTypeInfo(DataType dataType)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
DataType |
InternalTypeInfo.getDataType() |
DataType |
ExternalSerializer.getDataType() |
DataType |
ExternalTypeInfo.getDataType() |
DataType |
DecimalDataTypeInfo.getDataType() |
Modifier and Type | Method and Description |
---|---|
static <I,E> ExternalSerializer<I,E> |
ExternalSerializer.of(DataType dataType)
Creates an instance of a
ExternalSerializer defined by the given DataType . |
static <T> ExternalTypeInfo<T> |
ExternalTypeInfo.of(DataType dataType)
Creates type information for a
DataType that is possibly represented by internal data
structures but serialized and deserialized into external data structures. |
static <I,E> ExternalSerializer<I,E> |
ExternalSerializer.of(DataType dataType,
boolean isInternalInput)
Creates an instance of a
ExternalSerializer defined by the given DataType . |
static <T> ExternalTypeInfo<T> |
ExternalTypeInfo.of(DataType dataType,
boolean isInternalInput)
Creates type information for a
DataType that is possibly represented by internal data
structures but serialized and deserialized into external data structures. |
Modifier and Type | Method and Description |
---|---|
DataType |
CsvTableSink.getConsumedDataType()
Deprecated.
|
default DataType |
TableSink.getConsumedDataType()
Deprecated.
Returns the data type consumed by this
TableSink . |
Constructor and Description |
---|
CsvTableSink(String path,
String fieldDelim,
int numFiles,
FileSystem.WriteMode writeMode,
String[] fieldNames,
DataType[] fieldTypes)
Deprecated.
A simple
TableSink to emit data as CSV files. |
Modifier and Type | Method and Description |
---|---|
DataType |
CsvTableSource.getProducedDataType()
Deprecated.
|
default DataType |
TableSource.getProducedDataType()
Deprecated.
Returns the
DataType for the produced data of the TableSource . |
Modifier and Type | Method and Description |
---|---|
CsvTableSource.Builder |
CsvTableSource.Builder.field(String fieldName,
DataType fieldType)
Adds a field with the field name and the data type.
|
Modifier and Type | Method and Description |
---|---|
static ResolvedFieldReference[] |
TimestampExtractorUtils.getAccessedFields(TimestampExtractor timestampExtractor,
DataType physicalInputType,
java.util.function.Function<String,String> nameRemapping)
Retrieves all field accesses needed for the given
TimestampExtractor . |
Modifier and Type | Method and Description |
---|---|
DataType |
Column.getDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
TpcdsSchema.getFieldTypes() |
Constructor and Description |
---|
Column(String name,
DataType dataType) |
Modifier and Type | Class and Description |
---|---|
class |
AtomicDataType
A data type that does not contain further data types (e.g.
|
class |
CollectionDataType
A data type that contains an element type (e.g.
|
class |
FieldsDataType
A data type that contains field data types (i.e.
|
class |
KeyValueDataType
A data type that contains a key and value data type (e.g.
|
Modifier and Type | Method and Description |
---|---|
DataType |
FieldsDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
KeyValueDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
CollectionDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
AtomicDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
DataTypeQueryable.getDataType() |
DataType |
CollectionDataType.getElementDataType() |
DataType |
KeyValueDataType.getKeyDataType() |
DataType |
KeyValueDataType.getValueDataType() |
DataType |
FieldsDataType.notNull() |
DataType |
KeyValueDataType.notNull() |
DataType |
CollectionDataType.notNull() |
DataType |
AtomicDataType.notNull() |
DataType |
FieldsDataType.nullable() |
DataType |
KeyValueDataType.nullable() |
DataType |
CollectionDataType.nullable() |
DataType |
AtomicDataType.nullable() |
DataType |
UnresolvedDataType.toDataType(DataTypeFactory factory)
Converts this instance to a resolved
DataType possibly enriched with additional
nullability and conversion class information. |
DataType |
DataType.toInternal()
Creates a copy of this
DataType instance with the internal data type conversion
classes. |
Modifier and Type | Method and Description |
---|---|
abstract List<DataType> |
DataType.getChildren()
Returns the children of this data type, if any.
|
List<DataType> |
FieldsDataType.getChildren() |
List<DataType> |
KeyValueDataType.getChildren() |
List<DataType> |
CollectionDataType.getChildren() |
List<DataType> |
AtomicDataType.getChildren() |
static List<DataType> |
DataType.getFieldDataTypes(DataType dataType)
Returns the first-level field data types for the provided
DataType . |
Modifier and Type | Method and Description |
---|---|
static int |
DataType.getFieldCount(DataType dataType)
Returns the count of the first-level fields for the provided
DataType . |
static List<DataType> |
DataType.getFieldDataTypes(DataType dataType)
Returns the first-level field data types for the provided
DataType . |
static List<String> |
DataType.getFieldNames(DataType dataType)
Returns the first-level field names for the provided
DataType . |
static List<DataTypes.Field> |
DataType.getFields(DataType dataType)
Returns an ordered list of fields starting from the provided
DataType . |
Constructor and Description |
---|
CollectionDataType(LogicalType logicalType,
Class<?> conversionClass,
DataType elementDataType) |
CollectionDataType(LogicalType logicalType,
DataType elementDataType) |
KeyValueDataType(LogicalType logicalType,
Class<?> conversionClass,
DataType keyDataType,
DataType valueDataType) |
KeyValueDataType(LogicalType logicalType,
DataType keyDataType,
DataType valueDataType) |
Constructor and Description |
---|
FieldsDataType(LogicalType logicalType,
Class<?> conversionClass,
List<DataType> fieldDataTypes) |
FieldsDataType(LogicalType logicalType,
List<DataType> fieldDataTypes) |
UnresolvedDataType(java.util.function.Supplier<String> description,
java.util.function.Function<DataTypeFactory,DataType> resolutionFactory) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeExtractor.extractFromGeneric(DataTypeFactory typeFactory,
Class<?> baseClass,
int genericPos,
Type contextType)
Extracts a data type from a type variable at
genericPos of baseClass using
the information of the most specific type contextType . |
static DataType |
DataTypeExtractor.extractFromMethodOutput(DataTypeFactory typeFactory,
Class<?> baseClass,
Method method)
Extracts a data type from a method return type by considering surrounding classes and method
annotation.
|
static DataType |
DataTypeExtractor.extractFromMethodParameter(DataTypeFactory typeFactory,
Class<?> baseClass,
Method method,
int paramPos)
Extracts a data type from a method parameter by considering surrounding classes and parameter
annotation.
|
static DataType |
DataTypeExtractor.extractFromType(DataTypeFactory typeFactory,
org.apache.flink.table.types.extraction.DataTypeTemplate template,
Type type)
Extracts a data type from a type without considering surrounding classes but templates.
|
static DataType |
DataTypeExtractor.extractFromType(DataTypeFactory typeFactory,
Type type)
Extracts a data type from a type without considering surrounding classes or templates.
|
Modifier and Type | Method and Description |
---|---|
DataType |
TypeInferenceUtil.Result.getOutputDataType() |
static DataType |
TypeInferenceUtil.inferOutputType(CallContext callContext,
TypeStrategy outputTypeStrategy)
Infers an output type using the given
TypeStrategy . |
DataType |
TypeTransformation.transform(DataType typeToTransform)
Transforms the given data type to a different data type.
|
default DataType |
TypeTransformation.transform(DataTypeFactory factory,
DataType typeToTransform)
Transforms the given data type to a different data type.
|
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
TypeInferenceUtil.Result.getAccumulatorDataType() |
List<DataType> |
CallContext.getArgumentDataTypes()
Returns a resolved list of the call's argument types.
|
List<DataType> |
TypeInferenceUtil.Result.getExpectedArgumentTypes() |
Optional<DataType> |
CallContext.getOutputDataType()
Returns the inferred output data type of the function call.
|
Optional<List<DataType>> |
TypeInference.getTypedArguments() |
Optional<DataType> |
ArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure)
Main logic for inferring and validating an argument.
|
Optional<List<DataType>> |
InputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure)
Main logic for inferring and validating the input arguments.
|
Optional<DataType> |
TypeInferenceUtil.SurroundingInfo.inferOutputType(DataTypeFactory typeFactory) |
Optional<DataType> |
TypeStrategy.inferType(CallContext callContext)
Infers a type from the given function call.
|
Modifier and Type | Method and Description |
---|---|
static CallContext |
TypeInferenceUtil.adaptArguments(TypeInference typeInference,
CallContext callContext,
DataType outputType)
Adapts the call's argument if necessary.
|
static TypeStrategy |
TypeStrategies.explicit(DataType dataType)
Type strategy that returns a fixed
DataType . |
static ExplicitArgumentTypeStrategy |
InputTypeStrategies.explicit(DataType expectedDataType)
Strategy for an argument that corresponds to an explicitly defined type casting.
|
static InputTypeStrategy |
InputTypeStrategies.explicitSequence(DataType... expectedDataTypes)
Strategy for a function signature of explicitly defined types like
f(STRING, INT) . |
static InputTypeStrategy |
InputTypeStrategies.explicitSequence(String[] argumentNames,
DataType[] expectedDataTypes)
Strategy for a named function signature of explicitly defined types like
f(s STRING, i
INT) . |
static TypeInferenceUtil.SurroundingInfo |
TypeInferenceUtil.SurroundingInfo.of(DataType dataType) |
DataType |
TypeTransformation.transform(DataType typeToTransform)
Transforms the given data type to a different data type.
|
default DataType |
TypeTransformation.transform(DataTypeFactory factory,
DataType typeToTransform)
Transforms the given data type to a different data type.
|
TypeInference.Builder |
TypeInference.Builder.typedArguments(DataType... argumentTypes) |
Modifier and Type | Method and Description |
---|---|
static TypeStrategy |
TypeStrategies.argument(int pos,
java.util.function.Function<DataType,Optional<DataType>> mapper)
Type strategy that returns the n-th input argument, mapping it.
|
static TypeStrategy |
TypeStrategies.argument(int pos,
java.util.function.Function<DataType,Optional<DataType>> mapper)
Type strategy that returns the n-th input argument, mapping it.
|
static ConstraintArgumentTypeStrategy |
InputTypeStrategies.constraint(String constraintMessage,
java.util.function.Predicate<List<DataType>> evaluator)
Strategy for an argument that must fulfill a given constraint.
|
TypeInference.Builder |
TypeInference.Builder.typedArguments(List<DataType> argumentTypes)
Sets the list of argument types for specifying a fixed, not overloaded, not vararg input
signature explicitly.
|
Constructor and Description |
---|
Result(List<DataType> expectedArgumentTypes,
DataType accumulatorDataType,
DataType outputDataType) |
Constructor and Description |
---|
Result(List<DataType> expectedArgumentTypes,
DataType accumulatorDataType,
DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
CommonArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
CompositeArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
ExplicitArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
AndArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
ConstraintArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
AnyArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
SymbolArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
OutputArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
RootArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
OrArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
FamilyArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
LiteralArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
TypeLiteralArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<List<DataType>> |
OrInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
WildcardInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
ComparableTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
SequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
CommonInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
SubsequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
RepeatingSequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
VaryingSequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<DataType> |
VaryingStringTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ToTimestampLtzTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MissingTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ArgumentMappingTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MatchFamilyTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
NullableIfArgsTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
FirstTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
CommonTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MappingTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ForceNullableTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ExplicitTypeStrategy.inferType(CallContext callContext) |
Constructor and Description |
---|
ExplicitArgumentTypeStrategy(DataType expectedDataType) |
ExplicitTypeStrategy(DataType explicitDataType) |
Constructor and Description |
---|
ArgumentMappingTypeStrategy(int pos,
java.util.function.Function<DataType,Optional<DataType>> mapper) |
ArgumentMappingTypeStrategy(int pos,
java.util.function.Function<DataType,Optional<DataType>> mapper) |
ConstraintArgumentTypeStrategy(String constraintMessage,
java.util.function.Predicate<List<DataType>> evaluator) |
Modifier and Type | Method and Description |
---|---|
DataType |
LegacyRawTypeTransformation.transform(DataType typeToTransform) |
DataType |
DataTypeConversionClassTransformation.transform(DataType dataType) |
DataType |
LegacyToNonLegacyTransformation.transform(DataType typeToTransform) |
DataType |
LegacyToNonLegacyTransformation.transform(DataTypeFactory factory,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
LegacyRawTypeTransformation.transform(DataType typeToTransform) |
DataType |
DataTypeConversionClassTransformation.transform(DataType dataType) |
DataType |
LegacyToNonLegacyTransformation.transform(DataType typeToTransform) |
DataType |
LegacyToNonLegacyTransformation.transform(DataTypeFactory factory,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
UnknownCallContext.getArgumentDataTypes() |
List<DataType> |
AdaptedCallContext.getArgumentDataTypes() |
Optional<DataType> |
UnknownCallContext.getOutputDataType() |
Optional<DataType> |
AdaptedCallContext.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
void |
AdaptedCallContext.setExpectedArguments(List<DataType> expectedArguments) |
Constructor and Description |
---|
AdaptedCallContext(CallContext originalContext,
DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeUtils.appendRowFields(DataType dataType,
List<DataTypes.Field> fields)
Appends the given list of fields to an existing row data type.
|
static DataType |
DataTypeUtils.createProctimeDataType()
Returns a PROCTIME data type.
|
static DataType |
TypeConversions.fromLegacyInfoToDataType(TypeInformation<?> typeInfo)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful. Sources and sinks should use the method available in
context to convert, within the planner you should use either
InternalTypeInfo or
ExternalTypeInfo depending on the use case. |
static DataType[] |
TypeConversions.fromLegacyInfoToDataType(TypeInformation<?>[] typeInfo)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful. Sources and sinks should use the method available in
context to convert, within the planner you should use either
InternalTypeInfo or
ExternalTypeInfo depending on the use case. |
static DataType |
TypeConversions.fromLogicalToDataType(LogicalType logicalType) |
static DataType[] |
TypeConversions.fromLogicalToDataType(LogicalType[] logicalTypes) |
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[] indexPaths)
Deprecated.
Use the
Projection type |
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[][] indexPaths)
Deprecated.
Use the
Projection type |
static DataType |
DataTypeUtils.removeTimeAttribute(DataType dataType)
Removes time attributes from the
DataType . |
static DataType |
DataTypeUtils.replaceLogicalType(DataType dataType,
LogicalType replacement)
Replaces the
LogicalType of a DataType , i.e., it keeps the bridging class. |
static DataType |
DataTypeUtils.stripRowPrefix(DataType dataType,
String prefix)
Removes a string prefix from the fields of the given row data type.
|
static DataType |
TypeInfoDataTypeConverter.toDataType(DataTypeFactory dataTypeFactory,
TypeInformation<?> typeInfo)
Converts the given
TypeInformation into DataType . |
static DataType |
TypeInfoDataTypeConverter.toDataType(DataTypeFactory dataTypeFactory,
TypeInformation<?> typeInfo,
boolean forceNullability)
Converts the given
TypeInformation into DataType but allows to make all
fields nullable independent of the nullability in the serialization stack. |
static DataType |
LogicalTypeDataTypeConverter.toDataType(LogicalType logicalType)
Returns the data type of a logical type without explicit conversions.
|
static DataType |
LegacyTypeInfoDataTypeConverter.toDataType(TypeInformation<?> typeInfo)
Deprecated.
|
static DataType |
DataTypeUtils.toInternalDataType(DataType dataType)
|
static DataType |
DataTypeUtils.toInternalDataType(LogicalType logicalType)
Creates a
DataType from the given LogicalType with internal data structures. |
static DataType |
DataTypeUtils.transform(DataTypeFactory factory,
DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static DataType |
DataTypeUtils.transform(DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
Modifier and Type | Method and Description |
---|---|
static Optional<DataType> |
ClassDataTypeConverter.extractDataType(Class<?> clazz)
Returns the clearly identifiable data type if possible.
|
static Optional<DataType> |
ValueDataTypeConverter.extractDataType(Object value)
Returns the clearly identifiable data type if possible.
|
static List<DataType> |
DataTypeUtils.flattenToDataTypes(DataType dataType)
Returns the data types of the flat representation in the first level of the given data type.
|
static Optional<DataType> |
TypeConversions.fromClassToDataType(Class<?> clazz) |
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
int index)
Retrieves a nested field from a composite type at given position.
|
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
String name)
Retrieves a nested field from a composite type with given name.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeUtils.appendRowFields(DataType dataType,
List<DataTypes.Field> fields)
Appends the given list of fields to an existing row data type.
|
protected abstract R |
DataTypeDefaultVisitor.defaultMethod(DataType dataType) |
static ResolvedSchema |
DataTypeUtils.expandCompositeTypeToSchema(DataType dataType)
Expands a composite
DataType to a corresponding ResolvedSchema . |
static List<DataType> |
DataTypeUtils.flattenToDataTypes(DataType dataType)
Returns the data types of the flat representation in the first level of the given data type.
|
static List<String> |
DataTypeUtils.flattenToNames(DataType dataType)
Returns the names of the flat representation of the given data type.
|
static List<String> |
DataTypeUtils.flattenToNames(DataType dataType,
List<String> existingNames) |
static LogicalType |
TypeConversions.fromDataToLogicalType(DataType dataType) |
static LogicalType[] |
TypeConversions.fromDataToLogicalType(DataType[] dataTypes) |
static TypeInformation<?> |
TypeConversions.fromDataTypeToLegacyInfo(DataType dataType)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful. Sources and sinks should use the method available in
context to convert, within the planner you should use either
InternalTypeInfo or
ExternalTypeInfo depending on the use case. |
static TypeInformation<?>[] |
TypeConversions.fromDataTypeToLegacyInfo(DataType[] dataType)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful. Sources and sinks should use the method available in
context to convert, within the planner you should use either
InternalTypeInfo or
ExternalTypeInfo depending on the use case. |
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
int index)
Retrieves a nested field from a composite type at given position.
|
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
String name)
Retrieves a nested field from a composite type with given name.
|
static boolean |
DataTypeUtils.isInternal(DataType dataType)
Checks whether a given data type is an internal data structure.
|
static boolean |
DataTypeUtils.isInternal(DataType dataType,
boolean autobox)
Checks whether a given data type is an internal data structure.
|
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[] indexPaths)
Deprecated.
Use the
Projection type |
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[][] indexPaths)
Deprecated.
Use the
Projection type |
static DataType |
DataTypeUtils.removeTimeAttribute(DataType dataType)
Removes time attributes from the
DataType . |
static DataType |
DataTypeUtils.replaceLogicalType(DataType dataType,
LogicalType replacement)
Replaces the
LogicalType of a DataType , i.e., it keeps the bridging class. |
static DataType |
DataTypeUtils.stripRowPrefix(DataType dataType,
String prefix)
Removes a string prefix from the fields of the given row data type.
|
static DataType |
DataTypeUtils.toInternalDataType(DataType dataType)
|
static TypeInformation<?> |
LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(DataType dataType)
Deprecated.
|
static LogicalType |
LogicalTypeDataTypeConverter.toLogicalType(DataType dataType)
Returns the logical type of a data type.
|
static DataType |
DataTypeUtils.transform(DataTypeFactory factory,
DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static DataType |
DataTypeUtils.transform(DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static void |
DataTypeUtils.validateInputDataType(DataType dataType)
The
DataType class can only partially verify the conversion class. |
static void |
DataTypeUtils.validateOutputDataType(DataType dataType)
The
DataType class can only partially verify the conversion class. |
Modifier and Type | Method and Description |
---|---|
DataType |
TimeIntervalTypeInfo.getDataType()
Deprecated.
|
DataType[] |
FieldInfoUtils.TypeInfoSchema.getFieldTypes() |
Modifier and Type | Method and Description |
---|---|
static int[] |
TypeMappingUtils.computePhysicalIndices(List<TableColumn> logicalColumns,
DataType physicalType,
java.util.function.Function<String,String> nameRemapping)
Computes indices of physical fields corresponding to the selected logical fields of a
TableSchema . |
static Object |
PartitionPathUtils.convertStringToInternalValue(String valStr,
DataType type)
Restore partition value from string and type.
|
static GenericRowData |
PartitionPathUtils.fillPartitionValueForRecord(String[] fieldNames,
DataType[] fieldTypes,
int[] selectFields,
List<String> partitionKeys,
Path path,
String defaultPartValue)
Extract partition value from path and fill to record.
|
Modifier and Type | Method and Description |
---|---|
static Table |
PythonTableUtils.createTableFromElement(TableEnvironment tEnv,
String filePath,
DataType schema,
boolean batched)
Create a table from
PythonDynamicTableSource that read data from input file with
specific DataType . |
static InputFormat<RowData,?> |
PythonTableUtils.getInputFormat(List<Object[]> data,
DataType dataType)
Wrap the unpickled python data with an InputFormat.
|
Constructor and Description |
---|
PythonDynamicTableSource(String filePath,
boolean batched,
DataType producedDataType) |
Copyright © 2014–2024 The Apache Software Foundation. All rights reserved.