Modifier and Type | Method and Description |
---|---|
static Object |
PythonBridgeUtils.getPickledBytesFromRow(Row row,
DataType[] dataTypes) |
Modifier and Type | Method and Description |
---|---|
DataType[] |
HBaseTableSchema.getQualifierDataTypes(String family) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
HBaseTableSchema.getRowKeyDataType() |
Modifier and Type | Method and Description |
---|---|
void |
HBaseTableSchema.addColumn(String family,
String qualifier,
DataType type) |
void |
HBaseTableSchema.setRowKey(String rowKeyName,
DataType type) |
Modifier and Type | Method and Description |
---|---|
DataType |
JdbcTableSource.getProducedDataType() |
Modifier and Type | Method and Description |
---|---|
JdbcOutputFormatBuilder |
JdbcOutputFormatBuilder.setFieldDataTypes(DataType[] fieldDataTypes) |
Constructor and Description |
---|
JdbcRowDataLookupFunction(JdbcConnectorOptions options,
JdbcLookupOptions lookupOptions,
String[] fieldNames,
DataType[] fieldTypes,
String[] keyNames,
RowType rowType) |
Constructor and Description |
---|
HiveContinuousPartitionFetcherContext(ObjectPath tablePath,
HiveShim hiveShim,
JobConfWrapper confWrapper,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
Configuration configuration,
String defaultPartitionName) |
HiveRowDataPartitionComputer(HiveShim hiveShim,
String defaultPartValue,
String[] columnNames,
DataType[] columnTypes,
String[] partitionColumns) |
Modifier and Type | Field and Description |
---|---|
protected DataType[] |
HivePartitionFetcherContextBase.fieldTypes |
Constructor and Description |
---|
HiveBulkFormatAdapter(JobConfWrapper jobConfWrapper,
List<String> partitionKeys,
String[] fieldNames,
DataType[] fieldTypes,
String hiveVersion,
RowType producedRowType,
boolean useMapRedReader) |
HiveInputFormatPartitionReader(org.apache.hadoop.mapred.JobConf jobConf,
String hiveVersion,
ObjectPath tablePath,
DataType[] fieldTypes,
String[] fieldNames,
List<String> partitionKeys,
int[] selectedFields,
boolean useMapRedReader) |
HiveMapredSplitReader(org.apache.hadoop.mapred.JobConf jobConf,
List<String> partitionKeys,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split,
HiveShim hiveShim) |
HivePartitionFetcherContextBase(ObjectPath tablePath,
HiveShim hiveShim,
JobConfWrapper confWrapper,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
Configuration configuration,
String defaultPartitionName) |
HiveTableInputFormat(org.apache.hadoop.mapred.JobConf jobConf,
List<String> partitionKeys,
DataType[] fieldTypes,
String[] fieldNames,
int[] projectedFields,
Long limit,
String hiveVersion,
boolean useMapRedReader,
List<HiveTablePartition> partitions) |
HiveVectorizedOrcSplitReader(String hiveVersion,
org.apache.hadoop.mapred.JobConf jobConf,
String[] fieldNames,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split) |
HiveVectorizedParquetSplitReader(String hiveVersion,
org.apache.hadoop.mapred.JobConf jobConf,
String[] fieldNames,
DataType[] fieldTypes,
int[] selectedFields,
HiveTableInputSplit split) |
Modifier and Type | Method and Description |
---|---|
static Map<String,Object> |
HivePartitionUtils.parsePartitionValues(Map<String,String> partitionSpecs,
String[] fieldNames,
DataType[] fieldTypes,
String defaultPartitionName,
HiveShim shim)
Parse partition string specs into object values.
|
Modifier and Type | Method and Description |
---|---|
static RowType |
DebeziumAvroSerializationSchema.createDebeziumAvroRowType(DataType dataType) |
static RowType |
DebeziumAvroDeserializationSchema.createDebeziumAvroRowType(DataType databaseSchema) |
Modifier and Type | Method and Description |
---|---|
static DataType |
AvroSchemaConverter.convertToDataType(String avroSchemaString)
Converts an Avro schema string into a nested row structure with deterministic field order and
data types that are compatible with Flink's Table & SQL API.
|
Constructor and Description |
---|
CsvInputFormat(Path[] filePaths,
DataType[] fieldTypes,
String[] fieldNames,
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema csvSchema,
RowType formatRowType,
int[] selectFields,
List<String> partitionKeys,
String defaultPartValue,
long limit,
int[] csvSelectFieldToProjectFieldMapping,
int[] csvSelectFieldToCsvFieldMapping,
boolean ignoreParseErrors) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
CanalJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
static CanalJsonDeserializationSchema.Builder |
CanalJsonDeserializationSchema.builder(DataType physicalDataType,
List<org.apache.flink.formats.json.canal.CanalJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo)
Creates A builder for building a
CanalJsonDeserializationSchema . |
DeserializationSchema<RowData> |
CanalJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
DebeziumJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
DebeziumJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType) |
Constructor and Description |
---|
DebeziumJsonDeserializationSchema(DataType physicalDataType,
List<org.apache.flink.formats.json.debezium.DebeziumJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo,
boolean schemaInclude,
boolean ignoreParseErrors,
TimestampFormat timestampFormat) |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
MaxwellJsonDecodingFormat.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
MaxwellJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType) |
Constructor and Description |
---|
MaxwellJsonDeserializationSchema(DataType physicalDataType,
List<org.apache.flink.formats.json.maxwell.MaxwellJsonDecodingFormat.ReadableMetadata> requestedMetadata,
TypeInformation<RowData> producedTypeInfo,
boolean ignoreParseErrors,
TimestampFormat timestampFormat) |
Modifier and Type | Method and Description |
---|---|
static ParquetColumnarRowSplitReader |
ParquetSplitReaderUtil.genPartColumnarRowReader(boolean utcTimestamp,
boolean caseSensitive,
Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
ParquetColumnarRowSplitReader . |
Modifier and Type | Method and Description |
---|---|
static org.apache.orc.TypeDescription |
OrcSplitReaderUtil.convertToOrcTypeWithPart(String[] fullFieldNames,
DataType[] fullFieldTypes,
Collection<String> partitionKeys) |
static OrcColumnarRowSplitReader<org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch> |
OrcSplitReaderUtil.genPartColumnarRowReader(String hiveVersion,
Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
List<OrcFilters.Predicate> conjunctPredicates,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
OrcColumnarRowSplitReader . |
Modifier and Type | Method and Description |
---|---|
static OrcColumnarRowSplitReader<org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch> |
OrcNoHiveSplitReaderUtil.genPartColumnarRowReader(Configuration conf,
String[] fullFieldNames,
DataType[] fullFieldTypes,
Map<String,Object> partitionSpec,
int[] selectedFields,
List<OrcFilters.Predicate> conjunctPredicates,
int batchSize,
Path path,
long splitStart,
long splitLength)
Util for generating partitioned
OrcColumnarRowSplitReader . |
Modifier and Type | Method and Description |
---|---|
DataType |
BatchSQLTestProgram.GeneratorTableSource.getProducedDataType() |
Modifier and Type | Field and Description |
---|---|
protected DataType |
KafkaDynamicSink.consumedDataType
Data type of consumed data type.
|
protected DataType |
KafkaDynamicSource.physicalDataType
Data type to configure the formats.
|
protected DataType |
KafkaDynamicSink.physicalDataType
Data type to configure the formats.
|
protected DataType |
KafkaDynamicSource.producedDataType
Data type that describes the final output of the source.
|
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
KafkaDynamicSource.listReadableMetadata() |
Map<String,DataType> |
KafkaDynamicSink.listWritableMetadata() |
Modifier and Type | Method and Description |
---|---|
void |
KafkaDynamicSource.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType) |
void |
KafkaDynamicSink.applyWritableMetadata(List<String> metadataKeys,
DataType consumedDataType) |
protected KafkaDynamicSink |
KafkaDynamicTableFactory.createKafkaTableSink(DataType physicalDataType,
EncodingFormat<SerializationSchema<RowData>> keyEncodingFormat,
EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
String topic,
Properties properties,
FlinkKafkaPartitioner<RowData> partitioner,
DeliveryGuarantee deliveryGuarantee,
Integer parallelism,
String transactionalIdPrefix) |
protected KafkaDynamicSource |
KafkaDynamicTableFactory.createKafkaTableSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
String tableIdentifier) |
DeserializationSchema<RowData> |
UpsertKafkaDynamicTableFactory.DecodingFormatWrapper.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
SerializationSchema<RowData> |
UpsertKafkaDynamicTableFactory.EncodingFormatWrapper.createRuntimeEncoder(DynamicTableSink.Context context,
DataType consumedDataType) |
Constructor and Description |
---|
KafkaDynamicSink(DataType consumedDataType,
DataType physicalDataType,
EncodingFormat<SerializationSchema<RowData>> keyEncodingFormat,
EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
String topic,
Properties properties,
FlinkKafkaPartitioner<RowData> partitioner,
DeliveryGuarantee deliveryGuarantee,
boolean upsertMode,
SinkBufferFlushMode flushMode,
Integer parallelism,
String transactionalIdPrefix) |
KafkaDynamicSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
boolean upsertMode,
String tableIdentifier) |
Modifier and Type | Method and Description |
---|---|
DataType |
RowDataKinesisDeserializationSchema.Metadata.getDataType() |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
KinesisDynamicSource.listReadableMetadata() |
Modifier and Type | Method and Description |
---|---|
void |
KinesisDynamicSource.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType) |
Constructor and Description |
---|
KinesisDynamicSink(DataType consumedDataType,
String stream,
Properties producerProperties,
EncodingFormat<SerializationSchema<RowData>> encodingFormat,
KinesisPartitioner<RowData> partitioner) |
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat) |
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType,
List<RowDataKinesisDeserializationSchema.Metadata> requestedMetadataFields) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypes.ARRAY(DataType elementDataType)
Data type of an array of elements with same subtype.
|
static DataType |
DataTypes.BIGINT()
Data type of an 8-byte signed integer with values from -9,223,372,036,854,775,808 to
9,223,372,036,854,775,807.
|
static DataType |
DataTypes.BINARY(int n)
Data type of a fixed-length binary string (=a sequence of bytes)
BINARY(n) where
n is the number of bytes. |
static DataType |
DataTypes.BOOLEAN()
Data type of a boolean with a (possibly) three-valued logic of
TRUE, FALSE, UNKNOWN . |
static DataType |
DataTypes.BYTES()
Data type of a variable-length binary string (=a sequence of bytes) with defined maximum
length.
|
static DataType |
DataTypes.CHAR(int n)
Data type of a fixed-length character string
CHAR(n) where n is the number of
code points. |
static DataType |
DataTypes.DATE()
Data type of a date consisting of
year-month-day with values ranging from 0000-01-01 to 9999-12-31 . |
static DataType |
DataTypes.DECIMAL(int precision,
int scale)
Data type of a decimal number with fixed precision and scale
DECIMAL(p, s) where
p is the number of digits in a number (=precision) and s is the number of
digits to the right of the decimal point in a number (=scale). |
static DataType |
DataTypes.DOUBLE()
Data type of an 8-byte double precision floating point number.
|
static DataType |
DataTypes.FLOAT()
Data type of a 4-byte single precision floating point number.
|
DataType |
DataTypes.Field.getDataType() |
DataType[] |
TableSchema.getFieldDataTypes()
Deprecated.
Returns all field data types as an array.
|
DataType |
TableColumn.getType()
Deprecated.
Returns the data type of this column.
|
DataType |
WatermarkSpec.getWatermarkExprOutputType()
Deprecated.
Returns the data type of the computation result of watermark generation expression.
|
static DataType |
DataTypes.INT()
Data type of a 4-byte signed integer with values from -2,147,483,648 to 2,147,483,647.
|
static DataType |
DataTypes.INTERVAL(DataTypes.Resolution resolution)
Data type of a temporal interval.
|
static DataType |
DataTypes.INTERVAL(DataTypes.Resolution upperResolution,
DataTypes.Resolution lowerResolution)
Data type of a temporal interval.
|
static DataType |
DataTypes.MAP(DataType keyDataType,
DataType valueDataType)
Data type of an associative array that maps keys (including
NULL ) to values
(including NULL ). |
static DataType |
DataTypes.MULTISET(DataType elementDataType)
Data type of a multiset (=bag).
|
static DataType |
DataTypes.NULL()
Data type for representing untyped
NULL values. |
static <T> DataType |
DataTypes.RAW(Class<T> clazz,
TypeSerializer<T> serializer)
Data type of an arbitrary serialized type.
|
static DataType |
DataTypes.ROW()
Data type of a row type with no fields.
|
static DataType |
DataTypes.ROW(DataType... fieldDataTypes)
Data type of a sequence of fields.
|
static DataType |
DataTypes.ROW(DataTypes.Field... fields)
Data type of a sequence of fields.
|
static DataType |
DataTypes.SMALLINT()
Data type of a 2-byte signed integer with values from -32,768 to 32,767.
|
static DataType |
DataTypes.STRING()
Data type of a variable-length character string with defined maximum length.
|
static <T> DataType |
DataTypes.STRUCTURED(Class<T> implementationClass,
DataTypes.Field... fields)
Data type of a user-defined object structured type.
|
static DataType |
DataTypes.TIME()
Data type of a time WITHOUT time zone
TIME with no fractional seconds by default. |
static DataType |
DataTypes.TIME(int precision)
Data type of a time WITHOUT time zone
TIME(p) where p is the number of digits
of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP_LTZ()
Data type of a timestamp WITH LOCAL time zone.
|
static DataType |
DataTypes.TIMESTAMP_LTZ(int precision)
Data type of a timestamp WITH LOCAL time zone.
|
static DataType |
DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE()
Data type of a timestamp WITH LOCAL time zone
TIMESTAMP WITH LOCAL TIME ZONE with 6
digits of fractional seconds by default. |
static DataType |
DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(int precision)
Data type of a timestamp WITH LOCAL time zone
TIMESTAMP(p) WITH LOCAL TIME ZONE where
p is the number of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP_WITH_TIME_ZONE()
Data type of a timestamp WITH time zone
TIMESTAMP WITH TIME ZONE with 6 digits of
fractional seconds by default. |
static DataType |
DataTypes.TIMESTAMP_WITH_TIME_ZONE(int precision)
Data type of a timestamp WITH time zone
TIMESTAMP(p) WITH TIME ZONE where p
is the number of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TIMESTAMP()
Data type of a timestamp WITHOUT time zone
TIMESTAMP with 6 digits of fractional
seconds by default. |
static DataType |
DataTypes.TIMESTAMP(int precision)
Data type of a timestamp WITHOUT time zone
TIMESTAMP(p) where p is the number
of digits of fractional seconds (=precision). |
static DataType |
DataTypes.TINYINT()
Data type of a 1-byte signed integer with values from -128 to 127.
|
DataType |
TableSchema.toPersistedRowDataType()
Deprecated.
Converts all persisted columns of this schema into a (possibly nested) row data type.
|
DataType |
TableSchema.toPhysicalRowDataType()
Deprecated.
Converts all physical columns of this schema into a (possibly nested) row data type.
|
DataType |
TableSchema.toRowDataType()
Deprecated.
Converts all columns of this schema into a (possibly nested) row data type.
|
static DataType |
DataTypes.VARBINARY(int n)
Data type of a variable-length binary string (=a sequence of bytes)
VARBINARY(n)
where n is the maximum number of bytes. |
static DataType |
DataTypes.VARCHAR(int n)
Data type of a variable-length character string
VARCHAR(n) where n is the
maximum number of code points. |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
TableSchema.getFieldDataType(int fieldIndex)
Deprecated.
Returns the specified data type for the given field index.
|
Optional<DataType> |
TableSchema.getFieldDataType(String fieldName)
Deprecated.
Returns the specified data type for the given field name.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypes.ARRAY(DataType elementDataType)
Data type of an array of elements with same subtype.
|
static TableColumn.ComputedColumn |
TableColumn.computed(String name,
DataType type,
String expression)
Deprecated.
Creates a computed column that is computed from the given SQL expression.
|
TableSchema.Builder |
TableSchema.Builder.field(String name,
DataType dataType)
Add a field with name and data type.
|
static DataTypes.Field |
DataTypes.FIELD(String name,
DataType dataType)
Field definition with field name and data type.
|
TableSchema.Builder |
TableSchema.Builder.field(String name,
DataType dataType,
String expression)
Add a computed field which is generated by the given expression.
|
static DataTypes.Field |
DataTypes.FIELD(String name,
DataType dataType,
String description)
Field definition with field name, data type, and a description.
|
TableSchema.Builder |
TableSchema.Builder.fields(String[] names,
DataType[] dataTypes)
Add an array of fields with names and data types.
|
Schema.Builder |
Schema.Builder.fromRowDataType(DataType dataType)
Adopts all fields of the given row as physical columns of the schema.
|
static ApiExpression |
Expressions.lit(Object v,
DataType dataType)
Creates a SQL literal of a given
DataType . |
static DataType |
DataTypes.MAP(DataType keyDataType,
DataType valueDataType)
Data type of an associative array that maps keys (including
NULL ) to values
(including NULL ). |
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type)
Deprecated.
Creates a metadata column from metadata of the given column name.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
boolean isVirtual)
Deprecated.
Creates a metadata column from metadata of the given column name.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
String metadataAlias)
Deprecated.
Creates a metadata column from metadata of the given alias.
|
static TableColumn.MetadataColumn |
TableColumn.metadata(String name,
DataType type,
String metadataAlias,
boolean isVirtual)
Deprecated.
Creates a metadata column from metadata of the given column name or from metadata of the
given alias (if not null).
|
static DataType |
DataTypes.MULTISET(DataType elementDataType)
Data type of a multiset (=bag).
|
static ApiExpression |
Expressions.nullOf(DataType dataType)
Returns a null literal value of a given data type.
|
static TableColumn |
TableColumn.of(String name,
DataType type)
Deprecated.
Use
TableColumn.physical(String, DataType) instead. |
static TableColumn |
TableColumn.of(String name,
DataType type,
String expression)
Deprecated.
Use
TableColumn.computed(String, DataType, String) instead. |
static TableColumn.PhysicalColumn |
TableColumn.physical(String name,
DataType type)
Deprecated.
Creates a regular table column that represents physical data.
|
static DataType |
DataTypes.ROW(DataType... fieldDataTypes)
Data type of a sequence of fields.
|
TableSchema.Builder |
TableSchema.Builder.watermark(String rowtimeAttribute,
String watermarkExpressionString,
DataType watermarkExprOutputType)
Specifies the previously defined field as an event-time attribute and specifies the
watermark strategy.
|
Constructor and Description |
---|
WatermarkSpec(String rowtimeAttribute,
String watermarkExpressionString,
DataType watermarkExprOutputType)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
ListView.newListViewDataType(DataType elementDataType)
|
static DataType |
MapView.newMapViewDataType(DataType keyDataType,
DataType valueDataType)
|
Modifier and Type | Method and Description |
---|---|
static DataType |
ListView.newListViewDataType(DataType elementDataType)
|
static DataType |
MapView.newMapViewDataType(DataType keyDataType,
DataType valueDataType)
|
Modifier and Type | Method and Description |
---|---|
OutType |
BaseExpressions.cast(DataType toType)
Converts a value to a given data type.
|
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType)
Extracts a scalar from a JSON string.
|
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType,
InType defaultOnEmptyOrError)
Extracts a scalar from a JSON string.
|
OutType |
BaseExpressions.jsonValue(String path,
DataType returningType,
JsonValueOnEmptyOrError onEmpty,
InType defaultOnEmpty,
JsonValueOnEmptyOrError onError,
InType defaultOnError)
Extracts a scalar from a JSON string.
|
Modifier and Type | Field and Description |
---|---|
protected DataType |
Column.dataType |
Modifier and Type | Method and Description |
---|---|
DataType |
DataTypeFactory.createDataType(AbstractDataType<?> abstractDataType)
Creates a type out of an
AbstractDataType . |
<T> DataType |
DataTypeFactory.createDataType(Class<T> clazz)
Creates a type by analyzing the given class.
|
DataType |
DataTypeFactory.createDataType(String name)
Creates a type by a fully or partially defined name.
|
<T> DataType |
DataTypeFactory.createDataType(TypeInformation<T> typeInfo)
Creates a type for the given
TypeInformation . |
DataType |
DataTypeFactory.createDataType(UnresolvedIdentifier identifier)
Creates a type by a fully or partially defined identifier.
|
<T> DataType |
DataTypeFactory.createRawDataType(Class<T> clazz)
Creates a RAW type for the given class in cases where no serializer is known and a generic
serializer should be used.
|
<T> DataType |
DataTypeFactory.createRawDataType(TypeInformation<T> typeInfo)
Creates a RAW type for the given
TypeInformation . |
DataType |
Column.getDataType()
Returns the data type of this column.
|
DataType |
SchemaTranslator.ConsumingResult.getPhysicalDataType() |
DataType |
ResolvedSchema.toPhysicalRowDataType()
Converts all physical columns of this schema into a (possibly nested) row data type.
|
DataType |
ResolvedSchema.toSinkRowDataType()
Converts all persisted columns of this schema into a (possibly nested) row data type.
|
DataType |
ResolvedSchema.toSourceRowDataType()
Converts all columns of this schema into a (possibly nested) row data type.
|
Modifier and Type | Method and Description |
---|---|
List<DataType> |
ResolvedSchema.getColumnDataTypes()
Returns all column data types.
|
Optional<DataType> |
SchemaTranslator.ProducingResult.getPhysicalDataType() |
Modifier and Type | Method and Description |
---|---|
abstract Column |
Column.copy(DataType newType)
Returns a copy of the column with a replaced
DataType . |
Column |
Column.PhysicalColumn.copy(DataType newDataType) |
Column |
Column.ComputedColumn.copy(DataType newDataType) |
Column |
Column.MetadataColumn.copy(DataType newDataType) |
static SchemaTranslator.ConsumingResult |
SchemaTranslator.createConsumingResult(DataTypeFactory dataTypeFactory,
DataType inputDataType,
Schema declaredSchema,
boolean mergePhysicalSchema)
Converts the given
DataType and an optional declared Schema (possibly
incomplete) into the final SchemaTranslator.ConsumingResult . |
static Column.MetadataColumn |
Column.metadata(String name,
DataType dataType,
String metadataKey,
boolean isVirtual)
Creates a metadata column from metadata of the given column name or from metadata of the
given key (if not null).
|
static ResolvedSchema |
ResolvedSchema.physical(String[] columnNames,
DataType[] columnDataTypes)
Shortcut for a resolved schema of only physical columns.
|
static Column.PhysicalColumn |
Column.physical(String name,
DataType dataType)
Creates a regular table column that represents physical data.
|
Modifier and Type | Method and Description |
---|---|
static ResolvedSchema |
ResolvedSchema.physical(List<String> columnNames,
List<DataType> columnDataTypes)
Shortcut for a resolved schema of only physical columns.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
HiveTypeUtil.toFlinkType(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector inspector)
Convert a Hive ObjectInspector to a Flink data type.
|
static DataType |
HiveTypeUtil.toFlinkType(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo hiveType)
Convert Hive data type to a Flink data type.
|
Modifier and Type | Method and Description |
---|---|
static org.apache.hadoop.hive.serde2.typeinfo.TypeInfo |
HiveTypeUtil.toHiveTypeInfo(DataType dataType,
boolean checkPrecision)
Convert Flink DataType to Hive TypeInfo.
|
Modifier and Type | Method and Description |
---|---|
static String[] |
CliUtils.typesToString(DataType[] types) |
Modifier and Type | Method and Description |
---|---|
default Map<String,DataType> |
DecodingFormat.listReadableMetadata()
Returns the map of metadata keys and their corresponding data types that can be produced by
this format for reading.
|
default Map<String,DataType> |
EncodingFormat.listWritableMetadata()
Returns the map of metadata keys and their corresponding data types that can be consumed by
this format for writing.
|
Modifier and Type | Method and Description |
---|---|
I |
DecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType)
Creates runtime decoder implementation that is configured to produce data of the given data
type.
|
I |
EncodingFormat.createRuntimeEncoder(DynamicTableSink.Context context,
DataType physicalDataType)
Creates runtime encoder implementation that is configured to consume data of the given data
type.
|
Modifier and Type | Method and Description |
---|---|
DynamicTableSink.DataStructureConverter |
DynamicTableSink.Context.createDataStructureConverter(DataType consumedDataType)
Creates a converter for mapping between Flink's internal data structures and objects
specified by the given
DataType that can be passed into a runtime implementation. |
<T> TypeInformation<T> |
DynamicTableSink.Context.createTypeInformation(DataType consumedDataType)
Creates type information describing the internal data structures of the given
DataType . |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
SupportsWritingMetadata.listWritableMetadata()
Returns the map of metadata keys and their corresponding data types that can be consumed by
this table sink for writing.
|
Modifier and Type | Method and Description |
---|---|
void |
SupportsWritingMetadata.applyWritableMetadata(List<String> metadataKeys,
DataType consumedDataType)
Provides a list of metadata keys that the consumed
RowData will contain as appended
metadata columns which must be persisted. |
Modifier and Type | Method and Description |
---|---|
DynamicTableSource.DataStructureConverter |
DynamicTableSource.Context.createDataStructureConverter(DataType producedDataType)
Creates a converter for mapping between objects specified by the given
DataType
and Flink's internal data structures that can be passed into a runtime implementation. |
<T> TypeInformation<T> |
DynamicTableSource.Context.createTypeInformation(DataType producedDataType)
Creates type information describing the internal data structures of the given
DataType . |
Modifier and Type | Method and Description |
---|---|
Map<String,DataType> |
SupportsReadingMetadata.listReadableMetadata()
Returns the map of metadata keys and their corresponding data types that can be produced by
this table source for reading.
|
Modifier and Type | Method and Description |
---|---|
boolean |
SupportsAggregatePushDown.applyAggregates(List<int[]> groupingSets,
List<AggregateExpression> aggregateExpressions,
DataType producedDataType)
Provides a list of aggregate expressions and the grouping keys.
|
void |
SupportsReadingMetadata.applyReadableMetadata(List<String> metadataKeys,
DataType producedDataType)
Provides a list of metadata keys that the produced
RowData must contain as appended
metadata columns. |
Modifier and Type | Method and Description |
---|---|
static StructuredObjectConverter<?> |
StructuredObjectConverter.create(DataType dataType) |
static RowRowConverter |
RowRowConverter.create(DataType dataType) |
static RawByteArrayConverter<?> |
RawByteArrayConverter.create(DataType dataType) |
static ArrayListConverter<?> |
ArrayListConverter.create(DataType dataType) |
static YearMonthIntervalPeriodConverter |
YearMonthIntervalPeriodConverter.create(DataType dataType) |
static RawObjectConverter<?> |
RawObjectConverter.create(DataType dataType) |
static ArrayObjectArrayConverter<?> |
ArrayObjectArrayConverter.create(DataType dataType) |
static <E> ArrayObjectArrayConverter<E> |
ArrayObjectArrayConverter.createForElement(DataType elementDataType) |
static MapMapConverter<?,?> |
MapMapConverter.createForMapType(DataType dataType) |
static MapMapConverter<?,?> |
MapMapConverter.createForMultisetType(DataType dataType) |
static DataStructureConverter<Object,Object> |
DataStructureConverters.getConverter(DataType dataType)
Returns a converter for the given
DataType . |
Modifier and Type | Method and Description |
---|---|
static DataFormatConverters.DataFormatConverter |
DataFormatConverters.getConverterForDataType(DataType originDataType)
|
Constructor and Description |
---|
AbstractRowDataConverter(DataType[] fieldTypes) |
CaseClassConverter(TupleTypeInfoBase t,
DataType[] fieldTypes) |
MapConverter(DataType keyTypeInfo,
DataType valueTypeInfo) |
ObjectArrayConverter(DataType elementType) |
PojoConverter(PojoTypeInfo<T> t,
DataType[] fieldTypes) |
RowConverter(DataType[] fieldTypes) |
TupleConverter(Class<Tuple> clazz,
DataType[] fieldTypes) |
Modifier and Type | Method and Description |
---|---|
DataType |
DescriptorProperties.getDataType(String key)
Deprecated.
Returns the DataType under the given existing key.
|
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
DescriptorProperties.getOptionalDataType(String key)
Deprecated.
Returns the DataType under the given key if it exists.
|
Modifier and Type | Method and Description |
---|---|
Schema |
Schema.field(String fieldName,
DataType fieldType)
Deprecated.
Adds a field with the field name and the data type.
|
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
ChangelogCsvFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
Constructor and Description |
---|
SocketDynamicTableSource(String hostname,
int port,
byte byteDelimiter,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
LocalReferenceExpression.getOutputDataType() |
DataType |
TableReferenceExpression.getOutputDataType() |
DataType |
TypeLiteralExpression.getOutputDataType() |
DataType |
FieldReferenceExpression.getOutputDataType() |
DataType |
CallExpression.getOutputDataType() |
DataType |
AggregateExpression.getOutputDataType() |
DataType |
ResolvedExpression.getOutputDataType()
Returns the data type of the computation result.
|
DataType |
ValueLiteralExpression.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
static LocalReferenceExpression |
ApiExpressionUtils.localRef(String name,
DataType dataType) |
CallExpression |
UnresolvedCallExpression.resolve(List<ResolvedExpression> args,
DataType dataType) |
static TypeLiteralExpression |
ApiExpressionUtils.typeLiteral(DataType dataType) |
static ValueLiteralExpression |
ApiExpressionUtils.valueLiteral(Object value,
DataType dataType) |
Constructor and Description |
---|
AggregateExpression(FunctionDefinition functionDefinition,
List<FieldReferenceExpression> args,
CallExpression filterExpression,
DataType resultType,
boolean distinct,
boolean approximate,
boolean ignoreNulls) |
CallExpression(FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType) |
CallExpression(FunctionIdentifier functionIdentifier,
FunctionDefinition functionDefinition,
List<ResolvedExpression> args,
DataType dataType) |
FieldReferenceExpression(String name,
DataType dataType,
int inputIndex,
int fieldIndex) |
TypeLiteralExpression(DataType dataType) |
ValueLiteralExpression(Object value,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
CallExpression |
ExpressionResolver.PostResolverFactory.array(DataType dataType,
ResolvedExpression... expression) |
CallExpression |
ExpressionResolver.PostResolverFactory.cast(ResolvedExpression expression,
DataType dataType) |
CallExpression |
ExpressionResolver.PostResolverFactory.get(ResolvedExpression composite,
ValueLiteralExpression key,
DataType dataType) |
CallExpression |
ExpressionResolver.PostResolverFactory.map(DataType dataType,
ResolvedExpression... expression) |
CallExpression |
ExpressionResolver.PostResolverFactory.row(DataType dataType,
ResolvedExpression... expression) |
ExpressionResolver.ExpressionResolverBuilder |
ExpressionResolver.ExpressionResolverBuilder.withOutputDataType(DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
ResolverRule.ResolutionContext.getOutputDataType()
Access to the expected top-level output data type.
|
Modifier and Type | Method and Description |
---|---|
default DataType[] |
FileSystemFormatFactory.ReaderContext.getFormatFieldTypes()
Get field types without partition keys.
|
Modifier and Type | Method and Description |
---|---|
static Object |
RowPartitionComputer.restorePartValueFromType(String valStr,
DataType type) |
Constructor and Description |
---|
RowDataPartitionComputer(String defaultPartValue,
String[] columnNames,
DataType[] columnTypes,
String[] partitionColumns) |
Modifier and Type | Method and Description |
---|---|
BuiltInFunctionDefinition.Builder |
BuiltInFunctionDefinition.Builder.typedArguments(DataType... argumentTypes) |
Modifier and Type | Field and Description |
---|---|
protected DataType[] |
HiveScalarFunction.argTypes |
Modifier and Type | Method and Description |
---|---|
DataType |
HiveFunction.getHiveResultType(Object[] constantArguments,
DataType[] argTypes)
Get result type by arguments and argTypes.
|
DataType |
HiveGenericUDTF.getHiveResultType(Object[] constantArguments,
DataType[] argTypes) |
DataType |
HiveGenericUDAF.getHiveResultType(Object[] constantArguments,
DataType[] argTypes) |
protected DataType |
HiveGenericUDF.inferReturnType() |
protected abstract DataType |
HiveScalarFunction.inferReturnType()
Infer return type of this function call.
|
protected DataType |
HiveSimpleUDF.inferReturnType() |
Modifier and Type | Method and Description |
---|---|
DataType |
HiveFunction.getHiveResultType(Object[] constantArguments,
DataType[] argTypes)
Get result type by arguments and argTypes.
|
DataType |
HiveGenericUDTF.getHiveResultType(Object[] constantArguments,
DataType[] argTypes) |
DataType |
HiveGenericUDAF.getHiveResultType(Object[] constantArguments,
DataType[] argTypes) |
void |
HiveFunction.setArgumentTypesAndConstants(Object[] constantArguments,
DataType[] argTypes)
Set arguments and argTypes for Function instance.
|
void |
HiveGenericUDTF.setArgumentTypesAndConstants(Object[] constantArguments,
DataType[] argTypes) |
void |
HiveGenericUDAF.setArgumentTypesAndConstants(Object[] constantArguments,
DataType[] argTypes) |
Modifier and Type | Method and Description |
---|---|
static org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector |
HiveInspectors.getObjectInspector(DataType flinkType)
Get Hive
ObjectInspector for a Flink DataType . |
static org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[] |
HiveInspectors.toInspectors(HiveShim hiveShim,
Object[] args,
DataType[] argTypes)
Get an array of ObjectInspector from the give array of args and their types.
|
Modifier and Type | Method and Description |
---|---|
static boolean |
HiveFunctionUtil.isSingleBoxedArray(DataType[] argTypes) |
Constructor and Description |
---|
PythonAggregateFunction(String name,
byte[] serializedAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonAggregateFunction(String name,
byte[] serializedAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableAggregateFunction(String name,
byte[] serializedTableAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
PythonTableAggregateFunction(String name,
byte[] serializedTableAggregateFunction,
DataType[] inputTypes,
DataType resultType,
DataType accumulatorType,
PythonFunctionKind pythonFunctionKind,
boolean deterministic,
boolean takesRowAsInput,
PythonEnv pythonEnv) |
Modifier and Type | Method and Description |
---|---|
DataType |
ScalaExternalQueryOperation.getPhysicalDataType() |
DataType |
JavaExternalQueryOperation.getPhysicalDataType() |
DataType |
ExternalModifyOperation.getPhysicalDataType() |
DataType |
OutputConversionModifyOperation.getType() |
Constructor and Description |
---|
ExternalModifyOperation(ObjectIdentifier tableIdentifier,
QueryOperation child,
ResolvedSchema resolvedSchema,
ChangelogMode changelogMode,
DataType physicalDataType) |
JavaExternalQueryOperation(ObjectIdentifier identifier,
DataStream<E> dataStream,
DataType physicalDataType,
boolean isTopLevelRecord,
ChangelogMode changelogMode,
ResolvedSchema resolvedSchema) |
OutputConversionModifyOperation(QueryOperation child,
DataType type,
OutputConversionModifyOperation.UpdateMode updateMode) |
ScalaExternalQueryOperation(ObjectIdentifier identifier,
DataStream<E> dataStream,
DataType physicalDataType,
boolean isTopLevelRecord,
ChangelogMode changelogMode,
ResolvedSchema resolvedSchema) |
Modifier and Type | Method and Description |
---|---|
QueryOperation |
OperationTreeBuilder.values(DataType rowType,
Expression... expressions) |
Modifier and Type | Method and Description |
---|---|
static org.apache.calcite.rel.RelNode |
DynamicSourceUtils.convertDataStreamToRel(boolean isBatchMode,
ReadableConfig config,
org.apache.flink.table.planner.calcite.FlinkRelBuilder relBuilder,
ObjectIdentifier identifier,
ResolvedSchema schema,
DataStream<?> dataStream,
DataType physicalDataType,
boolean isTopLevelRecord,
ChangelogMode changelogMode)
Converts a given
DataStream to a RelNode . |
Modifier and Type | Method and Description |
---|---|
static DataType |
HiveParserUtils.toDataType(org.apache.calcite.rel.type.RelDataType relDataType) |
Modifier and Type | Method and Description |
---|---|
static org.apache.calcite.rel.type.RelDataType |
HiveParserUtils.toRelDataType(DataType dataType,
org.apache.calcite.rel.type.RelDataTypeFactory dtFactory) |
Modifier and Type | Method and Description |
---|---|
DataType |
RexNodeExpression.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
static ValueLiteralExpression |
ExpressionBuilder.literal(Object value,
DataType type) |
static ValueLiteralExpression |
ExpressionBuilder.nullOf(DataType type) |
static TypeLiteralExpression |
ExpressionBuilder.typeLiteral(DataType type) |
Constructor and Description |
---|
RexNodeExpression(org.apache.calcite.rex.RexNode rexNode,
DataType outputDataType,
String summaryString,
String serializableString) |
Modifier and Type | Method and Description |
---|---|
DataType |
ListAggWsWithRetractAggFunction.getAccumulatorDataType() |
DataType |
ListAggWithRetractAggFunction.getAccumulatorDataType() |
DataType |
MinWithRetractAggFunction.getAccumulatorDataType() |
DataType |
LastValueWithRetractAggFunction.getAccumulatorDataType() |
DataType |
LagAggFunction.getAccumulatorDataType() |
DataType |
FirstValueWithRetractAggFunction.getAccumulatorDataType() |
DataType |
FirstValueAggFunction.getAccumulatorDataType() |
DataType |
MaxWithRetractAggFunction.getAccumulatorDataType() |
DataType |
CollectAggFunction.getAccumulatorDataType() |
DataType |
LastValueAggFunction.getAccumulatorDataType() |
DataType[] |
RankAggFunction.getAggBufferTypes() |
DataType[] |
RowNumberAggFunction.getAggBufferTypes() |
DataType[] |
CountAggFunction.getAggBufferTypes() |
DataType[] |
IncrSumWithRetractAggFunction.getAggBufferTypes() |
DataType[] |
SingleValueAggFunction.getAggBufferTypes() |
DataType[] |
LeadLagAggFunction.getAggBufferTypes() |
DataType[] |
MaxAggFunction.getAggBufferTypes() |
abstract DataType[] |
DeclarativeAggregateFunction.getAggBufferTypes()
All types of the aggregate buffer.
|
DataType[] |
MinAggFunction.getAggBufferTypes() |
DataType[] |
AvgAggFunction.getAggBufferTypes() |
DataType[] |
Sum0AggFunction.getAggBufferTypes() |
DataType[] |
ListAggFunction.getAggBufferTypes() |
DataType[] |
SumAggFunction.getAggBufferTypes() |
DataType[] |
IncrSumAggFunction.getAggBufferTypes() |
DataType[] |
DenseRankAggFunction.getAggBufferTypes() |
DataType[] |
SumWithRetractAggFunction.getAggBufferTypes() |
DataType[] |
Count1AggFunction.getAggBufferTypes() |
DataType |
ListAggWsWithRetractAggFunction.getOutputDataType() |
DataType |
ListAggWithRetractAggFunction.getOutputDataType() |
DataType |
MinWithRetractAggFunction.getOutputDataType() |
DataType |
LastValueWithRetractAggFunction.getOutputDataType() |
DataType |
LagAggFunction.getOutputDataType() |
DataType |
FirstValueWithRetractAggFunction.getOutputDataType() |
DataType |
FirstValueAggFunction.getOutputDataType() |
DataType |
MaxWithRetractAggFunction.getOutputDataType() |
DataType |
CollectAggFunction.getOutputDataType() |
DataType |
LastValueAggFunction.getOutputDataType() |
DataType |
RowNumberAggFunction.getResultType() |
DataType |
CountAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.IntIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.ByteIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.ShortIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.LongIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.FloatIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.DoubleIncrSumWithRetractAggFunction.getResultType() |
DataType |
IncrSumWithRetractAggFunction.DecimalIncrSumWithRetractAggFunction.getResultType() |
DataType |
SingleValueAggFunction.ByteSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.ShortSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.IntSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.LongSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.FloatSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DoubleSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.BooleanSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DecimalSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.StringSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.DateSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimeSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimestampSingleValueAggFunction.getResultType() |
DataType |
SingleValueAggFunction.TimestampLtzSingleValueAggFunction.getResultType() |
DataType |
LeadLagAggFunction.IntLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.ByteLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.ShortLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.LongLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.FloatLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DoubleLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.BooleanLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DecimalLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.StringLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.CharLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.DateLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.TimeLeadLagAggFunction.getResultType() |
DataType |
LeadLagAggFunction.TimestampLeadLagAggFunction.getResultType() |
DataType |
MaxAggFunction.IntMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.ByteMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.ShortMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.LongMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.FloatMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DoubleMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DecimalMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.BooleanMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.StringMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.DateMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimeMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimestampMaxAggFunction.getResultType() |
DataType |
MaxAggFunction.TimestampLtzMaxAggFunction.getResultType() |
abstract DataType |
DeclarativeAggregateFunction.getResultType()
The result type of the function.
|
DataType |
MinAggFunction.IntMinAggFunction.getResultType() |
DataType |
MinAggFunction.ByteMinAggFunction.getResultType() |
DataType |
MinAggFunction.ShortMinAggFunction.getResultType() |
DataType |
MinAggFunction.LongMinAggFunction.getResultType() |
DataType |
MinAggFunction.FloatMinAggFunction.getResultType() |
DataType |
MinAggFunction.DoubleMinAggFunction.getResultType() |
DataType |
MinAggFunction.DecimalMinAggFunction.getResultType() |
DataType |
MinAggFunction.BooleanMinAggFunction.getResultType() |
DataType |
MinAggFunction.StringMinAggFunction.getResultType() |
DataType |
MinAggFunction.DateMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimeMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimestampMinAggFunction.getResultType() |
DataType |
MinAggFunction.TimestampLtzMinAggFunction.getResultType() |
DataType |
AvgAggFunction.ByteAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.ShortAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.IntAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.LongAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.FloatAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.DoubleAvgAggFunction.getResultType() |
DataType |
AvgAggFunction.DecimalAvgAggFunction.getResultType() |
DataType |
Sum0AggFunction.IntSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.ByteSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.ShortSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.LongSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.FloatSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.DoubleSum0AggFunction.getResultType() |
DataType |
Sum0AggFunction.DecimalSum0AggFunction.getResultType() |
DataType |
ListAggFunction.getResultType() |
DataType |
SumAggFunction.IntSumAggFunction.getResultType() |
DataType |
SumAggFunction.ByteSumAggFunction.getResultType() |
DataType |
SumAggFunction.ShortSumAggFunction.getResultType() |
DataType |
SumAggFunction.LongSumAggFunction.getResultType() |
DataType |
SumAggFunction.FloatSumAggFunction.getResultType() |
DataType |
SumAggFunction.DoubleSumAggFunction.getResultType() |
DataType |
SumAggFunction.DecimalSumAggFunction.getResultType() |
DataType |
RankLikeAggFunctionBase.getResultType() |
DataType |
IncrSumAggFunction.IntIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.ByteIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.ShortIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.LongIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.FloatIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.DoubleIncrSumAggFunction.getResultType() |
DataType |
IncrSumAggFunction.DecimalIncrSumAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.IntSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.ByteSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.ShortSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.LongSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.FloatSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.DoubleSumWithRetractAggFunction.getResultType() |
DataType |
SumWithRetractAggFunction.DecimalSumWithRetractAggFunction.getResultType() |
DataType |
Count1AggFunction.getResultType() |
abstract DataType |
AvgAggFunction.getSumType() |
DataType |
AvgAggFunction.ByteAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.ShortAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.IntAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.LongAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.FloatAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.DoubleAvgAggFunction.getSumType() |
DataType |
AvgAggFunction.DecimalAvgAggFunction.getSumType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
ListAggWsWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
ListAggWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
MinWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
LastValueWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
LagAggFunction.getArgumentDataTypes() |
List<DataType> |
FirstValueWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
FirstValueAggFunction.getArgumentDataTypes() |
List<DataType> |
MaxWithRetractAggFunction.getArgumentDataTypes() |
List<DataType> |
CollectAggFunction.getArgumentDataTypes() |
List<DataType> |
LastValueAggFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
LookupCallContext.getArgumentDataTypes() |
List<DataType> |
CallBindingCallContext.getArgumentDataTypes() |
List<DataType> |
OperatorBindingCallContext.getArgumentDataTypes() |
Optional<DataType> |
LookupCallContext.getOutputDataType() |
Optional<DataType> |
CallBindingCallContext.getOutputDataType() |
Optional<DataType> |
OperatorBindingCallContext.getOutputDataType() |
Constructor and Description |
---|
HiveTableSqlFunction(FunctionIdentifier identifier,
TableFunction hiveUdtf,
DataType implicitResultType,
org.apache.flink.table.planner.calcite.FlinkTypeFactory typeFactory,
org.apache.flink.table.planner.plan.schema.FlinkTableFunction functionImpl,
HiveTableSqlFunction.HiveOperandTypeChecker operandTypeChecker)
Deprecated.
|
Constructor and Description |
---|
BatchExecBoundedStreamScan(DataStream<?> dataStream,
DataType sourceType,
int[] fieldIndexes,
List<String> qualifiedName,
RowType outputType,
String description) |
Modifier and Type | Method and Description |
---|---|
protected LogicalType[] |
StreamExecWindowAggregateBase.convertToLogicalTypes(DataType[] dataTypes) |
Constructor and Description |
---|
StreamExecDataStreamScan(DataStream<?> dataStream,
DataType sourceType,
int[] fieldIndexes,
String[] fieldNames,
List<String> qualifiedName,
RowType outputType,
String description) |
Modifier and Type | Method and Description |
---|---|
static DataViewUtils.DataViewSpec[] |
CommonPythonUtil.extractDataViewSpecs(int index,
DataType accType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataViewUtils.adjustDataViews(DataType accumulatorDataType,
boolean hasStateBackedDataViews)
Modifies the data type of an accumulator regarding data views.
|
static DataType |
DataViewUtils.createDistinctViewDataType(DataType keyDataType,
int filterArgs,
int filterArgsLimit)
Creates a special
DataType for DISTINCT aggregates. |
static DataType |
DataViewUtils.extractElementDataTypeForListView(DataType dataType) |
DataType |
DataViewUtils.DataViewSpec.getDataType() |
DataType |
DataViewUtils.ListViewSpec.getElementDataType() |
DataType |
DataViewUtils.MapViewSpec.getKeyDataType() |
DataType |
DataViewUtils.MapViewSpec.getValueDataType() |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataViewUtils.adjustDataViews(DataType accumulatorDataType,
boolean hasStateBackedDataViews)
Modifies the data type of an accumulator regarding data views.
|
static DataType |
DataViewUtils.createDistinctViewDataType(DataType keyDataType,
int filterArgs,
int filterArgsLimit)
Creates a special
DataType for DISTINCT aggregates. |
static DataViewUtils.DistinctViewSpec |
DataViewUtils.createDistinctViewSpec(int index,
DataType distinctViewDataType)
Creates a special
DataViewUtils.DistinctViewSpec for DISTINCT aggregates. |
static List<DataViewUtils.DataViewSpec> |
DataViewUtils.extractDataViews(int aggIndex,
DataType accumulatorDataType)
Searches for data views in the data type of an accumulator and extracts them.
|
static DataType |
DataViewUtils.extractElementDataTypeForListView(DataType dataType) |
static KeyValueDataType |
DataViewUtils.extractKeyValueDataTypeForMapView(DataType dataType) |
static boolean |
DataViewUtils.isListViewDataType(DataType dataType)
Check if the given data type represents a
ListView . |
static boolean |
DataViewUtils.isMapViewDataType(DataType dataType)
Check if the given data type represents a
MapView . |
Constructor and Description |
---|
DistinctViewSpec(String stateId,
DataType distinctViewDataType) |
ListViewSpec(String stateId,
int fieldIndex,
DataType dataType) |
ListViewSpec(String stateId,
int fieldIndex,
DataType dataType,
TypeSerializer<?> elementSerializer)
Deprecated.
|
MapViewSpec(String stateId,
int fieldIndex,
DataType dataType,
boolean containsNullKey) |
MapViewSpec(String stateId,
int fieldIndex,
DataType dataType,
boolean containsNullKey,
TypeSerializer<?> keySerializer,
TypeSerializer<?> valueSerializer)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
static ArrowTableSource |
ArrowUtils.createArrowTableSource(DataType dataType,
String fileName) |
Modifier and Type | Method and Description |
---|---|
DataType |
ArrowTableSource.getProducedDataType() |
Constructor and Description |
---|
ArrowTableSource(DataType dataType,
byte[][] arrowData) |
Modifier and Type | Method and Description |
---|---|
DynamicTableSink.DataStructureConverter |
SinkRuntimeProviderContext.createDataStructureConverter(DataType consumedDataType) |
TypeInformation<?> |
SinkRuntimeProviderContext.createTypeInformation(DataType consumedDataType) |
Modifier and Type | Method and Description |
---|---|
DynamicTableSource.DataStructureConverter |
LookupRuntimeProviderContext.createDataStructureConverter(DataType producedDataType) |
DynamicTableSource.DataStructureConverter |
ScanRuntimeProviderContext.createDataStructureConverter(DataType producedDataType) |
TypeInformation<?> |
LookupRuntimeProviderContext.createTypeInformation(DataType producedDataType) |
TypeInformation<?> |
ScanRuntimeProviderContext.createTypeInformation(DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInAggregateFunction.getAccumulatorDataType() |
DataType |
BuiltInAggregateFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInAggregateFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInScalarFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInScalarFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
DataType |
BuiltInTableFunction.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
BuiltInTableFunction.getArgumentDataTypes() |
Modifier and Type | Method and Description |
---|---|
static DataType |
ClassDataTypeConverter.fromClassToDataType(Class<?> clazz) |
static DataType |
LogicalTypeDataTypeConverter.fromLogicalTypeToDataType(LogicalType logicalType)
Deprecated.
|
DataType |
DataTypePrecisionFixer.visit(AtomicDataType dataType) |
DataType |
DataTypePrecisionFixer.visit(CollectionDataType collectionDataType) |
DataType |
DataTypePrecisionFixer.visit(FieldsDataType fieldsDataType) |
DataType |
DataTypePrecisionFixer.visit(KeyValueDataType keyValueDataType) |
Modifier and Type | Method and Description |
---|---|
static LogicalType |
LogicalTypeDataTypeConverter.fromDataTypeToLogicalType(DataType dataType)
Deprecated.
|
static TypeInformation<?> |
TypeInfoDataTypeConverter.fromDataTypeToTypeInfo(DataType dataType)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
DataType |
InternalTypeInfo.getDataType() |
DataType |
DecimalDataTypeInfo.getDataType() |
DataType |
ExternalSerializer.getDataType() |
DataType |
ExternalTypeInfo.getDataType() |
Modifier and Type | Method and Description |
---|---|
static <I,E> ExternalSerializer<I,E> |
ExternalSerializer.of(DataType dataType)
Creates an instance of a
ExternalSerializer defined by the given DataType . |
static <T> ExternalTypeInfo<T> |
ExternalTypeInfo.of(DataType dataType)
Creates type information for a
DataType that is possibly represented by internal data
structures but serialized and deserialized into external data structures. |
static <I,E> ExternalSerializer<I,E> |
ExternalSerializer.of(DataType dataType,
boolean isInternalInput)
Creates an instance of a
ExternalSerializer defined by the given DataType . |
static <T> ExternalTypeInfo<T> |
ExternalTypeInfo.of(DataType dataType,
boolean isInternalInput)
Creates type information for a
DataType that is possibly represented by internal data
structures but serialized and deserialized into external data structures. |
Modifier and Type | Method and Description |
---|---|
DataType |
CsvTableSink.getConsumedDataType()
Deprecated.
|
default DataType |
TableSink.getConsumedDataType()
Deprecated.
Returns the data type consumed by this
TableSink . |
Constructor and Description |
---|
CsvTableSink(String path,
String fieldDelim,
int numFiles,
FileSystem.WriteMode writeMode,
String[] fieldNames,
DataType[] fieldTypes)
Deprecated.
A simple
TableSink to emit data as CSV files. |
Modifier and Type | Method and Description |
---|---|
DataType |
CsvTableSource.getProducedDataType()
Deprecated.
|
default DataType |
TableSource.getProducedDataType()
Deprecated.
Returns the
DataType for the produced data of the TableSource . |
Modifier and Type | Method and Description |
---|---|
CsvTableSource.Builder |
CsvTableSource.Builder.field(String fieldName,
DataType fieldType)
Adds a field with the field name and the data type.
|
Modifier and Type | Method and Description |
---|---|
static ResolvedFieldReference[] |
TimestampExtractorUtils.getAccessedFields(TimestampExtractor timestampExtractor,
DataType physicalInputType,
java.util.function.Function<String,String> nameRemapping)
Retrieves all field accesses needed for the given
TimestampExtractor . |
Modifier and Type | Method and Description |
---|---|
DataType |
Column.getDataType() |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
TpcdsSchema.getFieldTypes() |
Constructor and Description |
---|
Column(String name,
DataType dataType) |
Modifier and Type | Class and Description |
---|---|
class |
AtomicDataType
A data type that does not contain further data types (e.g.
|
class |
CollectionDataType
A data type that contains an element type (e.g.
|
class |
FieldsDataType
A data type that contains field data types (i.e.
|
class |
KeyValueDataType
A data type that contains a key and value data type (e.g.
|
Modifier and Type | Method and Description |
---|---|
DataType |
AtomicDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
CollectionDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
FieldsDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
KeyValueDataType.bridgedTo(Class<?> newConversionClass) |
DataType |
DataTypeQueryable.getDataType() |
DataType |
CollectionDataType.getElementDataType() |
DataType |
KeyValueDataType.getKeyDataType() |
DataType |
KeyValueDataType.getValueDataType() |
DataType |
AtomicDataType.notNull() |
DataType |
CollectionDataType.notNull() |
DataType |
FieldsDataType.notNull() |
DataType |
KeyValueDataType.notNull() |
DataType |
AtomicDataType.nullable() |
DataType |
CollectionDataType.nullable() |
DataType |
FieldsDataType.nullable() |
DataType |
KeyValueDataType.nullable() |
DataType |
UnresolvedDataType.toDataType(DataTypeFactory factory)
Converts this instance to a resolved
DataType possibly enriched with additional
nullability and conversion class information. |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
AtomicDataType.getChildren() |
abstract List<DataType> |
DataType.getChildren() |
List<DataType> |
CollectionDataType.getChildren() |
List<DataType> |
FieldsDataType.getChildren() |
List<DataType> |
KeyValueDataType.getChildren() |
Constructor and Description |
---|
CollectionDataType(LogicalType logicalType,
Class<?> conversionClass,
DataType elementDataType) |
CollectionDataType(LogicalType logicalType,
DataType elementDataType) |
KeyValueDataType(LogicalType logicalType,
Class<?> conversionClass,
DataType keyDataType,
DataType valueDataType) |
KeyValueDataType(LogicalType logicalType,
DataType keyDataType,
DataType valueDataType) |
Constructor and Description |
---|
FieldsDataType(LogicalType logicalType,
Class<?> conversionClass,
List<DataType> fieldDataTypes) |
FieldsDataType(LogicalType logicalType,
List<DataType> fieldDataTypes) |
UnresolvedDataType(java.util.function.Supplier<String> description,
java.util.function.Function<DataTypeFactory,DataType> resolutionFactory) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeExtractor.extractFromGeneric(DataTypeFactory typeFactory,
Class<?> baseClass,
int genericPos,
Type contextType)
Extracts a data type from a type variable at
genericPos of baseClass using
the information of the most specific type contextType . |
static DataType |
DataTypeExtractor.extractFromMethodOutput(DataTypeFactory typeFactory,
Class<?> baseClass,
Method method)
Extracts a data type from a method return type by considering surrounding classes and method
annotation.
|
static DataType |
DataTypeExtractor.extractFromMethodParameter(DataTypeFactory typeFactory,
Class<?> baseClass,
Method method,
int paramPos)
Extracts a data type from a method parameter by considering surrounding classes and parameter
annotation.
|
static DataType |
DataTypeExtractor.extractFromType(DataTypeFactory typeFactory,
org.apache.flink.table.types.extraction.DataTypeTemplate template,
Type type)
Extracts a data type from a type without considering surrounding classes but templates.
|
static DataType |
DataTypeExtractor.extractFromType(DataTypeFactory typeFactory,
Type type)
Extracts a data type from a type without considering surrounding classes or templates.
|
Modifier and Type | Method and Description |
---|---|
DataType |
TypeInferenceUtil.Result.getOutputDataType() |
static DataType |
TypeInferenceUtil.inferOutputType(CallContext callContext,
TypeStrategy outputTypeStrategy)
Infers an output type using the given
TypeStrategy . |
DataType |
TypeTransformation.transform(DataType typeToTransform)
Transforms the given data type to a different data type.
|
default DataType |
TypeTransformation.transform(DataTypeFactory factory,
DataType typeToTransform)
Transforms the given data type to a different data type.
|
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
TypeInferenceUtil.Result.getAccumulatorDataType() |
List<DataType> |
CallContext.getArgumentDataTypes()
Returns a resolved list of the call's argument types.
|
List<DataType> |
TypeInferenceUtil.Result.getExpectedArgumentTypes() |
Optional<DataType> |
CallContext.getOutputDataType()
Returns the inferred output data type of the function call.
|
Optional<List<DataType>> |
TypeInference.getTypedArguments() |
Optional<DataType> |
ArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure)
Main logic for inferring and validating an argument.
|
Optional<List<DataType>> |
InputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure)
Main logic for inferring and validating the input arguments.
|
Optional<DataType> |
TypeInferenceUtil.SurroundingInfo.inferOutputType(DataTypeFactory typeFactory) |
Optional<DataType> |
TypeStrategy.inferType(CallContext callContext)
Infers a type from the given function call.
|
Modifier and Type | Method and Description |
---|---|
static CallContext |
TypeInferenceUtil.adaptArguments(TypeInference typeInference,
CallContext callContext,
DataType outputType)
Adapts the call's argument if necessary.
|
static ExplicitArgumentTypeStrategy |
InputTypeStrategies.explicit(DataType expectedDataType)
Strategy for an argument that corresponds to an explicitly defined type casting.
|
static TypeStrategy |
TypeStrategies.explicit(DataType dataType)
Type strategy that returns a fixed
DataType . |
static InputTypeStrategy |
InputTypeStrategies.explicitSequence(DataType... expectedDataTypes)
Strategy for a function signature of explicitly defined types like
f(STRING, INT) . |
static InputTypeStrategy |
InputTypeStrategies.explicitSequence(String[] argumentNames,
DataType[] expectedDataTypes)
Strategy for a named function signature of explicitly defined types like
f(s STRING, i
INT) . |
static TypeInferenceUtil.SurroundingInfo |
TypeInferenceUtil.SurroundingInfo.of(DataType dataType) |
DataType |
TypeTransformation.transform(DataType typeToTransform)
Transforms the given data type to a different data type.
|
default DataType |
TypeTransformation.transform(DataTypeFactory factory,
DataType typeToTransform)
Transforms the given data type to a different data type.
|
TypeInference.Builder |
TypeInference.Builder.typedArguments(DataType... argumentTypes) |
Modifier and Type | Method and Description |
---|---|
static ConstraintArgumentTypeStrategy |
InputTypeStrategies.constraint(String constraintMessage,
java.util.function.Function<List<DataType>,Boolean> evaluator)
Strategy for an argument that must fulfill a given constraint.
|
TypeInference.Builder |
TypeInference.Builder.typedArguments(List<DataType> argumentTypes)
Sets the list of argument types for specifying a fixed, not overloaded, not vararg input
signature explicitly.
|
Constructor and Description |
---|
Result(List<DataType> expectedArgumentTypes,
DataType accumulatorDataType,
DataType outputDataType) |
Constructor and Description |
---|
Result(List<DataType> expectedArgumentTypes,
DataType accumulatorDataType,
DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
Optional<DataType> |
ExplicitArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
FamilyArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
OrArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
AndArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
RootArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
CompositeArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
LiteralArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
AnyArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
ConstraintArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
OutputArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
SymbolArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
CommonArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<DataType> |
TypeLiteralArgumentTypeStrategy.inferArgumentType(CallContext callContext,
int argumentPos,
boolean throwOnFailure) |
Optional<List<DataType>> |
CommonInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
ComparableTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
SubsequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
SequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
WildcardInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
OrInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<List<DataType>> |
VaryingSequenceInputTypeStrategy.inferInputTypes(CallContext callContext,
boolean throwOnFailure) |
Optional<DataType> |
VaryingStringTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ExplicitTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MappingTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
FirstTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
UseArgumentTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
CommonTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MatchFamilyTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
ForceNullableTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
MissingTypeStrategy.inferType(CallContext callContext) |
Optional<DataType> |
NullableIfArgsTypeStrategy.inferType(CallContext callContext) |
Constructor and Description |
---|
ExplicitArgumentTypeStrategy(DataType expectedDataType) |
ExplicitTypeStrategy(DataType explicitDataType) |
Constructor and Description |
---|
ConstraintArgumentTypeStrategy(String constraintMessage,
java.util.function.Function<List<DataType>,Boolean> evaluator) |
Modifier and Type | Method and Description |
---|---|
DataType |
LegacyToNonLegacyTransformation.transform(DataType typeToTransform) |
DataType |
DataTypeConversionClassTransformation.transform(DataType dataType) |
DataType |
LegacyRawTypeTransformation.transform(DataType typeToTransform) |
DataType |
LegacyToNonLegacyTransformation.transform(DataTypeFactory factory,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
DataType |
LegacyToNonLegacyTransformation.transform(DataType typeToTransform) |
DataType |
DataTypeConversionClassTransformation.transform(DataType dataType) |
DataType |
LegacyRawTypeTransformation.transform(DataType typeToTransform) |
DataType |
LegacyToNonLegacyTransformation.transform(DataTypeFactory factory,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
List<DataType> |
UnknownCallContext.getArgumentDataTypes() |
List<DataType> |
AdaptedCallContext.getArgumentDataTypes() |
Optional<DataType> |
UnknownCallContext.getOutputDataType() |
Optional<DataType> |
AdaptedCallContext.getOutputDataType() |
Modifier and Type | Method and Description |
---|---|
void |
AdaptedCallContext.setExpectedArguments(List<DataType> expectedArguments) |
Constructor and Description |
---|
AdaptedCallContext(CallContext originalContext,
DataType outputDataType) |
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeUtils.appendRowFields(DataType dataType,
List<DataTypes.Field> fields)
Appends the given list of fields to an existing row data type.
|
static DataType |
DataTypeUtils.createProctimeDataType()
Returns a PROCTIME data type.
|
static DataType |
TypeConversions.fromLegacyInfoToDataType(TypeInformation<?> typeInfo)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful.
|
static DataType[] |
TypeConversions.fromLegacyInfoToDataType(TypeInformation<?>[] typeInfo)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful.
|
static DataType |
TypeConversions.fromLogicalToDataType(LogicalType logicalType) |
static DataType[] |
TypeConversions.fromLogicalToDataType(LogicalType[] logicalTypes) |
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[] indices)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given indices.
|
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[][] indexPaths)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given index paths.
|
static DataType |
DataTypeUtils.removeTimeAttribute(DataType dataType)
Removes time attributes from the
DataType . |
static DataType |
DataTypeUtils.replaceLogicalType(DataType dataType,
LogicalType replacement)
Replaces the
LogicalType of a DataType , i.e., it keeps the bridging class. |
static DataType |
DataTypeUtils.stripRowPrefix(DataType dataType,
String prefix)
Removes a string prefix from the fields of the given row data type.
|
static DataType |
TypeInfoDataTypeConverter.toDataType(DataTypeFactory dataTypeFactory,
TypeInformation<?> typeInfo)
Converts the given
TypeInformation into DataType . |
static DataType |
TypeInfoDataTypeConverter.toDataType(DataTypeFactory dataTypeFactory,
TypeInformation<?> typeInfo,
boolean forceNullability)
Converts the given
TypeInformation into DataType but allows to make all
fields nullable independent of the nullability in the serialization stack. |
static DataType |
LogicalTypeDataTypeConverter.toDataType(LogicalType logicalType)
Returns the data type of a logical type without explicit conversions.
|
static DataType |
LegacyTypeInfoDataTypeConverter.toDataType(TypeInformation<?> typeInfo)
Deprecated.
|
static DataType |
DataTypeUtils.toInternalDataType(DataType dataType)
|
static DataType |
DataTypeUtils.toInternalDataType(LogicalType logicalType)
Creates a
DataType from the given LogicalType with internal data structures. |
static DataType |
DataTypeUtils.transform(DataTypeFactory factory,
DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static DataType |
DataTypeUtils.transform(DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
Modifier and Type | Method and Description |
---|---|
static Optional<DataType> |
ClassDataTypeConverter.extractDataType(Class<?> clazz)
Returns the clearly identifiable data type if possible.
|
static Optional<DataType> |
ValueDataTypeConverter.extractDataType(Object value)
Returns the clearly identifiable data type if possible.
|
static List<DataType> |
DataTypeUtils.flattenToDataTypes(DataType dataType)
Returns the data types of the flat representation in the first level of the given data type.
|
static Optional<DataType> |
TypeConversions.fromClassToDataType(Class<?> clazz) |
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
int index)
Retrieves a nested field from a composite type at given position.
|
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
String name)
Retrieves a nested field from a composite type with given name.
|
Modifier and Type | Method and Description |
---|---|
static DataType |
DataTypeUtils.appendRowFields(DataType dataType,
List<DataTypes.Field> fields)
Appends the given list of fields to an existing row data type.
|
protected abstract R |
DataTypeDefaultVisitor.defaultMethod(DataType dataType) |
static ResolvedSchema |
DataTypeUtils.expandCompositeTypeToSchema(DataType dataType)
Expands a composite
DataType to a corresponding ResolvedSchema . |
static List<DataType> |
DataTypeUtils.flattenToDataTypes(DataType dataType)
Returns the data types of the flat representation in the first level of the given data type.
|
static List<String> |
DataTypeUtils.flattenToNames(DataType dataType)
Returns the names of the flat representation in the first level of the given data type.
|
static List<String> |
DataTypeUtils.flattenToNames(DataType dataType,
List<String> existingNames)
Returns the names of the flat representation in the first level of the given data type.
|
static LogicalType |
TypeConversions.fromDataToLogicalType(DataType dataType) |
static LogicalType[] |
TypeConversions.fromDataToLogicalType(DataType[] dataTypes) |
static TypeInformation<?> |
TypeConversions.fromDataTypeToLegacyInfo(DataType dataType)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful.
|
static TypeInformation<?>[] |
TypeConversions.fromDataTypeToLegacyInfo(DataType[] dataType)
Deprecated.
Please don't use this method anymore. It will be removed soon and we should not
make the removal more painful.
|
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
int index)
Retrieves a nested field from a composite type at given position.
|
static Optional<DataType> |
DataTypeUtils.getField(DataType compositeType,
String name)
Retrieves a nested field from a composite type with given name.
|
static boolean |
DataTypeUtils.isInternal(DataType dataType)
Checks whether a given data type is an internal data structure.
|
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[] indices)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given indices.
|
static DataType |
DataTypeUtils.projectRow(DataType dataType,
int[][] indexPaths)
Projects a (possibly nested) row data type by returning a new data type that only includes
fields of the given index paths.
|
static DataType |
DataTypeUtils.removeTimeAttribute(DataType dataType)
Removes time attributes from the
DataType . |
static DataType |
DataTypeUtils.replaceLogicalType(DataType dataType,
LogicalType replacement)
Replaces the
LogicalType of a DataType , i.e., it keeps the bridging class. |
static DataType |
DataTypeUtils.stripRowPrefix(DataType dataType,
String prefix)
Removes a string prefix from the fields of the given row data type.
|
static DataType |
DataTypeUtils.toInternalDataType(DataType dataType)
|
static TypeInformation<?> |
LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(DataType dataType)
Deprecated.
|
static LogicalType |
LogicalTypeDataTypeConverter.toLogicalType(DataType dataType)
Returns the logical type of a data type.
|
static DataType |
DataTypeUtils.transform(DataTypeFactory factory,
DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static DataType |
DataTypeUtils.transform(DataType typeToTransform,
TypeTransformation... transformations)
Transforms the given data type to a different data type using the given transformations.
|
static void |
DataTypeUtils.validateInputDataType(DataType dataType)
The
DataType class can only partially verify the conversion class. |
static void |
DataTypeUtils.validateOutputDataType(DataType dataType)
The
DataType class can only partially verify the conversion class. |
Modifier and Type | Method and Description |
---|---|
DataType |
TimeIntervalTypeInfo.getDataType()
Deprecated.
|
DataType[] |
FieldInfoUtils.TypeInfoSchema.getFieldTypes() |
Modifier and Type | Method and Description |
---|---|
static int[] |
TypeMappingUtils.computePhysicalIndices(List<TableColumn> logicalColumns,
DataType physicalType,
java.util.function.Function<String,String> nameRemapping)
Computes indices of physical fields corresponding to the selected logical fields of a
TableSchema . |
static Object |
PartitionPathUtils.convertStringToInternalValue(String valStr,
DataType type)
Restore partition value from string and type.
|
static GenericRowData |
PartitionPathUtils.fillPartitionValueForRecord(String[] fieldNames,
DataType[] fieldTypes,
int[] selectFields,
List<String> partitionKeys,
Path path,
String defaultPartValue)
Extract partition value from path and fill to record.
|
Copyright © 2014–2023 The Apache Software Foundation. All rights reserved.