SQL Functions

Introduction

Ascend uses Spark SQL syntax. This page offers a list of functions supported by the Ascend platform.

❗️

These Functions are for Ascend's Legacy SQL Operator

With the general availability of Spark SQL, Ascend supports the same functions as in Spark's own SQL Functions. This list below is the historical list of supported functions in Ascend's Legacy SQL Transform.

Aggregate functions

ANY

Description

any(expr) - Returns true if at least one value of expr is true.

Argument typeReturn type
( Bool )Bool

More

ANY on Apache Spark Documentation

SOME

Description

some(expr) - Returns true if at least one value of expr is true.

Argument typeReturn type
( Bool )Bool

More

SOME on Apache Spark Documentation

BOOL_OR

Description

bool_or(expr) - Returns true if at least one value of expr is true.

Argument typeReturn type
( Bool )Bool

More

BOOL_OR on Apache Spark Documentation

BOOL_AND

Description

bool_and(expr) - Returns true if all values of expr are true.

Argument typeReturn type
( Bool )Bool

More

BOOL_AND on Apache Spark Documentation

EVERY

Description

every(expr) - Returns true if all values of expr are true.

Argument typeReturn type
( Bool )Bool

More

EVERY on Apache Spark Documentation

AVG

Description

avg(expr) - Returns the mean calculated from values of a group.

Argument typeReturn type
( Integer or Float )Float

More

AVG on Apache Spark Documentation

BIT_OR

Description

bit_or(expr) - Returns the bitwise OR of all non-null input values, or null if none.

Argument typeReturn type
( Integer )Integer

More

BIT_OR on Apache Spark Documentation

BIT_AND

Description

bit_and(expr) - Returns the bitwise AND of all non-null input values, or null if none.

Argument typeReturn type
( Integer )Integer

More

BIT_AND on Apache Spark Documentation

BIT_XOR

Description

bit_xor(expr) - Returns the bitwise XOR of all non-null input values, or null if none.

Argument typeReturn type
( Integer )Integer

More

BIT_XOR on Apache Spark Documentation

MEAN

Description

mean(expr) - Returns the mean calculated from values of a group.

Argument typeReturn type
( Integer or Float )Float

More

MEAN on Apache Spark Documentation

COUNT

Description

count(*) - Returns the total number of retrieved rows, including rows containing null. count(expr) - Returns the number of rows for which the supplied expression is non-null. count(DISTINCT expr[, expr...]) - Returns the number of rows for which the supplied expression(s) are unique and non-null.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map or )Integer

More

COUNT on Apache Spark Documentation

COUNT_IF

Description

count_if(expr) - Returns the number of TRUE values for the expression.

Argument typeReturn type
( Bool )Integer

More

COUNT_IF on Apache Spark Documentation

MAX

Description

max(expr) - Returns the maximum value of expr.

Argument typeReturn type
( Integer )Integer
( Float )Float
( Bool )Bool
( String )String
( Binary )Binary
( Date )Date
( Timestamp )Timestamp

More

MAX on Apache Spark Documentation

MAX_BY

Description

max_by(x, y) - Returns the value of x associated with the maximum value of y.

Argument typeReturn type
( Integer, Integer or Float or Bool or String or Binary or Date or Timestamp )Integer
( Float, Integer or Float or Bool or String or Binary or Date or Timestamp )Float
( Bool, Integer or Float or Bool or String or Binary or Date or Timestamp )Bool
( String, Integer or Float or Bool or String or Binary or Date or Timestamp )String
( Binary, Integer or Float or Bool or String or Binary or Date or Timestamp )Binary
( Date, Integer or Float or Bool or String or Binary or Date or Timestamp )Date
( Timestamp, Integer or Float or Bool or String or Binary or Date or Timestamp )Timestamp
( Array, Integer or Float or Bool or String or Binary or Date or Timestamp )Array
( Struct, Integer or Float or Bool or String or Binary or Date or Timestamp )Struct
( Map, Integer or Float or Bool or String or Binary or Date or Timestamp )Map

More

MAX_BY on Apache Spark Documentation

MIN_BY

Description

min_by(x, y) - Returns the value of x associated with the minimum value of y.

Argument typeReturn type
( Integer, Integer or Float or Bool or String or Binary or Date or Timestamp )Integer
( Float, Integer or Float or Bool or String or Binary or Date or Timestamp )Float
( Bool, Integer or Float or Bool or String or Binary or Date or Timestamp )Bool
( String, Integer or Float or Bool or String or Binary or Date or Timestamp )String
( Binary, Integer or Float or Bool or String or Binary or Date or Timestamp )Binary
( Date, Integer or Float or Bool or String or Binary or Date or Timestamp )Date
( Timestamp, Integer or Float or Bool or String or Binary or Date or Timestamp )Timestamp
( Array, Integer or Float or Bool or String or Binary or Date or Timestamp )Array
( Struct, Integer or Float or Bool or String or Binary or Date or Timestamp )Struct
( Map, Integer or Float or Bool or String or Binary or Date or Timestamp )Map

More

MIN_BY on Apache Spark Documentation

MIN

Description

min(expr) - Returns the minimum value of expr.

Argument typeReturn type
( Integer )Integer
( Float )Float
( Bool )Bool
( String )String
( Binary )Binary
( Date )Date
( Timestamp )Timestamp

More

MIN on Apache Spark Documentation

SUM

Description

sum(expr) - Returns the sum calculated from values of a group.

Argument typeReturn type
( Integer )Integer
( Float )Float

More

SUM on Apache Spark Documentation

COLLECT_LIST

Description

collect_list(expr) - Collects and returns a list of non-unique elements.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Array

More

COLLECT_LIST on Apache Spark Documentation

COLLECT_SET

Description

collect_set(expr) - Collects and returns a set of unique elements.

Argument typeReturn type
( Integer or Bool or String or Binary or Date or Timestamp )Array

More

COLLECT_SET on Apache Spark Documentation

FIRST

Description

first(expr[, isIgnoreNull]) - Returns the first value of expr for a group of rows. If isIgnoreNull is true, returns only non-null values.

Argument typeReturn type
( Integer )Integer
( Integer, Bool )Integer
( Float )Float
( Float, Bool )Float
( Bool )Bool
( Bool, Bool )Bool
( String )String
( String, Bool )String
( Binary )Binary
( Binary, Bool )Binary
( Date )Date
( Date, Bool )Date
( Timestamp )Timestamp
( Timestamp, Bool )Timestamp
( Array )Array
( Array, Bool )Array
( Struct )Struct
( Struct, Bool )Struct
( Map )Map
( Map, Bool )Map

More

FIRST on Apache Spark Documentation

FIRST_VALUE

Description

first_value(expr[, isIgnoreNull]) - Returns the first value of expr for a group of rows. If isIgnoreNull is true, returns only non-null values.

Argument typeReturn type
( Integer )Integer
( Integer, Bool )Integer
( Float )Float
( Float, Bool )Float
( Bool )Bool
( Bool, Bool )Bool
( String )String
( String, Bool )String
( Binary )Binary
( Binary, Bool )Binary
( Date )Date
( Date, Bool )Date
( Timestamp )Timestamp
( Timestamp, Bool )Timestamp
( Array )Array
( Array, Bool )Array
( Struct )Struct
( Struct, Bool )Struct
( Map )Map
( Map, Bool )Map

More

FIRST_VALUE on Apache Spark Documentation

LAST

Description

last(expr[, isIgnoreNull]) - Returns the last value of expr for a group of rows. If isIgnoreNull is true, returns only non-null values.

Argument typeReturn type
( Integer )Integer
( Integer, Bool )Integer
( Float )Float
( Float, Bool )Float
( Bool )Bool
( Bool, Bool )Bool
( String )String
( String, Bool )String
( Binary )Binary
( Binary, Bool )Binary
( Date )Date
( Date, Bool )Date
( Timestamp )Timestamp
( Timestamp, Bool )Timestamp
( Array )Array
( Array, Bool )Array
( Struct )Struct
( Struct, Bool )Struct
( Map )Map
( Map, Bool )Map

More

LAST on Apache Spark Documentation

LAST_VALUE

Description

last_value(expr[, isIgnoreNull]) - Returns the last value of expr for a group of rows. If isIgnoreNull is true, returns only non-null values.

Argument typeReturn type
( Integer )Integer
( Integer, Bool )Integer
( Float )Float
( Float, Bool )Float
( Bool )Bool
( Bool, Bool )Bool
( String )String
( String, Bool )String
( Binary )Binary
( Binary, Bool )Binary
( Date )Date
( Date, Bool )Date
( Timestamp )Timestamp
( Timestamp, Bool )Timestamp
( Array )Array
( Array, Bool )Array
( Struct )Struct
( Struct, Bool )Struct
( Map )Map
( Map, Bool )Map

More

LAST_VALUE on Apache Spark Documentation

Statistical Aggregate Functions

CORR

Description

corr(expr1, expr2) - Returns Pearson coefficient of correlation between a set of number pairs.

Argument typeReturn type
( Float, Float )Float

More

CORR on Apache Spark Documentation

COVAR_POP

Description

covar_pop(expr1, expr2) - Returns the population covariance of a set of number pairs.

Argument typeReturn type
( Float, Float )Float

More

COVAR_POP on Apache Spark Documentation

COVAR_SAMP

Description

covar_samp(expr1, expr2) - Returns the sample covariance of a set of number pairs.

Argument typeReturn type
( Float, Float )Float

More

COVAR_SAMP on Apache Spark Documentation

KURTOSIS

Description

kurtosis(expr) - Returns the kurtosis value calculated from values of a group.

Argument typeReturn type
( Float )Float

More

KURTOSIS on Apache Spark Documentation

SKEWNESS

Description

skewness(expr) - Returns the skewness value calculated from values of a group.

Argument typeReturn type
( Float )Float

More

SKEWNESS on Apache Spark Documentation

STDDEV_POP

Description

stddev_pop(expr) - Returns the population standard deviation calculated from values of a group.

Argument typeReturn type
( Float )Float

More

STDDEV_POP on Apache Spark Documentation

STDDEV_SAMP

Description

stddev_samp(expr) - Returns the sample standard deviation calculated from values of a group.

Argument typeReturn type
( Float )Float

More

STDDEV_SAMP on Apache Spark Documentation

STD

Description

std(expr) - Returns the sample standard deviation calculated from values of a group.

Argument typeReturn type
( Float )Float

More

STD on Apache Spark Documentation

STDDEV

Description

stddev(expr) - Returns the sample standard deviation calculated from values of a group.

Argument typeReturn type
( Float )Float

More

STDDEV on Apache Spark Documentation

VAR_POP

Description

var_pop(expr) - Returns the population variance calculated from values of a group.

Argument typeReturn type
( Float )Float

More

VAR_POP on Apache Spark Documentation

VAR_SAMP

Description

var_samp(expr) - Returns the sample variance calculated from values of a group.

Argument typeReturn type
( Float )Float

More

VAR_SAMP on Apache Spark Documentation

VARIANCE

Description

variance(expr) - Returns the sample variance calculated from values of a group.

Argument typeReturn type
( Float )Float

More

VARIANCE on Apache Spark Documentation

PERCENTILE

Description

percentile(col, percentage [, frequency]) - Returns the exact percentile value of numeric column col at the given percentage. The value of percentage must be between 0.0 and 1.0. The value of frequency should be positive integral.

Argument typeReturn type
( Float, Float )Float
( Float, Float, Integer )Float

More

PERCENTILE on Apache Spark Documentation

PERCENTILE_APPROX

Description

percentile_approx(col, percentage [, accuracy]) - Returns the approximate percentile value of numeric column col at the given percentage. The value of percentage must be between 0.0 and 1.0. The accuracy parameter (default: 10000) is a positive numeric literal which controls approximation accuracy at the cost of memory. Higher value of accuracy yields better accuracy, 1.0/accuracy is the relative error of the approximation.

Argument typeReturn type
( Float, Float )Float
( Float, Float, Integer )Float

More

PERCENTILE_APPROX on Apache Spark Documentation

APPROX_PERCENTILE

Description

approx_percentile(col, percentage [, accuracy]) - Returns the approximate percentile value of numeric column col at the given percentage. The value of percentage must be between 0.0 and 1.0. The accuracy parameter (default: 10000) is a positive numeric literal which controls approximation accuracy at the cost of memory. Higher value of accuracy yields better accuracy, 1.0/accuracy is the relative error of the approximation.

Argument typeReturn type
( Float, Float )Float
( Float, Float, Integer )Float

More

APPROX_PERCENTILE on Apache Spark Documentation

Approximate Aggregate Functions

APPROX_COUNT_DISTINCT

Description

approx_count_distinct(expr[, relativeSD]) - Returns the estimated cardinality by HyperLogLog++. relativeSD defines the maximum estimation error allowed.

Argument typeReturn type
( Integer or Bool or String or Binary or Date or Timestamp )Integer
( Integer or Bool or String or Binary or Date or Timestamp, Float )Integer

More

APPROX_COUNT_DISTINCT on Apache Spark Documentation

COUNT_MIN_SKETCH

Description

count_min_sketch(col, eps, confidence, seed) - Returns a count-min sketch of a column with the given esp, confidence and seed. The result is an array of bytes, which can be deserialized to a CountMinSketch before usage. Count-min sketch is a probabilistic data structure used for cardinality estimation using sub-linear space.

Argument typeReturn type
( Integer or Bool or String or Binary or Date or Timestamp, Float, Float, Integer )Binary

More

COUNT_MIN_SKETCH on Apache Spark Documentation

Misc Functions

ISNULL

Description

isnull(expr) - Returns true if expr is null, or false otherwise.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Bool

More

ISNULL on Apache Spark Documentation

ISNOTNULL

Description

isnotnull(expr) - Returns true if expr is not null, or false otherwise.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Bool

More

ISNOTNULL on Apache Spark Documentation

NULLIF

Description

nullif(expr1, expr2) - Returns null if expr1 equals to expr2, or expr1 otherwise.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float
( Bool, Bool )Bool
( String, String )String
( Binary, Binary )Binary
( Date, Date )Date
( Timestamp, Timestamp )Timestamp
( Array, Array )Array
( Struct, Struct )Struct
( Map, Map )Map

More

NULLIF on Apache Spark Documentation

NOT

Description

not(expr) - Logical not.

Argument typeReturn type
( Bool )Bool

More

NOT on Apache Spark Documentation

IF

Description

if(expr1, expr2, expr3) - If expr1 evaluates to true, then returns expr2; otherwise returns expr3.

Argument typeReturn type
( Bool, Integer, Integer )Integer
( Bool, Float, Float )Float
( Bool, Bool, Bool )Bool
( Bool, String, String )String
( Bool, Binary, Binary )Binary
( Bool, Date, Date )Date
( Bool, Timestamp, Timestamp )Timestamp
( Bool, Array, Array )Array
( Bool, Struct, Struct )Struct
( Bool, Map, Map )Map

More

IF on Apache Spark Documentation

IFNULL

Description

ifnull(expr1, expr2) - Returns expr2 if expr1 is null, or expr1 otherwise.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float
( Bool, Bool )Bool
( String, String )String
( Binary, Binary )Binary
( Date, Date )Date
( Timestamp, Timestamp )Timestamp
( Array, Array )Array
( Struct, Struct )Struct
( Map, Map )Map

More

IFNULL on Apache Spark Documentation

NVL

Description

nvl(expr1, expr2) - Returns expr2 if expr1 is null, or expr1 otherwise.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float
( Bool, Bool )Bool
( String, String )String
( Binary, Binary )Binary
( Date, Date )Date
( Timestamp, Timestamp )Timestamp
( Array, Array )Array
( Struct, Struct )Struct
( Map, Map )Map

More

NVL on Apache Spark Documentation

NVL2

Description

nvl2(expr1, expr2, expr3) - Returns expr2 if expr1 is not null, or expr3 otherwise.

Argument typeReturn type
( Integer, Integer, Integer )Integer
( Float, Float, Float )Float
( Bool, Bool, Bool )Bool
( String, String, String )String
( Binary, Binary, Binary )Binary
( Date, Date, Date )Date
( Timestamp, Timestamp, Timestamp )Timestamp
( Array, Array, Array )Array
( Struct, Struct, Struct )Struct
( Map, Map, Map )Map

More

NVL2 on Apache Spark Documentation

COALESCE

Description

coalesce(expr1, expr2, ...) - Returns the first non-null argument if exists. Otherwise, null.

Argument typeReturn type
( Integer )Integer
( Float )Float
( Bool )Bool
( String )String
( Binary )Binary
( Date )Date
( Timestamp )Timestamp
( Array )Array
( Struct )Struct
( Map )Map

More

COALESCE on Apache Spark Documentation

RAISE_ERROR

Description

raise_error(msg) - Raises an error with the specified string message. The function always returns a value of string data type.

Argument typeReturn type
( String )String

More

RAISE_ERROR on Apache Spark Documentation

ASSERT_TRUE

Description

assert_true(expr) - Throws an exception if expr is not true.

Argument typeReturn type
( Bool )String

More

ASSERT_TRUE on Apache Spark Documentation

TYPEOF

Description

typeof(expr) - Return DDL-formatted type string for the data type of the input.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )String

More

TYPEOF on Apache Spark Documentation

Cast Functions

BIGINT

Description

bigint(expr) - Casts the value expr to the target data type bigint.

Argument typeReturn type
( Integer or Float or Bool or String )Integer

More

BIGINT on Apache Spark Documentation

DATE

Description

date(expr) - Casts the value expr to the target data type date.

Argument typeReturn type
( String or Date or Timestamp )Date

More

DATE on Apache Spark Documentation

DOUBLE

Description

double(expr) - Casts the value expr to the target data type double.

Argument typeReturn type
( Integer or Float or Bool or String )Float

More

DOUBLE on Apache Spark Documentation

INT

Description

int(expr) - Casts the value expr to the target data type int.

Argument typeReturn type
( Integer or Float or Bool or String )Integer

More

INT on Apache Spark Documentation

TIMESTAMP

Description

timestamp(expr) - Casts the value expr to the target data type timestamp.

Argument typeReturn type
( String )Timestamp

More

TIMESTAMP on Apache Spark Documentation

BINARY

Description

binary(expr) - Casts the value expr to the target data type binary.

Argument typeReturn type
( Integer or String or Binary )Binary

More

BINARY on Apache Spark Documentation

BOOLEAN

Description

boolean(expr) - Casts the value expr to the target data type boolean.

Argument typeReturn type
( Integer or Float or Bool or String or Date or Timestamp )Bool

More

BOOLEAN on Apache Spark Documentation

DECIMAL

Description

decimal(expr) - Casts the value expr to the target data type decimal.

Argument typeReturn type
( Integer or Float or Bool or String )Float

More

DECIMAL on Apache Spark Documentation

FLOAT

Description

float(expr) - Casts the value expr to the target data type float.

Argument typeReturn type
( Integer or Float or Bool or String )Float

More

FLOAT on Apache Spark Documentation

SMALLINT

Description

smallint(expr) - Casts the value expr to the target data type smallint.

Argument typeReturn type
( Integer or Float or Bool or String )Integer

More

SMALLINT on Apache Spark Documentation

STRING

Description

string(expr) - Casts the value expr to the target data type string.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )String

More

STRING on Apache Spark Documentation

Analytic Functions: Numbering Functions

RANK

Description

rank() - Computes the rank of a value in a group of values. The result is one plus the number of rows preceding or equal to the current row in the ordering of the partition. The values will produce gaps in the sequence.

Argument typeReturn type
No argumentsInteger

More

RANK on Apache Spark Documentation

DENSE_RANK

Description

dense_rank() - Computes the rank of a value in a group of values. The result is one plus the previously assigned rank value. Unlike the function rank, dense_rank will not produce gaps in the ranking sequence.

Argument typeReturn type
No argumentsInteger

More

DENSE_RANK on Apache Spark Documentation

PERCENT_RANK

Description

percent_rank() - Computes the percentage ranking of a value in a group of values.

Argument typeReturn type
No argumentsFloat

More

PERCENT_RANK on Apache Spark Documentation

CUME_DIST

Description

cume_dist() - Computes the position of a value relative to all values in the partition.

Argument typeReturn type
No argumentsFloat

More

CUME_DIST on Apache Spark Documentation

NTILE

Description

ntile(n) - Divides the rows for each window partition into n buckets ranging from 1 to at most n.

Argument typeReturn type
( Integer )Integer

More

NTILE on Apache Spark Documentation

ROW_NUMBER

Description

row_number() - Assigns a unique, sequential number to each row, starting with one, according to the ordering of rows within the window partition.

Argument typeReturn type
No argumentsInteger

More

ROW_NUMBER on Apache Spark Documentation

BIN

Description

bin(expr) - Returns the string representation of the long value expr represented in binary.

Argument typeReturn type
( Integer )String

More

BIN on Apache Spark Documentation

CONV

Description

conv(num, from_base, to_base) - Convert num from from_base to to_base.

Argument typeReturn type
( String, Integer, Integer )String

More

CONV on Apache Spark Documentation

MONOTONICALLY_INCREASING_ID

Description

monotonically_increasing_id() - Returns monotonically increasing 64-bit integers. The generated ID is guaranteed to be monotonically increasing and unique, but not consecutive. The current implementation puts the partition ID in the upper 31 bits, and the lower 33 bits represent the record number within each partition. The assumption is that the data frame has less than 1 billion partitions, and each partition has less than 8 billion records.

Argument typeReturn type
No argumentsInteger

More

MONOTONICALLY_INCREASING_ID on Apache Spark Documentation

Analytic Functions: Navigation Functions

LEAD

Description

lead(input[, offset[, default]]) - Returns the value of input at the offsetth row after the current row in the window. The default value of offset is 1 and the default value of default is null. If the value of input at the offsetth row is null, null is returned. If there is no such an offset row (e.g., when the offset is 1, the last row of the window does not have any subsequent row), default is returned.

Argument typeReturn type
( Integer )Integer
( Integer, Integer )Integer
( Integer, Integer, Integer )Integer
( Float )Float
( Float, Integer )Float
( Float, Integer, Float )Float
( Bool )Bool
( Bool, Integer )Bool
( Bool, Integer, Bool )Bool
( String )String
( String, Integer )String
( String, Integer, String )String
( Binary )Binary
( Binary, Integer )Binary
( Binary, Integer, Binary )Binary
( Date )Date
( Date, Integer )Date
( Date, Integer, Date )Date
( Timestamp )Timestamp
( Timestamp, Integer )Timestamp
( Timestamp, Integer, Timestamp )Timestamp
( Array )Array
( Array, Integer )Array
( Array, Integer, Array )Array
( Struct )Struct
( Struct, Integer )Struct
( Struct, Integer, Struct )Struct
( Map )Map
( Map, Integer )Map
( Map, Integer, Map )Map

More

LEAD on Apache Spark Documentation

LAG

Description

lag(input[, offset[, default]]) - Returns the value of input at the offsetth row before the current row in the window. The default value of offset is 1 and the default value of default is null. If the value of input at the offsetth row is null, null is returned. If there is no such offset row (e.g., when the offset is 1, the first row of the window does not have any previous row), default is returned.

Argument typeReturn type
( Integer )Integer
( Integer, Integer )Integer
( Integer, Integer, Integer )Integer
( Float )Float
( Float, Integer )Float
( Float, Integer, Float )Float
( Bool )Bool
( Bool, Integer )Bool
( Bool, Integer, Bool )Bool
( String )String
( String, Integer )String
( String, Integer, String )String
( Binary )Binary
( Binary, Integer )Binary
( Binary, Integer, Binary )Binary
( Date )Date
( Date, Integer )Date
( Date, Integer, Date )Date
( Timestamp )Timestamp
( Timestamp, Integer )Timestamp
( Timestamp, Integer, Timestamp )Timestamp
( Array )Array
( Array, Integer )Array
( Array, Integer, Array )Array
( Struct )Struct
( Struct, Integer )Struct
( Struct, Integer, Struct )Struct
( Map )Map
( Map, Integer )Map
( Map, Integer, Map )Map

More

LAG on Apache Spark Documentation

Bit Functions

BIT_COUNT

Description

bit_count(expr) - Returns the number of bits that are set in the argument expr as an unsigned 64-bit integer, or NULL if the argument is NULL.

Argument typeReturn type
( Integer or Bool )Integer

More

BIT_COUNT on Apache Spark Documentation

SHIFTLEFT

Description

shiftleft(base, expr) - Bitwise left shift.

Argument typeReturn type
( Integer, Integer )Integer

More

SHIFTLEFT on Apache Spark Documentation

SHIFTRIGHT

Description

shiftright(base, expr) - Bitwise (signed) right shift.

Argument typeReturn type
( Integer, Integer )Integer

More

SHIFTRIGHT on Apache Spark Documentation

SHIFTRIGHTUNSIGNED

Description

shiftrightunsigned(base, expr) - Bitwise unsigned right shift.

Argument typeReturn type
( Integer, Integer )Integer

More

SHIFTRIGHTUNSIGNED on Apache Spark Documentation

Mathematical functions

E

Description

e() - Returns Euler's number, e.

Argument typeReturn type
No argumentsFloat

More

E on Apache Spark Documentation

PI

Description

pi() - Returns pi.

Argument typeReturn type
No argumentsFloat

More

PI on Apache Spark Documentation

ABS

Description

abs(expr) - Returns the absolute value of the numeric value.

Argument typeReturn type
( Integer )Integer
( Float )Float

More

ABS on Apache Spark Documentation

NEGATIVE

Description

negative(expr) - Returns the negated value of expr.

Argument typeReturn type
( Integer )Integer
( Float )Float
( )

More

NEGATIVE on Apache Spark Documentation

POSITIVE

Description

positive(expr) - Returns the value of expr.

Argument typeReturn type
( Integer )Integer
( Float )Float
( )

More

POSITIVE on Apache Spark Documentation

SIGN

Description

sign(expr) - Returns -1.0, 0.0 or 1.0 as expr is negative, 0 or positive.

Argument typeReturn type
( Integer or Float )Float

More

SIGN on Apache Spark Documentation

SIGNUM

Description

signum(expr) - Returns -1.0, 0.0 or 1.0 as expr is negative, 0 or positive.

Argument typeReturn type
( Integer or Float )Float

More

SIGNUM on Apache Spark Documentation

ISNAN

Description

isnan(expr) - Returns true if expr is NaN, or false otherwise.

Argument typeReturn type
( Float )Bool

More

ISNAN on Apache Spark Documentation

RAND

Description

rand([seed]) - Returns a random value with independent and identically distributed (i.i.d.) uniformly distributed values in [0, 1).

Argument typeReturn type
No argumentsFloat
( Integer )Float

More

RAND on Apache Spark Documentation

RANDN

Description

randn([seed]) - Returns a random value with independent and identically distributed (i.i.d.) values drawn from the standard normal distribution.

Argument typeReturn type
No argumentsFloat
( Integer )Float

More

RANDN on Apache Spark Documentation

UUID

Description

uuid() - Returns an universally unique identifier (UUID) string. The value is returned as a canonical UUID 36-character string.

Argument typeReturn type
No argumentsString

More

UUID on Apache Spark Documentation

SQRT

Description

sqrt(expr) - Returns the square root of expr.

Argument typeReturn type
( Float )Float

More

SQRT on Apache Spark Documentation

CBRT

Description

cbrt(expr) - Returns the cube root of expr.

Argument typeReturn type
( Float )Float

More

CBRT on Apache Spark Documentation

HYPOT

Description

hypot(expr1, expr2) - Returns sqrt(expr12 + expr22).

Argument typeReturn type
( Float, Float )Float

More

HYPOT on Apache Spark Documentation

POW

Description

pow(expr1, expr2) - Raises expr1 to the power of expr2.

Argument typeReturn type
( Float, Float )Float

More

POW on Apache Spark Documentation

EXP

Description

exp(expr) - Returns e to the power of expr.

Argument typeReturn type
( Float )Float

More

EXP on Apache Spark Documentation

EXPM1

Description

expm1(expr) - Returns exp(expr) - 1.

Argument typeReturn type
( Float )Float

More

EXPM1 on Apache Spark Documentation

LN

Description

ln(expr) - Returns the natural logarithm (base e) of expr.

Argument typeReturn type
( Float )Float

More

LN on Apache Spark Documentation

LOG

Description

log(base, expr) - Returns the logarithm of expr with base.

Argument typeReturn type
( Float )Float
( Float, Float )Float

More

LOG on Apache Spark Documentation

LOG10

Description

log10(expr) - Returns the logarithm of expr with base 10.

Argument typeReturn type
( Float )Float

More

LOG10 on Apache Spark Documentation

LOG1P

Description

log1p(expr) - Returns log(1 + expr).

Argument typeReturn type
( Float )Float

More

LOG1P on Apache Spark Documentation

LOG2

Description

log2(expr) - Returns the logarithm of expr with base 2.

Argument typeReturn type
( Float )Float

More

LOG2 on Apache Spark Documentation

GREATEST

Description

greatest(expr, ...) - Returns the greatest value of all parameters, skipping null values.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float
( Bool, Bool )Bool
( String, String )String
( Binary, Binary )Binary
( Date, Date )Date
( Timestamp, Timestamp )Timestamp

More

GREATEST on Apache Spark Documentation

LEAST

Description

least(expr, ...) - Returns the least value of all parameters, skipping null values.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float
( Bool, Bool )Bool
( String, String )String
( Binary, Binary )Binary
( Date, Date )Date
( Timestamp, Timestamp )Timestamp

More

LEAST on Apache Spark Documentation

MOD

Description

mod(expr1, expr2) - Returns the remainder after expr1/expr2.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float

More

MOD on Apache Spark Documentation

PMOD

Description

pmod(expr1, expr2) - Returns the positive value of expr1 mod expr2.

Argument typeReturn type
( Integer, Integer )Integer
( Float, Float )Float

More

PMOD on Apache Spark Documentation

FACTORIAL

Description

factorial(expr) - Returns the factorial of expr. expr is [0..20]. Otherwise, null.

Argument typeReturn type
( Integer )Integer

More

FACTORIAL on Apache Spark Documentation

NANVL

Description

nanvl(expr1, expr2) - Returns expr1 if it's not NaN, or expr2 otherwise.

Argument typeReturn type
( Float, Float )Float

More

NANVL on Apache Spark Documentation

DIV

Description

div(expr1, expr2) - Divide expr1 by expr2. It returns NULL if an operand is NULL or expr2 is 0. The result is casted to long.

Argument typeReturn type
( Integer, Integer )Integer

More

DIV on Apache Spark Documentation

POWER

Description

power(expr1, expr2) - Raises expr1 to the power of expr2.

Argument typeReturn type
( Float, Float )Float

More

POWER on Apache Spark Documentation

RANDOM

Description

random([seed]) - Returns a random value with independent and identically distributed (i.i.d.) uniformly distributed values in [0, 1).

Argument typeReturn type
No argumentsFloat
( Integer )Float

More

RANDOM on Apache Spark Documentation

Rounding Functions

ROUND

Description

round(expr, d) - Returns expr rounded to d decimal places using HALF_UP rounding mode.

Argument typeReturn type
( Float )Float
( Float, Integer )Float

More

ROUND on Apache Spark Documentation

BROUND

Description

bround(expr, d) - Returns expr rounded to d decimal places using HALF_EVEN rounding mode.

Argument typeReturn type
( Float )Float
( Float, Integer )Float

More

BROUND on Apache Spark Documentation

CEIL

Description

ceil(expr) - Returns the smallest integer not smaller than expr.

Argument typeReturn type
( Float )Float

More

CEIL on Apache Spark Documentation

CEILING

Description

ceiling(expr) - Returns the smallest integer not smaller than expr.

Argument typeReturn type
( Float )Float

More

CEILING on Apache Spark Documentation

FLOOR

Description

floor(expr) - Returns the largest integer not greater than expr.

Argument typeReturn type
( Float )Float

More

FLOOR on Apache Spark Documentation

RINT

Description

rint(expr) - Returns the double value that is closest in value to the argument and is equal to a mathematical integer.

Argument typeReturn type
( Float )Float

More

RINT on Apache Spark Documentation

Trigonometric and hyperbolic functions

DEGREES

Description

degrees(expr) - Converts radians to degrees.

Argument typeReturn type
( Float )Float

More

DEGREES on Apache Spark Documentation

RADIANS

Description

radians(expr) - Converts degrees to radians.

Argument typeReturn type
( Float )Float

More

RADIANS on Apache Spark Documentation

COS

Description

cos(expr) - Returns the cosine of expr.

Argument typeReturn type
( Float )Float

More

COS on Apache Spark Documentation

COSH

Description

cosh(expr) - Returns the hyperbolic cosine of expr.

Argument typeReturn type
( Float )Float

More

COSH on Apache Spark Documentation

ACOS

Description

acos(expr) - Returns the inverse cosine (a.k.a. arccosine) of expr if -1<=expr<=1 or NaN otherwise.

Argument typeReturn type
( Float )Float

More

ACOS on Apache Spark Documentation

ACOSH

Description

acosh(expr) - Returns inverse hyperbolic cosine of expr.

Argument typeReturn type
( Float )Float

More

ACOSH on Apache Spark Documentation

SIN

Description

sin(expr) - Returns the sine of expr.

Argument typeReturn type
( Float )Float

More

SIN on Apache Spark Documentation

SINH

Description

sinh(expr) - Returns the hyperbolic sine of expr.

Argument typeReturn type
( Float )Float

More

SINH on Apache Spark Documentation

ASIN

Description

asin(expr) - Returns the inverse sine (a.k.a. arcsine) the arc sin of expr if -1<=expr<=1 or NaN otherwise.

Argument typeReturn type
( Float )Float

More

ASIN on Apache Spark Documentation

ASINH

Description

asinh(expr) - Returns inverse hyperbolic sine of expr.

Argument typeReturn type
( Float )Float

More

ASINH on Apache Spark Documentation

TAN

Description

tan(expr) - Returns the tangent of expr.

Argument typeReturn type
( Float )Float

More

TAN on Apache Spark Documentation

TANH

Description

tanh(expr) - Returns the hyperbolic tangent of expr.

Argument typeReturn type
( Float )Float

More

TANH on Apache Spark Documentation

ATANH

Description

atanh(expr) - Returns inverse hyperbolic tangent of expr.

Argument typeReturn type
( Float )Float

More

ATANH on Apache Spark Documentation

COT

Description

cot(expr) - Returns the cotangent of expr.

Argument typeReturn type
( Float )Float

More

COT on Apache Spark Documentation

ATAN

Description

atan(expr) - Returns the inverse tangent (a.k.a. arctangent).

Argument typeReturn type
( Float )Float

More

ATAN on Apache Spark Documentation

ATAN2

Description

atan2(expr1, expr2) - Returns the angle in radians between the positive x-axis of a plane and the point given by the coordinates (expr1, expr2).

Argument typeReturn type
( Float, Float )Float

More

ATAN2 on Apache Spark Documentation

Hash functions

HASH

Description

hash(expr1, expr2, ...) - Returns a hash value of the arguments.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Integer

More

HASH on Apache Spark Documentation

CRC32

Description

crc32(expr) - Returns a cyclic redundancy check value of the expr as a bigint.

Argument typeReturn type
( Binary )Integer

More

CRC32 on Apache Spark Documentation

MD5

Description

md5(expr) - Returns an MD5 128-bit checksum as a hex string of expr.

Argument typeReturn type
( Binary )String

More

MD5 on Apache Spark Documentation

SHA

Description

sha(expr) - Returns a sha1 hash value as a hex string of the expr.

Argument typeReturn type
( Binary )String

More

SHA on Apache Spark Documentation

SHA1

Description

sha1(expr) - Returns a sha1 hash value as a hex string of the expr.

Argument typeReturn type
( Binary )String

More

SHA1 on Apache Spark Documentation

SHA2

Description

sha2(expr, bitLength) - Returns a checksum of SHA-2 family as a hex string of expr. SHA-224, SHA-256, SHA-384, and SHA-512 are supported. Bit length of 0 is equivalent to 256.

Argument typeReturn type
( Binary, Integer )String

More

SHA2 on Apache Spark Documentation

XXHASH64

Description

xxhash64(expr1, expr2, ...) - Returns a 64-bit hash value of the arguments.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Integer

More

XXHASH64 on Apache Spark Documentation

String functions

LENGTH

Description

length(expr) - Returns the character length of expr or number of bytes in binary data.

Argument typeReturn type
( String or Binary )Integer

More

LENGTH on Apache Spark Documentation

ASCII

Description

ascii(str) - Returns the numeric value of the first character of str.

Argument typeReturn type
( String )Integer

More

ASCII on Apache Spark Documentation

CHAR

Description

char(expr) - Returns the ASCII character having the binary equivalent to expr. If n is larger than 256 the result is equivalent to chr(n % 256).

Argument typeReturn type
( Integer )String

More

CHAR on Apache Spark Documentation

CHR

Description

chr(expr) - Returns the ASCII character having the binary equivalent to expr. If n is larger than 256 the result is equivalent to chr(n % 256).

Argument typeReturn type
( Integer )String

More

CHR on Apache Spark Documentation

ELT

Description

elt(n, str1, str2, ...) - Returns the n-th string, e.g., returns str2 when n is 2.

Argument typeReturn type
( Integer, String )String
( Integer, Binary )Binary

More

ELT on Apache Spark Documentation

CONCAT

Description

concat(str1, str2, ..., strN) - Returns the concatenation of str1, str2, ..., strN.

Argument typeReturn type
No argumentsString
( Binary )Binary

More

CONCAT on Apache Spark Documentation

CONCAT_WS

Description

concat_ws(sep, [str | array(str)]+) - Returns the concatenation of the strings separated by sep.

Argument typeReturn type
( String )String

More

CONCAT_WS on Apache Spark Documentation

FORMAT_NUMBER

Description

format_number(expr1, expr2) - Formats the number expr1 like '#,###,###.##', rounded to expr2 decimal places. If expr2 is 0, the result has no decimal point or fractional part. This is supposed to function like MySQL's FORMAT.

Argument typeReturn type
( Float, Integer )String

More

FORMAT_NUMBER on Apache Spark Documentation

FORMAT_STRING

Description

format_string(strfmt, obj, ...) - Returns a formatted string from printf-style format strings.

Argument typeReturn type
( String )String

More

FORMAT_STRING on Apache Spark Documentation

PRINTF

Description

printf(strfmt, obj, ...) - Returns a formatted string from printf-style format strings.

Argument typeReturn type
( String )String

More

PRINTF on Apache Spark Documentation

LPAD

Description

lpad(str, len, pad) - Returns str, left-padded with pad to a length of len. If str is longer than len, the return value is shortened to len characters.

Argument typeReturn type
( String, Integer, String )String

More

LPAD on Apache Spark Documentation

LCASE

Description

lcase(str) - Returns str with all characters changed to lowercase.

Argument typeReturn type
( String )String

More

LCASE on Apache Spark Documentation

LOWER

Description

lower(str) - Returns str with all characters changed to lowercase.

Argument typeReturn type
( String )String

More

LOWER on Apache Spark Documentation

LTRIM

Description

ltrim(str) - Removes the leading space characters from str. ltrim(trimStr, str) - Removes the leading string contains the characters from the trim string

Argument typeReturn type
( String )String
( String, String )String

More

LTRIM on Apache Spark Documentation

REGEXP_EXTRACT

Description

regexp_extract(str, regexp[, idx]) - Extracts a group that matches regexp.

Argument typeReturn type
( String, String, Integer )String

More

REGEXP_EXTRACT on Apache Spark Documentation

REGEXP_REPLACE

Description

regexp_replace(str, regexp, rep) - Replaces all substrings of str that match regexp with rep.

Argument typeReturn type
( String, String, String )String

More

REGEXP_REPLACE on Apache Spark Documentation

REPEAT

Description

repeat(str, n) - Returns the string which repeats the given string value n times.

Argument typeReturn type
( String, Integer )String

More

REPEAT on Apache Spark Documentation

REPLACE

Description

replace(str, search[, replace]) - Replaces all occurrences of search with replace.

Argument typeReturn type
( String, String )String
( String, String, String )String

More

REPLACE on Apache Spark Documentation

REVERSE

Description

reverse(str) - Returns the reversed given string.

Argument typeReturn type
( String )String

More

REVERSE on Apache Spark Documentation

RPAD

Description

rpad(str, len, pad) - Returns str, right-padded with pad to a length of len. If str is longer than len, the return value is shortened to len characters.

Argument typeReturn type
( String, Integer, String )String

More

RPAD on Apache Spark Documentation

RTRIM

Description

rtrim(str) - Removes the trailing space characters from str. rtrim(trimStr, str) - Removes the trailing string which contains the characters from the trim string

Argument typeReturn type
( String )String
( String, String )String

More

RTRIM on Apache Spark Documentation

SPLIT

Description

split(str, regex) - Splits str around occurrences that match regex.

Argument typeReturn type
( String, String )Array

More

SPLIT on Apache Spark Documentation

INSTR

Description

instr(str, substr) - Returns the (1-based) index of the first occurrence of substr in str.

Argument typeReturn type
( String, String )Integer

More

INSTR on Apache Spark Documentation

LOCATE

Description

locate(substr, str[, pos]) - Returns the position of the first occurrence of substr in str after position pos. The given pos and return value are 1-based.

Argument typeReturn type
( String, String )Integer
( String, String, Integer )Integer

More

LOCATE on Apache Spark Documentation

POSITION

Description

position(substr, str[, pos]) - Returns the position of the first occurrence of substr in str after position pos. The given pos and return value are 1-based.

Argument typeReturn type
( String, String )Integer
( String, String, Integer )Integer

More

POSITION on Apache Spark Documentation

SUBSTRING_INDEX

Description

substring_index(str, delim, count) - Returns the substring from str before count occurrences of the delimiter delim. If count is positive, everything to the left of the final delimiter (counting from the left) is returned. If count is negative, everything to the right of the final delimiter (counting from the right) is returned. The function substring_index performs a case-sensitive match when searching for delim.

Argument typeReturn type
( String, String, Integer )String

More

SUBSTRING_INDEX on Apache Spark Documentation

SUBSTRING

Description

substring(str, pos[, len]) - Returns the substring of str that starts at pos and is of length len, or the slice of byte array that starts at pos and is of length len.

Argument typeReturn type
( String, Integer )String
( String, Integer, Integer )String
( Binary, Integer )Binary
( Binary, Integer, Integer )Binary

More

SUBSTRING on Apache Spark Documentation

SUBSTR

Description

substr(str, pos[, len]) - Returns the substring of str that starts at pos and is of length len, or the slice of byte array that starts at pos and is of length len.

Argument typeReturn type
( String, Integer )String
( String, Integer, Integer )String
( Binary, Integer )Binary
( Binary, Integer, Integer )Binary

More

SUBSTR on Apache Spark Documentation

LEFT

Description

left(str, len) - Returns the leftmost len(len can be string type) characters from the string str,if len is less or equal than 0 the result is an empty string.

Argument typeReturn type
( String, Integer )String

More

LEFT on Apache Spark Documentation

RIGHT

Description

right(str, len) - Returns the rightmost len(len can be string type) characters from the string str,if len is less or equal than 0 the result is an empty string.

Argument typeReturn type
( String, Integer )String

More

RIGHT on Apache Spark Documentation

BASE64

Description

base64(bin) - Converts the argument from a binary bin to a base 64 string.

Argument typeReturn type
( Binary )String

More

BASE64 on Apache Spark Documentation

UNBASE64

Description

unbase64(str) - Converts the argument from a base 64 string str to a binary.

Argument typeReturn type
( String )Binary

More

UNBASE64 on Apache Spark Documentation

DECODE

Description

decode(bin, charset) - Decodes the first argument using the second argument character set.

Argument typeReturn type
( Binary, String )String

More

DECODE on Apache Spark Documentation

ENCODE

Description

encode(str, charset) - Encodes the first argument using the second argument character set.

Argument typeReturn type
( String, String )Binary

More

ENCODE on Apache Spark Documentation

TRIM

Description

trim(str) - Removes the leading and trailing space characters from str. trim(trimStr, str) - Removes the leading and trailing string which contains the characters from the trim string

Argument typeReturn type
( String )String
( String, String )String

More

TRIM on Apache Spark Documentation

UCASE

Description

ucase(str) - Returns str with all characters changed to uppercase.

Argument typeReturn type
( String )String

More

UCASE on Apache Spark Documentation

UPPER

Description

upper(str) - Returns str with all characters changed to uppercase.

Argument typeReturn type
( String )String

More

UPPER on Apache Spark Documentation

INITCAP

Description

initcap(str) - Returns str with the first letter of each word in uppercase. All other letters are in lowercase. Words are delimited by white space.

Argument typeReturn type
( String )String

More

INITCAP on Apache Spark Documentation

HEX

Description

hex(expr) - Converts expr to hexadecimal.

Argument typeReturn type
( Integer or String or Binary )String

More

HEX on Apache Spark Documentation

UNHEX

Description

unhex(expr) - Converts hexadecimal expr to binary.

Argument typeReturn type
( String )Binary

More

UNHEX on Apache Spark Documentation

LEVENSHTEIN

Description

levenshtein(str1, str2) - Returns the Levenshtein distance between the two given strings.

Argument typeReturn type
( String, String )Integer

More

LEVENSHTEIN on Apache Spark Documentation

SOUNDEX

Description

soundex(str) - Returns Soundex code of the string.

Argument typeReturn type
( String )String

More

SOUNDEX on Apache Spark Documentation

TRANSLATE

Description

translate(input, from, to) - Translates the input string by replacing the characters present in the from string with the corresponding characters in the to string.

Argument typeReturn type
( String, String, String )String

More

TRANSLATE on Apache Spark Documentation

SENTENCES

Description

sentences(str[, lang, country]) - Splits str into an array of array of words.

Argument typeReturn type
( String )Array
( String, String )Array
( String, String, String )Array

More

SENTENCES on Apache Spark Documentation

BIT_LENGTH

Description

bit_length(expr) - Returns the bit length of string data or number of bits of binary data.

Argument typeReturn type
( String or Binary )Integer

More

BIT_LENGTH on Apache Spark Documentation

CHAR_LENGTH

Description

char_length(expr) - Returns the character length of string data or number of bytes of binary data. The length of string data includes the trailing spaces. The length of binary data includes binary zeros.

Argument typeReturn type
( String or Binary )Integer

More

CHAR_LENGTH on Apache Spark Documentation

CHARACTER_LENGTH

Description

character_length(expr) - Returns the character length of string data or number of bytes of binary data. The length of string data includes the trailing spaces. The length of binary data includes binary zeros.

Argument typeReturn type
( String or Binary )Integer

More

CHARACTER_LENGTH on Apache Spark Documentation

XPATH

Description

xpath(xml, xpath) - Returns a string array of values within the nodes of xml that match the XPath expression.

Argument typeReturn type
( String, String )Array

More

XPATH on Apache Spark Documentation

XPATH_BOOLEAN

Description

xpath_boolean(xml, xpath) - Returns true if the XPath expression evaluates to true, or if a matching node is found.

Argument typeReturn type
( String, String )Bool

More

XPATH_BOOLEAN on Apache Spark Documentation

XPATH_DOUBLE

Description

xpath_double(xml, xpath) - Returns a double value, the value zero if no match is found, or NaN if a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Float

More

XPATH_DOUBLE on Apache Spark Documentation

XPATH_FLOAT

Description

xpath_float(xml, xpath) - Returns a float value, the value zero if no match is found, or NaN if a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Float

More

XPATH_FLOAT on Apache Spark Documentation

XPATH_INT

Description

xpath_int(xml, xpath) - Returns an integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Integer

More

XPATH_INT on Apache Spark Documentation

XPATH_LONG

Description

xpath_long(xml, xpath) - Returns a long integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Integer

More

XPATH_LONG on Apache Spark Documentation

XPATH_NUMBER

Description

xpath_number(xml, xpath) - Returns a double value, the value zero if no match is found, or NaN if a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Float

More

XPATH_NUMBER on Apache Spark Documentation

XPATH_SHORT

Description

xpath_short(xml, xpath) - Returns a short integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.

Argument typeReturn type
( String, String )Integer

More

XPATH_SHORT on Apache Spark Documentation

XPATH_STRING

Description

xpath_string(xml, xpath) - Returns the text contents of the first xml node that matches the XPath expression.

Argument typeReturn type
( String, String )String

More

XPATH_STRING on Apache Spark Documentation

OCTET_LENGTH

Description

octet_length(expr) - Returns the byte length of string data or number of bytes of binary data.

Argument typeReturn type
( String or Binary )Integer

More

OCTET_LENGTH on Apache Spark Documentation

OVERLAY

Description

overlay(input, replace, pos[, len]) - Replace input with replace that starts at pos and is of length len.

Argument typeReturn type
( String, String, Integer )String
( String, String, Integer, Integer )String
( Binary, Binary, Integer )Binary
( Binary, Binary, Integer, Integer )Binary

More

OVERLAY on Apache Spark Documentation

PARSE_URL

Description

parse_url(url, partToExtract[, key]) - Extracts a part from a URL.

Argument typeReturn type
( String, String )String
( String, String, String )String

More

PARSE_URL on Apache Spark Documentation

SCHEMA_OF_CSV

Description

schema_of_csv(csv[, options]) - Returns schema in the DDL format of CSV string.

Argument typeReturn type
( String )String
( String, Map )String

More

SCHEMA_OF_CSV on Apache Spark Documentation

SPACE

Description

space(n) - Returns a string consisting of n spaces.

Argument typeReturn type
( Integer )String

More

SPACE on Apache Spark Documentation

TO_CSV

Description

to_csv(expr[, options]) - Returns a CSV string with a given struct value

Argument typeReturn type
( Struct )String
( Struct, Map )String

More

TO_CSV on Apache Spark Documentation

FIND_IN_SET

Description

find_in_set(str, str_array) - Returns the index (1-based) of the given string (str) in the comma-delimited list (str_array). Returns 0, if the string was not found or if the given string (str) contains a comma.

Argument typeReturn type
( String, String )Integer

More

FIND_IN_SET on Apache Spark Documentation

JSON functions

TO_JSON

Description

to_json(expr[, options]) - Returns a json string with a given struct value

Argument typeReturn type
( Array or Struct or Map )String
( Array or Struct or Map, Map )String

More

TO_JSON on Apache Spark Documentation

GET_JSON_OBJECT

Description

get_json_object(json_txt, path) - Extracts a json object from path.

Argument typeReturn type
( String, String )String

More

GET_JSON_OBJECT on Apache Spark Documentation

SCHEMA_OF_JSON

Description

schema_of_json(json[, options]) - Returns schema in the DDL format of JSON string.

Argument typeReturn type
( String )String
( String, Map )String

More

SCHEMA_OF_JSON on Apache Spark Documentation

Array functions

ARRAY

Description

array(expr, ...) - Returns an array with the given elements.

Argument typeReturn type
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray
No argumentsArray

More

ARRAY on Apache Spark Documentation

ARRAY_CONTAINS

Description

array_contains(array, value) - Returns true if the array contains the value.

Argument typeReturn type
( Array, Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Bool

More

ARRAY_CONTAINS on Apache Spark Documentation

ARRAY_DISTINCT

Description

array_distinct(array) - Removes duplicate values from the array.

Argument typeReturn type
( Array )Array

More

ARRAY_DISTINCT on Apache Spark Documentation

ARRAY_EXCEPT

Description

array_except(array1, array2) - Returns an array of the elements in array1 but not in array2, without duplicates.

Argument typeReturn type
( Array, Array )Array

More

ARRAY_EXCEPT on Apache Spark Documentation

ARRAY_INTERSECT

Description

array_intersect(array1, array2) - Returns an array of the elements in the intersection of array1 and array2, without duplicates.

Argument typeReturn type
( Array, Array )Array

More

ARRAY_INTERSECT on Apache Spark Documentation

ARRAY_POSITION

Description

array_position(array, element) - Returns the (1-based) index of the first element of the array as long.

Argument typeReturn type
( Array, Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map )Integer

More

ARRAY_POSITION on Apache Spark Documentation

ARRAY_REMOVE

Description

array_remove(array, element) - Remove all elements that equal to element from array.

Argument typeReturn type
( Array, Integer or Bool or String or Binary or Date or Timestamp )Array

More

ARRAY_REMOVE on Apache Spark Documentation

ARRAY_REPEAT

Description

array_repeat(element, count) - Returns the array containing element count times.

Argument typeReturn type
( Integer or Float or Bool or String or Binary or Date or Timestamp or Array or Struct or Map, Integer )Array

More

ARRAY_REPEAT on Apache Spark Documentation

ARRAY_UNION

Description

array_union(array1, array2) - Returns an array of the elements in the union of array1 and array2, without duplicates.

Argument typeReturn type
( Array, Array )Array

More

ARRAY_UNION on Apache Spark Documentation

ARRAYS_OVERLAP

Description

arrays_overlap(a1, a2) - Returns true if a1 contains at least a non-null element present also in a2. If the arrays have no common element and they are both non-empty and either of them contains a null element null is returned, false otherwise.

Argument typeReturn type
( Array, Array )Bool

More

ARRAYS_OVERLAP on Apache Spark Documentation

ARRAYS_ZIP

Description

arrays_zip(a1, a2, ...) - Returns a merged array of structs in which the N-th struct contains all N-th values of input arrays.

Argument typeReturn type
( Array )Array

More

ARRAYS_ZIP on Apache Spark Documentation

SHUFFLE

Description

shuffle(array) - Returns a random permutation of the given array.

Argument typeReturn type
( Array )Array

More

SHUFFLE on Apache Spark Documentation

SLICE

Description

slice(x, start, length) - Subsets array x starting from index start (array indices start at 1, or starting from the end if start is negative) with the specified length.

Argument typeReturn type
( Array, Integer, Integer )Array

More

SLICE on Apache Spark Documentation

SEQUENCE

Description

sequence(start, stop, step) - Generates an array of elements from start to stop (inclusive), incrementing by step. The type of the returned elements is the same as the type of argument expressions. The start and stop expressions must resolve to the same type. If start and stop expressions resolve to the 'date' or 'timestamp' type then the step expression must resolve to the 'interval' type, otherwise to the same type as the start and stop expressions. Arguments: start - an expression, the start of the range; stop - an expression, the end the range (inclusive); step - an optional expression, the step of the range. By default step is 1 if start is less than or equal to stop, otherwise -1. For the temporal sequences it's 1 day and -1 day respectively. If start is greater than stop then the step must be negative, and vice versa.

Argument typeReturn type
( Integer, Integer )Array
( Integer, Integer, Integer )Array
( Date, Date )Array
( Date, Date, )Array
( Timestamp, Timestamp )Array
( Timestamp, Timestamp, )Array

More

SEQUENCE on Apache Spark Documentation

SORT_ARRAY

Description

sort_array(array[, ascendingOrder]) - Sorts the input array in ascending or descending order according to the natural ordering of the array elements.

Argument typeReturn type
( Array )Array
( Array, Bool )Array

More

SORT_ARRAY on Apache Spark Documentation

SIZE

Description

size(expr) - Returns the size of an array or a map.

Argument typeReturn type
( Array or Map )Integer

More

SIZE on Apache Spark Documentation

ARRAY_JOIN

Description

array_join(array, delimiter[, nullReplacement]) - Concatenates the elements of the given array using the delimiter and an optional string to replace nulls. If no value is set for nullReplacement, any null value is filtered.

Argument typeReturn type
( Array, String )String
( Array, String, String )String

More

ARRAY_JOIN on Apache Spark Documentation

CARDINALITY

Description

cardinality(expr) - Returns the size of an array or a map.

Argument typeReturn type
( Array or Map )Integer

More

CARDINALITY on Apache Spark Documentation

FLATTEN

Description

flatten(arrayOfArrays) - Transforms an array of arrays into a single array.

Argument typeReturn type
( Array )Array

More

FLATTEN on Apache Spark Documentation

Struct functions

STRUCT

Description

struct(col1, col2, col3, ...) - Creates a struct with the given field values.

Argument typeReturn type
No argumentsStruct

More

STRUCT on Apache Spark Documentation

NAMED_STRUCT

Description

named_struct(name1, val1, name2, val2, ...) - Creates a struct with the given field names and values.

Argument typeReturn type
( String )Struct

More

NAMED_STRUCT on Apache Spark Documentation

FROM_JSON

Description

from_json(jsonStr, schema[, options]) - Returns a struct value with the given jsonStr and schema.

Argument typeReturn type
( String, String )Struct
( String, String, Map )Struct

More

FROM_JSON on Apache Spark Documentation

FROM_CSV

Description

from_csv(csvStr, schema[, options]) - Returns a struct value with the given csvStr and schema.

Argument typeReturn type
( String, String )Struct
( String, String, Map )Struct

More

FROM_CSV on Apache Spark Documentation

Map functions

MAP

Description

map(key0, value0, key1, value1, ...) - Creates a map with the given key/value pairs.

Argument typeReturn type
No argumentsMap

More

MAP on Apache Spark Documentation

MAP_FROM_ARRAYS

Description

map_from_arrays(keys, values) - Creates a map with a pair of the given key/value arrays. All elements in keys should not be null

Argument typeReturn type
( Array, Array )Map

More

MAP_FROM_ARRAYS on Apache Spark Documentation

MAP_FROM_ENTRIES

Description

map_from_entries(arrayOfEntries) - Returns a map created from the given array of entries.

Argument typeReturn type
( Array )Map

More

MAP_FROM_ENTRIES on Apache Spark Documentation

MAP_KEYS

Description

map_keys(map) - Returns an unordered array containing the keys of the map.

Argument typeReturn type
( Map )Array

More

MAP_KEYS on Apache Spark Documentation

MAP_VALUES

Description

map_values(map) - Returns an unordered array containing the values of the map.

Argument typeReturn type
( Map )Array

More

MAP_VALUES on Apache Spark Documentation

STR_TO_MAP

Description

str_to_map(text[, pairDelim[, keyValueDelim]]) - Creates a map after splitting the text into key/value pairs using delimiters. Default delimiters are ',' for pairDelim and ':' for keyValueDelim.

Argument typeReturn type
( String )Map
( String, String )Map
( String, String, String )Map

More

STR_TO_MAP on Apache Spark Documentation

MAP_CONCAT

Description

map_concat(map, ...) - Returns the union of all the given maps

Argument typeReturn type
No argumentsMap

More

MAP_CONCAT on Apache Spark Documentation

MAP_ENTRIES

Description

map_entries(map) - Returns an unordered array of all entries in the given map.

Argument typeReturn type
( Map )Array

More

MAP_ENTRIES on Apache Spark Documentation

DATE functions

ADD_MONTHS

Description

add_months(start_date, num_months) - Returns the date that is num_months after start_date.

Argument typeReturn type
( Date, Integer )Date

More

ADD_MONTHS on Apache Spark Documentation

CURRENT_DATE

Description

current_date() - Returns the current date at the start of query evaluation.

Argument typeReturn type
No argumentsDate

More

CURRENT_DATE on Apache Spark Documentation

DATE_ADD

Description

date_add(start_date, num_days) - Returns the date that is num_days after start_date.

Argument typeReturn type
( Date, Integer )Date

More

DATE_ADD on Apache Spark Documentation

DATE_SUB

Description

date_sub(start_date, num_days) - Returns the date that is num_days before start_date.

Argument typeReturn type
( Date, Integer )Date

More

DATE_SUB on Apache Spark Documentation

DATEDIFF

Description

datediff(endDate, startDate) - Returns the number of days from startDate to endDate.

Argument typeReturn type
( Date, Date )Integer

More

DATEDIFF on Apache Spark Documentation

LAST_DAY

Description

last_day(date) - Returns the last day of the month which the date belongs to.

Argument typeReturn type
( Date )Date

More

LAST_DAY on Apache Spark Documentation

NEXT_DAY

Description

next_day(start_date, day_of_week) - Returns the first date which is later than start_date and named as indicated.

Argument typeReturn type
( Date, String )Date

More

NEXT_DAY on Apache Spark Documentation

TO_DATE

Description

to_date(date_str[, fmt]) - Parses the date_str expression with the fmt expression to a date. Returns null with invalid input. By default, it follows casting rules to a date if the fmt is omitted.

Argument typeReturn type
( String, String )Date
( String or Date or Timestamp )Date

More

TO_DATE on Apache Spark Documentation

TRUNC

Description

trunc(date, fmt) - Returns date with the time portion of the day truncated to the unit specified by the format model fmt.

Argument typeReturn type
( Date, String )Date

More

TRUNC on Apache Spark Documentation

DATE_PART

Description

date_part(field, source) - Extracts a part of the date/timestamp or interval source. Arguments: field - selects which part of the source should be extracted, and supported string values are as same as the fields of the equivalent function EXTRACT; source - a date/timestamp or interval column from where field should be extracted.

Argument typeReturn type
( String, Date or Timestamp or )Float

More

DATE_PART on Apache Spark Documentation

MAKE_DATE

Description

make_date(year, month, day) - Create date from year, month and day fields. Arguments: year - the year to represent, from 1 to 9999; month - the month-of-year to represent, from 1 (January) to 12 (December); day - the day-of-month to represent, from 1 to 31.

Argument typeReturn type
( Integer, Integer, Integer )Date

More

MAKE_DATE on Apache Spark Documentation

WEEKDAY

Description

weekday(date) - Returns the day of the week for date/timestamp (0 = Monday, 1 = Tuesday, ..., 6 = Sunday).

Argument typeReturn type
( Date or Timestamp )Integer

More

WEEKDAY on Apache Spark Documentation

TIMESTAMP functions

NOW

Description

now() - Returns the current timestamp at the start of query evaluation.

Argument typeReturn type
No argumentsTimestamp

More

NOW on Apache Spark Documentation

CURRENT_TIMESTAMP

Description

current_timestamp() - Returns the current timestamp at the start of query evaluation.

Argument typeReturn type
No argumentsTimestamp

More

CURRENT_TIMESTAMP on Apache Spark Documentation

FROM_UNIXTIME

Description

from_unixtime(unix_time, format) - Returns unix_time in the specified format.

Argument typeReturn type
( Integer, String )String
( Integer )String

More

FROM_UNIXTIME on Apache Spark Documentation

FROM_UTC_TIMESTAMP

Description

from_utc_timestamp(timestamp, timezone) - Given a timestamp like '2017-07-14 02:40:00.0', interprets it as a time in UTC, and renders that time as a timestamp in the given time zone. For example, 'GMT+1' would yield '2017-07-14 03:40:00.0'.

Argument typeReturn type
( Timestamp, String )Timestamp

More

FROM_UTC_TIMESTAMP on Apache Spark Documentation

TO_TIMESTAMP

Description

to_timestamp(timestamp[, fmt]) - Parses the timestamp expression with the fmt expression to a timestamp. Returns null with invalid input. By default, it follows casting rules to a timestamp if the fmt is omitted.

Argument typeReturn type
( Integer or Float or String or Date or Timestamp )Timestamp
( String, String )Timestamp

More

TO_TIMESTAMP on Apache Spark Documentation

TO_UTC_TIMESTAMP

Description

to_utc_timestamp(timestamp, timezone) - Given a timestamp like '2017-07-14 02:40:00.0', interprets it as a time in the given time zone, and renders that time as a timestamp in UTC. For example, 'GMT+1' would yield '2017-07-14 01:40:00.0'.

Argument typeReturn type
( Timestamp, String )Timestamp

More

TO_UTC_TIMESTAMP on Apache Spark Documentation

TO_UNIX_TIMESTAMP

Description

to_unix_timestamp(expr[, pattern]) - Returns the UNIX timestamp of the given time.

Argument typeReturn type
( String, String )Integer
( String or Timestamp )Integer

More

TO_UNIX_TIMESTAMP on Apache Spark Documentation

UNIX_TIMESTAMP

Description

unix_timestamp([expr[, pattern]]) - Returns the UNIX timestamp of current or specified time.

Argument typeReturn type
No argumentsInteger
( String, String )Integer
( String or Timestamp )Integer

More

UNIX_TIMESTAMP on Apache Spark Documentation

DATE_TRUNC

Description

date_trunc(fmt, ts) - Returns timestamp ts truncated to the unit specified by the format model fmt. fmt should be one of ["YEAR", "YYYY", "YY", "MON", "MONTH", "MM", "DAY", "DD", "HOUR", "MINUTE", "SECOND", "WEEK", "QUARTER"]

Argument typeReturn type
( String, Timestamp )Timestamp

More

DATE_TRUNC on Apache Spark Documentation

WINDOW

Description

window(time, windowDuration[, slideDuration[, startTime]]) - Bucketize rows into one or more time windows given a timestamp specifying column.

Argument typeReturn type
( Timestamp, String )Struct
( Timestamp, String, String )Struct
( Timestamp, String, String, String )Struct

More

WINDOW on Apache Spark Documentation

DATE_FORMAT

Description

date_format(timestamp, fmt) - Converts timestamp to a value of string in the format specified by the date format fmt.

Argument typeReturn type
( Date or Timestamp, String )String

More

DATE_FORMAT on Apache Spark Documentation

MONTHS_BETWEEN

Description

months_between(timestamp1, timestamp2) - Returns number of months between timestamp1 and timestamp2.

Argument typeReturn type
( Date or Timestamp, Date or Timestamp )Float

More

MONTHS_BETWEEN on Apache Spark Documentation

DAY

Description

day(date) - Returns the day of month of the date/timestamp.

Argument typeReturn type
( Date or Timestamp )Integer

More

DAY on Apache Spark Documentation

DAYOFMONTH

Description

dayofmonth(date) - Returns the day of month of the date/timestamp.

Argument typeReturn type
( Date or Timestamp )Integer

More

DAYOFMONTH on Apache Spark Documentation

DAYOFYEAR

Description

dayofyear(date) - Returns the day of year of the date/timestamp.

Argument typeReturn type
( Date or Timestamp )Integer

More

DAYOFYEAR on Apache Spark Documentation

DAYOFWEEK

Description

dayofweek(date) - Returns the day of the week for date/timestamp (1 = Sunday, 2 = Monday, ..., 7 = Saturday).

Argument typeReturn type
( Date or Timestamp )Integer

More

DAYOFWEEK on Apache Spark Documentation

WEEKOFYEAR

Description

weekofyear(date) - Returns the week of the year of the given date. A week is considered to start on a Monday and week 1 is the first week with >3 days.

Argument typeReturn type
( Date or Timestamp )Integer

More

WEEKOFYEAR on Apache Spark Documentation

YEAR

Description

year(date) - Returns the year component of the date/timestamp.

Argument typeReturn type
( Date or Timestamp )Integer

More

YEAR on Apache Spark Documentation

MONTH

Description

month(date) - Returns the month component of the date/timestamp.

Argument typeReturn type
( Date or Timestamp )Integer

More

MONTH on Apache Spark Documentation

QUARTER

Description

quarter(date) - Returns the quarter of the year for date, in the range 1 to 4.

Argument typeReturn type
( Date or Timestamp )Integer

More

QUARTER on Apache Spark Documentation

HOUR

Description

hour(timestamp) - Returns the hour component of the timestamp.

Argument typeReturn type
( Timestamp )Integer

More

HOUR on Apache Spark Documentation

MINUTE

Description

minute(timestamp) - Returns the minute component of the timestamp.

Argument typeReturn type
( Timestamp )Integer

More

MINUTE on Apache Spark Documentation

SECOND

Description

second(timestamp) - Returns the second component of the timestamp.

Argument typeReturn type
( Timestamp )Integer

More

SECOND on Apache Spark Documentation

MAKE_INTERVAL

Description

make_interval(years, months, weeks, days, hours, mins, secs) - Make interval from years, months, weeks, days, hours, mins and secs. Arguments: years - the number of years, positive or negative; months - the number of months, positive or negative; weeks - the number of weeks, positive or negative; days - the number of days, positive or negative; hours - the number of hours, positive or negative; mins - the number of minutes, positive or negative; secs - the number of seconds with the fractional part in microsecond precision.

Argument typeReturn type
No arguments
( Integer )
( Integer, Integer )
( Integer, Integer, Integer )
( Integer, Integer, Integer, Integer )
( Integer, Integer, Integer, Integer, Integer )
( Integer, Integer, Integer, Integer, Integer, Integer )
( Integer, Integer, Integer, Integer, Integer, Integer, Integer or Float )

More

MAKE_INTERVAL on Apache Spark Documentation

MAKE_TIMESTAMP

Description

make_timestamp(year, month, day, hour, min, sec[, timezone]) - Create timestamp from year, month, day, hour, min, sec and timezone fields. Arguments: year - the year to represent, from 1 to 9999; month - the month-of-year to represent, from 1 (January) to 12 (December); day - the day-of-month to represent, from 1 to 31; hour - the hour-of-day to represent, from 0 to 23; min - the minute-of-hour to represent, from 0 to 59; sec - the second-of-minute and its micro-fraction to represent, from 0 to 60; if the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp; timezone - the time zone identifier, for example, CET, UTC and etc.

Argument typeReturn type
( Integer, Integer, Integer, Integer, Integer, Integer or Float )Timestamp
( Integer, Integer, Integer, Integer, Integer, Integer or Float, String )Timestamp

More

MAKE_TIMESTAMP on Apache Spark Documentation