Skip to content

Conversation

@lyne7-sc
Copy link
Contributor

Which issue does this PR close?

Follow up to #19738

Rationale for this change

The current hex implementation expands DictionaryArray inputs into a regular array, which causes loss of dictionary encoding and redundant hex computation for repeated values.

What changes are included in this PR?

  • Apply hex encoding only to dictionary values
  • Avoid expanding dictionary arrays during execution

Benchmark

Size Before After Speedup
1024 8.3 µs 7.2 µs 1.15×
4096 42.9 µs 34.5 µs 1.24×
8192 91.6 µs 71.7 µs 1.28×

Are these changes tested?

Yes. Existing unit tests and sqllogictest tests pass.

Are there any user-facing changes?

No.

@github-actions github-actions bot added sqllogictest SQL Logic Tests (.slt) spark labels Jan 15, 2026
Comment on lines 271 to 274
let encoded_values_array: ArrayRef = match encoded_values {
ColumnarValue::Array(a) => a,
ColumnarValue::Scalar(s) => Arc::new(s.to_array()?),
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably refactor hex_encode_bytes and hex_encode_int64 to return arrays only, as their signature say they return ColumnarValue but they never return the scalar variant, forcing handling like this

}
DataType::Dictionary(_, value_type) => {
DataType::Dictionary(_, _) => {
let dict = as_dictionary_array::<Int32Type>(&array);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: we should have some check that the dictionary has i32 key type, otherwise this will panic

let dict_values = dict.values();

match **value_type {
let encoded_values: ColumnarValue = match dict_values.data_type() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might want to consider arms for LargeUtf, views, etc.

FROM VALUES ('foo'), ('bar'), ('foo'), (NULL), ('baz'), ('bar');

query T
SELECT hex(dict_col) FROM t_dict_utf8;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we check the output type here with arrow_typeof to ensure they are dictionaries

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After running the SLT tests for hex, it seems the planner might unpack dictionary-encoded inputs like Dictionary(Int32, Utf8) or Dictionary(Int32, Int64) into their underlying types (Utf8View or Int64) before calling the function. However, Dictionary(Binary) appears to stay as a dictionary

logical_plan
01)Projection: arrow_typeof(hex(CAST(t_dict_utf8.dict_col AS Utf8View)))
physical_plan
01)ProjectionExec: expr=[arrow_typeof(hex(CAST(dict_col@0 AS Utf8View)))]

logical_plan
01)Projection: arrow_typeof(hex(t_dict_binary.dict_col))
physical_plan
01)ProjectionExec: expr=[arrow_typeof(hex(dict_col@0))]

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess it's related to this issue

We can still push through with this PR even though it only works for binary (we can change the tests to binary here)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the clarification. the tests have been updated to use dictionary(binary).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

spark sqllogictest SQL Logic Tests (.slt)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants