Merged
Conversation
Update parquet to 1.12.3 (latest version with hadoop-client-2.x) * Add support for ZSTD compression * Temporary declare LZO as not supported. It causes following error in both current and in pre-upgrade builds: ``` ERROR: PXF server error : Class com.hadoop.compression.lzo.LzoCodec was not found (seg1 10.11.0.131:6000 pid=2567556) ``` * Add tests to cover different types of compression
Contributor
Author
|
Passed: 1005 Failed: 92 Skipped: 61 |
ostinru
added a commit
that referenced
this pull request
Feb 24, 2026
Add examples on how to use PXF foreign data wrappers to access ORC files on S3/HDFS.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Update parquet to 1.15.2
Add support for ZSTD compression
Add support for LZ4_RAW codec
More optimal memory usage in compression codecs
Temporary declare LZO as not supported. It causes following error in both current and in pre-upgrade builds:
Add tests to cover different types of compression
Dependency tree changes are small:
It has its own thrift library shaded. It doesn't depend on protobuf.
parquet-hadoopin fact expects that there ishadoop-client,hadoop-common,hadoop-annotationsandhadoop-mapreduce-client-coreprovided.