Skip to content

Conversation

@dibryant
Copy link
Contributor

@dibryant dibryant commented Nov 14, 2025

Fixes for https://issues.redhat.com/browse/RHAIENG-1758

Description

Updated GA with missing Pandoc

How Has This Been Tested?

Self checklist (all need to be checked):

  • Ensure that you have run make test (gmake on macOS) before asking for review
  • Changes to everything except Dockerfile.konflux files should be done in odh/notebooks and automatically synced to rhds/notebooks. For Konflux-specific changes, modify Dockerfile.konflux files directly in rhds/notebooks as these require special attention in the downstream repository and flow to the upcoming RHOAI release.

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Summary by CodeRabbit

  • New Features

    • Added PDF export tooling with TeX Live and Pandoc for enhanced document generation.
    • Extended optimized support for ppc64le architecture.
  • Improvements

    • Streamlined container build process and reduced installation layers for faster builds.
    • Configured container startup with a proper entrypoint.
    • Introduced conditional caching to speed PDF dependency provisioning on supported architectures.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 14, 2025

Warning

Rate limit exceeded

@dibryant has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 19 minutes and 11 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between b498018 and aee9043.

📒 Files selected for processing (4)
  • jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • jupyter/utils/install_pandoc.sh (1 hunks)
  • jupyter/utils/install_texlive.sh (1 hunks)
  • tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1 hunks)

Walkthrough

Adds two new architecture-aware install scripts for Pandoc and TeX Live (ppc64le-only) and updates the UBI9 Python 3.12 CPU Dockerfile to integrate PDF-tooling build stages, conditional caching, and an ENTRYPOINT.

Changes

Cohort / File(s) Summary
Installation scripts
jupyter/utils/install_pandoc.sh, jupyter/utils/install_texlive.sh
New executable bash scripts. Both map uname -m to GOARCH and run only for ppc64le. install_pandoc.sh installs pandoc via dnf, creates /usr/local/pandoc/bin, symlinks pandoc, updates PATH, and verifies version. install_texlive.sh installs build deps, builds/installs TeX Live 2025 into /usr/local/texlive, creates arch-specific symlinks, updates PATH, and verifies pdflatex/tlmgr. Both use strict error handling (set -euxo pipefail).
Docker configuration
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu
Adds a pdf-builder stage and ubi-repos stage for repo config; consolidates and streamlines RUN blocks; conditionally uses cached texlive/pandoc on ppc64le (copies from cache) or runs install scripts otherwise; bundles Python dependency installation steps; adds ENTRYPOINT start-notebook.sh.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • Check architecture detection and the uname -m → GOARCH mapping and conditional branches in both scripts.
  • Review TeX Live build/install steps, install paths, and permissions in install_texlive.sh.
  • Verify symlink targets and PATH modifications in both scripts to avoid conflicts.
  • Inspect Dockerfile caching logic, the new pdf-builder/ubi-repos stages, and the added ENTRYPOINT for startup compatibility.

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings, 1 inconclusive)
Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is largely incomplete: it provides minimal detail ('Updated GA with missing Pandoc'), lacks testing information, and all self-checklist items and merge criteria remain unchecked, indicating insufficient preparation for review. Complete the description with detailed change explanations, testing procedures, environment details, and verify all self-checklist items before requesting review. Mark completed checklist items with [x].
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Title check ❓ Inconclusive The PR title references a Jira ticket (RHAIENG-1758) but is vague about actual changes; 'WIP' indicates work-in-progress status and the phrase 'Revise Tests for 2025b Onboarding: PandocMissing' doesn't clearly match the actual changes (adding Pandoc and TexLive installation scripts and Dockerfile updates). Clarify the title to describe the main changes, e.g., 'Add Pandoc and TexLive installation scripts for ppc64le architecture' or similar, removing vague prefixes like 'WIP'.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions github-actions bot added the review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel label Nov 14, 2025
@openshift-ci openshift-ci bot requested review from atheo89 and daniellutz November 14, 2025 01:23
@openshift-ci
Copy link
Contributor

openshift-ci bot commented Nov 14, 2025

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign daniellutz for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-34: Remove sudo and pin Pandoc version for deterministic builds.

GitHub Actions runners have sufficient permissions; sudo is typically unnecessary and can cause issues on certain runners. Additionally, Pandoc's version should be pinned to ensure reproducible builds and prevent unexpected breaking changes.

Apply this diff to remove sudo and pin the version:

-      - name: Install Pandoc
-        run: sudo apt-get update && sudo apt-get install -y pandoc
+      - name: Install Pandoc
+        run: apt-get install -y pandoc=3.1.*

Alternatively, if you need the latest version in a specific major.minor release:

-      - name: Install Pandoc
-        run: sudo apt-get update && sudo apt-get install -y pandoc
+      - name: Install Pandoc
+        run: |
+          apt-get update
+          apt-get install -y pandoc
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e9c915d and 1b7f86e.

📒 Files selected for processing (1)
  • .github/workflows/build-notebooks-push.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-34: Pandoc installation may be in the wrong job—verify placement and necessity.

The Pandoc installation step is added to the gen job, which only runs gen_gha_matrix_jobs.py to generate the build matrix. Pandoc is typically needed during actual notebook building, not matrix generation. Confirm whether:

  1. The Python matrix generation script actually requires Pandoc
  2. This step should instead be in the build job(s) or in ./.github/workflows/build-notebooks-TEMPLATE.yaml

If Pandoc is not needed for matrix generation, this step should be moved to avoid unnecessary overhead and potential ordering issues.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-36: Install Pandoc step looks good, minor clarity suggestion on sudo usage.

The placement and logic are correct—Pandoc is installed after checkout and before the matrix generation step that likely depends on it. The -y flag for automatic confirmation is appropriate for CI/CD.

Minor note: sudo is unnecessary in GitHub Actions ubuntu-latest runners, which typically run as root. You can simplify to just apt-get update && apt-get install -y pandoc for clarity.

~      - name: Install Pandoc
~        run: |
-~          sudo apt-get update
-~          sudo apt-get install -y pandoc
+~          apt-get update
+~          apt-get install -y pandoc
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4c3d496 and c9cfd1b.

📒 Files selected for processing (1)
  • .github/workflows/build-notebooks-push.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
  • GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)

20-47: Reminder: Complete the PR checklist before requesting review removal.

This PR is marked as WIP and the self-checklist indicates you should run make test (gmake on macOS) before requesting review, and ensure the merge criteria are met. Please complete these checks before changing the PR status from work-in-progress.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (4)
jupyter/utils/install_texlive.sh (4)

72-82: Critical: Code executes unconditionally for all architectures.

Lines 72-82 are not indented inside the if [[ "$ARCH" == "ppc64le" ]] block (which begins at line 13). This means they execute for all architectures, not just ppc64le. On non-ppc64le systems, the TeX Live installation is skipped, so /usr/local/texlive/bin/powerpc64le-unknown-linux-gnu does not exist, and line 76 will fail when attempting to create the symlink.

This is a repeat of a critical issue from previous reviews that remains unresolved.

Move lines 72-82 inside the ppc64le conditional block:

   ./install-tl --profile=texlive.profile --custom-bin=$TEXLIVE_INSTALL_PREFIX/bin/powerpc64le-unknown-linux-gnu

-# TeX Live binary directory
-TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
-
-# Create standard symlink 'linux' → arch-specific folder
-ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
-
-
-  # Set up environment
-  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
-  pdflatex --version
-  tlmgr --version
-
-fi
+  # TeX Live binary directory
+  TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
+
+  # Create standard symlink 'linux' → arch-specific folder
+  ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
+
+  # Set up environment
+  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
+  pdflatex --version
+  tlmgr --version
+
+fi

51-51: Cleanup path mismatch remains unresolved.

This line uses absolute path /texlive-build for cleanup, but line 41 creates ../texlive-build (relative path). These may not refer to the same directory, potentially leaving stale directories or removing unintended paths.

This issue was previously flagged in past reviews and remains unresolved.

Use consistent paths for creation and cleanup:

   # Create build directory and build
-  mkdir -p ../texlive-build
-  cd ../texlive-build
+  BUILD_DIR="/tmp/texlive-build"
+  mkdir -p "$BUILD_DIR"
+  cd "$BUILD_DIR"
   ../texlive-20250308-source/configure --prefix=/usr/local/texlive

And update the cleanup:

   # Cleanup sources to reduce image size
-  rm -rf /texlive-20250308-source /texlive-build
+  cd /
+  rm -rf /texlive-20250308-source "$BUILD_DIR"

57-57: Unsafe glob expansion remains unresolved.

The pattern cd install-tl-2*/ relies on glob expansion that could match zero directories (silent failure) or multiple directories (ambiguous match).

This issue was previously flagged in past reviews and remains unresolved.

Add explicit validation:

   wget https://mirror.ctan.org/systems/texlive/tlnet/install-tl-unx.tar.gz
   tar -xzf install-tl-unx.tar.gz
-  cd install-tl-2*/
+  
+  INSTALL_TL_DIR=$(find . -maxdepth 1 -type d -name 'install-tl-*' | head -1)
+  if [[ -z "$INSTALL_TL_DIR" ]]; then
+    echo "Error: TeX Live installer directory not found after extraction"
+    exit 1
+  fi
+  cd "$INSTALL_TL_DIR"

21-30: Missing checksum verification for external downloads.

External downloads lack integrity verification, creating supply-chain risks and making builds non-reproducible. This affects:

  • Lines 21-30: CentOS RPMs from mirror.stream.centos.org
  • Line 33: TeX Live 2025 source archive
  • Line 55: TeX Live installer archive

This issue was previously flagged in past reviews and remains unresolved.

Add SHA256 verification for each download. Example for the TeX Live source:

   # Step 1: Download and extract the TeX Live source
-  wget https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/texlive-20250308-source.tar.xz
-  tar -xf texlive-20250308-source.tar.xz
+  TEXLIVE_SOURCE="texlive-20250308-source.tar.xz"
+  TEXLIVE_SOURCE_SHA256="<expected-hash-here>"
+  wget "https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/${TEXLIVE_SOURCE}"
+  echo "${TEXLIVE_SOURCE_SHA256}  ${TEXLIVE_SOURCE}" | sha256sum -c - || exit 1
+  tar -xf "${TEXLIVE_SOURCE}"

Apply similar verification to the installer archive and consider pinning RPM checksums.

Also applies to: 33-33, 55-55

🧹 Nitpick comments (2)
tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1)

111-112: Strengthen the success validation.

The current assertion checks for "Writing" or "PDF" in the output, but "PDF" is too generic and could appear in error messages (e.g., "Failed to generate PDF"). The exit code check alone is sufficient, or you could verify the actual PDF file was created.

Consider one of these approaches:

Option 1: Remove the loose string check and rely on exit code:

             # Otherwise PDF export should work
             assert exit_code == 0, output
-            assert "Writing" in output or "PDF" in output

Option 2: Verify the PDF file was actually created:

             # Otherwise PDF export should work
             assert exit_code == 0, output
-            assert "Writing" in output or "PDF" in output
+            pdf_file = test_file_name.replace(".ipybn", ".pdf")
+            exit_code_check, _ = container.exec(["test", "-f", f"{self.APP_ROOT_HOME}/{pdf_file}"])
+            assert exit_code_check == 0, f"PDF file {pdf_file} was not created"
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (1)

9-25: Address Hadolint warning about USER directive.

The pipeline failure indicates DL3002: Last USER should not be root at line 17. While this is a build stage and the final jupyter-minimal stage correctly switches to USER 1001 (line 104), it's good practice to switch back to a non-root user even in intermediate stages to minimize the attack surface.

Add a USER directive before the stage ends:

 RUN ./install_texlive.sh
 RUN ./install_pandoc.sh
+
+USER 1001
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9e28c28 and 88a76fb.

📒 Files selected for processing (4)
  • jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • jupyter/utils/install_pandoc.sh (1 hunks)
  • jupyter/utils/install_texlive.sh (1 hunks)
  • tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • jupyter/utils/install_pandoc.sh
🧰 Additional context used
🪛 GitHub Actions: Code static analysis
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu

[warning] 17-17: Hadolint DL3002: Last USER should not be root.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (25)
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (3)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (3)

127-127: LGTM: ENTRYPOINT correctly configured.

The ENTRYPOINT directive properly references the start-notebook.sh script that was copied to the working directory at line 106.


95-102: No issues found — the install_pdf_deps.sh script exists.

The script referenced at line 101 (./utils/install_pdf_deps.sh) is present in the repository at jupyter/utils/install_pdf_deps.sh. The Dockerfile reference is valid and the code is correct as-is.


95-95: The mount directive syntax is valid—no changes needed.

The original review comment incorrectly identified this as problematic syntax. Docker BuildKit supports cache mounts with from and source parameters to use a directory from another build stage as the cache base, with syntax: RUN --mount=type=cache,target=<target>,id=<id>,from=<stage-or-image>,source=<path-in-from> <command>

The Dockerfile's usage of --mount=type=cache,from=pdf-builder,source=/usr/local/,target=/pdf_builder/ aligns with this documented feature. This requires Dockerfile frontend >= 1.2/1.3, so ensure the syntax directive at the top of the Dockerfile specifies an appropriate version. The code does not require restructuring.

Likely an incorrect or invalid review comment.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (4)
jupyter/utils/install_texlive.sh (4)

72-82: CRITICAL: Code outside conditional executes unconditionally for all architectures.

Lines 72–82 are unindented and execute for all architectures, not just ppc64le. For non-ppc64le systems, line 76 attempts to create a symlink to /usr/local/texlive/bin/powerpc64le-unknown-linux-gnu, which does not exist (TeX Live installation was skipped). This causes the build to fail on non-ppc64le platforms.

Indent lines 72–82 to be inside the if [[ "$ARCH" == "ppc64le" ]] block (which ends at line 84):

   ./install-tl --profile=texlive.profile --custom-bin=$TEXLIVE_INSTALL_PREFIX/bin/powerpc64le-unknown-linux-gnu

-# TeX Live binary directory
-TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
-
-# Create standard symlink 'linux' → arch-specific folder
-ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
-
-
-  # Set up environment
-  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
-  pdflatex --version
-  tlmgr --version
-
-fi
+  # TeX Live binary directory
+  TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
+
+  # Create standard symlink 'linux' → arch-specific folder
+  ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
+
+  # Set up environment
+  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
+  pdflatex --version
+  tlmgr --version
+
+fi

40-51: Cleanup logic uses inconsistent path resolution (build vs removal).

Line 41 creates ../texlive-build relative to the source directory, but line 51 removes /texlive-build (absolute root path). These paths may not match, leaving stale build artifacts in the image. Move the cleanup after changing to a safe directory and use tracked variables:

   make -j"$(nproc)"
   make install

+  # Change to a safe directory before cleanup
+  cd /
+
+  # Cleanup sources to reduce image size
+  rm -rf /texlive-20250308-source /texlive-build

   # Symlink for pdflatex
   ln -sf pdftex /usr/local/texlive/bin/powerpc64le-unknown-linux-gnu/pdflatex

-  # Cleanup sources to reduce image size
-  rm -rf /texlive-20250308-source /texlive-build

55-58: Unsafe glob expansion for directory traversal without existence checks.

Line 57 uses cd install-tl-2*/ which silently expands unpredictably if the glob matches zero or multiple directories. Add explicit error handling:

   wget https://mirror.ctan.org/systems/texlive/tlnet/install-tl-unx.tar.gz
   tar -xzf install-tl-unx.tar.gz
+
+  # Find and validate the extracted installer directory
+  INSTALL_TL_DIR=$(find . -maxdepth 1 -type d -name 'install-tl-*' | head -1)
+  if [[ -z "$INSTALL_TL_DIR" ]]; then
+    echo "Error: TeX Live installer directory not found after extraction"
+    exit 1
+  fi
-  cd install-tl-2*/
+  cd "$INSTALL_TL_DIR"

21-30: Add checksum verification for all external downloads to ensure supply-chain integrity.

External downloads lack integrity verification, making builds non-reproducible and vulnerable to supply-chain attacks. Downloads occur at lines 21–30 (RPMs), 33 (TeX Live source), and 55 (TeX Live installer). Apply this pattern to each:

   # Step 1: Download and extract the TeX Live source
-  wget https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/texlive-20250308-source.tar.xz
-  tar -xf texlive-20250308-source.tar.xz
+  TEXLIVE_SOURCE="texlive-20250308-source.tar.xz"
+  TEXLIVE_SOURCE_SHA256="<compute-and-insert-expected-hash>"
+  wget "https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/${TEXLIVE_SOURCE}"
+  echo "${TEXLIVE_SOURCE_SHA256}  ${TEXLIVE_SOURCE}" | sha256sum -c - || exit 1
+  tar -xf "${TEXLIVE_SOURCE}"

Apply the same pattern to the TeX Live installer and consider pinning RPM URLs or computing their hashes locally.

Also applies to: 33-33, 55-55

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 88a76fb and 40c90bd.

📒 Files selected for processing (3)
  • jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • jupyter/utils/install_pandoc.sh (1 hunks)
  • jupyter/utils/install_texlive.sh (1 hunks)
🧰 Additional context used
🪛 GitHub Actions: Code static analysis
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu

[warning] 17-17: DL3002 Last USER should not be root.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (25)
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
🔇 Additional comments (5)
jupyter/utils/install_pandoc.sh (1)

13-22: PATH export within conditional won't affect subsequent Docker build steps.

Line 20 exports PATH within the conditional block, but this modification is only visible during the current RUN instruction's execution; mounted files and environment changes are not persisted in the final image when using RUN --mount constructs. For Pandoc to be available in subsequent layers, PATH must be set in the Dockerfile or in a persistent shell profile. Verify that the Dockerfile properly sets PATH after invoking this script, or restructure to embed the PATH update directly in the Dockerfile's ENV directive.

jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (4)

9-25: Effective architecture-specific PDF tooling stage with proper cache-mount technique.

The new pdf-builder stage isolates PPC64LE-specific TeX Live and Pandoc builds, reducing final image bloat. Cache mounts (RUN --mount=type=cache) are only available during the RUN command on Podman; anything needed from the cache should be copied outside of the RUN during the same command. Lines 92–99 correctly copy built artifacts from the cache during the same RUN instruction, ensuring portability across Docker and Podman.


91-99: Verify conditional ppc64le copy logic and error handling.

Lines 93–99 conditionally copy prebuilt Pandoc/TeX Live for ppc64le or invoke install_pdf_deps.sh for other architectures. Ensure that:

  1. ./utils/install_pdf_deps.sh exists in the repository and is compatible with all target architectures.
  2. The bash -c subshell within the RUN properly handles failures; currently, set -euxo pipefail is not active in the subshell, so errors may be masked.

Consider rewriting with explicit error handling:

-RUN --mount=type=cache,from=pdf-builder,source=/usr/local/,target=/pdf_builder/,rw \
+RUN set -euxo pipefail && \
+    --mount=type=cache,from=pdf-builder,source=/usr/local/,target=/pdf_builder/,rw \

Actually, set -euxo pipefail cannot precede mount syntax. Instead, ensure the conditional logic is robust or move it to the helper script.


106-120: Consolidate Python dependency installation and verify kernel.json update syntax.

Lines 106–120 combine multiple steps (pip install, kernel.json update, permissions fix, addon apply) in a single RUN, which is good for layer reduction. However, verify the sed command on line 113:

sed -i -e "s/Python.*/$(python --version | cut -d '.' -f-2)\",/" /opt/app-root/share/jupyter/kernels/python3/kernel.json

This assumes kernel.json contains a line matching Python.*. If the file structure differs or the pattern doesn't match, sed silently succeeds and the kernel.json is not updated. Add validation or use a more robust method:

# Verify kernel.json exists before sed
if [[ ! -f /opt/app-root/share/jupyter/kernels/python3/kernel.json ]]; then
  echo "Error: kernel.json not found"
  exit 1
fi
sed -i -e "s/Python.*/$(python --version | cut -d '.' -f-2)\",/" /opt/app-root/share/jupyter/kernels/python3/kernel.json

89-89: Pipeline warning: User context after file operations.

The pipeline reports DL3002: Last USER should not be root. Line 89 sets USER 0 for package installation, line 101 sets USER 1001 for general operations, and lines 106–120 perform file operations as USER 1001. Since the final effective user before ENTRYPOINT is 1001, the warning may be a false positive or refer to a different section. However, verify that all sensitive operations (package installation, permissions fixes) complete before demoting to USER 1001, which appears correct here.

Also applies to: 101-101, 106-120

Comment on lines +16 to +17
dnf install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm
dnf install -y pandoc
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add error handling and checksum verification for external dependencies.

Lines 16–17 install an external RPM and Pandoc package without verifying integrity or handling installation failures. If dnf install fails silently or the RPM is compromised, the build continues unchecked. Apply strict error checking and pin package versions or checksums:

-  dnf install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm
-  dnf install -y pandoc
+  dnf install -y --setopt=skip_missing_names_on_install=False \
+    https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm && \
+  dnf install -y --setopt=skip_missing_names_on_install=False pandoc && \
+  pandoc --version  # Verify installation
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
dnf install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm
dnf install -y pandoc
dnf install -y --setopt=skip_missing_names_on_install=False \
https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm && \
dnf install -y --setopt=skip_missing_names_on_install=False pandoc && \
pandoc --version # Verify installation
🤖 Prompt for AI Agents
In jupyter/utils/install_pandoc.sh around lines 16–17, the script currently runs
dnf install on an external RPM and pandoc without integrity checks or robust
error handling; update the flow to (1) download the epel-release RPM to a
temporary file, (2) verify its checksum (and/or RPM GPG signature) against a
pinned value before installing, (3) install using dnf only if verification
succeeds, (4) pin the pandoc package/version (or install a specific released
RPM) and verify its checksum/signature as well, and (5) enable strict failure
behavior (e.g., set -e) and check dnf exit codes so the script exits with a
clear error message if any download, checksum verification, or install step
fails.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (1)

126-126: Verify ENTRYPOINT compatibility with start-notebook.sh.

Line 126 sets ENTRYPOINT ["start-notebook.sh"] but the script is copied at line 105 to ./ (which is /opt/app-root/bin due to line 87 WORKDIR). Ensure:

  • start-notebook.sh is on PATH or ENTRYPOINT uses an absolute path: ENTRYPOINT ["/opt/app-root/bin/start-notebook.sh"]
  • The script is executable (chmod +x may be needed in the COPY step or after)
  • No CMD is defined to override ENTRYPOINT with positional arguments

Current form may fail because the shell cannot locate start-notebook.sh without it being in PATH.

-ENTRYPOINT ["start-notebook.sh"]
+ENTRYPOINT ["/opt/app-root/bin/start-notebook.sh"]

Or:

 WORKDIR /opt/app-root/src
 
-ENTRYPOINT ["start-notebook.sh"]
+ENV PATH="/opt/app-root/bin:$PATH"
+ENTRYPOINT ["start-notebook.sh"]
♻️ Duplicate comments (4)
jupyter/utils/install_texlive.sh (4)

72-84: Critical: Incorrect indentation — setup code outside if block executes unconditionally.

Lines 72–84 (TEX_BIN_DIR assignment, symlink creation, PATH export, version verification) are at indentation level 0 and execute for all architectures, not just ppc64le. For non-ppc64le systems (where TeX Live was not installed), line 76 fails because /usr/local/texlive/bin/powerpc64le-unknown-linux-gnu does not exist.

Move these lines inside the if block with proper indentation:

   ./install-tl --profile=texlive.profile --custom-bin=$TEXLIVE_INSTALL_PREFIX/bin/powerpc64le-unknown-linux-gnu

-# TeX Live binary directory
-TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
-
-# Create standard symlink 'linux' → arch-specific folder
-ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
-
-
-  # Set up environment
-  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
-  pdflatex --version
-  tlmgr --version
-
-fi
+  # TeX Live binary directory
+  TEX_BIN_DIR="/usr/local/texlive/bin/powerpc64le-unknown-linux-gnu"
+
+  # Create standard symlink 'linux' → arch-specific folder
+  ln -sf "$TEX_BIN_DIR" /usr/local/texlive/bin/linux
+
+  # Set up environment
+  export PATH="$TEXLIVE_INSTALL_PREFIX/bin/linux:$PATH"
+  pdflatex --version
+  tlmgr --version
+
+fi

57-57: Unsafe glob expansion for directory traversal.

Line 57 uses cd install-tl-2*/ which silently fails to match if extraction doesn't produce the expected directory. Add explicit error handling:

   wget https://mirror.ctan.org/systems/texlive/tlnet/install-tl-unx.tar.gz
   tar -xzf install-tl-unx.tar.gz
-  cd install-tl-2*/
+  
+  # Find and verify the extracted install-tl directory
+  INSTALL_TL_DIR=$(find . -maxdepth 1 -type d -name 'install-tl-*' | head -1)
+  if [[ -z "$INSTALL_TL_DIR" ]]; then
+    echo "Error: TeX Live installer directory not found after extraction"
+    exit 1
+  fi
+  cd "$INSTALL_TL_DIR"

40-51: Cleanup logic path mismatch — cleanup may remove unintended files or leave stale directories.

Line 41 creates ../texlive-build (relative to /texlive-20250308-source), but line 51 removes /texlive-build (absolute root path). These may not refer to the same directory. Additionally, the script removes /texlive-20250308-source at the root level (line 51) while it was extracted into the current working context. Use tracked variables to ensure cleanup removes exactly what was created:

   # Create build directory and build
-  mkdir -p ../texlive-build
-  cd ../texlive-build
+  BUILD_DIR="/texlive-build"
+  mkdir -p "$BUILD_DIR"
+  cd "$BUILD_DIR"
   ../texlive-20250308-source/configure --prefix=/usr/local/texlive
   make -j"$(nproc)"
   make install
 
   # Symlink for pdflatex
   ln -sf pdftex /usr/local/texlive/bin/powerpc64le-unknown-linux-gnu
   
   # Cleanup sources to reduce image size
-  rm -rf /texlive-20250308-source /texlive-build
+  cd /
+  rm -rf /texlive-20250308-source "$BUILD_DIR"

21-30: Missing checksum verification for external downloads — non-reproducible builds and supply-chain risk.

Downloads lack integrity verification:

  • Lines 21–30: CentOS RPMs (hardcoded URLs)
  • Lines 33–34: TeX Live source archive
  • Lines 55–56: TeX Live installer archive

Compute and verify SHA256 checksums immediately after each download. Example for TeX Live source:

   # Step 1: Download and extract the TeX Live source
-  wget https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/texlive-20250308-source.tar.xz
-  tar -xf texlive-20250308-source.tar.xz
+  TEXLIVE_SOURCE="texlive-20250308-source.tar.xz"
+  TEXLIVE_SOURCE_SHA256="<expected-hash-here>"
+  wget "https://ftp.math.utah.edu/pub/tex/historic/systems/texlive/2025/${TEXLIVE_SOURCE}"
+  echo "${TEXLIVE_SOURCE_SHA256}  ${TEXLIVE_SOURCE}" | sha256sum -c - || exit 1
+  tar -xf "${TEXLIVE_SOURCE}"

Apply the same pattern to the TeX Live installer and consider verifying RPM checksums.

Also applies to: 33-34, 55-56

🧹 Nitpick comments (2)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (2)

53-53: Verify that additional packages are necessary.

Line 53 adds perl, mesa-libGL, and skopeo. Confirm:

  • perl: Required for TeX Live installer (makes sense given pdf-builder stage)
  • mesa-libGL: Ensure this is not a transitive dependency that dnf would have installed anyway
  • skopeo: Verify this is actually used by the notebooks or if it can be deferred

Consider documenting why each package is required via inline comments.


108-122: Consolidated install chain is comprehensive but complex — ensure all steps are necessary and ordering is correct.

Lines 108–122 combine Python dependency install, JupyterLab config, kernel setup, permissions, and addons in a single RUN. This reduces layer count but makes debugging harder if a step fails.

Key observations:

  • Line 111: --verify-hashes requires that all dependencies have hashes in pylock.toml. Verify this is true and that hash verification is not being skipped.
  • Line 113: Disables JupyterLab announcements plugin — confirm this is intentional and not masking security advisories.
  • Line 115: Sed substitution of kernel display name — ensure the regex correctly handles all Python versions.
  • Line 120: fix-permissions /opt/app-root -P — verify this command exists and is idempotent.

Consider splitting this into multiple RUN commands if any step is optional or frequently changes, to improve cache hit rates.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 40c90bd and b498018.

📒 Files selected for processing (3)
  • jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • jupyter/utils/install_pandoc.sh (1 hunks)
  • jupyter/utils/install_texlive.sh (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • jupyter/utils/install_pandoc.sh
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (20)
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
🔇 Additional comments (4)
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (4)

48-49: RUN command consolidation in cpu-base stage looks good.

The upgrade and clean operations are properly chained with && and flags like --nobest --skip-broken are appropriate for this base image context.


58-67: Verify micropipenv, uv, and oc client versions and pinning strategy.

Lines 59, 63–66 install tools with fixed versions (micropipenv 1.9.0, uv 0.8.12, oc latest-stable):

  • micropipenv==1.9.0: Pinned. Verify this version is compatible with the pylock.toml format.
  • uv==0.8.12: Pinned. Verify this version is recent enough and security-sound as of November 2025.
  • oc: No version pinning — always pulls "latest". Consider explicitly pinning to a stable release version for reproducibility.

9-27: Review comment is incorrect regarding the indentation claim.

The actual install_texlive.sh and install_pandoc.sh scripts have correct indentation and proper control flow. Both scripts use a conditional if [[ "$ARCH" == "ppc64le" ]] structure that correctly handles non-ppc64le architectures by gracefully exiting with success (code 0) after doing nothing—not failing.

For non-ppc64le architectures, the Dockerfile intentionally uses a different installation strategy via ./utils/install_pdf_deps.sh at line 94, so these scripts are only meant to install for ppc64le in the pdf-builder stage.

Both scripts correctly install to the expected paths (/usr/local/texlive and /usr/local/pandoc), are made executable in the Dockerfile, and integrate properly with the cache mount strategy.

Likely an incorrect or invalid review comment.


93-102: The primary concerns in the review comment are not valid issues.

Verification results:

  1. Script existence — Confirmed: jupyter/utils/install_pdf_deps.sh exists and is properly copied into the image at build time via the COPY ${JUPYTER_REUSABLE_UTILS} utils/ instruction (line 72).

  2. Path correctness — Confirmed: Both install_texlive.sh (line 24 in pdf-builder) and install_pandoc.sh (line 25) create exactly the paths expected by the cp commands:

    • install_texlive.sh creates /usr/local/texlive via --prefix=/usr/local/texlive
    • install_pandoc.sh creates /usr/local/pandoc/bin with symlink to /usr/bin/pandoc
    • The cache mount correctly maps /usr/local/ from pdf-builder → /pdf_builder/ in the target stage
    • Copies pull from the correct paths: /pdf_builder/texlive and /pdf_builder/pandoc
  3. Error handling — This is a valid robustness suggestion rather than a bug. The install_pdf_deps.sh script uses set -euxo pipefail for error detection, but the cp commands and script invocation in the Dockerfile lack explicit error checks. Adding || exit 1 would improve reliability.

The code functions correctly as written. The suggestion to add error handling is reasonable but optional.

Likely an incorrect or invalid review comment.

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Nov 20, 2025

@dibryant: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/prow/notebook-jupyter-ubi9-python-3-12-pr-image-mirror aee9043 link true /test notebook-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror aee9043 link true /test notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror aee9043 link true /test notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebooks-py312-ubi9-e2e-tests aee9043 link true /test notebooks-py312-ubi9-e2e-tests
ci/prow/images aee9043 link true /test images

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

@openshift-merge-robot
Copy link
Contributor

PR needs rebase.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

do-not-merge/work-in-progress needs-rebase review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel size/l

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants