Skip to content

feat(ci): add reusable S3 upload workflow with environment detection and AWS integration#132

Merged
guimoreirar merged 3 commits intomainfrom
s3-upload
Mar 13, 2026
Merged

feat(ci): add reusable S3 upload workflow with environment detection and AWS integration#132
guimoreirar merged 3 commits intomainfrom
s3-upload

Conversation

@guimoreirar
Copy link
Member

@guimoreirar guimoreirar commented Mar 11, 2026

Lerian

GitHub Actions Shared Workflows


Description

Type of Change

  • feat: New workflow or new input/output/step in an existing workflow
  • fix: Bug fix in a workflow (incorrect behavior, broken step, wrong condition)
  • perf: Performance improvement (e.g. caching, parallelism, reduced steps)
  • refactor: Internal restructuring with no behavior change
  • docs: Documentation only (README, docs/, inline comments)
  • ci: Changes to self-CI (workflows under .github/workflows/ that run on this repo)
  • chore: Dependency bumps, config updates, maintenance
  • test: Adding or updating tests
  • BREAKING CHANGE: Callers must update their configuration after this PR

Breaking Changes

None.

Testing

  • YAML syntax validated locally
  • Triggered a real workflow run on a caller repository using @develop or the beta tag
  • Verified all existing inputs still work with default values
  • Confirmed no secrets or tokens are printed in logs
  • Checked that unrelated workflows are not affected

Caller repo / workflow run:

Related Issues

Closes #

Summary by CodeRabbit

  • Chores
    • Added a reusable automation to upload files to AWS S3. It auto-detects target environment (development, staging, production) from branches/tags or accepts a manual override, authenticates securely, and places uploads under environment-specific folders with optional prefixing. Supports file-pattern matching, optional path flattening, dry-run preview, reports upload counts, and fails when no files match.
  • Documentation
    • Added usage docs and examples for the S3 upload workflow.

@guimoreirar guimoreirar requested a review from a team as a code owner March 11, 2026 18:58
@coderabbitai
Copy link

coderabbitai bot commented Mar 11, 2026

Walkthrough

Adds a reusable GitHub Actions workflow .github/workflows/s3-upload.yml to upload files to S3 with environment resolution (auto/manual), OIDC-based AWS role assumption, optional path flattening and dry-run. Also adds docs (docs/s3-upload.md) and a Dependabot group for aws-actions/*.

Changes

Cohort / File(s) Summary
S3 Upload Workflow
​.github/workflows/s3-upload.yml
New reusable workflow. Defines inputs (runner_type, s3_bucket, file_pattern, s3_prefix, aws_region, environment_detection, manual_environment, flatten, dry_run). Resolves environment from refs/tags or manual input, configures AWS creds via OIDC/assume-role, matches and uploads files to S3, supports flattening, dry-run, and fails if no files matched. Pay attention to environment-mapping logic, IAM role usage, and path composition.
Documentation
docs/s3-upload.md
New documentation describing workflow inputs, permission requirements, environment mapping table, usage examples (including dry-run and manual override), and permission block.
Dependabot config
.github/dependabot.yml
Adds an aws group to auto-update aws-actions/* (major, minor, patch).

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~30 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Description check ⚠️ Warning PR description is incomplete: the Description section is empty, explaining nothing about what the workflow does or why. The Testing section has all items unchecked and provides no validation evidence. Fill the Description section with a summary of the S3 upload workflow's behavior, environment detection logic, and affected use cases. Check at least the relevant Testing items and provide a caller repo/workflow run link or detailed validation notes.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately describes the main change: a new reusable S3 upload workflow with environment detection and AWS integration, matching the raw summary and file additions.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch s3-upload
📝 Coding Plan for PR comments
  • Generate coding plan

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@gandalf-at-lerian gandalf-at-lerian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM — clean reusable workflow. OIDC auth, good environment detection from tag suffixes, sensible defaults. One nit: the glob expansion in the upload step (for file in $PATTERN) runs unquoted which is fine for globbing but if zero files match, the loop body runs once with the literal pattern string — the -f check handles that correctly though. Good to go.

Copy link
Contributor

@gandalf-at-lerian gandalf-at-lerian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Review — s3-upload.yml

Good addition — clean structure, solid OIDC auth, useful environment auto-detection. A few items to address:

🔴 Script Injection Risk (Security)

Multiple inputs are interpolated directly into run: blocks via ${{ inputs.xxx }}. Even though workflow_call inputs come from caller workflows (not arbitrary users), this is a known injection vector if a caller ever passes user-controlled data through.

Affected lines: inputs.environment_detection, inputs.manual_environment, inputs.s3_bucket, inputs.s3_prefix, inputs.file_pattern, inputs.flatten

Fix: Use environment variables instead of direct interpolation:

- name: Upload files to S3
  if: steps.env.outputs.folder != ''
  env:
    BUCKET: ${{ inputs.s3_bucket }}
    FOLDER: ${{ steps.env.outputs.folder }}
    PREFIX: ${{ inputs.s3_prefix }}
    PATTERN: ${{ inputs.file_pattern }}
    FLATTEN: ${{ inputs.flatten }}
  run: |
    set -euo pipefail
    # now use $BUCKET, $FOLDER, etc. safely

Same treatment needed for the Determine environment step.

🟡 Glob Edge Case

When no files match the pattern, bash keeps the literal glob string and the [[ ! -f "$file" ]] check catches it — which works. But the error message says "No files matched" and exits 1 on the first non-matching entry. If the glob partially expands (some files exist, some don't), this check won't trigger.

Consider adding shopt -s nullglob before the loop and checking FILE_COUNT after:

shopt -s nullglob
for file in $PATTERN; do
  # upload logic
done

if [[ $FILE_COUNT -eq 0 ]]; then
  echo "⚠️ No files matched pattern: ${PATTERN}"
  exit 1
fi

🟡 PR Template

Description is empty, no type of change selected, no testing checkboxes marked. Would be good to fill these in — especially the caller repo/workflow run link to show it was validated.

✅ What's Good

  • OIDC auth (no long-lived credentials)
  • set -euo pipefail on all scripts
  • Clean environment detection logic (tag suffix + manual override)
  • Sensible defaults (flatten=true, us-east-2)
  • Good inline documentation with usage examples

…run blocks

- Move all workflow inputs to env: block to prevent script injection
- Replace shopt -s nullglob for robust glob handling
- Check FILE_COUNT after loop instead of inside it

Co-authored-by: gandalf-at-lerian <gandalf@lerian.studio>
coderabbitai[bot]
coderabbitai bot previously requested changes Mar 11, 2026
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/s3-upload.yml:
- Around line 58-61: The workflow currently sets flatten: true which causes
silent overwrites when multiple files have the same basename (e.g., file_pattern
matches a/config.json and b/config.json) and uploads use aws s3 cp to
${S3_PATH}; update the workflow to either change the default setting to flatten:
false or implement a preflight check before the upload step that, when flatten
is true, scans the resolved file list for duplicate basenames and fails the job
with a clear error if any duplicates are found (reference symbols: flatten,
file_pattern, ${S3_PATH}, and the upload step that runs aws s3 cp).
- Around line 1-26: Add a new documentation file docs/s3-upload.md that
documents the reusable workflow s3-upload.yml: start the file with the Lerian
branding header "# S3 Upload", describe the workflow purpose, list and explain
inputs (s3_bucket, file_pattern, s3_prefix, aws_role_arn_secret, etc.), show
example usages mirroring the examples in s3-upload.yml, and include any notes
about environment routing (beta→development, rc→staging, release→production) and
OIDC IAM role authentication so the repo policy requiring a docs/* md for every
reusable workflow is satisfied.
- Around line 11-25: The inline examples in s3-upload.yml erroneously pass
aws_role_arn_secret as a workflow input; replace those with a secrets mapping
instead (remove aws_role_arn_secret from the with: block and add a secrets:
block) so callers map their repo secret into the reusable workflow (e.g., keep
s3_bucket, file_pattern, s3_prefix under with:, and add secrets: AWS_ROLE_ARN:
${{ secrets.AWS_INIT_DATA_ROLE_ARN }} or secrets: AWS_ROLE_ARN: ${{
secrets.AWS_MIGRATIONS_ROLE_ARN }} as appropriate), referencing the
aws_role_arn_secret identifier only via the secrets mapping expected by the
reusable workflow.
- Around line 82-84: When inputs.environment_detection is "manual" (the branch
that sets FOLDER from inputs.manual_environment), validate that
inputs.manual_environment is non-empty and exactly one of "development",
"staging", or "production"; if it is not, print a clear error and exit non‑zero
so the workflow fails instead of silently using an invalid or empty folder.
Update the manual branch that assigns FOLDER to check inputs.manual_environment
against the allowed set and call an error/exit path when validation fails.
- Around line 127-133: The loop that checks matched files uses PATTERN and exits
on no match; change the behavior to warn and skip instead of failing by
replacing the failing exit code: in the for-loop that iterates over $PATTERN
(and relies on the nullglob behavior set earlier), update the branch that
handles [[ ! -f "$file" ]] so it echoes the warning and calls exit 0 (not exit
1), ensuring the workflow returns success when no files match and merely warns
instead of failing the step.
- Around line 27-66: Add a workflow_dispatch block and a boolean input dry_run
to the reusable interface (in addition to workflow_call) and update the examples
to reference the declared secret name secrets.AWS_ROLE_ARN (not
aws_role_arn_secret); implement behavior so when dry_run is true the script uses
aws s3 cp --dryrun and emits ::notice:: lines listing resolved inputs
(s3_bucket, file_pattern, s3_prefix, aws_region, environment_detection,
manual_environment, flatten, AWS_ROLE_ARN) while when dry_run is false it
performs live uploads and emits only a success summary notice; fix the
file-matching/shopt logic so that shopt -s nullglob does not cause an unexpected
exit — detect zero matches and emit a notice and exit 0 (or fail with a clear
message if intended), and when FLATTEN="true" detect duplicate basenames before
uploading and either fail with a list of duplicates or switch to non-flattened
keys to avoid silent overwrites; finally add docs/s3-upload.md with the required
Lerian header and usage examples showing both workflow_call and
workflow_dispatch usages and the environment routing behavior.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yml

Review profile: ASSERTIVE

Plan: Pro

Run ID: 6a72f88f-b17c-4038-95d3-034b2b9dc2f5

📥 Commits

Reviewing files that changed from the base of the PR and between 5cdd9c8 and c1220f0.

📒 Files selected for processing (1)
  • .github/workflows/s3-upload.yml

coderabbitai[bot]
coderabbitai bot previously requested changes Mar 11, 2026
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (3)
.github/workflows/s3-upload.yml (3)

27-66: ⚠️ Potential issue | 🔴 Critical

Expose workflow_dispatch and dry_run before merging.

This workflow mutates S3, but it only supports workflow_call. That removes the required manual-test entry point and preview mode for a state-changing reusable workflow.

As per coding guidelines, "Reusable workflows must support workflow_call (for external callers) and workflow_dispatch (for manual testing). They must expose explicit inputs and always include a dry_run input (type: boolean, default: false)."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/s3-upload.yml around lines 27 - 66, Add a
workflow_dispatch trigger and a boolean dry_run input to the reusable workflow
so it can be manually tested and support preview mode: keep the existing
workflow_call inputs but also add a top-level workflow_dispatch block exposing
the same inputs (including runner_type, s3_bucket, file_pattern, s3_prefix,
aws_region, environment_detection, manual_environment, flatten) and add a new
input named dry_run with type: boolean and default: false; ensure any code paths
that perform S3 mutations (the S3 upload step referencing
file_pattern/s3_prefix) check the dry_run flag and skip or echo actions when
dry_run is true.

11-25: ⚠️ Potential issue | 🟠 Major

Fix the examples to pass AWS_ROLE_ARN via secrets.

aws_role_arn_secret is not a declared workflow input. A caller copying either snippet will fail interface validation and still leave secrets.AWS_ROLE_ARN unset.

Suggested update
 #   uses: LerianStudio/github-actions-shared-workflows/.github/workflows/s3-upload.yml@main
 #   with:
 #     s3_bucket: "lerian-casdoor-init-data"
 #     file_pattern: "init/casdoor/init_data*.json"
-#     aws_role_arn_secret: "AWS_INIT_DATA_ROLE_ARN"
+#   secrets:
+#     AWS_ROLE_ARN: ${{ secrets.AWS_INIT_DATA_ROLE_ARN }}
 #
 #   # Upload migration files with custom prefix
 #   uses: LerianStudio/github-actions-shared-workflows/.github/workflows/s3-upload.yml@main
 #   with:
 #     s3_bucket: "lerian-migration-files"
 #     file_pattern: "init/casdoor-migrations/migrations/*.sql"
 #     s3_prefix: "casdoor-migrations"
-#     aws_role_arn_secret: "AWS_MIGRATIONS_ROLE_ARN"
+#   secrets:
+#     AWS_ROLE_ARN: ${{ secrets.AWS_MIGRATIONS_ROLE_ARN }}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/s3-upload.yml around lines 11 - 25, The examples use the
undeclared input name aws_role_arn_secret which will fail workflow validation
and leave secrets.AWS_ROLE_ARN unset; update the examples to pass the role ARN
via the secrets mapping instead of an input — remove aws_role_arn_secret entries
and show callers supplying AWS_ROLE_ARN through the workflow call's secrets
(referencing secrets.AWS_ROLE_ARN), while keeping the other example inputs
(s3_bucket, file_pattern, s3_prefix) unchanged so users can copy/paste
correctly.

58-61: ⚠️ Potential issue | 🟠 Major

flatten: true can silently overwrite distinct files with the same basename.

If file_pattern matches a/config.json and b/config.json, both uploads target the same S3 key under ${S3_PATH} and the later copy wins. Either default flatten to false or fail fast after detecting duplicate basenames.

Also applies to: 135-140

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/s3-upload.yml around lines 58 - 61, The workflow input
"flatten" currently defaults to true which can silently overwrite files with
identical basenames; change the default for the "flatten" input in s3-upload.yml
from true to false, and/or add a fast-fail pre-check that resolves the file list
from "file_pattern", computes basenames, and errors if any basename is
duplicated (include "file_pattern" and "S3_PATH" in the error message) so
uploads don't proceed when flattening would cause collisions; implement this
check in the job step that expands file_pattern before any S3 uploads.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/s3-upload.yml:
- Around line 79-87: Validate the INPUT_ENV_DETECTION value before deciding
FOLDER: check the INPUT_ENV_DETECTION env var (used in the conditional with
"$INPUT_ENV_DETECTION") and only accept "manual" or "tag_suffix"; if it's
"manual" use INPUT_MANUAL_ENV as before, if "tag_suffix" proceed with the
existing auto-detection path, otherwise exit non‑zero with an error message that
includes the invalid value so the workflow fails fast instead of silently
falling through to production.
- Around line 119-135: The for-loop uses for file in $PATTERN which performs
unsafe word-splitting before globbing; resolve PATTERN into a bash array first
(e.g., use eval "files=($PATTERN)") and then iterate with for file in
"${files[@]}"; ensure you quote "$file" when using FLATTEN and S3_PATH and still
increment FILE_COUNT after each successful upload; update the loop around the
existing PATTERN, FLATTEN, S3_PATH and FILE_COUNT logic and remove the original
for file in $PATTERN construct.

---

Duplicate comments:
In @.github/workflows/s3-upload.yml:
- Around line 27-66: Add a workflow_dispatch trigger and a boolean dry_run input
to the reusable workflow so it can be manually tested and support preview mode:
keep the existing workflow_call inputs but also add a top-level
workflow_dispatch block exposing the same inputs (including runner_type,
s3_bucket, file_pattern, s3_prefix, aws_region, environment_detection,
manual_environment, flatten) and add a new input named dry_run with type:
boolean and default: false; ensure any code paths that perform S3 mutations (the
S3 upload step referencing file_pattern/s3_prefix) check the dry_run flag and
skip or echo actions when dry_run is true.
- Around line 11-25: The examples use the undeclared input name
aws_role_arn_secret which will fail workflow validation and leave
secrets.AWS_ROLE_ARN unset; update the examples to pass the role ARN via the
secrets mapping instead of an input — remove aws_role_arn_secret entries and
show callers supplying AWS_ROLE_ARN through the workflow call's secrets
(referencing secrets.AWS_ROLE_ARN), while keeping the other example inputs
(s3_bucket, file_pattern, s3_prefix) unchanged so users can copy/paste
correctly.
- Around line 58-61: The workflow input "flatten" currently defaults to true
which can silently overwrite files with identical basenames; change the default
for the "flatten" input in s3-upload.yml from true to false, and/or add a
fast-fail pre-check that resolves the file list from "file_pattern", computes
basenames, and errors if any basename is duplicated (include "file_pattern" and
"S3_PATH" in the error message) so uploads don't proceed when flattening would
cause collisions; implement this check in the job step that expands file_pattern
before any S3 uploads.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yml

Review profile: ASSERTIVE

Plan: Pro

Run ID: 892aa628-8102-4aa0-97d1-34d4d4bf406e

📥 Commits

Reviewing files that changed from the base of the PR and between c1220f0 and 9d7a856.

📒 Files selected for processing (1)
  • .github/workflows/s3-upload.yml

@bedatty bedatty dismissed stale reviews from coderabbitai[bot] and coderabbitai[bot] March 12, 2026 12:59

All comments addressed.

Add workflow_dispatch and dry_run input, validate environment_detection
and manual_environment inputs, fix inline examples to use secrets mapping,
create docs/s3-upload.md, and add aws-actions to dependabot.
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (2)
docs/s3-upload.md (1)

1-6: ⚠️ Potential issue | 🟡 Minor

Start this doc with a Markdown H1.

Line 1 begins with an HTML table, so the file still misses the repo’s expected branding-header format and triggers the MD041 warning. Put # S3 Upload first, then keep the branded table below it if you still want the logo layout.

Suggested change
-<table border="0" cellspacing="0" cellpadding="0">
+# S3 Upload
+
+<table border="0" cellspacing="0" cellpadding="0">

As per coding guidelines "The doc file must start with the Lerian branding header."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/s3-upload.md` around lines 1 - 6, The document currently starts with an
HTML table (<table> tag) which triggers MD041; update the file to begin with a
Markdown H1 heading "# S3 Upload" as the first line, then move the existing
branded HTML table (the <table> ... </table> block containing the Lerian logo)
below that heading so the file conforms to the repository's required Lerian
branding-header format and silences the MD041 warning.
.github/workflows/s3-upload.yml (1)

170-211: ⚠️ Potential issue | 🟠 Major

Make dry_run execute the AWS CLI preview path.

Lines 197-201 only print synthetic commands. That means dry_run never validates the actual aws s3 cp invocation or computed destination keys, and it also omits some resolved non-secret values (aws_region, environment_detection, manual_environment, S3_PATH) that this repo requires in verbose dry-run output.

Suggested change
       - name: Dry run summary
         if: steps.env.outputs.folder != '' && inputs.dry_run
         env:
           BUCKET: ${{ inputs.s3_bucket }}
           FOLDER: ${{ steps.env.outputs.folder }}
           PREFIX: ${{ inputs.s3_prefix }}
           PATTERN: ${{ inputs.file_pattern }}
           FLATTEN: ${{ inputs.flatten }}
+          AWS_REGION: ${{ inputs.aws_region }}
+          ENV_DETECTION: ${{ inputs.environment_detection }}
+          MANUAL_ENV: ${{ inputs.manual_environment }}
         run: |
           set -euo pipefail

           echo "::notice::DRY RUN — no files will be uploaded"
-          echo "  bucket      : ${BUCKET}"
-          echo "  folder      : ${FOLDER}"
-          echo "  prefix      : ${PREFIX:-<none>}"
-          echo "  pattern     : ${PATTERN}"
-          echo "  flatten     : ${FLATTEN}"
+          echo "::notice::bucket=${BUCKET}"
+          echo "::notice::folder=${FOLDER}"
+          echo "::notice::prefix=${PREFIX:-<none>}"
+          echo "::notice::pattern=${PATTERN}"
+          echo "::notice::flatten=${FLATTEN}"
+          echo "::notice::aws_region=${AWS_REGION}"
+          echo "::notice::environment_detection=${ENV_DETECTION}"
+          echo "::notice::manual_environment=${MANUAL_ENV:-<none>}"

           if [[ -n "$PREFIX" ]]; then
             S3_PATH="s3://${BUCKET}/${FOLDER}/${PREFIX}/"
           else
             S3_PATH="s3://${BUCKET}/${FOLDER}/"
           fi
+          echo "::notice::s3_path=${S3_PATH}"

           shopt -s nullglob
           FILE_COUNT=0

           for file in $PATTERN; do
             if [[ "$FLATTEN" == "true" ]]; then
-              echo "  [dry-run] aws s3 cp $file ${S3_PATH}"
+              aws s3 cp "$file" "${S3_PATH}" --dryrun
             else
-              echo "  [dry-run] aws s3 cp $file ${S3_PATH}${file}"
+              aws s3 cp "$file" "${S3_PATH}${file}" --dryrun
             fi
             FILE_COUNT=$((FILE_COUNT + 1))
           done

As per coding guidelines "When dry_run: true, use ::notice:: annotations to indicate dry run state, print every resolved input and computed variable, enable tool-native verbose/preview flags (--dry-run --debug, --check, --plan, --diff), and never skip silently."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/s3-upload.yml around lines 170 - 211, The dry-run branch
currently only prints synthetic aws commands and omits resolved non-secret
inputs and tool-native preview flags; update the dry-run logic in the Dry run
summary step to (1) echo all resolved inputs and computed variables (aws_region,
environment_detection, manual_environment, BUCKET, FOLDER, PREFIX, PATTERN,
FLATTEN and computed S3_PATH) using ::notice::, (2) invoke the real aws s3 cp in
the loop with the CLI's preview flags (e.g., --dryrun and/or --debug) so the
actual invocation and destination keys are validated, and (3) keep the existing
FLATTEN behavior but pass the resolved S3_PATH and filename to the aws call so
the computed destination is exercised; ensure FILE_COUNT and the no-match
warning behavior remain unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In @.github/workflows/s3-upload.yml:
- Around line 170-211: The dry-run branch currently only prints synthetic aws
commands and omits resolved non-secret inputs and tool-native preview flags;
update the dry-run logic in the Dry run summary step to (1) echo all resolved
inputs and computed variables (aws_region, environment_detection,
manual_environment, BUCKET, FOLDER, PREFIX, PATTERN, FLATTEN and computed
S3_PATH) using ::notice::, (2) invoke the real aws s3 cp in the loop with the
CLI's preview flags (e.g., --dryrun and/or --debug) so the actual invocation and
destination keys are validated, and (3) keep the existing FLATTEN behavior but
pass the resolved S3_PATH and filename to the aws call so the computed
destination is exercised; ensure FILE_COUNT and the no-match warning behavior
remain unchanged.

In `@docs/s3-upload.md`:
- Around line 1-6: The document currently starts with an HTML table (<table>
tag) which triggers MD041; update the file to begin with a Markdown H1 heading
"# S3 Upload" as the first line, then move the existing branded HTML table (the
<table> ... </table> block containing the Lerian logo) below that heading so the
file conforms to the repository's required Lerian branding-header format and
silences the MD041 warning.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yml

Review profile: ASSERTIVE

Plan: Pro

Run ID: 30081cc9-4391-4754-947b-01998bfa9bb5

📥 Commits

Reviewing files that changed from the base of the PR and between 9d7a856 and fddd8c4.

📒 Files selected for processing (3)
  • .github/dependabot.yml
  • .github/workflows/s3-upload.yml
  • docs/s3-upload.md

@guimoreirar guimoreirar merged commit 73e2f69 into main Mar 13, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants