Bring SAN adjustments, add group and user token features.#6
Bring SAN adjustments, add group and user token features.#6
Conversation
…-sso' into feat/complete-oauth-sso
📝 WalkthroughWalkthroughThe PR introduces Subject Alternative Names (SANs) support for certificates, creates a new certificate grouping system, adds API key authentication, and restructures the database schema by dropping the Changes
Sequence Diagram(s)sequenceDiagram
actor Client
participant API as API Route<br/>(generate)
participant CertTool as CertTooler
participant FileSystem
participant Database
participant Porca as Porca<br/>(Revoke)
Client->>API: POST with subjectAltNameData
API->>API: Split SANs on comma
API->>API: Encode SANs with punycode.toASCII()
API->>CertTool: generateCSR(SANArray, CN, OU, O...)
CertTool->>FileSystem: Write CSR file
CertTool-->>API: Return CSR
API->>API: Execute openssl ca -new-cert
API->>Database: Store certificate with<br/>subjectAltNames & group
Database-->>API: Confirm storage
Note over Client,Database: Later: Certificate Revocation
Client->>API: DELETE /api/certs/{id}
API->>FileSystem: Read certificate PEM file
alt PEM file exists
API->>Porca: revokeCertificate(publicKey)
Porca->>FileSystem: Write temp file<br/>(./certs/tmp_revoke_<uuid>.pem)
Porca->>FileSystem: Execute openssl ca -revoke
Porca->>FileSystem: Delete temp file
Porca-->>API: Revocation complete
else PEM file missing (ENOENT)
API->>API: Continue without revocation
end
API->>Database: Delete certificate record
Database-->>API: Record deleted
API-->>Client: Success response
Estimated code review effort🎯 4 (Complex) | ⏱️ ~55 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
There was a problem hiding this comment.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
src/app/api/certs/get_file/[slug]/route.ts (1)
10-26:⚠️ Potential issue | 🔴 CriticalCritical: Authentication removed - private keys exposed to unauthenticated access.
This endpoint serves private key files (
type=private) but performs no authentication. The middleware insrc/proxy.tsexplicitly excludes/apiroutes (line 5-14), so this endpoint is completely unprotected. Anyone who knows or guesses a certificate UUID can download the private key.The
checkAuthTokenvariable (line 17) is read but never used, suggesting authentication was incompletely removed rather than intentionally disabled.Restore authentication before the slug validation:
export const GET = async ( request: NextRequest, props: { params: Promise<{ slug: string }> }, ) => { try { - const header = await headers(); - const params = new URLSearchParams(request.url.split("?")[1]); - const checkAuthToken = params.get("auth_token"); - + const user = await checkUserLoginStatus(request.headers); + if (!user.loggedIn) { + return new Response("Unauthorized", { status: 401 }); + } + const params = new URLSearchParams(request.url.split("?")[1]); const { slug } = await props.params;Also add the import:
import checkUserLoginStatus from "@/components/checkUserLoginStatusAPI";🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/certs/get_file/`[slug]/route.ts around lines 10 - 26, The GET handler currently exposes private key files because authentication was removed; restore authentication by importing and calling checkUserLoginStatus (import checkUserLoginStatus from "@/components/checkUserLoginStatusAPI") at the top of the GET function before slug validation, use the extracted checkAuthToken (or header if preferred) to call checkUserLoginStatus and if it fails return a 401/403 Response, and only then proceed to parse params and validate the slug; ensure checkAuthToken is actually passed to checkUserLoginStatus and handle/marshal any errors into a proper Response to prevent unauthenticated access to type=private files.src/app/api/certs/route.ts (1)
23-36:⚠️ Potential issue | 🔴 CriticalCritical:
publicKeyis out of scope at line 35, causing a ReferenceError.The
publicKeyvariable is declared inside the innertryblock (line 25) and is not accessible at line 35. This will fail at runtime. Additionally, line 35 appears to be a duplicate call sincerevokeCertificate(publicKey)is already called at line 26.Remove the duplicate, out-of-scope call:
try { const publicKey = await fs.promises.readFile(certPath, "utf8"); await revokeCertificate(publicKey); } catch (e) { if ((e as NodeJS.ErrnoException).code !== "ENOENT") { throw e; } } await db .delete(schema.certificates) .where(eq(schema.certificates.id, sanitizedId)); - await revokeCertificate(publicKey); return new Response("Certificate Deleted!");🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/certs/route.ts` around lines 23 - 36, The code calls revokeCertificate(publicKey) inside the try block where publicKey is declared, but then calls revokeCertificate(publicKey) again after the try/catch where publicKey is out of scope; remove the duplicate out-of-scope call and keep the single revokeCertificate(publicKey) executed inside the try (or if you intended to always attempt revocation regardless of read error, declare let publicKey: string | undefined before the try and guard the later call with if (publicKey) revokeCertificate(publicKey)); adjust the code around revokeCertificate, the try/catch that reads certPath, and the subsequent db.delete(...).where(eq(schema.certificates.id, sanitizedId)) accordingly.src/components/core/certTooler.ts (1)
3-12:⚠️ Potential issue | 🟠 MajorThe
subjectAltNamesparameter is accepted but never used in the CSR generation.The function accepts
subjectAltNamesas input (line 5) but completely ignores it. The CSR is generated using only the-subjflag (line 26-29), which sets subject DN fields (CN, OU, O, L, ST, C) but does not include Subject Alternative Names extensions.This creates a misleading API contract—callers pass SANs expecting them to be embedded in the CSR, but they are silently discarded. To fix this, either:
- Remove the unused parameter if SANs are intentionally handled elsewhere (e.g., during signing)
- Or embed SANs into the CSR using OpenSSL's
-reqextsflag with a config file or the-addextoption🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/core/certTooler.ts` around lines 3 - 12, The generateCSR function currently accepts subjectAltNames but never uses them; either drop the subjectAltNames parameter from generateCSR if SANs are not intended to be embedded, or modify generateCSR to include SANs in the CSR by passing them to OpenSSL (e.g., use the -addext "subjectAltName=DNS:...,IP:..." option when calling openssl req or create a temporary OpenSSL config with a [req] and [req_ext] section and invoke openssl req with -config <tempfile> -reqexts req_ext); update any calling code or docs accordingly and ensure the parameter name subjectAltNames is referenced in the updated implementation so SANs are actually embedded in the generated CSR.
🧹 Nitpick comments (3)
porca/index.ts (1)
265-267: Remove redundant dynamic import -unlinkis already imported at the top level.Line 2 already imports
unlinkfromnode:fs/promises. The dynamic import on line 266 is unnecessary.} finally { - const { unlink } = await import("node:fs/promises"); await unlink(tmpCert).catch(() => {}); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@porca/index.ts` around lines 265 - 267, The finally block contains a redundant dynamic import of unlink; remove the line "const { unlink } = await import('node:fs/promises');" and call the already top-level imported unlink directly (e.g., await unlink(tmpCert).catch(() => {})) in the same finally block so cleanup behavior and error swallowing remain unchanged; update any references in that finally block to use the existing unlink symbol.src/app/api/certs/get_file/[slug]/route.ts (1)
5-6: Remove unused imports.
authandheadersare imported but no longer used after the authentication logic was removed.-import { auth } from "@/components/auth"; -import { headers } from "next/headers";🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/certs/get_file/`[slug]/route.ts around lines 5 - 6, Remove the now-unused imports in route.ts by deleting the imported symbols auth and headers from the top import lines (the import of auth from "@/components/auth" and headers from "next/headers"); ensure any remaining imports are preserved and run the linter/TS compile to confirm no other references remain (search for usages of auth and headers to verify safe removal).src/app/api/certs/generate/route.ts (1)
53-56: Simplify redundant conditional logic.The ternary condition
SANArray.length > 0 ? ... : [toUnicode(SANArray[0])]is redundant. IfSANArray.length > 0is false,SANArray[0]would be undefined anyway. Sincesplit()on a string always returns at least one element, the else branch is unreachable.- subjectAltNames: - SANArray.length > 0 - ? SANArray.map((i) => toUnicode(i)) - : [toUnicode(SANArray[0])], + subjectAltNames: SANArray.map((i) => toUnicode(i)),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/certs/generate/route.ts` around lines 53 - 56, The subjectAltNames assignment uses a redundant ternary on SANArray length; replace the ternary with a straightforward mapping so subjectAltNames: SANArray.map(i => toUnicode(i)) (referencing SANArray and toUnicode in route.ts) to simplify the logic and remove the unreachable else branch.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@db_migrations/0011_real_cardiac.sql`:
- Around line 12-13: The migration drops the session_tokens table while
application modules still import and use the sessionToks export, causing compile
errors; either remove the references to sessionToks from the getGeneratedKey
route and delete handlers (delete.ts and route.ts) or restore the sessionToks
export in the schema (uncomment the sessionToks export and ensure the
session_tokens table exists/migration ordering preserves it) and update the
migration so it doesn't drop session_tokens until the code no longer depends on
it.
In `@package.json`:
- Line 18: Dependencies and imports for Better Auth are mismatched: package.json
adds "@better-auth/api-key"@^1.5.6 while code (auth.ts and auth-client.ts) still
imports the API-key plugin from the old core paths and schema hasn’t been
updated. Fix by aligning versions and imports: upgrade "better-auth" in
package.json to a 1.5.x release that matches "@better-auth/api-key" (or remove
"@better-auth/api-key" if you prefer downgrading), then update all plugin import
sites (references in auth.ts and auth-client.ts that import from
"better-auth/plugins" or "better-auth/client/plugins") to import the API-key
plugin from "@better-auth/api-key"; finally regenerate the Better Auth
schema/artifacts so the new plugin shape is reflected.
In `@src/app/api/certs/generate/route.ts`:
- Around line 29-35: Validate subjectAltNameData before building SANArray:
ensure subjectAltNameData exists and that
subjectAltNameData.toString().split(",").map(i=>i.trim()).filter(Boolean) yields
at least one non-empty entry; if not, return a bad request/error response (or
throw) from the route handler so you don't proceed to build a cert with an empty
CN. Update the logic around the subjectAltNameData and SANArray variables (the
consts subjectAltNameData and SANArray and any code that reads SANArray[0]) to
use the trimmed/filtered array and to stop execution with a clear error when
validation fails.
In `@src/app/certs/client.tsx`:
- Around line 705-706: The SAN badges are rendered without React keys and use
flatMap unnecessarily; update the rendering of row.original.subjectAltNames in
the component that returns Badge elements to use map() instead of flatMap() and
supply a stable key prop for each Badge, e.g.,
key={`${row.original.id}:${san}`}, where Badge and row.original.subjectAltNames
are located.
In `@src/components/auth.ts`:
- Line 31: Add the missing "apikey" table/schema and migration required by the
apiKey() plugin: update src/components/drizzle/schema.ts to define an "apikey"
table (name "apikey") with the columns expected by Better Auth's API-key plugin
(the canonical columns include an id/key, secret/token, owner/user id,
createdAt, expiresAt and revoked/disabled flags as per the Better Auth docs) and
then add a new migration file in db_migrations/ that creates the "apikey" table
with the same column types and any necessary indexes/constraints; ensure the
table name matches "apikey" exactly so the apiKey() plugin (invoked as apiKey()
in src/components/auth.ts) can find and use it at runtime.
In `@src/components/core/system.ts`:
- Line 3: Replace the literal rm('/tmp/*') call with code that enumerates and
deletes only the concrete temp paths your app creates: use a globbing step
(e.g., fast-glob or glob) to expand patterns for the specific files/dirs
(examples: /tmp/{fileUUID}/ created by caddyControl.ts and /tmp/{saveUUID}.cnf
created by porca/index.ts), then iterate matches and call fs.promises.rm(match,
{ recursive: true, force: true }) for each; ensure you remove the hardcoded
"/tmp/*" and instead target the concrete patterns and use force:true to avoid
ENOENT errors.
In `@src/components/drizzle/schema.ts`:
- Line 16: The JSONB SAN columns are untyped and should be declared as string[]
to enable compile-time type safety; update the jsonb column definitions for
subjectAltNames and issuerAltNames in schema.ts to use Drizzle's generic typing
via .$type<string[]>() so inserts, defaults and selects are typed correctly and
the default([]) remains valid for both subjectAltNames and issuerAltNames.
---
Outside diff comments:
In `@src/app/api/certs/get_file/`[slug]/route.ts:
- Around line 10-26: The GET handler currently exposes private key files because
authentication was removed; restore authentication by importing and calling
checkUserLoginStatus (import checkUserLoginStatus from
"@/components/checkUserLoginStatusAPI") at the top of the GET function before
slug validation, use the extracted checkAuthToken (or header if preferred) to
call checkUserLoginStatus and if it fails return a 401/403 Response, and only
then proceed to parse params and validate the slug; ensure checkAuthToken is
actually passed to checkUserLoginStatus and handle/marshal any errors into a
proper Response to prevent unauthenticated access to type=private files.
In `@src/app/api/certs/route.ts`:
- Around line 23-36: The code calls revokeCertificate(publicKey) inside the try
block where publicKey is declared, but then calls revokeCertificate(publicKey)
again after the try/catch where publicKey is out of scope; remove the duplicate
out-of-scope call and keep the single revokeCertificate(publicKey) executed
inside the try (or if you intended to always attempt revocation regardless of
read error, declare let publicKey: string | undefined before the try and guard
the later call with if (publicKey) revokeCertificate(publicKey)); adjust the
code around revokeCertificate, the try/catch that reads certPath, and the
subsequent db.delete(...).where(eq(schema.certificates.id, sanitizedId))
accordingly.
In `@src/components/core/certTooler.ts`:
- Around line 3-12: The generateCSR function currently accepts subjectAltNames
but never uses them; either drop the subjectAltNames parameter from generateCSR
if SANs are not intended to be embedded, or modify generateCSR to include SANs
in the CSR by passing them to OpenSSL (e.g., use the -addext
"subjectAltName=DNS:...,IP:..." option when calling openssl req or create a
temporary OpenSSL config with a [req] and [req_ext] section and invoke openssl
req with -config <tempfile> -reqexts req_ext); update any calling code or docs
accordingly and ensure the parameter name subjectAltNames is referenced in the
updated implementation so SANs are actually embedded in the generated CSR.
---
Nitpick comments:
In `@porca/index.ts`:
- Around line 265-267: The finally block contains a redundant dynamic import of
unlink; remove the line "const { unlink } = await import('node:fs/promises');"
and call the already top-level imported unlink directly (e.g., await
unlink(tmpCert).catch(() => {})) in the same finally block so cleanup behavior
and error swallowing remain unchanged; update any references in that finally
block to use the existing unlink symbol.
In `@src/app/api/certs/generate/route.ts`:
- Around line 53-56: The subjectAltNames assignment uses a redundant ternary on
SANArray length; replace the ternary with a straightforward mapping so
subjectAltNames: SANArray.map(i => toUnicode(i)) (referencing SANArray and
toUnicode in route.ts) to simplify the logic and remove the unreachable else
branch.
In `@src/app/api/certs/get_file/`[slug]/route.ts:
- Around line 5-6: Remove the now-unused imports in route.ts by deleting the
imported symbols auth and headers from the top import lines (the import of auth
from "@/components/auth" and headers from "next/headers"); ensure any remaining
imports are preserved and run the linter/TS compile to confirm no other
references remain (search for usages of auth and headers to verify safe
removal).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 4358af8d-0fb1-4df0-a516-4febf3fa8aeb
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (16)
db_migrations/0010_sloppy_miek.sqldb_migrations/0011_real_cardiac.sqldb_migrations/meta/0010_snapshot.jsondb_migrations/meta/0011_snapshot.jsondb_migrations/meta/_journal.jsonpackage.jsonporca/index.tssrc/app/api/certs/generate/route.tssrc/app/api/certs/get_file/[slug]/route.tssrc/app/api/certs/route.tssrc/app/certs/client.tsxsrc/components/auth-client.tssrc/components/auth.tssrc/components/core/certTooler.tssrc/components/core/system.tssrc/components/drizzle/schema.ts
| DROP TABLE "proxy" CASCADE;--> statement-breakpoint | ||
| DROP TABLE "session_tokens" CASCADE;--> statement-breakpoint |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find all references to sessionToks in the codebase
rg -n "sessionToks" --type tsRepository: hpware/caddy-and-cert-manager
Length of output: 733
🏁 Script executed:
# Check schema.ts around line 44 to see the sessionToks export
sed -n '40,50p' src/components/drizzle/schema.ts
# Check route.ts to see the full import context
head -30 src/app/api/certs/getGeneratedKey/route.ts
# Check delete.ts to see the full context
cat src/app/api/certs/getGeneratedKey/delete.tsRepository: hpware/caddy-and-cert-manager
Length of output: 2378
🏁 Script executed:
# Check if sessionToks is defined/exported anywhere else in the codebase
rg -n "export const sessionToks" --type ts
# Check if there are other commented session_tokens references
rg -n "session_tokens" --type ts
# Check if route.ts actually compiles or has errors
rg -n "sessionToks" src/app/api/certs/getGeneratedKey/Repository: hpware/caddy-and-cert-manager
Length of output: 827
Critical: Migration drops session_tokens table while application code still references it.
The table definition in src/components/drizzle/schema.ts:44 is currently commented out, but src/app/api/certs/getGeneratedKey/route.ts and src/app/api/certs/getGeneratedKey/delete.ts still import and reference sessionToks, causing compile-time errors.
Before applying this migration, either:
- Remove the dead code in
getGeneratedKey/route.tsanddelete.ts, or - Uncomment the
sessionToksexport in schema.ts and ensure the table exists before this migration runs
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@db_migrations/0011_real_cardiac.sql` around lines 12 - 13, The migration
drops the session_tokens table while application modules still import and use
the sessionToks export, causing compile errors; either remove the references to
sessionToks from the getGeneratedKey route and delete handlers (delete.ts and
route.ts) or restore the sessionToks export in the schema (uncomment the
sessionToks export and ensure the session_tokens table exists/migration ordering
preserves it) and update the migration so it doesn't drop session_tokens until
the code no longer depends on it.
| }, | ||
| "dependencies": { | ||
| "@base-ui/react": "^1.0.0", | ||
| "@better-auth/api-key": "^1.5.6", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n package.json | grep -E "better-auth|@better-auth"Repository: hpware/caddy-and-cert-manager
Length of output: 404
🏁 Script executed:
find src/components -name "*.ts" -type f | head -20Repository: hpware/caddy-and-cert-manager
Length of output: 459
🏁 Script executed:
cat -n src/components/auth.ts 2>/dev/null | head -50Repository: hpware/caddy-and-cert-manager
Length of output: 2081
🏁 Script executed:
cat -n src/components/auth-client.ts 2>/dev/null | head -50Repository: hpware/caddy-and-cert-manager
Length of output: 593
Align Better Auth versions and plugin imports.
The PR adds @better-auth/api-key 1.5.6 while the core remains at better-auth 1.4.17. However, the code still imports the API-key plugin from the old core paths (better-auth/plugins in auth.ts and better-auth/client/plugins in auth-client.ts), and the new @better-auth/api-key package is never imported. Better Auth 1.5 moved the API-key plugin to a separate package with schema changes. Either downgrade the new package and remove it from dependencies, or upgrade the core to 1.5.x, update all plugin imports to use the new packages, and regenerate the schema.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@package.json` at line 18, Dependencies and imports for Better Auth are
mismatched: package.json adds "@better-auth/api-key"@^1.5.6 while code (auth.ts
and auth-client.ts) still imports the API-key plugin from the old core paths and
schema hasn’t been updated. Fix by aligning versions and imports: upgrade
"better-auth" in package.json to a 1.5.x release that matches
"@better-auth/api-key" (or remove "@better-auth/api-key" if you prefer
downgrading), then update all plugin import sites (references in auth.ts and
auth-client.ts that import from "better-auth/plugins" or
"better-auth/client/plugins") to import the API-key plugin from
"@better-auth/api-key"; finally regenerate the Better Auth schema/artifacts so
the new plugin shape is reflected.
| const { subjectAltNameData, OU, O, L, ST, C } = | ||
| Object.fromEntries(formData); | ||
| const saveUUID = crypto.randomUUID(); | ||
| const SANArray = subjectAltNameData | ||
| .toString() | ||
| .split(",") | ||
| .map((i) => toASCII(i)); |
There was a problem hiding this comment.
Missing validation: empty subjectAltNameData results in undefined behavior.
If subjectAltNameData is empty or missing, SANArray[0] will be an empty string (from "".split(",") → [""]), leading to a certificate with an empty CN. Add validation:
+ if (!subjectAltNameData || subjectAltNameData.toString().trim() === "") {
+ return Response.json(
+ { ok: false, error: "subjectAltNameData is required" },
+ { status: 400 },
+ );
+ }
const SANArray = subjectAltNameData
.toString()
.split(",")
- .map((i) => toASCII(i));
+ .map((i) => toASCII(i.trim()))
+ .filter((i) => i.length > 0);
+ if (SANArray.length === 0) {
+ return Response.json(
+ { ok: false, error: "At least one valid SAN is required" },
+ { status: 400 },
+ );
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const { subjectAltNameData, OU, O, L, ST, C } = | |
| Object.fromEntries(formData); | |
| const saveUUID = crypto.randomUUID(); | |
| const SANArray = subjectAltNameData | |
| .toString() | |
| .split(",") | |
| .map((i) => toASCII(i)); | |
| const { subjectAltNameData, OU, O, L, ST, C } = | |
| Object.fromEntries(formData); | |
| const saveUUID = crypto.randomUUID(); | |
| if (!subjectAltNameData || subjectAltNameData.toString().trim() === "") { | |
| return Response.json( | |
| { ok: false, error: "subjectAltNameData is required" }, | |
| { status: 400 }, | |
| ); | |
| } | |
| const SANArray = subjectAltNameData | |
| .toString() | |
| .split(",") | |
| .map((i) => toASCII(i.trim())) | |
| .filter((i) => i.length > 0); | |
| if (SANArray.length === 0) { | |
| return Response.json( | |
| { ok: false, error: "At least one valid SAN is required" }, | |
| { status: 400 }, | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/app/api/certs/generate/route.ts` around lines 29 - 35, Validate
subjectAltNameData before building SANArray: ensure subjectAltNameData exists
and that
subjectAltNameData.toString().split(",").map(i=>i.trim()).filter(Boolean) yields
at least one non-empty entry; if not, return a bad request/error response (or
throw) from the route handler so you don't proceed to build a cert with an empty
CN. Update the logic around the subjectAltNameData and SANArray variables (the
consts subjectAltNameData and SANArray and any code that reads SANArray[0]) to
use the trimmed/filtered array and to stop execution with a clear error when
validation fails.
| {row.original.subjectAltNames.flatMap((i) => ( | ||
| <Badge>{i}</Badge> |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
sed -n '700,710p' src/app/certs/client.tsxRepository: hpware/caddy-and-cert-manager
Length of output: 462
🏁 Script executed:
sed -n '695,715p' src/app/certs/client.tsxRepository: hpware/caddy-and-cert-manager
Length of output: 816
🏁 Script executed:
rg "row\.original\.id" src/app/certs/client.tsx -B 2 -A 2Repository: hpware/caddy-and-cert-manager
Length of output: 238
🏁 Script executed:
sed -n '700,710p' src/app/certs/client.tsx | cat -nRepository: hpware/caddy-and-cert-manager
Length of output: 539
🏁 Script executed:
rg "subjectAltNames" src/app/certs/client.tsx -B 1 -A 1Repository: hpware/caddy-and-cert-manager
Length of output: 349
🏁 Script executed:
fd "Badge" src --type f -e tsx -e ts | head -5Repository: hpware/caddy-and-cert-manager
Length of output: 55
🏁 Script executed:
rg "export.*Badge" src/components -A 5Repository: hpware/caddy-and-cert-manager
Length of output: 133
🏁 Script executed:
cat -n src/components/ui/badge.tsxRepository: hpware/caddy-and-cert-manager
Length of output: 2326
🏁 Script executed:
grep -n "subjectAltNames.flatMap" src/app/certs/client.tsxRepository: hpware/caddy-and-cert-manager
Length of output: 139
🏁 Script executed:
rg "Badge.*key=" src/app/certs/client.tsxRepository: hpware/caddy-and-cert-manager
Length of output: 55
🏁 Script executed:
rg "subjectAltNames" src/components/drizzle/schema.ts -B 2 -A 2Repository: hpware/caddy-and-cert-manager
Length of output: 608
🏁 Script executed:
rg "subjectAltNames" src --type ts --type tsx -B 3 -A 3 | head -40Repository: hpware/caddy-and-cert-manager
Length of output: 100
🏁 Script executed:
rg "duplicate.*SAN|SAN.*duplicate" src --type ts --type tsxRepository: hpware/caddy-and-cert-manager
Length of output: 100
🏁 Script executed:
sed -n '650,750p' src/app/certs/client.tsx | head -100Repository: hpware/caddy-and-cert-manager
Length of output: 3750
Give each SAN badge a stable key.
This renders a list of badges without keys, so React will warn on every row and can reconcile updates incorrectly when SANs are added or removed. Replace flatMap() with map() (flatMap is unnecessary here since each Badge is a single element, not an array), and key by ${row.original.id}:${san}.
Suggested change
- {row.original.subjectAltNames.flatMap((i) => (
- <Badge>{i}</Badge>
- ))}
+ {row.original.subjectAltNames.map((san) => (
+ <Badge key={`${row.original.id}:${san}`}>{san}</Badge>
+ ))}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| {row.original.subjectAltNames.flatMap((i) => ( | |
| <Badge>{i}</Badge> | |
| {row.original.subjectAltNames.map((san) => ( | |
| <Badge key={`${row.original.id}:${san}`}>{san}</Badge> | |
| ))} |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/app/certs/client.tsx` around lines 705 - 706, The SAN badges are rendered
without React keys and use flatMap unnecessarily; update the rendering of
row.original.subjectAltNames in the component that returns Badge elements to use
map() instead of flatMap() and supply a stable key prop for each Badge, e.g.,
key={`${row.original.id}:${san}`}, where Badge and row.original.subjectAltNames
are located.
| }), | ||
| ] | ||
| : []), | ||
| apiKey(), |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
git ls-files | head -20Repository: hpware/caddy-and-cert-manager
Length of output: 512
🏁 Script executed:
find . -type f -name "*.ts" -o -name "*.tsx" -o -name "*.sql" -o -name "*migration*" | grep -E "(auth|migration|schema)" | head -30Repository: hpware/caddy-and-cert-manager
Length of output: 892
🏁 Script executed:
cat -n src/components/auth.ts | head -50Repository: hpware/caddy-and-cert-manager
Length of output: 2081
🏁 Script executed:
cat -n src/components/drizzle/schema.tsRepository: hpware/caddy-and-cert-manager
Length of output: 6319
🏁 Script executed:
for file in db_migrations/{0009,0010,0011}*.sql; do
if [ -f "$file" ]; then
echo "=== $file ==="
cat -n "$file"
echo ""
fi
doneRepository: hpware/caddy-and-cert-manager
Length of output: 1763
🏁 Script executed:
grep -r "apikey" db_migrations/ -iRepository: hpware/caddy-and-cert-manager
Length of output: 55
🏁 Script executed:
git statusRepository: hpware/caddy-and-cert-manager
Length of output: 272
🏁 Script executed:
git diff --name-onlyRepository: hpware/caddy-and-cert-manager
Length of output: 55
Add apikey table schema and migration for the apiKey() plugin.
The apiKey() plugin is enabled on line 31, but the required backing schema is missing. Better Auth's API-key plugin requires an apikey table (better-auth.com docs), which is not present in either src/components/drizzle/schema.ts or any migration in db_migrations/. The first attempt to create or use an API key will fail without the table. Add the schema definition and corresponding migration.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/auth.ts` at line 31, Add the missing "apikey" table/schema and
migration required by the apiKey() plugin: update
src/components/drizzle/schema.ts to define an "apikey" table (name "apikey")
with the columns expected by Better Auth's API-key plugin (the canonical columns
include an id/key, secret/token, owner/user id, createdAt, expiresAt and
revoked/disabled flags as per the Better Auth docs) and then add a new migration
file in db_migrations/ that creates the "apikey" table with the same column
types and any necessary indexes/constraints; ensure the table name matches
"apikey" exactly so the apiKey() plugin (invoked as apiKey() in
src/components/auth.ts) can find and use it at runtime.
| import fs from "node:fs"; | ||
| export async function tmpCleanUpFunction() { | ||
| await fs.promises.rm("/tmp/*", { recursive: true }); | ||
| await fs.promises.rm(`/tmp/*`, { recursive: true }); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cd /repo && find . -name "system.ts" -type f | head -20Repository: hpware/caddy-and-cert-manager
Length of output: 128
🏁 Script executed:
cd /repo && cat -n src/components/core/system.ts 2>/dev/null || echo "File not found in that location"Repository: hpware/caddy-and-cert-manager
Length of output: 177
🏁 Script executed:
# Check Node.js fs.promises.rm() behavior with glob patterns
# Let's also search for any rm() calls in the codebase to understand context
rg -n "fs\.promises\.rm\|fs\.rm" --type ts --type js -B 2 -A 2 || echo "No matches found"Repository: hpware/caddy-and-cert-manager
Length of output: 89
🏁 Script executed:
git ls-files | grep -E "system\.ts|\.ts$" | head -20Repository: hpware/caddy-and-cert-manager
Length of output: 783
🏁 Script executed:
git ls-files | head -30Repository: hpware/caddy-and-cert-manager
Length of output: 916
🏁 Script executed:
rg -l "fs\.promises\.rm|fs\.rm" --type ts --type jsRepository: hpware/caddy-and-cert-manager
Length of output: 102
🏁 Script executed:
cat -n src/components/core/system.tsRepository: hpware/caddy-and-cert-manager
Length of output: 245
🏁 Script executed:
# Check Node.js documentation snippet or create a quick test
# First, let's search if there are any comments or tests related to this cleanup function
rg -n "tmpCleanUpFunction|/tmp/\*" --type ts --type js -B 3 -A 3Repository: hpware/caddy-and-cert-manager
Length of output: 877
🏁 Script executed:
# Let's check if there's any documentation or test files that might clarify intended behavior
rg -n "rm\(" --type ts --type js -B 2 -A 2 | head -50Repository: hpware/caddy-and-cert-manager
Length of output: 370
🌐 Web query:
Node.js fs.promises.rm() glob pattern behavior documentation
💡 Result:
Node.js fs.promises.rm does not support glob patterns directly. The path argument must be a specific PathLike (string, Buffer, or URL) referring to a single file or directory. It removes files or directories (recursively if {recursive: true}), but treats glob patterns like wildcards (, ) as literal path segments, not expansion patterns. Node.js provides a separate fs.promises.glob (added in v22.6.0) for glob pattern matching, which returns an async iterable of matching paths. To delete files matching a glob, use glob to iterate matches and call rm on each. Example: import { glob, rm } from 'node:fs/promises'; for await (const path of glob('/.tmp')) { await rm(path); } fs.rm internally uses a modified rimraf implementation without glob support (all glob-related code removed). Third-party libraries like rimraf support globs optionally. Official docs confirm rm(path) expects a path, with options like {recursive: true, force: true, maxRetries: number, retryDelay: number}.
Citations:
- 1: https://nodejs.org/api/fs.html
- 2: https://github.com/nodejs/node/blob/main/lib/internal/fs/rimraf.js
- 3: https://github.com/nodejs/node/blob/main/lib/fs.js
- 4: nodejs/node@1610728d7c
- 5: https://bun.com/reference/node/fs/promises/rm
🏁 Script executed:
rg -n "tmp|temp|\.tmp" --type ts --type js -i | grep -E "(mkdir|write|create|/tmp)" | head -30Repository: hpware/caddy-and-cert-manager
Length of output: 613
🏁 Script executed:
# Check Node.js engine requirements
cat package.json | grep -A 5 '"engines"' || echo "No engines field found"Repository: hpware/caddy-and-cert-manager
Length of output: 95
fs.promises.rm() won't expand /tmp/* and will throw ENOENT.
Node's rm() does not interpret glob patterns—/tmp/* is treated as a literal path that doesn't exist. This call will always fail. Additionally, cleaning all of /tmp is overly broad. Instead, enumerate the specific temporary files and directories your app creates (e.g., /tmp/{fileUUID}/ from caddyControl.ts, /tmp/{saveUUID}.cnf from porca/index.ts), and delete those concrete paths individually using fs.promises.glob() to match patterns if needed, then rm() each match with { recursive: true, force: true }. (Node.js fs.promises.rm() docs)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/core/system.ts` at line 3, Replace the literal rm('/tmp/*')
call with code that enumerates and deletes only the concrete temp paths your app
creates: use a globbing step (e.g., fast-glob or glob) to expand patterns for
the specific files/dirs (examples: /tmp/{fileUUID}/ created by caddyControl.ts
and /tmp/{saveUUID}.cnf created by porca/index.ts), then iterate matches and
call fs.promises.rm(match, { recursive: true, force: true }) for each; ensure
you remove the hardcoded "/tmp/*" and instead target the concrete patterns and
use force:true to avoid ENOENT errors.
| export const certificates = pgTable("certificates", { | ||
| id: uuid("id").primaryKey().notNull(), | ||
| name: text("name").notNull(), | ||
| subjectAltNames: jsonb("subject_alt_names").notNull().default([]), |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n src/components/drizzle/schema.ts | head -50Repository: hpware/caddy-and-cert-manager
Length of output: 2393
🏁 Script executed:
rg -n "subjectAltNames" src/app/certs/client.tsx -B 2 -A 2Repository: hpware/caddy-and-cert-manager
Length of output: 511
🏁 Script executed:
rg -n "jsonb" src/components/drizzle/schema.ts -B 1 -A 1Repository: hpware/caddy-and-cert-manager
Length of output: 788
🏁 Script executed:
cat -n package.json | grep -A 5 -B 5 "drizzle"Repository: hpware/caddy-and-cert-manager
Length of output: 1596
🌐 Web query:
Drizzle ORM jsonb $type string[] type safety
💡 Result:
In Drizzle ORM for PostgreSQL, you can achieve type safety for a jsonb column storing a string[] using the .$type<string[]> method on the jsonb column builder. This provides compile-time type inference for inserts, selects, and defaults without runtime validation. Example schema definition: import { jsonb, pgTable } from 'drizzle-orm/pg-core'; const table = pgTable('table', { tags: jsonb('tags').$type<string[]>, }); This infers the column type as string[] in TypeScript. For instance: - db.select.from(table) returns { tags: string[] } - table.tags.default([]) compiles (matches string[]) - table.tags.default({}) fails to compile (type mismatch) Official docs confirm: "will be inferred as string[] jsonb: jsonb('jsonb').$type<string[]>;" Note: .$type does not perform runtime checks; values must conform at runtime. For nested querying (e.g., tags[0]), use raw sql... operators, as native type-safe JSONB path queries are a requested feature but not yet implemented (open issue #1690). Recent issues (2026) note potential driver-specific insertion bugs (e.g., double-stringification with bun:sql), but core .$type<string[]> support remains standard.
Citations:
- 1: [FEATURE]: pg JSONb Type insertion drizzle-team/drizzle-orm#2035
- 2: https://orm.drizzle.team/docs/column-types/pg
- 3: [FEATURE]: Native PG jsonb query support drizzle-team/drizzle-orm#1690
- 4: https://www.answeroverflow.com/m/1140076094804988005
Type both SAN JSONB columns as string[].
These fields are already being consumed as arrays in the UI (line 705 of client.tsx), but the schema leaves their JSON shape implicit. Drizzle's .$type<string[]>() method is the standard way to ensure JSONB defaults, inserts, and selects are type-safe at compile time. Please declare both SAN columns as .$type<string[]>() to prevent untyped payloads from spreading.
Suggested change
export const certificates = pgTable("certificates", {
id: uuid("id").primaryKey().notNull(),
name: text("name").notNull(),
- subjectAltNames: jsonb("subject_alt_names").notNull().default([]),
+ subjectAltNames: jsonb("subject_alt_names").$type<string[]>().notNull().default([]),
privateKey: boolean("private_key").notNull().default(false),
created_at: timestamp("created_at").defaultNow().notNull(),
updated_at: timestamp("updated_at").defaultNow().notNull(),
group: uuid("group").references(() => certificateGroup.id),
});
export const certificateGroup = pgTable("certificate_group", {
id: uuid("id").primaryKey().notNull(),
name: text("name").notNull(),
- subjectAltNames: jsonb("subject_alt_names").notNull().default([]),
+ subjectAltNames: jsonb("subject_alt_names").$type<string[]>().notNull().default([]),
created_at: timestamp("created_at").defaultNow().notNull(),
updated_at: timestamp("updated_at").defaultNow().notNull(),
createdByUser: text("created_by").references(() => user.id),
});Also applies to: 26-26
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/drizzle/schema.ts` at line 16, The JSONB SAN columns are
untyped and should be declared as string[] to enable compile-time type safety;
update the jsonb column definitions for subjectAltNames and issuerAltNames in
schema.ts to use Drizzle's generic typing via .$type<string[]>() so inserts,
defaults and selects are typed correctly and the default([]) remains valid for
both subjectAltNames and issuerAltNames.
Add new SAN adjustments in other tabs, currently only the EASY mode has the new SAN features, not the CSR and the advanced views, this also brings the group feature to life.
Summary by CodeRabbit
Release Notes
New Features
Improvements