Skip to content

feat: replace @buildonspark/spark-sdk with @breeztech/breez-sdk-spark#986

Open
ditto-agent wants to merge 27 commits intomasterfrom
impl/breez-spark-sdk-migration
Open

feat: replace @buildonspark/spark-sdk with @breeztech/breez-sdk-spark#986
ditto-agent wants to merge 27 commits intomasterfrom
impl/breez-spark-sdk-migration

Conversation

@ditto-agent
Copy link
Copy Markdown
Contributor

Summary

  • Replaced @buildonspark/spark-sdk with @breeztech/breez-sdk-spark@0.12.2 across all Spark operations (wallet init, send, receive, balance tracking)
  • Event-driven balance & status tracking via sdk.addEventListener — removes all polling (useQueries/refetchInterval/setInterval) and the zero-balance workaround
  • Simplified SparkAccount type from dual ownedBalance/availableBalance to single balance: Money | null
  • Send flow: two-step prepareSendPayment + sendPayment with idempotencyKey (replaces payLightningInvoice + manual dedup via findExistingLightningSendRequest)
  • Receive flow: receivePayment + bolt11 parsing, event-driven completion detection
  • Lightning Address spark path disabled — Breez SDK doesn't expose receiverIdentityPubkey for delegated invoices yet (cashu path unaffected)
  • Fully removed @buildonspark/spark-sdk and its patch file

Test plan

  • App loads without errors, WASM init completes in _protected clientLoader
  • Spark wallet connects successfully (check console for [Spark]/[Breez] logs)
  • Balance displays correctly and updates on send/receive events (no polling)
  • Send lightning payment completes (prepare → send → event-driven completion)
  • Receive lightning payment completes (invoice creation → event-driven detection)
  • Cashu-to-Spark token claim works (event-driven waitForSparkReceiveToComplete)
  • Lightning Address cashu path still works (/lnurl-test)
  • Lightning Address spark path returns error (expected — disabled)
  • bun run fix:all passes with no errors
  • Zero @buildonspark/spark-sdk imports remain (rg "@buildonspark/spark-sdk" app/)

🤖 Generated with Claude Code

ditto-agent and others added 15 commits April 8, 2026 14:29
Phase C validation complete — all tests passed. These docs provide
context for the Phase A production replacement.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…rk path

Instead of keeping @buildonspark/spark-sdk for Lightning Address, throw
not-implemented error and leave TODO to fork Breez SDK to expose
receiverIdentityPubkey.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Changes:
- WASM init in _protected.tsx clientLoader (not entry.client.tsx)
- Static imports (Breez package exports handle SSR/client)
- No app/lib/breez-spark/ folder — everything in app/lib/spark/
- No getBreezSdk accessor or type re-exports
- Event-driven balance, send status, and receive status (no polling)
- Cached prepareSendPayment response (no double-prepare)
- Single balance value (ownedBalance = availableBalance = balanceSats)
- Simplified updateSparkAccountBalance cache method
- Remove all dead code (workarounds, polling, SparkProto, etc.)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- connectBreezWallet takes { mnemonic, network, apiKey } object param
- Remove createEventListener wrapper (inline addEventListener is simple enough)
- updateSparkAccountBalance takes single balance param (no owned/available split)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
No init.ts file needed. shared/spark.ts calls connect() + defaultConfig()
from @breeztech/breez-sdk-spark inline. Tasks renumbered (7 total).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…per account

- getSparkIdentityPublicKeyFromMnemonic uses static import (not dynamic)
- SparkAccount.balance replaces ownedBalance/availableBalance
- updateSparkAccountBalance works with single balance
- Event listeners: one per account, matches against all pending quotes
- Clarify addEventListener does NOT replay past events (initial check needed)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Replace @buildonspark/spark-sdk polling with Breez SDK event listeners for
receive status tracking. Simplify SparkReceiveLightningQuote type, use
receivePayment() + parseBolt11Invoice instead of createLightningInvoice,
and match completions by paymentHash from paymentSucceeded events.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…k-sdk

Breez SDK does not expose receiverIdentityPubkey for delegated invoices,
so the Spark Lightning Address path now throws with a clear TODO. The old
@buildonspark/spark-sdk dependency, its patch file, and all its imports
are fully removed. ReadUserDefaultAccountRepository returns a stub wallet
for the server-side spark branch (never called in practice now).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Narrow PENDING quotes type for sparkId access in send hooks
- Add Money type cast for moneyFromSats in send service
- Fix empty catch block lint errors in spark.ts

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@supabase
Copy link
Copy Markdown

supabase bot commented Apr 8, 2026

This pull request has been ignored for the connected project hrebgkfhjpkbxpztqqke because there are no changes detected in supabase directory. You can change this behaviour in Project Integrations Settings ↗︎.


Preview Branches by Supabase.
Learn more about Supabase Branching ↗︎.

@vercel
Copy link
Copy Markdown

vercel bot commented Apr 8, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
agicash Ready Ready Preview, Comment Apr 10, 2026 11:00am

Request Review

return 'mainnet';
case 'REGTEST':
case 'LOCAL':
return 'regtest';
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you play with this? Also, are these the only network options? the buildonspark sdk had signet last I looked

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i didn't. breez network type only has mainnet and regtest while spark sdk has this:

declare enum Network {
  MAINNET = 0,
  TESTNET = 1,
  SIGNET = 2,
  REGTEST = 3,
  LOCAL = 4
}

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah it only has mainnet and regtest in breez sdk

await this.queryClient.invalidateQueries({
queryKey: sparkBalanceQueryKey(quotes.destinationAccount.id),
});
// Balance is updated event-driven via useTrackAndUpdateSparkAccountBalances
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need this comment? Assuming this is claude because it always adds comments like this in place of code that is removed

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah it's claude. I will cleanup all the code now

() => wallet.getLightningReceiveRequest(quote.sparkId),
{ receiveRequestId: quote.sparkId },
);
const handlePayment = (payment: {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does the Breez SDK have types we can use instead of writing our own inline?

if (settled) return;
const details = payment.details;
if (details?.type !== 'lightning') return;
if (details.htlcDetails?.paymentHash !== quote.paymentHash) return;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can the type or paymentHash ever not be set or is this just to make TS show these as defined?

})
.then((id) => {
listenerId = id;
// If already settled before listener registered, clean up immediately
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how does this happen? If the timeout fires and sets settled to true? or can it happen too from the initial check below?

): error is SparkError => {
/**
* Checks if an error is an insufficient balance error from the Breez SDK.
* Phase C validation confirmed: message contains "insufficient funds".
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this line should be removed

Copy link
Copy Markdown
Collaborator

@jbojcic1 jbojcic1 Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it should. as a matter of fact I'd remove the entire tsdoc here. doesn't really add a lot of value imo


/**
* Checks if an error indicates the invoice was already paid.
* Phase C validation confirmed: message contains "preimage request already exists".
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same with this one

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah entire tsdoc here should be removed

connect({
config,
seed: { type: 'mnemonic', mnemonic },
storageDir: `spark-${breezNetwork}`,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where is the storage? And do you know what's in it?

sparkBalanceQueryKey(updatedQuote.accountId),
),
});
// Balance is updated event-driven via useTrackAndUpdateSparkAccountBalances
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this comment needed?

typename: response.typename,
paymentPreimage: response.paymentPreimage ?? undefined,
receiverIdentityPublicKey: response.receiverIdentityPublicKey ?? undefined,
fee: moneyFromSats(0),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this fee ever get set to something else? If not, then why is fee added to this type. I don't think receiver pays fees

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The package's web entry exports only the WASM init function as default,
so `import pkg from` + destructure loses the named exports in the
browser. The patch adds all named exports to the default via
Object.assign, making `import pkg from` work in both Node (CJS) and
browser (ESM) environments.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add nodejs/index.mjs that re-exports the CJS entry as named ESM
bindings, and update the package exports map to use it for ESM
imports. This lets `import { connect } from '@breeztech/breez-sdk-spark'`
work in Vite SSR without dynamic imports or Object.assign hacks.

Reverts app code to clean static named imports.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
clientMiddleware runs before clientLoader, so WASM must be initialized
in the middleware where defaultExternalSigner is first called.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…rData

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Replace @breeztech/breez-sdk-spark with @agicash/breez-sdk-spark@0.12.2-1
  (fork with ESM Node entry and receiverIdentityPubkey support)
- Remove the CJS-to-ESM bun patch (fork ships native ESM)
- Restore Spark Lightning Address flow: server wallet via
  LNURL_SERVER_SPARK_MNEMONIC creates delegated invoices with
  receiverIdentityPubkey, matching the master branch behavior
- Restore handleSparkLnurlpVerify using BreezSdk.getPayment()
- ReadUserDefaultAccountRepository accepts optional mnemonic to
  create real Spark wallets on the server

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* Hex-encoded identity public key of the receiver.
* Used for delegated invoices (Lightning Address).
*/
receiverIdentityPubkey?: string;
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why was this added? it was removed in RepositoryCreateQuoteParams and added here. I'd revert that change

paymentHash,
expiresAt,
sparkId,
receiverIdentityPubkey,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be reverted

* If provided, the incoming payment can only be claimed by the Spark wallet that controls the specified public key.
* If not provided, the invoice will be created for the user that owns the Spark wallet.
*/
receiverIdentityPubkey?: string;
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was removed form here and added below. no need to do such unnecessary changes. also not sure there was a need to chagne the comment

identityPublicKey() returns { bytes: number[] }, not ArrayBuffer.
The incorrect cast produced garbage hex, causing "malformed public key"
errors for delegated invoices and potentially Spark offline on Vercel.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
We use our own lightning address system, not Breez's. Setting
lnurlDomain to undefined prevents the 404 recovery call to
breez.tips on every SDK connect.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
## Integration Notes

- Import from `@breeztech/breez-sdk-spark` (root path, not `/bundler`) — root has conditional exports: Node.js gets CJS entry, browser gets ESM/WASM entry
- WASM init must happen in `entry.client.tsx` (client-only) via dynamic import
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i changed this to happen in protected route middleware so we don't init it for users that are not logged in. it is only needed for logged in users


- Import from `@breeztech/breez-sdk-spark` (root path, not `/bundler`) — root has conditional exports: Node.js gets CJS entry, browser gets ESM/WASM entry
- WASM init must happen in `entry.client.tsx` (client-only) via dynamic import
- All SDK usage in app code must use dynamic `import()` to avoid SSR module graph issues
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I solved this in the fork of the lib by adding an option to use esm instead of commonjs for node


**Result: SAME** — Fees are identical between Breez SDK and current Spark SDK. Both use the same Spark protocol.

## C6: Init Performance
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is kind of comparing apples to oranges due to have sdks are bundling and loading wasm. They have different bundle sizes and are triggering things at different times so to make the comparison that is relevant for us we'd need to consider the speed to show the balance on the home screen in different scenarios:
a) no cache hit time to display the balance on the home page
b) cache hit time to display the balance on the home page

also possibly try both on slow network, etc.

| Insufficient balance | `Error` | `insufficient funds` |
| Already paid invoice | `Error` | `preimage request already exists` |

**Key difference from current Spark SDK:** Spark SDK's `SparkError` has `getContext()` returning structured data (e.g., `{ expected, value, field }` for insufficient balance). Breez SDK errors are plain `Error` with flat message strings — no typed subclasses, no structured context.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One corollary of this is that we can't have those specific error messages for cases where owned balance hasn't been decreased yet and another send is attempted. This is probably fine because Breez sdk seems much better at handling that. Owned and available balances are abstracted away and balance does seem to decrease relatively fast after initiating the send even while it is still pending.

@@ -1,2 +1,6 @@
export * from './errors';
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need to revert this change. this was absolutely unnecessary

});
} catch (error) {
if (isInsufficentBalanceError(error)) {
throw new DomainError(`Insufficient balance. ${error.message}`);
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

before this change the message was Insufficient balance. Total cost of send is ${value} sats but the available balance is ${availableSats} sats.

we should see if there is a way to keep it the same (if we can get the total send cost and available balance values)

private readonly db: AgicashDb,
private readonly queryClient: QueryClient,
private readonly getSparkWalletMnemonic: () => Promise<string>,
private readonly getSparkWalletMnemonic?: () => Promise<string>,
Copy link
Copy Markdown
Collaborator

@jbojcic1 jbojcic1 Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why was this made optional? it should be reverted


private async getInitializedSparkWallet(network: NetworkType) {
private async getInitializedSparkWalletForNetwork(network: SparkNetwork) {
if (!this.getSparkWalletMnemonic) {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when getSparkWalletMnemonic is made mandatory again this whole change can also be reverted

}

private async getInitializedSparkWallet(network: NetworkType) {
private async getInitializedSparkWalletForNetwork(network: SparkNetwork) {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this name change was unnecessary and should be reverted

* Network of the Spark account.
* Based on the NetworkType enum from the spark-sdk.
*/
network: z.enum(['MAINNET', 'TESTNET', 'SIGNET', 'REGTEST', 'LOCAL']),
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should change this to only have values 'MAINNET' and 'REGTEST' and change the comment to say it is based on breez sdk type. then when storing the value in the db we need to convert it to uppercase (because breez type is lowercase) and when we need to convert db type to breez type we need to convert to uppercase

we can change the type like that because in all our dbs all storated accoutns atm are 'MAINNET'


export type SparkNetwork = SparkAccountDetailsDbData['network'];

export function toBreezNetwork(network: SparkNetwork): 'mainnet' | 'regtest' {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when we do this here we can just call toLower.

also this fn doesn't even belong here. we can just call tolower where needed

optimizationOptions: {
auto: true,
const breezNetwork = toBreezNetwork(network);
const apiKey = import.meta.env.VITE_BREEZ_API_KEY;
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets load this env var into variable on the top of the module and throw if not set (same pattern like in database.server.ts file for example)

const apiKey = import.meta.env.VITE_BREEZ_API_KEY;

// initLogging is idempotent — safe to call multiple times
try {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets extract this logger init into small helper function and also log a warning to the console in catch clause

}

export function useTrackAndUpdateSparkAccountBalances() {
const { data: sparkAccounts } = useAccounts({ type: 'spark' });
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just filter online accounts here so we don't need to do if (!account.isOnline) continue; below

if (
event.type === 'paymentSucceeded' ||
event.type === 'paymentPending' ||
event.type === 'synced'
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should also include paymentFailed and claimedDeposits I think. those are also updating the balance


const sdk = account.wallet;

sdk
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should do something like this instead:

useEffect(() => {
    const registrations = sparkOnlineAccounts
      .map((account) => {
        const sdk = account.wallet;
        const listenerPromise = sdk.addEventListener({
          onEvent(event: SdkEvent) {
            sparkDebugLog('Breez event', {
              accountId: account.id,
              type: event.type,
            });

            if (
              event.type === 'paymentSucceeded' ||
              ...
            ) {
              sdk.getInfo({}).then((info) => {
                ...
                accountCache.updateSparkAccountBalance({
                  accountId: account.id,
                  balance,
                });
              });
            }
          },
        });
        return { sdk, listenerPromise };
      });

    return () => {
      for (const { sdk, listenerPromise } of registrations) {
        listenerPromise.then((id) => sdk.removeEventListener(id)).catch(() => {});
      }
    };
  }, [sparkAccounts, accountCache]);

/**
* Optional public key of the wallet receiving the lightning invoice.
*/
receiverIdentityPubkey?: string;
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should not be removed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants