Proposal: Denex Developer SDK#238
Conversation
There was a problem hiding this comment.
Thanks for the proposal. I have added a few comments to clarify the scope of work being proposed. It seems that I am missing to understand, the motivation behind some of the work being proposed here.
Edit: it would also help if you could showcase some of your existing works, the https://github.com/denex-io/ repo currently does not contain any public repos.
| | **Runtime inspection** | No programmatic way to query active topology, parties, ports, or users; developers must read env files, Docker volume mounts, or navigate web UIs manually | **CLI and programmatic API** to inspect running network state, such as query active parties, ports, users, and configuration | | ||
| | **Runtime modification** | Requires stopping the network, editing config files, and restarting | **Modify a running network**: onboard parties/users, adjust topology via API or CLI | | ||
| | **Integration testing** | No built-in test harness; developers must manually ensure the network is running and hardcode connection parameters | `test-utils` with `setupSandbox` for automated lifecycle management, party allocation, and teardown | | ||
| | **Prerequisites** | Docker >= 27, Compose >= 2.27, direnv, Java 21, Node.js, >= 8 GB RAM; Gradle/Make orchestration | Streamlined prerequisite chain | |
There was a problem hiding this comment.
what is meant by "Streamlined prerequisite chain", could you elaborate?
There was a problem hiding this comment.
Yes, I can see this is somewhat confusingly worded. I'll update this in the proposal.
The details here are that denex-localnet requires just Docker and Deno/Node as dependencies.
| | Milestone; Deadline | Deliverable | Category | Requested Funding (CC) | Description | | ||
| | ---------------------- | ----------------------- | -------------- | ---------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | ||
| | 1.1; Approval + 3 months | adz initial release | adz | 1,000,000 | Daml source code parser and TypeScript code generator producing Zod schemas and typed package descriptors from Daml projects | | ||
| | 1.2; Approval + 3 months | SDK initial release | Denex SDK | 500,000 | Collection of Typescript packages providing high-level, type-safe access to Canton ledger, scan, validator using Zod schemas for runtime type safety at the Daml payload level. Publish `@denex/*` packages to public npm registry. | |
There was a problem hiding this comment.
Could you provide more details on the kind of APIs you are are targeting?
All of the components mentioned already have OpenAPI spec, so what will be the goal of the these packages, to provide typescript clients for the OpenAPI spec?
or are you referring to parse of daml contracts provided by these endpoints. It would be helpful if these details are clarified.
There was a problem hiding this comment.
Both, layered. The HTTP/OpenAPI piece is one component; the Daml-contract
handling sits on top of it and is where most of the value is.
The SDK targets the three Canton subsystems a dApp developer needs to talk to:
the participant node's JSON Ledger API v2 (query active contracts, submit
commands, read users and rights, fetch contract disclosures), the Scan Proxy
API (AmuletRules, open mining rounds, featured app rights), and the Token
Standard Registry APIs (instrument discovery, transfer and allocation
factories together with their choice context and disclosed contracts).
Internally it is two layers:
- Low-level OpenAPI TypeScript client — generated from the Canton OpenAPI
specs. This is exactly the "TypeScript client for the OpenAPI spec" option in
the question. It gives typed HTTP access to the raw endpoints and nothing
more; it is interchangeable with other OpenAPI-generated clients (e.g.
@c7-digital/ledger). - High-level Ledger interface on top — the part that actually parses and
produces Daml contracts. This is where the SDK differs from a plain OpenAPI
client:- Daml contract payloads, choice arguments and choice results are described
as Zod schemas. The OpenAPI spec can only describe those asunknown/
object, because they are Daml-defined, not Canton-defined. The schemas
provide both compile-time TypeScript types (via inference) and runtime
parsing, validation and wire-format encoding of Daml scalars such asInt,
Numeric,Time,Party,ContractId,Optional, whose Canton wire
format does not match their natural TypeScript representation. - A command builder turns those schemas into typed template, contract and
choice wrappers, so a choice invocation like
counter.increment({ amount: 2n })replaces manually assembling a
templateId/contractId/choice/argumentobject and casting
unknownresults. - Workflow helpers bundle the multi-call flows Canton requires —
cursor-based pagination as a reusable pipeline, token-standard factory
resolution (registry call + disclosed-contract collection + choice context)
as a single object ready to feed into submit, disclosure retrieval as a
per-call opt-in, multi-command submit that returns a tuple of per-command
results with per-command types.
- Daml contract payloads, choice arguments and choice results are described
| | ------------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- | | ||
| | **Primary audience** | Wallet providers, exchanges, dApp frontends connecting to wallets | dApp backends, frontends and full-stack developers, automation builders | | ||
| | **Core abstraction** | CIP-0103 JSON-RPC protocol between dApps and wallets; `DappClient` proxies ledger calls through wallet gateway | Direct ledger interaction via `Ledger` API with typed commands, queries, and contract wrappers | | ||
| | **Type safety at Daml layer** | OpenAPI-generated types for HTTP paths and JSON-RPC schemas; **Daml payload types are `unknown`** unless manually typed via `Ops.*` generics | Zod schemas (hand-written or adz-generated) provide **compile-time and runtime type validation** for all contract payloads, choice arguments, and results | |
There was a problem hiding this comment.
Could you describe why the Zod schemas would be useful over the typescript code provided via daml codegen, like some concrete example where the users would benefit from the runtime validation / decoding for a payload vs the generated decode currently provided by daml codegen
There was a problem hiding this comment.
daml codegen js / daml2js emits TypeScript companion objects and decoders
built on @daml/types. They verify the JSON shape of a payload and hand back
values, but the field types are transport-level rather than
application-level: Int, Numeric and Time all stay as string (branded
through @daml/types), so application code converts them on every read.
Concretely, on a contract with count : Int, createdAt : Time and
price : Numeric 10:
// daml codegen / @daml/ledger today
const c = Counter.decoder.runWithException(payload);
c.count; // string (branded Int)
c.createdAt; // string (branded Time)
c.price; // string (branded Numeric)
const count = BigInt(c.count); // convert on every read
const at = new Date(c.createdAt); // convert on every read
const price = new Decimal(c.price); // convert on every read, carefullyWith a Zod schema describing the same payload the decoded values are the
TypeScript types you actually use, and encoding back to the wire happens the
same way on submission — bigint choice arguments serialize to the integer
strings the ledger expects for Int, Decimal values to the decimal strings
expected for Numeric, Date values to ISO-8601 timestamps, etc., without
per-call boilerplate:
const c = Payload.parse(raw);
c.count; // bigint
c.createdAt; // Date
c.price; // Decimal (from decimal.js, no precision loss)Four concrete places this matters:
Numericprecision. Astring → numberround-trip loses precision for
balances and coin amounts. Branded-string codegen leaves the conversion (and
the choice of conversion) up to every call site; Zod allows to implement
encoding/decoding Decimal in one place.- Semantic validation, not just shape.
.refine(...)/ regex checks let
Partyfields reject malformed identifiers, enums reject unknown variants,
and domain-level invariants get enforced at the ingest boundary. This matters
most when a payload arrives from another party's node (e.g. via Scan), where
the data did not originate from your own codegen toolchain. - Same validation vocabulary as the rest of the dApp. HTTP bodies,
application configuration and internal events are already validated with Zod
in many TypeScript projects; the Daml payloads slot into the same vocabulary
instead of pulling in a Daml-specific parallel (@daml/types). - Lightweight, editable artifact. The schemas are plain TypeScript files —
either generated from the dApp's own Daml sources with a tool likeadzor
hand-written for third-party templates you only consume (Amulet, Token
Standard). There is no runtime dependency on@daml/types/@daml/ledger,
and a dApp can add a few interfaces from a foreign package without needing
access to its Daml source.
|
|
||
| **On the current TypeScript codegen landscape:** Digital Asset's | ||
| JavaScript/TypeScript codegen (`daml2js`) depends on the `@daml/ledger` package, | ||
| which has not been updated to support the current Canton architecture. DA's own |
There was a problem hiding this comment.
Could you clarify " not been updated to support the current Canton architecture"?
These packages are being used in production systems currently to do encoding/decoding of contracts.
It seems that there are two concerns being mixed, the handling of contract parsing, and doing HTTP requests.
For doing HTTP requests, plenty of open source production grade client code already exists and it makes sense to use it, so it is not clear to me what issue is being solved by adz SDK
There was a problem hiding this comment.
Fair pushback — the earlier framing was too broad. Two points to separate.
Clarification on the "current Canton architecture" claim
This is not a claim that @daml/ledger or daml codegen are broken or unused —
they are clearly in production. The precise points are:
-
@daml/ledgeris explicitly deprecated and is no longer distributed with
Canton 3.4. The
Canton 3.4 release notes for Splice 0.5.0 state
verbatim: "The deprecated@daml/ledgerand@daml/reactTypeScript
libraries are no longer distributed with Canton 3.4." The underlying
transport matches: the same release notes and the
Canton/Daml 3.3 release notes describe JSON Ledger API v2
as "the supported JSON Ledger API version going forward", v1 was deprecated in
Canton 3.3, and v1 was removed in Canton 3.4 (a
migration guide is published). The@daml/ledgernpm
package (2.10.x) still targets the v1 JSON API documented at
docs.daml.com/json-api/(Daml 2.x), so on Canton 3.4 participant nodes it is
both marked deprecated by Digital Asset and unable to reach the endpoints it
calls.As far as I can verify from public Canton 3.3 / 3.4 release notes, Digital
Asset has not announced a drop-in@daml/ledgerreplacement for JSON Ledger
API v2 under the@damlscope. TypeScript codegen continues (the 3.4 notes
describe changes to interface identifier output, and the newdpmCLI
generates TypeScript alongside Java), but that is codegen, not a runtime
HTTP/ledger client library. A TypeScript consumer of Canton 3.4 therefore has
to either write an HTTP client against JSON Ledger API v2 themselves, use a
community package such as@c7-digital/ledger, or use a higher-level library
like this SDK. -
@daml/typesmodelsInt/Numeric/Timeas branded strings. That
is correct for the transport, but it pushes the "decode into application
values" concern onto every call site (see previous comment). -
Neither covers Scan / Validator / Token-Standard flows. A dApp building on
Amulet or Token Standard has to add those endpoints and their multi-call
workflows itself.
Two concerns — which one is the SDK actually about?
Agreed that parsing and HTTP are separate concerns, and the HTTP concern is
well-served by existing code. The SDK is overwhelmingly about the parsing /
command-building concern:
- HTTP layer. Thin, generated from the Canton OpenAPI specs, and
substitutable with another OpenAPI-generated client (@c7-digital/ledgerdoes
the same thing for the Ledger API). This is not where the SDK claims to add
value. - Parsing / command-building layer. This is the value. Zod-typed payloads
and choice arguments, a command builder that produces typed template and
contract wrappers, disclosure-aware queries, multi-command submit with typed
per-command results, and workflow helpers for token-standard factory
resolution, pagination, and choice-context lookups. None of this is provided
by an OpenAPI-generated HTTP client, because none of it is HTTP — it is
Daml-level plumbing that every Canton dApp would otherwise rebuild.
The closest existing option on the parsing concern is @daml/ledger +
daml codegen js, which covers some of it but only at the transport-level
typing, only for JSON API v1, and only for participant-node endpoints (no Scan /
Token Standard). @c7-digital/ledger covers the HTTP concern for API v2 but
explicitly does not attempt the parsing layer. @canton-network/wallet-sdk is
wallet-centric (external signing, prepared-transaction flow) rather than
dApp-centric and does not offer general typed templates or a command builder.
So the SDK is not trying to replace existing OpenAPI-generated HTTP clients; it
uses one internally. It is trying to consolidate the Daml-side concerns —
payload typing, choice-context and disclosure workflows, command construction,
multi-command results — into a single, v2-native, Zod-driven layer that dApp
developers would otherwise have to assemble per project.
| `useDenexContext`, typed error handling | ||
| - **NPM-publishable:** Deno-first development with `dnt` build pipeline for NPM | ||
| distribution | ||
| - **CIP-0056 compliant** with a roadmap to CIP-0103 compliance |
There was a problem hiding this comment.
I think CIP-0103 compliance is a must. Commenting below why.
| The [Splice Wallet Kernel](https://github.com/nickkatsios/splice-wallet-kernel) | ||
| is a comprehensive monorepo targeting **wallet provider integrations** via the | ||
| CIP-0103 dApp API standard. We recognize its value and do not propose to replace | ||
| it. Rather, the Denex SDK addresses a different, and currently unserved, |
There was a problem hiding this comment.
You're highlighting here that TypeScript could potentially be used in two rather distinct places:
- A node.js backend that sits next to the app dev's validator and interacts directly with the Ledger API.
- A dApp frontend that connects to the ledger via the CIP-0103 dApp API.
As an app developer I typically need to write both. A tooling landscape that has different tooling for purposes 1 and 2 means I have to build my backends and frontends using different libraries, codegens and type representations.
The Splice Wallet Kernel team is looking to move the TypeScript tooling to a state where there's a uniform "LedgerClient" object that behind the scenes interacts with the ledger either via 1. and 2. above, meaning it abstracts away the fontend/backend distinction of how you connect to the ledger.
I think your proposal would be much stronger if you planned to target that abstraction so that your codegen, typing, command builders, etc. worked independent of how the ledger is connected.
There was a problem hiding this comment.
We agree with the principle, and this is how the Denex SDK is designed. adz, the
command builder, the contract wrappers and the pagination helpers are all
transport-agnostic by design, so the same application code can run in a Node
backend and in a browser frontend using the same schemas, package descriptors
and contract-wrapper API. Delivering this uniformly is part of the work this
grant funds.
Concretely, the SDK exposes one typed ledger interface instantiable over
different transports:
- A server-side instance implemented directly against the JSON Ledger API
v2, with per-call contract-disclosure access. This is the implementation
delivered under this grant. - A client-side shape with the same surface (minus disclosure, which is
removed at both the type and runtime levels), usable from the browser as soon
as a suitable transport is provided. Wiring that transport up (auth, session,
proxy or direct-API deployment) is application-specific and out of scope here.
This aligns with the direction the Splice Wallet Kernel team is taking with the
LedgerClient abstraction: an upstream LedgerClient would be a natural future
transport, and the Denex ledger API could be implemented on top of it without
changing application code, schemas or the command builder.
On CIP-0103 specifically
CIP-0103 support remains on the Denex SDK roadmap, but we would like to keep it
outside the scope of this proposal and revisit it in a follow-up:
- It addresses a different layer. CIP-0103 is a dApp ↔ wallet connectivity
protocol, well-served by the Splice Wallet Kernel. The Denex SDK's focus is on
typed contract interaction, codegen and command building. - The
LedgerClientabstraction is still maturing. Waiting until it has
settled lets us integrate cleanly rather than chase a moving target. - Backend/frontend uniformity can be delivered without it. CIP-0103 is one
possible future transport under the typed ledger API, not a prerequisite for
uniform developer experience across backend and frontend.
The transport-agnostic design means CIP-0103 can be added later without breaking
existing users. We are happy to discuss a follow-up or scoped extension once the
upstream work is further along.
| developer persona: **the dApp backend developer who needs to interact with their | ||
| own participant node's ledger.** | ||
|
|
||
| | Dimension | Splice Wallet Kernel | Denex SDK | |
There was a problem hiding this comment.
Rather than contrasting Denex SDK <> Wallet SDK, I'd present a picture of how the Denex SDK adds a layer on top of the Splice tooling as per above. Splice provides an abstract "LedgerClient". Denex SDK provides typing, codegen, builders on top.
There was a problem hiding this comment.
That matches our intent. The layering we have in mind is:
- Transport — JSON Ledger API v2, Scan Proxy, Token Standard Registry APIs,
and, in future, the Wallet Kernel'sLedgerClient. The SDK consumes these;
it does not reimplement them. - Typed client — a single typed ledger interface with a direct
implementation against the JSON Ledger API v2, Scan Proxy and Token Standard
Registry APIs delivered under this grant, and a matching client-side shape
instantiable over any suitable browser transport. Both would be implementable
on top ofLedgerClientin future without any API change. - Developer layer — adz-generated Zod schemas and package descriptors, the
command builder, template and contract wrappers (Counter.create(...),
counter.increment(...),counter.archive(),choice.asCommand()),
disclosure-aware workflows and pagination helpers.
The "vs." framing in Section 4.1 was intended to highlight the gap in the
current typed-SDK story, not to position the Denex SDK against the Wallet
Kernel. The Denex SDK does not aim to reimplement anything the Wallet Kernel
provides — it consumes whatever ledger-access transport the developer has chosen
and adds typing, codegen and command building above it.
|
|
||
| | Dimension | Splice Wallet Kernel | Denex SDK | | ||
| | ------------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- | | ||
| | **Primary audience** | Wallet providers, exchanges, dApp frontends connecting to wallets | dApp backends, frontends and full-stack developers, automation builders | |
There was a problem hiding this comment.
Here you do include "frontends" while above stating the target to be just "the dApp backend developer who needs to interact with their own participant node's ledger". As I said, I do think that including frontend devs is of high value, but in that case you need to address somewhere how this proposal relates to @c7/react. If we end up two stacks as follows, devs still don't have a single clear choice for all their TypeScript needs.
- dpm codegen,@c7/daml, @c7/react: Stack 1, goes all the way to UI components
- Denex codegen, Denex SDK: Stack 2, stops at types and builders
Ideally, you'd include the work to require @daml/react or @c7/react to work with the Denex stack, or you'd include a replacement that makes it easy for devs to build frontends.
There was a problem hiding this comment.
The "stops at types and builders" characterization is accurate and deliberate.
The grant targets the parts of the stack where the current TypeScript landscape
has the biggest gap — codegen, typed contract interaction and command building —
which were also among the items the 2026 Developer Experience Survey ranked most
urgent.
For a frontend developer:
- The typed ledger client, codegen output and command builder work unchanged
in the browser. A frontend dApp consumes the same Zod schemas, contract
wrappers andsubmit/query/exerciseAPI as the backend, so there is no
second set of types, codegen or command model. - Not included in this grant: a full browser-side dApp runtime (OAuth
sign-in, session management) or a UI-component library. Auth and session
wiring are typically application- and deployment-specific, and a UI-component
library like@c7/reactis a sizable project in its own right.
On @c7/react specifically:
- A developer who needs typed ledger access in React gets it from the Denex
SDK directly — the typed client, schemas and command builder are usable from
any JS runtime, including a React frontend. - A developer who wants pre-built UI components can continue to use
@c7/reactor similar, fed with a Denex ledger client. There is no inherent
incompatibility. - Building a replacement for
@c7/reactis not in scope here; if the
community identifies it as a priority later, the Denex SDK's typed contract
wrappers are exactly the input such a library would need.
For the types, codegen, command-builder and typed-client layers, a developer can
pick the Denex stack once and use it on both sides of the wire. Above that, the
choice of UI-component library or application framework remains the developer's
to make.
There was a problem hiding this comment.
Given that @leonidr-c7 submitted #270, would it be worth explicitly aligning the two to target demonstrable compatibility between the typed layer in this proposal and the UI layer in #270 ?
|
Thank you for your responses. I would like to further review the proposal but I find it difficult because of its size, and the amount of things covered in the proposal. As I am not a user of JavaScript, the SDK and other parts are not an area I can review properly, so I will leave that on other to review. For the localnet I am very interested in understanding and reviewing the design you are trying to create. I feel like that the localnet work should be carved out into a separate proposal, and even have its own milestones, acceptance criteria and the adoption related milestones. Further I would like you to add more details regarding the localnet work, like the details on the CLI tools that will be provided. And regarding its milestones what features would be the target of the releases for various phases.
|
|
Thanks for this proposal. Overall I'm quite supportive of the direction, but would like to see the integration with existing dev tooling addressed more.
I'm generally wary of creating alt-dev-tooling that overlaps heavily with the Daml/Splice SDK, but needs to replace large parts of the latter to be used. Because on that case, developers are likely to start on the Daml/Splice SDKs, then discover the cracks, discover the alt-tooking and have to rework. So either
|
|
|
||
| 1. **adz**: A Daml-to-TypeScript code generator that produces Zod schemas and | ||
| typed package descriptors from Daml source, eliminating the need for manual | ||
| type definitions or reliance on deprecated codegen tooling. |
There was a problem hiding this comment.
I would be interested in understanding what tooling do you consider deprecated. The DA-provided codegen-js available as daml codegen js command in versions 3.1..3.4 and as dpm codegen-js command in 3.4 onward. is not deprecated nor is there a plan to do so.
There was a problem hiding this comment.
To be precise about what is and is not deprecated:
@daml/ledgerand@daml/reactare explicitly deprecated by Digital
Asset. The Canton 3.4 release notes for Splice 0.5.0 state verbatim that
"the deprecated@daml/ledgerand@daml/reactTypeScript libraries are no
longer distributed with Canton 3.4." Related: JSON Ledger API v1 was
deprecated in Canton 3.3 and removed in 3.4, and the@daml/ledger2.10.x npm
package targets v1. This is the deprecation the proposal is referring to — the
runtime TypeScript ledger client, not codegen.daml codegen js/dpm codegen-jsis not deprecated, and we do not claim
it is. TypeScript codegen continues underdpm. adz is positioned as an
alternative codegen with a different output shape (Zod schemas, inferred
TypeScript types and package descriptors) rather than a replacement for a
deprecated tool.@daml/typesis not deprecated. Our observation is only that its
transport-level typing ofInt/Numeric/Timeas branded strings pushes
decoding into every call site — a design-point note, not a deprecation claim.
Where the proposal says "deprecated codegen tooling," the intent is to point at
the end-to-end TypeScript stack that daml codegen js output is typically used
with (i.e. @daml/ledger), which is deprecated.
Our intent is to fall under the "build on top with clearly stated integration adz vs. dpm codegenadz and adz will be open source, and one of its build targets is a single-file Getting-started docs on docs.canton.networkWhich tooling is presented on docs.canton.network is the Canton Foundation and On the "alt-dev-tooling" concernWe share the concern about creating parallel tooling, and we have tried to shape
Neither component requires developers to abandon the Daml/Splice SDK to adopt |
Sounds great. Thanks for your thoughtful engagement.
I've heard some complaints and concerns about the Dev Fund process from people on the committee, but "there are too few proposals" so far hasn't been one. ;) I think arguments could be made for this being split into any number of smaller proposals. Big picture,
It seems like the convention on these proposals has been not to dive too deeply into technical details in the proposal itself, which is perhaps something we should think about providing some guidance on. I can provide a few things that might help here. This is the current CLI help output that outlines quite a few of its capabilities: This is an example config that's passed in to create a localnet:
Yes, I think
There is some maintenance on our part, which we are happy to do because we are active users of this tool ourselves. I think it will be quite minor to do though as, aside from the kind of sweeping breaking changes to the core Splice infrastructure that probably everyone hopes are mostly in the rear-view mirror now, mostly it's a matter of bumping versions on the container refs.
It runs a full LocalNet, including a fully operational Super Validator node as well as a configurable number of Validator/participant nodes. So anything that works on any of the public networks or in Splice localnet/cn-quickstart should work. Integrating such tools should be even easier than with these other options, as you get complete control over the environment in the config file and the ability to query the state and configuration of the running localnet programatically to set up environment config for other tools and apps without the fiddly hardcoding that's common today.
Yes, this is a core feature already. All the containers |
I think I was not able to clarify my intention. I would like the proposals to be easier for the reviewers to review. And if there are many things being proposed then it would make the review process slow. One way to achieve this is to split the proposal into separate parts, (as I said earlier I am mostly interested only in localnet aspects, and leave SDK review to others) Regarding CLI, thanks for clarifying, I wasn't expecting a single CLI design as you have described, from first read of proposal I had somehow pictured multiple CLIs. Regarding splice releases support, I would like this to be mentioned in proposal. I have noticed myself some time back that cn-quickstart wasn't following this, so I don't consider this a trivial. Regarding the "utility", sorry I wasn't clear, I meant the service provided by DA (https://docs.digitalasset.com/utilities/devnet/index.html). I believe most of non-CC tokens on canton are currently using this, and it is a big pain point of doing localnet testing if there is no support of token-standard APIs, that are currently being provided by this service on devnet. |
Ah, I follow you now. Based on my conversations with DA, my understanding is that they do not currently ship all the pieces of the Utility app that you would to run your own registry on a localnet. So there's not much that anyone outside of DA can do about that. However, we a prototype of separate utility that enables minting CIP-56 compliant tokens in local and test environments that works with the |
| identified in the | ||
| [2026 Developer Experience and Tooling Survey](https://discuss.daml.com/t/canton-network-developer-experience-and-tooling-survey-analysis-2026/8412): | ||
|
|
||
| 1. **adz**: A Daml-to-TypeScript code generator that produces Zod schemas and |
There was a problem hiding this comment.
One thing that’s awkward about this approach is that you are bypassing the Daml-LF JSON encoding, that is presented over the JSON API. What guarantees would your approach provide to guard against changes in this encoding that may not be due to Daml changes.
There was a problem hiding this comment.
Worth separating what the codegen reads from what implements the wire format. adz parses Daml source files; daml codegen js reads DARs (Daml-LF protobuf). Either way, the codegen learns the type structure of templates, choices, and data types — but the Daml-LF JSON encoding of those values has to be implemented separately, in the runtime that the generated code uses. daml codegen js pairs with @daml/types, where the LF JSON encoding is hand-written (source). adz pairs with the Denex SDK, which similarly implements the LF JSON encoding in a small, scoped module. The encoding spec is a documentation artifact, not a machine-readable schema, so any TS toolchain has to track it by hand against the same spec.
Our schemas don't define a parallel wire format — they target the LF JSON encoding on both sides of the transform. What changes is the in-memory TypeScript representation between decode and encode, not the bytes on the wire. A Numeric arrives as a string on the wire (LF JSON), is decoded into a Decimal for application code, and on submission is re-encoded to a string in the LF format. An Int arrives as a string, becomes a bigint in TS, and goes back out as a string.
Two things guard against drift in the LF encoding. The schemas are designed to express the full LF JSON encoding directly — Zod codecs map cleanly onto the scalar string-with-format conventions, and the shape conventions for Optional / Variant / Record / Map / Set are straightforward composable schemas. Encoding changes translate into local edits in one module rather than ripple-effect changes across generated code. The toolchain is also exercised against a real Canton sandbox in CI: contracts are created, choices exercised, and results read back through the JSON Ledger API. Encoding drift surfaces as concrete test failures against a live participant rather than as silent schema mismatches.
| Daml payload level. | ||
| - The community tools page lists `@c7/ledger` and `@c7/react` as replacements | ||
| for `@daml/ledger`, with no explanation of why a replacement is needed and | ||
| minimal adoption (1 GitHub star). |
There was a problem hiding this comment.
😆 That’s not true, I alone ⭐ ‘d it twice!
| - **Automatic choice name normalization:** Daml's `Offer_Accept` convention -> | ||
| `.accept()` method with original name preserved for ledger submission | ||
| - **Explicit `Ledger` vs `ClientLedger` split** enforcing disclosure access | ||
| rules at the type level so that server-side code gets `$disclosure()`, while |
There was a problem hiding this comment.
Could you please describe this distinction in more detail?
There was a problem hiding this comment.
The SDK exposes two flavors of the ledger API. Both let you submit commands and query contracts and interfaces. They differ in one capability: access to contract disclosures on query results.
- Server-side ledger: for trusted code (e.g. backend services). Queries can opt in per call to receive disclosures alongside contract payloads, which can then be fed into subsequent submit calls. Contract wrappers built from this ledger expose a
$disclosure()helper to fetch one on demand. - Client-side ledger: for untrusted code going through a proxy. Queries have no way to ask for disclosures, and contract wrappers don't expose
$disclosure(). submit still accepts disclosures, but the caller must obtain them from elsewhere (e.g. registry endpoints that return them as part of their normal response).
The proxy strips disclosures from query responses at runtime; the two-type split enforces the same boundary at compile time, so client code can't accidentally request or depend on them.
There was a problem hiding this comment.
Probably not worth debating these details at the proposal level. But disclosure is a legitimate use case even on the client side, so I hope this is configurable.
|
This is a good proposal reflective of an impressive technology. I may disagree with the architecture of I would support breaking this proposal into two (or three), as disagreements over the above should not be a blocker for supporting your localnet project. |
|
Temple is in favor here. We’re already integrating this stack, and it’s shaping up to be a very practical, cohesive path for type-safe Canton app dev |
|
Did some substantial reworking here. The |
7b6e8ed to
5cb7b00
Compare
Development Fund Proposal Submission
Proposal file:
./proposals/denex-development-toolkit.md
Summary
The Denex Development Toolkit is a suite of tools to accelerate the development of Canton apps. It includes a tool to generate TypeScript types and Zod schemas from Daml contracts; a type-safe high-level SDK library for Canton Ledger interaction that integrates with these generated types and schemas along with the Ledger API spec, enabling both dev affordances and powerful guardrails and tool assistance for agentic/AI systems; and a tool for creating and managing Canton LocalNets via both API and a single-file configuration format that includes direct control over validators/participants, parties, users, ports, and authentication, and tools to integrate into CI/CD and app development flows. Altogether, these tools address persistent pain points raised in development surveys around developing against smart contracts, type-safe SDKs, and environment setup, and work cohesively to accelerate app development.
Denex uses these tools extensively in our own projects, and they are part of our larger Denex App Platform that is currently in a closed beta launch.
Checklist
/proposals/Notes for Reviewers
(Add anything the Tech & Ops Committee should pay attention to.)