From 70ba6bf83c52fbc6e260837cd49136668d2bb260 Mon Sep 17 00:00:00 2001 From: Kathryn May Date: Wed, 24 Sep 2025 14:11:22 -0400 Subject: [PATCH 01/18] Draft nav + sidebar updates for langsmith platform --- src/docs.json | 1059 ++++++++--------- src/langgraph-platform/deployment-options.mdx | 84 -- src/langgraph-platform/faq.mdx | 68 -- src/langgraph-platform/plans.mdx | 5 - src/langgraph-platform/quick-start-studio.mdx | 134 --- src/langgraph-platform/release-versions.mdx | 11 - src/langgraph-platform/self-hosted.mdx | 77 -- src/langgraph-platform/why-langgraph.mdx | 10 - .../add-auth-server.mdx | 0 .../add-human-in-the-loop.mdx | 0 .../agent-auth.mdx | 2 +- src/langsmith/agent-builder-overview.mdx | 5 + .../api-ref-control-plane.mdx | 0 .../application-structure.mdx | 0 .../assistants.mdx | 0 .../auth.mdx | 0 .../autogen-integration.mdx | 0 .../background-run.mdx | 0 src/{langgraph-platform => langsmith}/cli.mdx | 4 +- .../cloud.mdx | 5 +- .../components.mdx | 6 +- .../configurable-headers.mdx | 0 .../configuration-cloud.mdx | 8 +- .../configure-ttl.mdx | 0 .../control-plane.mdx | 0 .../cron-jobs.mdx | 0 .../custom-auth.mdx | 0 .../custom-docker.mdx | 0 .../custom-lifespan.mdx | 0 .../custom-middleware.mdx | 0 .../custom-routes.mdx | 0 .../data-plane.mdx | 0 .../data-storage-and-privacy.mdx | 4 +- .../deploy-hybrid.mdx | 3 +- .../deploy-self-hosted-full-platform.mdx | 4 +- .../deploy-standalone-server.mdx | 5 +- .../deploy-to-cloud.mdx | 10 +- src/langsmith/deployment-options.mdx | 60 + .../deployment-quickstart.mdx | 6 +- .../double-texting.mdx | 0 .../egress-metrics-metadata.mdx | 0 .../enqueue-concurrent.mdx | 0 .../env-var.mdx | 0 src/langsmith/evaluation-quickstart.mdx | 2 +- src/langsmith/evaluation.mdx | 58 +- src/langsmith/faq.mdx | 98 +- .../generative-ui-react.mdx | 0 .../graph-rebuild.mdx | 0 src/langsmith/home.mdx | 88 +- .../human-in-the-loop-time-travel.mdx | 0 .../hybrid.mdx | 12 +- .../interrupt-concurrent.mdx | 0 .../langgraph-cli.mdx | 1 + .../langgraph-js-ts-sdk.mdx} | 2 +- .../langgraph-python-sdk.mdx} | 2 +- .../langgraph-server-changelog.mdx | 0 .../langgraph-server.mdx | 0 .../langgraph-studio.mdx | 10 +- .../local-server.mdx | 8 +- .../monorepo-support.mdx | 0 src/langsmith/observability-quickstart.mdx | 2 +- src/langsmith/observability-studio.mdx | 213 ++++ src/langsmith/observability.mdx | 58 +- .../openapi-security.mdx | 0 src/langsmith/prompt-engineering.mdx | 57 +- src/langsmith/quick-start-studio.mdx | 135 +++ .../reference-overview.mdx | 0 .../reject-concurrent.mdx | 0 src/langsmith/release-versions.mdx | 13 +- .../remote-graph.mdx | 0 .../resource-auth.mdx | 0 .../rollback-concurrent.mdx | 0 .../same-thread.mdx | 0 .../scalability-and-resilience.mdx | 0 src/{langgraph-platform => langsmith}/sdk.mdx | 0 src/langsmith/self-hosted.mdx | 198 +++ .../semantic-search.mdx | 0 .../server-a2a.mdx | 0 .../server-api-ref.mdx | 0 .../server-mcp.mdx | 0 .../set-up-custom-auth.mdx | 2 +- .../setup-app-requirements-txt.mdx | 0 .../setup-javascript.mdx | 0 .../setup-pyproject.mdx | 0 .../{api-ref.mdx => smith-api-ref.mdx} | 2 +- .../{js-ts-sdk.mdx => smith-js-ts-sdk.mdx} | 2 +- .../{python-sdk.mdx => smith-python-sdk.mdx} | 2 +- .../stateless-runs.mdx | 0 .../streaming.mdx | 0 .../troubleshooting-studio.mdx | 6 +- .../use-remote-graph.mdx | 0 .../use-stream-react.mdx | 0 src/langsmith/use-studio.mdx | 139 +++ .../use-threads.mdx | 2 +- .../use-webhooks.mdx | 0 src/langsmith/why-langgraph.mdx | 10 + src/snippets/oss/studio.mdx | 2 +- 97 files changed, 1643 insertions(+), 1051 deletions(-) delete mode 100644 src/langgraph-platform/deployment-options.mdx delete mode 100644 src/langgraph-platform/faq.mdx delete mode 100644 src/langgraph-platform/plans.mdx delete mode 100644 src/langgraph-platform/quick-start-studio.mdx delete mode 100644 src/langgraph-platform/release-versions.mdx delete mode 100644 src/langgraph-platform/self-hosted.mdx delete mode 100644 src/langgraph-platform/why-langgraph.mdx rename src/{langgraph-platform => langsmith}/add-auth-server.mdx (100%) rename src/{langgraph-platform => langsmith}/add-human-in-the-loop.mdx (100%) rename src/{langgraph-platform => langsmith}/agent-auth.mdx (97%) create mode 100644 src/langsmith/agent-builder-overview.mdx rename src/{langgraph-platform => langsmith}/api-ref-control-plane.mdx (100%) rename src/{langgraph-platform => langsmith}/application-structure.mdx (100%) rename src/{langgraph-platform => langsmith}/assistants.mdx (100%) rename src/{langgraph-platform => langsmith}/auth.mdx (100%) rename src/{langgraph-platform => langsmith}/autogen-integration.mdx (100%) rename src/{langgraph-platform => langsmith}/background-run.mdx (100%) rename src/{langgraph-platform => langsmith}/cli.mdx (99%) rename src/{langgraph-platform => langsmith}/cloud.mdx (57%) rename src/{langgraph-platform => langsmith}/components.mdx (84%) rename src/{langgraph-platform => langsmith}/configurable-headers.mdx (100%) rename src/{langgraph-platform => langsmith}/configuration-cloud.mdx (92%) rename src/{langgraph-platform => langsmith}/configure-ttl.mdx (100%) rename src/{langgraph-platform => langsmith}/control-plane.mdx (100%) rename src/{langgraph-platform => langsmith}/cron-jobs.mdx (100%) rename src/{langgraph-platform => langsmith}/custom-auth.mdx (100%) rename src/{langgraph-platform => langsmith}/custom-docker.mdx (100%) rename src/{langgraph-platform => langsmith}/custom-lifespan.mdx (100%) rename src/{langgraph-platform => langsmith}/custom-middleware.mdx (100%) rename src/{langgraph-platform => langsmith}/custom-routes.mdx (100%) rename src/{langgraph-platform => langsmith}/data-plane.mdx (100%) rename src/{langgraph-platform => langsmith}/data-storage-and-privacy.mdx (91%) rename src/{langgraph-platform => langsmith}/deploy-hybrid.mdx (99%) rename src/{langgraph-platform => langsmith}/deploy-self-hosted-full-platform.mdx (97%) rename src/{langgraph-platform => langsmith}/deploy-standalone-server.mdx (98%) rename src/{langgraph-platform => langsmith}/deploy-to-cloud.mdx (94%) create mode 100644 src/langsmith/deployment-options.mdx rename src/{langgraph-platform => langsmith}/deployment-quickstart.mdx (95%) rename src/{langgraph-platform => langsmith}/double-texting.mdx (100%) rename src/{langgraph-platform => langsmith}/egress-metrics-metadata.mdx (100%) rename src/{langgraph-platform => langsmith}/enqueue-concurrent.mdx (100%) rename src/{langgraph-platform => langsmith}/env-var.mdx (100%) rename src/{langgraph-platform => langsmith}/generative-ui-react.mdx (100%) rename src/{langgraph-platform => langsmith}/graph-rebuild.mdx (100%) rename src/{langgraph-platform => langsmith}/human-in-the-loop-time-travel.mdx (100%) rename src/{langgraph-platform => langsmith}/hybrid.mdx (90%) rename src/{langgraph-platform => langsmith}/interrupt-concurrent.mdx (100%) rename src/{langgraph-platform => langsmith}/langgraph-cli.mdx (98%) rename src/{langgraph-platform/js-ts-sdk.mdx => langsmith/langgraph-js-ts-sdk.mdx} (78%) rename src/{langgraph-platform/python-sdk.mdx => langsmith/langgraph-python-sdk.mdx} (78%) rename src/{langgraph-platform => langsmith}/langgraph-server-changelog.mdx (100%) rename src/{langgraph-platform => langsmith}/langgraph-server.mdx (100%) rename src/{langgraph-platform => langsmith}/langgraph-studio.mdx (84%) rename src/{langgraph-platform => langsmith}/local-server.mdx (93%) rename src/{langgraph-platform => langsmith}/monorepo-support.mdx (100%) create mode 100644 src/langsmith/observability-studio.mdx rename src/{langgraph-platform => langsmith}/openapi-security.mdx (100%) create mode 100644 src/langsmith/quick-start-studio.mdx rename src/{langgraph-platform => langsmith}/reference-overview.mdx (100%) rename src/{langgraph-platform => langsmith}/reject-concurrent.mdx (100%) rename src/{langgraph-platform => langsmith}/remote-graph.mdx (100%) rename src/{langgraph-platform => langsmith}/resource-auth.mdx (100%) rename src/{langgraph-platform => langsmith}/rollback-concurrent.mdx (100%) rename src/{langgraph-platform => langsmith}/same-thread.mdx (100%) rename src/{langgraph-platform => langsmith}/scalability-and-resilience.mdx (100%) rename src/{langgraph-platform => langsmith}/sdk.mdx (100%) create mode 100644 src/langsmith/self-hosted.mdx rename src/{langgraph-platform => langsmith}/semantic-search.mdx (100%) rename src/{langgraph-platform => langsmith}/server-a2a.mdx (100%) rename src/{langgraph-platform => langsmith}/server-api-ref.mdx (100%) rename src/{langgraph-platform => langsmith}/server-mcp.mdx (100%) rename src/{langgraph-platform => langsmith}/set-up-custom-auth.mdx (97%) rename src/{langgraph-platform => langsmith}/setup-app-requirements-txt.mdx (100%) rename src/{langgraph-platform => langsmith}/setup-javascript.mdx (100%) rename src/{langgraph-platform => langsmith}/setup-pyproject.mdx (100%) rename src/langsmith/{api-ref.mdx => smith-api-ref.mdx} (68%) rename src/langsmith/{js-ts-sdk.mdx => smith-js-ts-sdk.mdx} (72%) rename src/langsmith/{python-sdk.mdx => smith-python-sdk.mdx} (74%) rename src/{langgraph-platform => langsmith}/stateless-runs.mdx (100%) rename src/{langgraph-platform => langsmith}/streaming.mdx (100%) rename src/{langgraph-platform => langsmith}/troubleshooting-studio.mdx (91%) rename src/{langgraph-platform => langsmith}/use-remote-graph.mdx (100%) rename src/{langgraph-platform => langsmith}/use-stream-react.mdx (100%) create mode 100644 src/langsmith/use-studio.mdx rename src/{langgraph-platform => langsmith}/use-threads.mdx (99%) rename src/{langgraph-platform => langsmith}/use-webhooks.mdx (100%) create mode 100644 src/langsmith/why-langgraph.mdx diff --git a/src/docs.json b/src/docs.json index fe2953e7d..7a7c04938 100644 --- a/src/docs.json +++ b/src/docs.json @@ -478,235 +478,90 @@ ] }, { - "dropdown": "LangGraph Platform", - "icon": "rocket-launch", - "description": "Platform for building and deploying AI agents", + "dropdown": "LangSmith", + "icon": "rocket", + "description": "Platform for deployment, LLM observability, and evaluation", "tabs": [ { "tab": "Get started", - "groups": [ - { - "group": "Overview", - "pages": [ - "langgraph-platform/index", - { - "group": "Components", - "pages": [ - "langgraph-platform/components", - "langgraph-platform/langgraph-server", - "langgraph-platform/data-plane", - "langgraph-platform/control-plane" - ] - }, - "langgraph-platform/application-structure" - ] - }, + "pages": [ + "langsmith/home", { "group": "Quickstarts", "pages": [ - "langgraph-platform/local-server", - "langgraph-platform/deployment-quickstart", - "langgraph-platform/quick-start-studio" - ] - }, - { - "group": "Plans", - "pages": [ - "langgraph-platform/plans" - ] - } - ] - }, - { - "tab": "Build", - "groups": [ - { - "group": "Build with LangGraph Platform", - "pages": [ - "langgraph-platform/why-langgraph" + "langsmith/observability-quickstart", + "langsmith/evaluation-quickstart", + "langsmith/prompt-engineering-quickstart", + "langsmith/local-server", + "langsmith/deployment-quickstart", + "langsmith/quick-start-studio" ] }, { - "group": "Data models", + "group": "App development", "pages": [ + "langsmith/why-langgraph", { - "group": "Assistants", + "group": "Data models", "pages": [ - "langgraph-platform/assistants", - "langgraph-platform/configuration-cloud", - "langgraph-platform/use-threads" + { + "group": "Assistants", + "pages": [ + "langsmith/assistants", + "langsmith/configuration-cloud", + "langsmith/use-threads" + ] + }, + { + "group": "Runs", + "pages": [ + "langsmith/background-run", + "langsmith/same-thread", + "langsmith/cron-jobs", + "langsmith/stateless-runs", + "langsmith/configurable-headers" + ] + } ] }, { - "group": "Runs", + "group": "Core capabilities", "pages": [ - "langgraph-platform/background-run", - "langgraph-platform/same-thread", - "langgraph-platform/cron-jobs", - "langgraph-platform/stateless-runs", - "langgraph-platform/configurable-headers" + "langsmith/streaming", + "langsmith/add-human-in-the-loop", + "langsmith/human-in-the-loop-time-travel", + "langsmith/server-mcp", + "langsmith/server-a2a", + "langsmith/use-webhooks", + { + "group": "Double-texting", + "pages": [ + "langsmith/double-texting", + "langsmith/interrupt-concurrent", + "langsmith/rollback-concurrent", + "langsmith/reject-concurrent", + "langsmith/enqueue-concurrent" + ] + } ] - } - ] - }, - { - "group": "Core capabilities", - "pages": [ - "langgraph-platform/streaming", - "langgraph-platform/add-human-in-the-loop", - "langgraph-platform/human-in-the-loop-time-travel", - "langgraph-platform/server-mcp", - "langgraph-platform/server-a2a", - "langgraph-platform/use-webhooks", + }, { - "group": "Double-texting", + "group": "Tutorials", "pages": [ - "langgraph-platform/double-texting", - "langgraph-platform/interrupt-concurrent", - "langgraph-platform/rollback-concurrent", - "langgraph-platform/reject-concurrent", - "langgraph-platform/enqueue-concurrent" + "langsmith/autogen-integration", + "langsmith/use-stream-react", + "langsmith/generative-ui-react" ] } ] }, { - "group": "LangGraph Studio", - "pages": [ - "langgraph-platform/langgraph-studio", - "langgraph-platform/use-studio", - "langgraph-platform/observability-studio", - "langgraph-platform/troubleshooting-studio" - ] - } - ] - }, - { - "tab": "Deploy", - "groups": [ - { - "group": "Overview", - "pages": [ - "langgraph-platform/deployment-options", - "langgraph-platform/cloud", - "langgraph-platform/hybrid", - "langgraph-platform/self-hosted" - ] - }, - { - "group": "Guides for deployment", - "pages": [ - "langgraph-platform/deploy-to-cloud", - "langgraph-platform/deploy-hybrid", - "langgraph-platform/deploy-self-hosted-full-platform", - "langgraph-platform/deploy-standalone-server", - "langgraph-platform/use-remote-graph" - ] - }, - { - "group": "Configure app for deployment", - "pages": [ - "langgraph-platform/setup-app-requirements-txt", - "langgraph-platform/setup-pyproject", - "langgraph-platform/setup-javascript", - "langgraph-platform/custom-docker", - "langgraph-platform/graph-rebuild", - "langgraph-platform/langgraph-cli", - "langgraph-platform/sdk", - "langgraph-platform/egress-metrics-metadata", - "langgraph-platform/monorepo-support" - ] - } - ] - }, - { - "tab": "Manage", - "groups": [ - { - "group": "Authentication & access control", - "pages": [ - "langgraph-platform/auth", - "langgraph-platform/custom-auth", - "langgraph-platform/set-up-custom-auth", - "langgraph-platform/resource-auth", - "langgraph-platform/add-auth-server", - "langgraph-platform/openapi-security", - "langgraph-platform/agent-auth" - ] - }, - { - "group": "Scalability & resilience", - "pages": [ - "langgraph-platform/scalability-and-resilience" - ] - }, - { - "group": "Server customization", - "pages": [ - "langgraph-platform/custom-lifespan", - "langgraph-platform/custom-middleware", - "langgraph-platform/custom-routes" - ] - }, - { - "group": "Data management", - "pages": [ - "langgraph-platform/data-storage-and-privacy", - "langgraph-platform/semantic-search", - "langgraph-platform/configure-ttl" - ] - }, - { - "group": "Tutorials", - "pages": [ - "langgraph-platform/autogen-integration", - "langgraph-platform/use-stream-react", - "langgraph-platform/generative-ui-react" - ] - } - ] - }, - { - "tab": "Reference", - "pages": [ - "langgraph-platform/reference-overview", - "langgraph-platform/server-api-ref", - "langgraph-platform/langgraph-server-changelog", - "langgraph-platform/release-versions", - "langgraph-platform/api-ref-control-plane", - "langgraph-platform/cli", - "langgraph-platform/env-var", - "langgraph-platform/python-sdk", - "langgraph-platform/js-ts-sdk", - "langgraph-platform/remote-graph", - "langgraph-platform/faq" - ] - } - ] - }, - { - "dropdown": "LangSmith", - "icon": "screwdriver-wrench", - "description": "Platform for LLM observability and evaluation", - "tabs": [ - { - "tab": "Get started", - "pages": [ - "langsmith/home", - { - "group": "Quickstarts", - "pages": [ - "langsmith/observability-quickstart", - "langsmith/evaluation-quickstart", - "langsmith/prompt-engineering-quickstart" - ] - }, - { - "group": "API & SDKs", + "group": "Studio", "pages": [ - "langsmith/api-ref", - "langsmith/python-sdk", - "langsmith/js-ts-sdk" + "langsmith/langgraph-studio", + "langsmith/use-studio", + "langsmith/observability-studio", + "langsmith/troubleshooting-studio" ] }, { @@ -991,78 +846,183 @@ ] }, { - "tab": "Self-hosting", - "pages": [ - "langsmith/architectural-overview", + "tab": "Deployment", + "groups": [ { - "group": "Setup", + "group": "Overview", "pages": [ - "langsmith/kubernetes", - "langsmith/docker", - "langsmith/self-host-usage", - "langsmith/self-host-upgrades", - "langsmith/self-host-egress", - "langsmith/self-host-organization-charts", - "langsmith/langsmith-managed-clickhouse" + "langsmith/deployment-options", + { + "group": "Platform components", + "pages": [ + "langsmith/components", + "langsmith/langgraph-server", + "langsmith/data-plane", + "langsmith/control-plane" + ] + } ] }, { - "group": "Configuration", + "group": "Cloud", "pages": [ - "langsmith/self-host-scale", - "langsmith/self-host-ttl", - "langsmith/self-host-ingress", - "langsmith/self-host-mirroring-images", - "langsmith/self-host-playground-environment-settings", - "langsmith/troubleshooting" + "langsmith/cloud", + "langsmith/deploy-to-cloud" ] }, { - "group": "Authentication & access control", + "group": "Hybrid", "pages": [ - "langsmith/self-host-basic-auth", - "langsmith/self-host-sso", - "langsmith/self-host-user-management", - "langsmith/self-host-custom-tls-certificates", - "langsmith/self-host-using-an-existing-secret" + "langsmith/hybrid", + "langsmith/deploy-hybrid" + ] + }, + { + "group": "Self-hosted", + "pages": [ + "langsmith/self-hosted", + { + "group": "Deployment guides", + "pages": [ + "langsmith/deploy-self-hosted-full-platform", + "langsmith/deploy-standalone-server" + ] + }, + { + "group": "Configuration", + "pages": [ + "langsmith/custom-docker", + "langsmith/egress-metrics-metadata" + ] + }, + { + "group": "(Existing LS content to organize) Self-host", + "pages": [ + "langsmith/architectural-overview", + { + "group": "Setup", + "pages": [ + "langsmith/kubernetes", + "langsmith/docker", + "langsmith/self-host-usage", + "langsmith/self-host-upgrades", + "langsmith/self-host-egress", + "langsmith/self-host-organization-charts", + "langsmith/langsmith-managed-clickhouse" + ] + }, + { + "group": "Configuration", + "pages": [ + "langsmith/self-host-scale", + "langsmith/self-host-ttl", + "langsmith/self-host-ingress", + "langsmith/self-host-mirroring-images", + "langsmith/self-host-playground-environment-settings", + "langsmith/troubleshooting" + ] + }, + { + "group": "Authentication & access control", + "pages": [ + "langsmith/self-host-basic-auth", + "langsmith/self-host-sso", + "langsmith/self-host-user-management", + "langsmith/self-host-custom-tls-certificates", + "langsmith/self-host-using-an-existing-secret" + ] + }, + { + "group": "Connect external services", + "pages": [ + "langsmith/self-host-blob-storage", + "langsmith/self-host-external-clickhouse", + "langsmith/self-host-external-postgres", + "langsmith/self-host-external-redis" + ] + }, + { + "group": "Scripts", + "pages": [ + "langsmith/script-delete-a-workspace", + "langsmith/script-delete-an-organization", + "langsmith/script-delete-traces", + "langsmith/script-generate-clickhouse-stats", + "langsmith/script-generate-query-stats", + "langsmith/script-running-pg-support-queries", + "langsmith/script-running-ch-support-queries" + ] + }, + { + "group": "Observability", + "pages": [ + "langsmith/export-backend", + "langsmith/langsmith-collector", + "langsmith/observability-stack" + ] + } + ] + } ] }, { - "group": "Connect external services", + "group": "Configure app for deployment", "pages": [ - "langsmith/self-host-blob-storage", - "langsmith/self-host-external-clickhouse", - "langsmith/self-host-external-postgres", - "langsmith/self-host-external-redis" + "langsmith/application-structure", + { + "group": "Setup", + "pages": [ + "langsmith/setup-app-requirements-txt", + "langsmith/setup-pyproject", + "langsmith/setup-javascript" + ] + }, + "langsmith/graph-rebuild", + "langsmith/use-remote-graph", + "langsmith/monorepo-support", + "langsmith/semantic-search", + "langsmith/configure-ttl" ] }, { - "group": "Scripts", + "group": "Authentication & access control", "pages": [ - "langsmith/script-delete-a-workspace", - "langsmith/script-delete-an-organization", - "langsmith/script-delete-traces", - "langsmith/script-generate-clickhouse-stats", - "langsmith/script-generate-query-stats", - "langsmith/script-running-pg-support-queries", - "langsmith/script-running-ch-support-queries" + "langsmith/auth", + "langsmith/custom-auth", + "langsmith/set-up-custom-auth", + "langsmith/resource-auth", + "langsmith/add-auth-server", + "langsmith/openapi-security", + "langsmith/agent-auth" ] }, { - "group": "Observability", + "group": "Server customization", "pages": [ - "langsmith/export-backend", - "langsmith/langsmith-collector", - "langsmith/observability-stack" + "langsmith/custom-lifespan", + "langsmith/custom-middleware", + "langsmith/custom-routes" ] } ] }, { - "tab": "Administration", + "tab": "Agent builder", + "pages": [ + "langsmith/agent-builder-overview" + ] + }, + { + "tab": "Reference", "pages": [ { - "group": "Setup", + "group": "Overview", + "pages": [ + "langsmith/reference-overview" + ] + }, + { + "group": "Account administration", "pages": [ "langsmith/administration-overview", "langsmith/create-account-api-key", @@ -1073,15 +1033,59 @@ "langsmith/user-management" ] }, + { + "group": "API & SDKs", + "pages": [ + { + "group": "SDKs", + "pages": [ + "langsmith/smith-python-sdk", + "langsmith/smith-js-ts-sdk", + "langsmith/langgraph-python-sdk", + "langsmith/langgraph-js-ts-sdk" + ] + }, + { + "group": "APIs", + "pages": [ + "langsmith/smith-api-ref", + "langsmith/server-api-ref", + "langsmith/api-ref-control-plane", + "langsmith/remote-graph" + ] + }, + "langsmith/cli", + "langsmith/env-var", + "langsmith/langgraph-cli" + ] + }, + { + "group": "Scalability & resilience", + "pages": [ + "langsmith/cloud-architecture-and-scalability", + "langsmith/scalability-and-resilience" + ] + }, + { + "group": "Data management", + "pages": [ + "langsmith/data-storage-and-privacy", + "langsmith/data-purging-compliance" + ] + }, + { + "group": "Releases & changelogs", + "pages": [ + "langsmith/langgraph-server-changelog", + "langsmith/release-versions" + ] + }, { "group": "Additional resources", "pages": [ "langsmith/faq", - "langsmith/cloud-architecture-and-scalability", "langsmith/regions-faq", - "langsmith/authentication-methods", - "langsmith/data-purging-compliance", - "langsmith/release-versions" + "langsmith/authentication-methods" ] } ] @@ -1534,248 +1538,90 @@ ] }, { - "dropdown": "LangGraph Platform", - "icon": "rocket-launch", - "description": "Platform for building and deploying AI agents", + "dropdown": "LangSmith", + "icon": "rocket", + "description": "Platform for deployment, LLM observability, and evaluation", "tabs": [ { "tab": "Get started", - "groups": [ - { - "group": "Overview", - "pages": [ - "langgraph-platform/index", - { - "group": "Components", - "pages": [ - "langgraph-platform/components", - "langgraph-platform/langgraph-server", - "langgraph-platform/data-plane", - "langgraph-platform/control-plane" - ] - }, - "langgraph-platform/application-structure" - ] - }, + "pages": [ + "langsmith/home", { "group": "Quickstarts", "pages": [ - "langgraph-platform/local-server", - "langgraph-platform/deployment-quickstart", - "langgraph-platform/quick-start-studio" - ] - }, - { - "group": "Plans and deployment", - "pages": [ - "langgraph-platform/plans" - ] - } - ] - }, - { - "tab": "Build", - "groups": [ - { - "group": "Build with LangGraph Platform", - "pages": [ - "langgraph-platform/why-langgraph" + "langsmith/observability-quickstart", + "langsmith/evaluation-quickstart", + "langsmith/prompt-engineering-quickstart", + "langsmith/local-server", + "langsmith/deployment-quickstart", + "langsmith/quick-start-studio" ] }, { - "group": "Data models", + "group": "App development", "pages": [ + "langsmith/why-langgraph", { - "group": "Assistants", + "group": "Data models", "pages": [ - "langgraph-platform/assistants", - "langgraph-platform/configuration-cloud", - "langgraph-platform/use-threads" + { + "group": "Assistants", + "pages": [ + "langsmith/assistants", + "langsmith/configuration-cloud", + "langsmith/use-threads" + ] + }, + { + "group": "Runs", + "pages": [ + "langsmith/background-run", + "langsmith/same-thread", + "langsmith/cron-jobs", + "langsmith/stateless-runs", + "langsmith/configurable-headers" + ] + } ] }, { - "group": "Runs", + "group": "Core capabilities", "pages": [ - "langgraph-platform/background-run", - "langgraph-platform/same-thread", - "langgraph-platform/cron-jobs", - "langgraph-platform/stateless-runs", - "langgraph-platform/configurable-headers" + "langsmith/streaming", + "langsmith/add-human-in-the-loop", + "langsmith/human-in-the-loop-time-travel", + "langsmith/server-mcp", + "langsmith/server-a2a", + "langsmith/use-webhooks", + { + "group": "Double-texting", + "pages": [ + "langsmith/double-texting", + "langsmith/interrupt-concurrent", + "langsmith/rollback-concurrent", + "langsmith/reject-concurrent", + "langsmith/enqueue-concurrent" + ] + } ] - } - ] - }, - { - "group": "Core capabilities", - "pages": [ - "langgraph-platform/streaming", - "langgraph-platform/add-human-in-the-loop", - "langgraph-platform/human-in-the-loop-time-travel", - "langgraph-platform/server-mcp", - "langgraph-platform/server-a2a", - "langgraph-platform/use-webhooks", + }, { - "group": "Double-texting", + "group": "Tutorials", "pages": [ - "langgraph-platform/double-texting", - "langgraph-platform/interrupt-concurrent", - "langgraph-platform/rollback-concurrent", - "langgraph-platform/reject-concurrent", - "langgraph-platform/enqueue-concurrent" + "langsmith/autogen-integration", + "langsmith/use-stream-react", + "langsmith/generative-ui-react" ] } ] }, { - "group": "LangGraph Studio", - "pages": [ - "langgraph-platform/langgraph-studio", - "langgraph-platform/use-studio", - "langgraph-platform/observability-studio", - "langgraph-platform/troubleshooting-studio" - ] - } - ] - }, - { - "tab": "Deploy", - "groups": [ - { - "group": "Overview", - "pages": [ - "langgraph-platform/deployment-options", - "langgraph-platform/cloud", - "langgraph-platform/hybrid", - "langgraph-platform/self-hosted" - ] - }, - { - "group": "Guides for deployment", - "pages": [ - "langgraph-platform/deploy-to-cloud", - "langgraph-platform/deploy-hybrid", - "langgraph-platform/deploy-self-hosted-full-platform", - "langgraph-platform/deploy-standalone-server", - "langgraph-platform/use-remote-graph" - ] - }, - { - "group": "Configure app for deployment", - "pages": [ - "langgraph-platform/setup-app-requirements-txt", - "langgraph-platform/setup-pyproject", - "langgraph-platform/setup-javascript", - "langgraph-platform/custom-docker", - "langgraph-platform/graph-rebuild", - "langgraph-platform/langgraph-cli", - "langgraph-platform/sdk", - "langgraph-platform/egress-metrics-metadata", - "langgraph-platform/monorepo-support" - ] - } - ] - }, - { - "tab": "Manage", - "groups": [ - { - "group": "Authentication & access control", - "pages": [ - "langgraph-platform/auth", - "langgraph-platform/custom-auth", - "langgraph-platform/set-up-custom-auth", - "langgraph-platform/resource-auth", - "langgraph-platform/add-auth-server", - "langgraph-platform/openapi-security", - "langgraph-platform/agent-auth" - ] - }, - { - "group": "Scalability & resilience", - "pages": [ - "langgraph-platform/scalability-and-resilience" - ] - }, - { - "group": "Server customization", - "pages": [ - "langgraph-platform/custom-lifespan", - "langgraph-platform/custom-middleware", - "langgraph-platform/custom-routes" - ] - }, - { - "group": "Data management", - "pages": [ - "langgraph-platform/data-storage-and-privacy", - "langgraph-platform/semantic-search", - "langgraph-platform/configure-ttl" - ] - }, - { - "group": "Tutorials", - "pages": [ - "langgraph-platform/autogen-integration", - "langgraph-platform/use-stream-react", - "langgraph-platform/generative-ui-react" - ] - } - ] - }, - { - "tab": "Reference", - "pages": [ - "langgraph-platform/reference-overview", - "langgraph-platform/server-api-ref", - "langgraph-platform/langgraph-server-changelog", - "langgraph-platform/release-versions", - "langgraph-platform/api-ref-control-plane", - "langgraph-platform/cli", - "langgraph-platform/env-var", - "langgraph-platform/python-sdk", - "langgraph-platform/js-ts-sdk", - "langgraph-platform/remote-graph", - "langgraph-platform/faq" - ] - } - ] - }, - { - "dropdown": "LangSmith", - "icon": "screwdriver-wrench", - "description": "Platform for LLM observability and evaluation", - "tabs": [ - { - "tab": "Get started", - "pages": [ - "langsmith/home", - { - "group": "Start tracing", - "pages": [ - "langsmith/observability-quickstart", - "langsmith/observability-concepts" - ] - }, - { - "group": "Evaluate your application", + "group": "Studio", "pages": [ - "langsmith/evaluation-quickstart", - "langsmith/evaluation-concepts" - ] - }, - { - "group": "Test your prompts", - "pages": [ - "langsmith/prompt-engineering-quickstart", - "langsmith/prompt-engineering-concepts" - ] - }, - { - "group": "API & SDKs", - "pages": [ - "langsmith/api-ref", - "langsmith/python-sdk", - "langsmith/js-ts-sdk" + "langsmith/langgraph-studio", + "langsmith/use-studio", + "langsmith/observability-studio", + "langsmith/troubleshooting-studio" ] }, { @@ -1913,7 +1759,8 @@ "tab": "Evaluation", "pages": [ "langsmith/evaluation", - "langsmith/evaluation-techniques", + "langsmith/evaluation-concepts", + "langsmith/evaluation-approaches", { "group": "Datasets", "pages": [ @@ -1928,19 +1775,14 @@ ] }, { - "group": "Evaluations", + "group": "Set up evaluations", "pages": [ { "group": "Run an evaluation", "pages": [ "langsmith/evaluate-llm-application", "langsmith/run-evaluation-from-prompt-playground", - { - "group": "Prebuilt evaluators", - "pages": [ - "langsmith/prebuilt-evaluators" - ] - } + "langsmith/prebuilt-evaluators" ] }, { @@ -1948,6 +1790,7 @@ "pages": [ "langsmith/code-evaluator", "langsmith/llm-as-judge", + "langsmith/composite-evaluators", "langsmith/summary", "langsmith/evaluate-pairwise" ] @@ -1986,6 +1829,16 @@ "langsmith/create-few-shot-evaluators", "langsmith/index-datasets-for-dynamic-few-shot-example-selection" ] + }, + { + "group": "Tutorials", + "pages": [ + "langsmith/evaluate-chatbot-tutorial", + "langsmith/evaluate-rag-tutorial", + "langsmith/test-react-agent-pytest", + "langsmith/evaluate-complex-agent", + "langsmith/run-backtests-new-agent" + ] } ] }, @@ -2008,16 +1861,6 @@ "langsmith/audit-evaluator-scores" ] }, - { - "group": "Tutorials", - "pages": [ - "langsmith/evaluate-chatbot-tutorial", - "langsmith/evaluate-rag-tutorial", - "langsmith/test-react-agent-pytest", - "langsmith/evaluate-complex-agent", - "langsmith/run-backtests-new-agent" - ] - }, { "group": "Common data types", "pages": [ @@ -2032,6 +1875,7 @@ "tab": "Prompt engineering", "pages": [ "langsmith/prompt-engineering", + "langsmith/prompt-engineering-concepts", { "group": "Create and update prompts", "pages": [ @@ -2062,78 +1906,183 @@ ] }, { - "tab": "Self-hosting", - "pages": [ - "langsmith/architectural-overview", + "tab": "Deployment", + "groups": [ { - "group": "Setup", + "group": "Overview", "pages": [ - "langsmith/kubernetes", - "langsmith/docker", - "langsmith/self-host-usage", - "langsmith/self-host-upgrades", - "langsmith/self-host-egress", - "langsmith/self-host-organization-charts", - "langsmith/langsmith-managed-clickhouse" + "langsmith/deployment-options", + { + "group": "Platform components", + "pages": [ + "langsmith/components", + "langsmith/langgraph-server", + "langsmith/data-plane", + "langsmith/control-plane" + ] + } ] }, { - "group": "Configuration", + "group": "Cloud", "pages": [ - "langsmith/self-host-scale", - "langsmith/self-host-ttl", - "langsmith/self-host-ingress", - "langsmith/self-host-mirroring-images", - "langsmith/self-host-playground-environment-settings", - "langsmith/troubleshooting" + "langsmith/cloud", + "langsmith/deploy-to-cloud" ] }, { - "group": "Authentication & access control", + "group": "Hybrid", "pages": [ - "langsmith/self-host-basic-auth", - "langsmith/self-host-sso", - "langsmith/self-host-user-management", - "langsmith/self-host-custom-tls-certificates", - "langsmith/self-host-using-an-existing-secret" + "langsmith/hybrid", + "langsmith/deploy-hybrid" ] }, { - "group": "Connect external services", + "group": "Self-hosted", "pages": [ - "langsmith/self-host-blob-storage", - "langsmith/self-host-external-clickhouse", - "langsmith/self-host-external-postgres", - "langsmith/self-host-external-redis" + "langsmith/self-hosted", + { + "group": "Deployment guides", + "pages": [ + "langsmith/deploy-self-hosted-full-platform", + "langsmith/deploy-standalone-server" + ] + }, + { + "group": "Configuration", + "pages": [ + "langsmith/custom-docker", + "langsmith/egress-metrics-metadata" + ] + }, + { + "group": "(Existing LS content to organize) Self-host", + "pages": [ + "langsmith/architectural-overview", + { + "group": "Setup", + "pages": [ + "langsmith/kubernetes", + "langsmith/docker", + "langsmith/self-host-usage", + "langsmith/self-host-upgrades", + "langsmith/self-host-egress", + "langsmith/self-host-organization-charts", + "langsmith/langsmith-managed-clickhouse" + ] + }, + { + "group": "Configuration", + "pages": [ + "langsmith/self-host-scale", + "langsmith/self-host-ttl", + "langsmith/self-host-ingress", + "langsmith/self-host-mirroring-images", + "langsmith/self-host-playground-environment-settings", + "langsmith/troubleshooting" + ] + }, + { + "group": "Authentication & access control", + "pages": [ + "langsmith/self-host-basic-auth", + "langsmith/self-host-sso", + "langsmith/self-host-user-management", + "langsmith/self-host-custom-tls-certificates", + "langsmith/self-host-using-an-existing-secret" + ] + }, + { + "group": "Connect external services", + "pages": [ + "langsmith/self-host-blob-storage", + "langsmith/self-host-external-clickhouse", + "langsmith/self-host-external-postgres", + "langsmith/self-host-external-redis" + ] + }, + { + "group": "Scripts", + "pages": [ + "langsmith/script-delete-a-workspace", + "langsmith/script-delete-an-organization", + "langsmith/script-delete-traces", + "langsmith/script-generate-clickhouse-stats", + "langsmith/script-generate-query-stats", + "langsmith/script-running-pg-support-queries", + "langsmith/script-running-ch-support-queries" + ] + }, + { + "group": "Observability", + "pages": [ + "langsmith/export-backend", + "langsmith/langsmith-collector", + "langsmith/observability-stack" + ] + } + ] + } ] }, { - "group": "Scripts", + "group": "Configure app for deployment", "pages": [ - "langsmith/script-delete-a-workspace", - "langsmith/script-delete-an-organization", - "langsmith/script-delete-traces", - "langsmith/script-generate-clickhouse-stats", - "langsmith/script-generate-query-stats", - "langsmith/script-running-pg-support-queries", - "langsmith/script-running-ch-support-queries" + "langsmith/application-structure", + { + "group": "Setup", + "pages": [ + "langsmith/setup-app-requirements-txt", + "langsmith/setup-pyproject", + "langsmith/setup-javascript" + ] + }, + "langsmith/graph-rebuild", + "langsmith/use-remote-graph", + "langsmith/monorepo-support", + "langsmith/semantic-search", + "langsmith/configure-ttl" ] }, { - "group": "Observability", + "group": "Authentication & access control", + "pages": [ + "langsmith/auth", + "langsmith/custom-auth", + "langsmith/set-up-custom-auth", + "langsmith/resource-auth", + "langsmith/add-auth-server", + "langsmith/openapi-security", + "langsmith/agent-auth" + ] + }, + { + "group": "Server customization", "pages": [ - "langsmith/export-backend", - "langsmith/langsmith-collector", - "langsmith/observability-stack" + "langsmith/custom-lifespan", + "langsmith/custom-middleware", + "langsmith/custom-routes" ] } ] }, { - "tab": "Administration", + "tab": "Agent builder", + "pages": [ + "langsmith/agent-builder-overview" + ] + }, + { + "tab": "Reference", "pages": [ { - "group": "Setup", + "group": "Overview", + "pages": [ + "langsmith/reference-overview" + ] + }, + { + "group": "Account administration", "pages": [ "langsmith/administration-overview", "langsmith/create-account-api-key", @@ -2144,15 +2093,59 @@ "langsmith/user-management" ] }, + { + "group": "API & SDKs", + "pages": [ + { + "group": "SDKs", + "pages": [ + "langsmith/smith-python-sdk", + "langsmith/smith-js-ts-sdk", + "langsmith/langgraph-python-sdk", + "langsmith/langgraph-js-ts-sdk" + ] + }, + { + "group": "APIs", + "pages": [ + "langsmith/smith-api-ref", + "langsmith/server-api-ref", + "langsmith/api-ref-control-plane", + "langsmith/remote-graph" + ] + }, + "langsmith/cli", + "langsmith/env-var", + "langsmith/langgraph-cli" + ] + }, + { + "group": "Scalability & resilience", + "pages": [ + "langsmith/cloud-architecture-and-scalability", + "langsmith/scalability-and-resilience" + ] + }, + { + "group": "Data management", + "pages": [ + "langsmith/data-storage-and-privacy", + "langsmith/data-purging-compliance" + ] + }, + { + "group": "Releases & changelogs", + "pages": [ + "langsmith/langgraph-server-changelog", + "langsmith/release-versions" + ] + }, { "group": "Additional resources", "pages": [ "langsmith/faq", - "langsmith/cloud-architecture-and-scalability", "langsmith/regions-faq", - "langsmith/authentication-methods", - "langsmith/data-purging-compliance", - "langsmith/release-versions" + "langsmith/authentication-methods" ] } ] diff --git a/src/langgraph-platform/deployment-options.mdx b/src/langgraph-platform/deployment-options.mdx deleted file mode 100644 index 70b19fa87..000000000 --- a/src/langgraph-platform/deployment-options.mdx +++ /dev/null @@ -1,84 +0,0 @@ ---- -title: Deployment options -sidebarTitle: Deployment options ---- - - -You can [run LangGraph Platform locally for free](/langgraph-platform/local-server) for testing and development. - - -There are 3 main options for deploying with [LangGraph Platform](/langgraph-platform/index): - -1. [Cloud](#cloud) -2. [Hybrid](#hybrid) -3. [Self-Hosted](#self-hosted) - -A quick comparison: - -| | **Cloud** | **Hybrid** | **Self-Hosted** | -|----------------------|----------------|-------------|-----------------| -| **Description** | All components run in LangChain's cloud | Control plane runs in LangChain's cloud; data plane in your cloud | All components run in your cloud | -| **CI/CD** | Managed internally by platform | Managed externally by you | Managed externally by you | -| **Data/compute residency** | LangChain's cloud | Your cloud | Your cloud | -| **LangSmith compatibility** | Trace to LangSmith SaaS | Trace to LangSmith SaaS | Trace to Self-Hosted LangSmith | -| **[Pricing](https://www.langchain.com/pricing-langgraph-platform)** | Plus | Enterprise | Enterprise | - -## Cloud - -The [Cloud](/langgraph-platform/cloud) deployment option is a fully managed model for deployment where the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) run in our cloud. This option provides a simple way to deploy and manage your LangGraph Servers. - -Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed internally by the platform. - -For more information, please see: - -* [Cloud Conceptual Guide](/langgraph-platform/cloud) -* [How to deploy to Cloud](/langgraph-platform/deploy-to-cloud) - -## Hybrid - - -**Important** -The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) plan. - - -The [Hybrid](/langgraph-platform/hybrid) deployment option lets you manage the [data plane](/langgraph-platform/data-plane) in your own cloud, while we handle the [control plane](/langgraph-platform/control-plane) in ours. - -Build a Docker image using the [LangGraph CLI](/langgraph-platform/langgraph-cli) and deploy your LangGraph Server from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). - -Supported Compute Platforms: [Kubernetes](https://kubernetes.io/) - -For more information, please see: - -* [Hybrid Conceptual Guide](/langgraph-platform/hybrid) -* [How to deploy the Hybrid](/langgraph-platform/deploy-hybrid) - -## Self-Hosted - - -**Important** -The Self-Hosted deployment option requires an [Enterprise](/langgraph-platform/plans) plan. - - -The Self-Hosted deployment option allows you to run all components entirely within your own cloud environment. - -You can deploy either: - -1. **[Full Platform](/langgraph-platform/self-hosted#full-platform)**: Deploy both control plane and data plane with full UI/API management capabilities -2. **[Standalone Server](/langgraph-platform/self-hosted#standalone-server)**: Deploy standalone instances of a LangGraph Server without the control plane UI - -With the full platform option, build a Docker image using the [LangGraph CLI](/langgraph-platform/langgraph-cli) and deploy your LangGraph Server from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui) or using the container deployment tooling of your choice. - -Supported Compute Platforms: [Kubernetes](https://kubernetes.io/) (for Control Plane), any compute platform (for Standalone Server Only) - -For more information, please see: - -* [Self-Hosted Conceptual Guide](/langgraph-platform/self-hosted) -* [How to deploy the Self-Hosted Full Platform](/langgraph-platform/deploy-self-hosted-full-platform) -* [How to deploy the Self-Hosted Standalone Server](/langgraph-platform/deploy-standalone-server) - -## Related - -For more information, please see: - -* [LangGraph Platform plans](/langgraph-platform/plans) -* [LangGraph Platform pricing](https://www.langchain.com/langgraph-platform-pricing) diff --git a/src/langgraph-platform/faq.mdx b/src/langgraph-platform/faq.mdx deleted file mode 100644 index caf176c1a..000000000 --- a/src/langgraph-platform/faq.mdx +++ /dev/null @@ -1,68 +0,0 @@ ---- -title: FAQ -sidebarTitle: FAQ ---- - -Common questions and their answers! - -## Do I need to use LangChain to use LangGraph? What’s the difference? - -No. LangGraph is an orchestration framework for complex agentic systems and is more low-level and controllable than LangChain agents. LangChain provides a standard interface to interact with models and other components, useful for straight-forward chains and retrieval flows. - -## How is LangGraph different from other agent frameworks? - -Other agentic frameworks can work for simple, generic tasks but fall short for complex tasks bespoke to a company’s needs. LangGraph provides a more expressive framework to handle companies’ unique tasks without restricting users to a single black-box cognitive architecture. - -## Does LangGraph impact the performance of my app? - -LangGraph will not add any overhead to your code and is specifically designed with streaming workflows in mind. - -## Is LangGraph open source? Is it free? - -Yes. LangGraph is an MIT-licensed open-source library and is free to use. - -## How are LangGraph and LangGraph Platform different? - -LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangGraph Platform is a service for deploying and scaling LangGraph applications, with an opinionated API for building agent UXs, plus an integrated developer studio. - -| Features | LangGraph (open source) | LangGraph Platform | -|---------------------|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------| -| Description | Stateful orchestration framework for agentic applications | Scalable infrastructure for deploying LangGraph applications | -| SDKs | Python and JavaScript | Python and JavaScript | -| HTTP APIs | None | Yes - useful for retrieving & updating state or long-term memory, or creating a configurable assistant | -| Streaming | Basic | Dedicated mode for token-by-token messages | -| Checkpointer | Community contributed | Supported out-of-the-box | -| Persistence Layer | Self-managed | Managed Postgres with efficient storage | -| Deployment | Self-managed | • Cloud
• Free self-hosted
• Enterprise (paid self-hosted) | -| Scalability | Self-managed | Auto-scaling of task queues and servers | -| Fault-tolerance | Self-managed | Automated retries | -| Concurrency Control | Simple threading | Supports double-texting | -| Scheduling | None | Cron scheduling | -| Monitoring | None | Integrated with LangSmith for observability | -| IDE integration | LangGraph Studio | LangGraph Studio | - -## Is LangGraph Platform open source? - -No. LangGraph Platform is proprietary software. - -There is a free, self-hosted version of LangGraph Platform with access to basic features. The Cloud deployment option and the Self-Hosted deployment options are paid services. [Contact our sales team](https://www.langchain.com/contact-sales) to learn more. - -For more information, see our [LangGraph Platform pricing page](https://www.langchain.com/pricing-langgraph-platform). - -## Does LangGraph work with LLMs that don't support tool calling? - -Yes! You can use LangGraph with any LLMs. The main reason we use LLMs that support tool calling is that this is often the most convenient way to have the LLM make its decision about what to do. If your LLM does not support tool calling, you can still use it - you just need to write a bit of logic to convert the raw LLM string response to a decision about what to do. - -## Does LangGraph work with OSS LLMs? - -Yes! LangGraph is totally ambivalent to what LLMs are used under the hood. The main reason we use closed LLMs in most of the tutorials is that they seamlessly support tool calling, while OSS LLMs often don't. But tool calling is not necessary (see [this section](#does-langgraph-work-with-llms-that-dont-support-tool-calling)) so you can totally use LangGraph with OSS LLMs. - -## Can I use LangGraph Studio without logging in to LangSmith - -Yes! You can use the [development version of LangGraph Server](/langgraph-platform/local-server) to run the backend locally. -This will connect to the studio frontend hosted as part of LangSmith. -If you set an environment variable of `LANGSMITH_TRACING=false`, then no traces will be sent to LangSmith. - -## What does "nodes executed" mean for LangGraph Platform usage? - -**Nodes Executed** is the aggregate number of nodes in a LangGraph application that are called and completed successfully during an invocation of the application. If a node in the graph is not called during execution or ends in an error state, these nodes will not be counted. If a node is called and completes successfully multiple times, each occurrence will be counted. diff --git a/src/langgraph-platform/plans.mdx b/src/langgraph-platform/plans.mdx deleted file mode 100644 index 10e6f93c1..000000000 --- a/src/langgraph-platform/plans.mdx +++ /dev/null @@ -1,5 +0,0 @@ ---- -title: LangGraph Platform plans -sidebarTitle: Plans and pricing -url: "https://www.langchain.com/pricing" ---- diff --git a/src/langgraph-platform/quick-start-studio.mdx b/src/langgraph-platform/quick-start-studio.mdx deleted file mode 100644 index 6fbf60a73..000000000 --- a/src/langgraph-platform/quick-start-studio.mdx +++ /dev/null @@ -1,134 +0,0 @@ ---- -title: Use LangGraph Studio -sidebarTitle: Use LangGraph Studio ---- - -[LangGraph Studio](/langgraph-platform/langgraph-studio) supports connecting to two types of graphs: - -- Graphs deployed on [LangGraph Platform](#langgraph-platform). -- Graphs running locally via the [LangGraph Server](#local-development-server). - -## LangGraph Platform - -LangGraph Studio is accessed from the LangSmith UI, within the **LangGraph Platform Deployments** tab. - -For applications that are [deployed](/langgraph-platform/deployment-quickstart) on LangGraph Platform, you can access Studio as part of that deployment. To do so, navigate to the deployment in LangGraph Platform within the LangSmith UI and click the **LangGraph Studio** button. - -This will load the Studio UI connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langgraph-platform/assistants), and [memory](/oss/concepts/memory) in that deployment. - -## Local development server - -To test your application locally using LangGraph Studio, follow the [local application quickstart](/langgraph-platform/local-server) first. - - -**LangSmith Tracing** -For local development, if you don't want data traced to LangSmith, set `LANGSMITH_TRACING=false` in your application's `.env` file. With tracing disabled, no data leaves your local server. - - -Next, install the [LangGraph CLI](/langgraph-platform/langgraph-cli): - - -```bash pip -pip install -U "langgraph-cli[inmem]" -langgraph dev -``` - -```bash uv -uv add langgraph-cli[inmem] -langgraph dev -``` - -```bash npm -npx @langchain/langgraph-cli dev -``` - - - -**Browser Compatibility** -Safari blocks `localhost` connections to Studio. To work around this, run the above command with `--tunnel` to access Studio via a secure tunnel. - - -This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langgraph-platform/cli#dev) to learn about all the options for starting the API server. - -You will see the following logs: - -``` -> Ready! -> -> - API: [http://localhost:2024](http://localhost:2024/) -> -> - Docs: http://localhost:2024/docs -> -> - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 -``` - -Once running, you will automatically be directed to LangGraph Studio. - -For an already running server, access Studio by either: - -1. Directly navigate to the following URL: `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`. -2. Within LangSmith, navigate to the **LangGraph Platform Deployments** tab, click the **LangGraph Studio** button, enter `http://127.0.0.1:2024` and click **Connect**. - -If running your server at a different host or port, simply update the `baseUrl` to match. - -### (Optional) Attach a debugger - -For step-by-step debugging with breakpoints and variable inspection: - - -```bash pip -# Install debugpy package -pip install debugpy -# Start server with debugging enabled -langgraph dev --debug-port 5678 -``` - -```bash uv -# Install debugpy package -uv add debugpy -# Start server with debugging enabled -langgraph dev --debug-port 5678 -``` - - -Then attach your preferred debugger: - - - - Add this configuration to `launch.json`: - - ```json - { - "name": "Attach to LangGraph", - "type": "debugpy", - "request": "attach", - "connect": { - "host": "0.0.0.0", - "port": 5678 - } - } - ``` - - - 1. Go to Run → Edit Configurations - 2. Click + and select "Python Debug Server" - 3. Set IDE host name: `localhost` - 4. Set port: `5678` (or the port number you chose in the previous step) - 5. Click "OK" and start debugging - - - -## Troubleshooting - -For issues getting started, refer to the [troubleshooting guide](/langgraph-platform/troubleshooting-studio). - -## Next steps - -See the following guides for more information on how to use Studio: - -* [Run application](/langgraph-platform/use-studio#run-application) -* [Manage assistants](/langgraph-platform/use-studio#manage-assistants) -* [Manage threads](/langgraph-platform/use-studio#manage-threads) -* [Iterate on prompts](/langgraph-platform/observability-studio) -* [Debug LangSmith traces](/langgraph-platform/observability-studio#debug-langsmith-traces) -* [Add node to dataset](/langgraph-platform/observability-studio#add-node-to-dataset) diff --git a/src/langgraph-platform/release-versions.mdx b/src/langgraph-platform/release-versions.mdx deleted file mode 100644 index 10aa5965e..000000000 --- a/src/langgraph-platform/release-versions.mdx +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: Release versions ---- - -import ReleaseVersionPolicy from "/snippets/release-version-policy.mdx"; - - - -## Current version support - -To check the current supported versions and their support levels, refer to the [LangGraph Server Changelog](/langgraph-platform/langgraph-server-changelog) for the latest release information. diff --git a/src/langgraph-platform/self-hosted.mdx b/src/langgraph-platform/self-hosted.mdx deleted file mode 100644 index e63895ab8..000000000 --- a/src/langgraph-platform/self-hosted.mdx +++ /dev/null @@ -1,77 +0,0 @@ ---- -title: Self-hosted ---- -The Self-Hosted deployment option allows you to run all components entirely within your own cloud environment. You can choose between two deployment models: - -1. **[Full Platform](#full-platform)**: Deploy both control plane and data plane with full UI/API management capabilities -2. **[Standalone Server](#standalone-server)**: Deploy standalone instances of a LangGraph Server without the control plane UI - - -**Important** -The Self-Hosted deployment options require an [Enterprise](/langgraph-platform/plans) plan. - - -## Full Platform - -### Overview - -The Full Platform deployment model is a fully self-hosted solution where you manage both the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) in your cloud. This option gives you full control and responsibility of the control plane and data plane infrastructure. - -| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | -|-------------------|-------------------|------------| -| **What is it?** | | | -| **Where is it hosted?** | Your cloud | Your cloud | -| **Who provisions and manages it?** | You | You | - -### Requirements - -* You use `langgraph-cli` and/or [LangGraph Studio](/langgraph-platform/langgraph-studio) app to test graph locally. -* You use `langgraph build` command to build image. -* You have a Self-Hosted LangSmith instance deployed. -* You are using [Ingress](/langsmith/self-host-ingress) for your LangSmith instance. All agents will be deployed as Kubernetes services behind this ingress. - -### Architecture - -![Self-Hosted Full Platform Architecture](/langgraph-platform/images/self-hosted-full-platform-architecture.png) - -### Compute Platforms - -* **Kubernetes**: The Full Platform deployment model supports deploying control plane and data plane infrastructure to any Kubernetes cluster. - - -If you would like to enable this on your LangSmith instance, please follow the [Self-Hosted Full Platform deployment guide](/langgraph-platform/deploy-self-hosted-full-platform). - - -## Standalone Server - -### Overview - -The Standalone Server Only deployment model is the least restrictive option for deployment. There is no [control plane](/langgraph-platform/control-plane). A simplified version of the [Data plane](/langgraph-platform/data-plane) infrastructure is managed by you. - -| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | -|-------------------|-------------------|------------| -| **What is it?** | n/a | | -| **Where is it hosted?** | n/a | Your cloud | -| **Who provisions and manages it?** | n/a | You | - - -LangGraph Platform should not be deployed in serverless environments. Scale to zero may cause task loss and scaling up will not work reliably. - - -### Architecture - -![Standalone Container](/langgraph-platform/images/langgraph-platform-deployment-architecture.png) - -### Compute Platforms - -#### Kubernetes - -The Standalone Server deployment model supports deploying data plane infrastructure to a Kubernetes cluster. - -#### Docker - -The Standalone Server deployment model supports deploying data plane infrastructure to any Docker-supported compute platform. - - -To deploy a [LangGraph Server](/langgraph-platform/langgraph-server), follow the how-to guide for [how to deploy the Standalone Server](/langgraph-platform/deploy-standalone-server). - diff --git a/src/langgraph-platform/why-langgraph.mdx b/src/langgraph-platform/why-langgraph.mdx deleted file mode 100644 index 63339a549..000000000 --- a/src/langgraph-platform/why-langgraph.mdx +++ /dev/null @@ -1,10 +0,0 @@ ---- -title: Overview -sidebarTitle: Overview ---- - -**LangGraph Platform** extends the [LangGraph](/oss/langgraph/overview) framework for building stateful, multi-agent applications as graphs, which allows you to define control flow, manage persistence, and coordinate interactions across components or agents. LangGraph Platform is framework-agnostic, which means you can deploy and operate agents built with LangGraph or [another framework](/langgraph-platform/autogen-integration). - -To get acquainted with LangGraph's key concepts and features, complete the following [LangGraph quickstart](/oss/langgraph/quickstart). - -While the OSS framework introduces the core abstractions and execution model, LangGraph Platform adds capabilities including managed infrastructure, [deployment models](/langgraph-platform/deployment-options), [assistants](/langgraph-platform/configuration-cloud), and [double-texting](/langgraph-platform/double-texting) support. These platform-level features support the full lifecycle of LangGraph applications, from development to production at scale. diff --git a/src/langgraph-platform/add-auth-server.mdx b/src/langsmith/add-auth-server.mdx similarity index 100% rename from src/langgraph-platform/add-auth-server.mdx rename to src/langsmith/add-auth-server.mdx diff --git a/src/langgraph-platform/add-human-in-the-loop.mdx b/src/langsmith/add-human-in-the-loop.mdx similarity index 100% rename from src/langgraph-platform/add-human-in-the-loop.mdx rename to src/langsmith/add-human-in-the-loop.mdx diff --git a/src/langgraph-platform/agent-auth.mdx b/src/langsmith/agent-auth.mdx similarity index 97% rename from src/langgraph-platform/agent-auth.mdx rename to src/langsmith/agent-auth.mdx index 32fdda40a..3b87951fe 100644 --- a/src/langgraph-platform/agent-auth.mdx +++ b/src/langsmith/agent-auth.mdx @@ -86,7 +86,7 @@ auth_result = await client.authenticate( During execution, if authentication is required, the SDK will throw an [interrupt](https://langchain-ai.github.io/langgraph/how-tos/human_in_the_loop/add-human-in-the-loop/#pause-using-interrupt). The agent execution pauses and presents the OAuth URL to the user: -![LangGraph Studio interrupt showing OAuth URL](/images/langgraph-auth-interrupt.png) +![Studio interrupt showing OAuth URL](/images/langgraph-auth-interrupt.png) After the user completes OAuth authentication and we receive the callback from the provider, they will see the auth success page. diff --git a/src/langsmith/agent-builder-overview.mdx b/src/langsmith/agent-builder-overview.mdx new file mode 100644 index 000000000..1e53ee46c --- /dev/null +++ b/src/langsmith/agent-builder-overview.mdx @@ -0,0 +1,5 @@ +--- +title: Overview +--- + +Placeholder page for docs design diff --git a/src/langgraph-platform/api-ref-control-plane.mdx b/src/langsmith/api-ref-control-plane.mdx similarity index 100% rename from src/langgraph-platform/api-ref-control-plane.mdx rename to src/langsmith/api-ref-control-plane.mdx diff --git a/src/langgraph-platform/application-structure.mdx b/src/langsmith/application-structure.mdx similarity index 100% rename from src/langgraph-platform/application-structure.mdx rename to src/langsmith/application-structure.mdx diff --git a/src/langgraph-platform/assistants.mdx b/src/langsmith/assistants.mdx similarity index 100% rename from src/langgraph-platform/assistants.mdx rename to src/langsmith/assistants.mdx diff --git a/src/langgraph-platform/auth.mdx b/src/langsmith/auth.mdx similarity index 100% rename from src/langgraph-platform/auth.mdx rename to src/langsmith/auth.mdx diff --git a/src/langgraph-platform/autogen-integration.mdx b/src/langsmith/autogen-integration.mdx similarity index 100% rename from src/langgraph-platform/autogen-integration.mdx rename to src/langsmith/autogen-integration.mdx diff --git a/src/langgraph-platform/background-run.mdx b/src/langsmith/background-run.mdx similarity index 100% rename from src/langgraph-platform/background-run.mdx rename to src/langsmith/background-run.mdx diff --git a/src/langgraph-platform/cli.mdx b/src/langsmith/cli.mdx similarity index 99% rename from src/langgraph-platform/cli.mdx rename to src/langsmith/cli.mdx index 4eabf7a39..76f76cf6c 100644 --- a/src/langgraph-platform/cli.mdx +++ b/src/langsmith/cli.mdx @@ -366,7 +366,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | `--debug-port INTEGER` | | Port for debugger to listen on | | `--wait-for-client` | `False` | Wait for a debugger client to connect to the debug port before starting the server | | `--no-browser` | | Skip automatically opening the browser when the server starts | - | `--studio-url TEXT` | | URL of the LangGraph Studio instance to connect to. Defaults to https://smith.langchain.com | + | `--studio-url TEXT` | | URL of the Studio instance to connect to. Defaults to https://smith.langchain.com | | `--allow-blocking` | `False` | Do not raise errors for synchronous I/O blocking operations in your code (added in `0.2.6`) | | `--tunnel` | `False` | Expose the local server via a public tunnel (Cloudflare) for remote frontend access. This avoids issues with browsers like Safari or networks blocking localhost connections | | `--help` | | Display command documentation | @@ -392,7 +392,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | `--debug-port INTEGER` | | Port for debugger to listen on | | `--wait-for-client` | `False` | Wait for a debugger client to connect to the debug port before starting the server | | `--no-browser` | | Skip automatically opening the browser when the server starts | - | `--studio-url TEXT` | | URL of the LangGraph Studio instance to connect to. Defaults to https://smith.langchain.com | + | `--studio-url TEXT` | | URL of the Studio instance to connect to. Defaults to https://smith.langchain.com | | `--allow-blocking` | `False` | Do not raise errors for synchronous I/O blocking operations in your code | | `--tunnel` | `False` | Expose the local server via a public tunnel (Cloudflare) for remote frontend access. This avoids issues with browsers or networks blocking localhost connections | | `--help` | | Display command documentation | diff --git a/src/langgraph-platform/cloud.mdx b/src/langsmith/cloud.mdx similarity index 57% rename from src/langgraph-platform/cloud.mdx rename to src/langsmith/cloud.mdx index c35a59fa0..df566723a 100644 --- a/src/langgraph-platform/cloud.mdx +++ b/src/langsmith/cloud.mdx @@ -1,12 +1,13 @@ --- title: Cloud +sidebarTitle: Overview --- To deploy a [LangGraph Server](/langgraph-platform/langgraph-server), follow the how-to guide for [how to deploy to Cloud](/langgraph-platform/deploy-to-cloud). -## Overview +The [Cloud](/langgraph-platform/cloud) deployment option is a fully managed model for deployment where the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) run in our cloud. This option provides a simple way to deploy and manage your LangGraph Servers. -The Cloud deployment option is a fully managed model for deployment where we manage the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) in our cloud. +Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed internally by the platform. | | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | |-------------------|---------------------------------------------------|-----------------------------------------------| diff --git a/src/langgraph-platform/components.mdx b/src/langsmith/components.mdx similarity index 84% rename from src/langgraph-platform/components.mdx rename to src/langsmith/components.mdx index 8dc81862a..9facaca5b 100644 --- a/src/langgraph-platform/components.mdx +++ b/src/langsmith/components.mdx @@ -1,13 +1,13 @@ --- -title: LangGraph Platform components -sidebarTitle: Overview +title: LangSmith components +sidebarTitle: Platform components --- LangGraph Platform consists of several key components that work together to provide a complete solution for deploying and managing agentic applications: * [LangGraph Server](/langgraph-platform/langgraph-server): The server defines an opinionated API and architecture that incorporates best practices for deploying agentic applications, allowing you to focus on building your agent logic rather than developing server infrastructure. * [LangGraph CLI](/langgraph-platform/langgraph-cli): LangGraph CLI is a command-line interface that helps to interact with a local LangGraph. -* [LangGraph Studio](/langgraph-platform/langgraph-studio): LangGraph Studio is a specialized IDE that can connect to a LangGraph Server to enable visualization, interaction, and debugging of the application locally. +* [Studio](/langgraph-platform/langgraph-studio): Studio is a specialized IDE that can connect to a LangGraph Server to enable visualization, interaction, and debugging of the application locally. * [Python/JS SDK](/langgraph-platform/sdk): The Python/JS SDK provides a programmatic way to interact with deployed LangGraph Applications. * [Remote Graph](/langgraph-platform/use-remote-graph): A RemoteGraph allows you to interact with any deployed LangGraph application as though it were running locally. * [LangGraph Control Plane](/langgraph-platform/control-plane): The LangGraph Control Plane refers to the Control Plane UI where users create and update LangGraph Servers and the Control Plane APIs that support the UI experience. diff --git a/src/langgraph-platform/configurable-headers.mdx b/src/langsmith/configurable-headers.mdx similarity index 100% rename from src/langgraph-platform/configurable-headers.mdx rename to src/langsmith/configurable-headers.mdx diff --git a/src/langgraph-platform/configuration-cloud.mdx b/src/langsmith/configuration-cloud.mdx similarity index 92% rename from src/langgraph-platform/configuration-cloud.mdx rename to src/langsmith/configuration-cloud.mdx index 0a8fcaaf1..040aa38b3 100644 --- a/src/langgraph-platform/configuration-cloud.mdx +++ b/src/langsmith/configuration-cloud.mdx @@ -117,7 +117,7 @@ Inside your deployment, select the "Assistants" tab. This will load a table of a To create a new assistant, select the "+ New assistant" button. This will open a form where you can specify the graph this assistant is for, as well as provide a name, description, and the desired configuration for the assistant based on the configuration schema for that graph. -To confirm, click "Create assistant". This will take you to [LangGraph Studio](/langgraph-platform/langgraph-studio) where you can test the assistant. If you go back to the "Assistants" tab in the deployment, you will see the newly created assistant in the table. +To confirm, click "Create assistant". This will take you to [Studio](/langgraph-platform/langgraph-studio) where you can test the assistant. If you go back to the "Assistants" tab in the deployment, you will see the newly created assistant in the table. ## Use an assistant @@ -225,7 +225,7 @@ Receiving event of type: updates ### LangGraph Platform UI -Inside your deployment, select the "Assistants" tab. For the assistant you would like to use, click the "Studio" button. This will open LangGraph Studio with the selected assistant. When you submit an input (either in Graph or Chat mode), the selected assistant and its configuration will be used. +Inside your deployment, select the "Assistants" tab. For the assistant you would like to use, click the "Studio" button. This will open Studio with the selected assistant. When you submit an input (either in Graph or Chat mode), the selected assistant and its configuration will be used. ## Create a new version for your assistant @@ -287,7 +287,7 @@ Inside your deployment, select the "Assistants" tab. This will load a table of a To edit an existing assistant, select the "Edit" button for the specified assistant. This will open a form where you can edit the assistant's name, description, and configuration. -Additionally, if using LangGraph Studio, you can edit the assistants and create new versions via the "Manage Assistants" button. +Additionally, if using Studio, you can edit the assistants and create new versions via the "Manage Assistants" button. ## Use a previous assistant version @@ -324,7 +324,7 @@ If you now run your graph and pass in this assistant id, it will use the first v ### LangGraph Platform UI -If using LangGraph Studio, to set the active version of your assistant, click the "Manage Assistants" button and locate the assistant you would like to use. Select the assistant and the version, and then click the "Active" toggle. This will update the assistant to make the selected version active. +If using Studio, to set the active version of your assistant, click the "Manage Assistants" button and locate the assistant you would like to use. Select the assistant and the version, and then click the "Active" toggle. This will update the assistant to make the selected version active. **Deleting Assistants** diff --git a/src/langgraph-platform/configure-ttl.mdx b/src/langsmith/configure-ttl.mdx similarity index 100% rename from src/langgraph-platform/configure-ttl.mdx rename to src/langsmith/configure-ttl.mdx diff --git a/src/langgraph-platform/control-plane.mdx b/src/langsmith/control-plane.mdx similarity index 100% rename from src/langgraph-platform/control-plane.mdx rename to src/langsmith/control-plane.mdx diff --git a/src/langgraph-platform/cron-jobs.mdx b/src/langsmith/cron-jobs.mdx similarity index 100% rename from src/langgraph-platform/cron-jobs.mdx rename to src/langsmith/cron-jobs.mdx diff --git a/src/langgraph-platform/custom-auth.mdx b/src/langsmith/custom-auth.mdx similarity index 100% rename from src/langgraph-platform/custom-auth.mdx rename to src/langsmith/custom-auth.mdx diff --git a/src/langgraph-platform/custom-docker.mdx b/src/langsmith/custom-docker.mdx similarity index 100% rename from src/langgraph-platform/custom-docker.mdx rename to src/langsmith/custom-docker.mdx diff --git a/src/langgraph-platform/custom-lifespan.mdx b/src/langsmith/custom-lifespan.mdx similarity index 100% rename from src/langgraph-platform/custom-lifespan.mdx rename to src/langsmith/custom-lifespan.mdx diff --git a/src/langgraph-platform/custom-middleware.mdx b/src/langsmith/custom-middleware.mdx similarity index 100% rename from src/langgraph-platform/custom-middleware.mdx rename to src/langsmith/custom-middleware.mdx diff --git a/src/langgraph-platform/custom-routes.mdx b/src/langsmith/custom-routes.mdx similarity index 100% rename from src/langgraph-platform/custom-routes.mdx rename to src/langsmith/custom-routes.mdx diff --git a/src/langgraph-platform/data-plane.mdx b/src/langsmith/data-plane.mdx similarity index 100% rename from src/langgraph-platform/data-plane.mdx rename to src/langsmith/data-plane.mdx diff --git a/src/langgraph-platform/data-storage-and-privacy.mdx b/src/langsmith/data-storage-and-privacy.mdx similarity index 91% rename from src/langgraph-platform/data-storage-and-privacy.mdx rename to src/langsmith/data-storage-and-privacy.mdx index 6eb4f2133..1a1aa9f12 100644 --- a/src/langgraph-platform/data-storage-and-privacy.mdx +++ b/src/langsmith/data-storage-and-privacy.mdx @@ -2,7 +2,7 @@ title: Data storage and privacy sidebarTitle: Data storage and privacy --- -This document describes how data is processed in the LangGraph CLI and the LangGraph Server for both the in-memory server (`langgraph dev`) and the local Docker server (`langgraph up`). It also describes what data is tracked when interacting with the hosted LangGraph Studio frontend. +This document describes how data is processed in the LangGraph CLI and the LangGraph Server for both the in-memory server (`langgraph dev`) and the local Docker server (`langgraph up`). It also describes what data is tracked when interacting with the hosted Studio frontend. ## CLI @@ -37,7 +37,7 @@ If you've disabled [tracing](#langsmith-tracing), no user data is persisted exte ## Studio -[LangGraph Studio](/langgraph-platform/langgraph-studio) is a graphical interface for interacting with your LangGraph server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local LangGraph server so that no data needs to be sent to LangSmith. +[Studio](/langgraph-platform/langgraph-studio) is a graphical interface for interacting with your LangGraph server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local LangGraph server so that no data needs to be sent to LangSmith. If you are logged in, LangSmith does collect some usage analytics to help improve studio's user experience. This includes: diff --git a/src/langgraph-platform/deploy-hybrid.mdx b/src/langsmith/deploy-hybrid.mdx similarity index 99% rename from src/langgraph-platform/deploy-hybrid.mdx rename to src/langsmith/deploy-hybrid.mdx index dba991354..ab0eec94b 100644 --- a/src/langgraph-platform/deploy-hybrid.mdx +++ b/src/langsmith/deploy-hybrid.mdx @@ -1,6 +1,7 @@ --- title: How to deploy hybrid -sidebarTitle: Hybrid deployment +sidebarTitle: Deployment guide +icon: "server" --- Before deploying, review the [conceptual guide for the Hybrid](/langgraph-platform/hybrid) deployment option. diff --git a/src/langgraph-platform/deploy-self-hosted-full-platform.mdx b/src/langsmith/deploy-self-hosted-full-platform.mdx similarity index 97% rename from src/langgraph-platform/deploy-self-hosted-full-platform.mdx rename to src/langsmith/deploy-self-hosted-full-platform.mdx index bb2b2e3d7..32976c216 100644 --- a/src/langgraph-platform/deploy-self-hosted-full-platform.mdx +++ b/src/langsmith/deploy-self-hosted-full-platform.mdx @@ -1,6 +1,8 @@ --- title: How to deploy self-hosted full platform -sidebarTitle: Self-hosted full platform +sidebarTitle: Full-platform deployment guide +icon: "server" +iconType: "solid" --- Before deploying, review the [conceptual guide for the Self-Hosted Full Platform](/langgraph-platform/self-hosted) deployment option. diff --git a/src/langgraph-platform/deploy-standalone-server.mdx b/src/langsmith/deploy-standalone-server.mdx similarity index 98% rename from src/langgraph-platform/deploy-standalone-server.mdx rename to src/langsmith/deploy-standalone-server.mdx index 7ebb62f4a..bf81f6abb 100644 --- a/src/langgraph-platform/deploy-standalone-server.mdx +++ b/src/langsmith/deploy-standalone-server.mdx @@ -1,7 +1,10 @@ --- title: How to deploy self-hosted standalone server -sidebarTitle: Self-hosted standalone server +sidebarTitle: Standalone server deployment guide +icon: "server" +iconType: "solid" --- + Before deploying, review the [conceptual guide for the Standalone Server](/langgraph-platform/self-hosted#standalone-server) deployment option. ## Prerequisites diff --git a/src/langgraph-platform/deploy-to-cloud.mdx b/src/langsmith/deploy-to-cloud.mdx similarity index 94% rename from src/langgraph-platform/deploy-to-cloud.mdx rename to src/langsmith/deploy-to-cloud.mdx index b857cd320..bb9ce7815 100644 --- a/src/langgraph-platform/deploy-to-cloud.mdx +++ b/src/langsmith/deploy-to-cloud.mdx @@ -1,6 +1,8 @@ --- title: How to deploy to cloud -sidebarTitle: Cloud deployment +sidebarTitle: Deployment guide +icon: "cloud" +iconType: "solid" --- Before deploying, review the [conceptual guide for the Cloud](/langgraph-platform/cloud) deployment option. @@ -27,7 +29,7 @@ Starting from the LangSmi 2. `Production` deployments can serve up to 500 requests/second and are provisioned with highly available storage with automatic backups. 3. Determine if the deployment should be `Shareable through LangGraph Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through LangGraph Studio to any LangSmith user. A direct URL to LangGraph Studio for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. 4. Specify `Environment Variables` and secrets. See the [Environment Variables reference](/langgraph-platform/env-var) to configure additional variables for the deployment. 1. Sensitive values such as API keys (e.g. `OPENAI_API_KEY`) should be specified as secrets. 2. Additional non-secret environment variables can be specified as well. @@ -45,9 +47,9 @@ Starting from the LangSmi 3. In the `Deployment` view, in the top-right corner, select `+ New Revision`. 4. In the `New Revision` modal, fill out the required fields. 1. Specify the full path to the [LangGraph API config file](/langgraph-platform/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. - 2. Determine if the deployment should be `Shareable through LangGraph Studio`. + 2. Determine if the deployment should be `Shareable through Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through LangGraph Studio to any LangSmith user. A direct URL to LangGraph Studio for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. 3. Specify `Environment Variables` and secrets. Existing secrets and environment variables are prepopulated. See the [Environment Variables reference](/langgraph-platform/env-var) to configure additional variables for the revision. 1. Add new secrets or environment variables. 2. Remove existing secrets or environment variables. diff --git a/src/langsmith/deployment-options.mdx b/src/langsmith/deployment-options.mdx new file mode 100644 index 000000000..9153dff03 --- /dev/null +++ b/src/langsmith/deployment-options.mdx @@ -0,0 +1,60 @@ +--- +title: Deployment options +sidebarTitle: Deployment options +mode: "wide" +--- + +LangSmith deployment options include: + + + + Fully managed model for deployment running in LangChain's cloud. A simple way to deploy and manage your LangGraph Servers. Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed internally by the platform. + + + + (Enterprise) Manage the data plane in your own cloud, while LangChain manages the control plane. Build a Docker image using the LangGraph CLI and deploy your LangGraph Server from the control plane UI. + + + + (Enterprise) Run all components entirely within your own cloud environment. Deploy the full LangSmith platform or standalone instances of a LangGraph Server without the control plane UI. + + + + +For comparison: + +| | **Cloud** | **Hybrid** | **Self-Hosted** | +|----------------------|----------------|-------------|-----------------| +| **Description** | All components run in LangChain's cloud | Control plane runs in LangChain's cloud; data plane in your cloud | All components run in your cloud | +| **CI/CD** | Managed internally by platform | Managed externally by you | Managed externally by you | +| **Data/compute residency** | LangChain's cloud | Your cloud | Your cloud | +| **LangSmith compatibility** | Trace to LangSmith SaaS | Trace to LangSmith SaaS | Trace to Self-Hosted LangSmith | +| **[Pricing](https://www.langchain.com/pricing-langgraph-platform)** | Plus | Enterprise | Enterprise | + + +You can [run LangSmith locally for free](/langgraph-platform/local-server) for testing and development. + + +## Related + +For more information, refer to: + +- [Plans](/langgraph-platform/plans) +- [Pricing](https://www.langchain.com/langgraph-platform-pricing) diff --git a/src/langgraph-platform/deployment-quickstart.mdx b/src/langsmith/deployment-quickstart.mdx similarity index 95% rename from src/langgraph-platform/deployment-quickstart.mdx rename to src/langsmith/deployment-quickstart.mdx index b94d37a61..3edfe21af 100644 --- a/src/langgraph-platform/deployment-quickstart.mdx +++ b/src/langsmith/deployment-quickstart.mdx @@ -29,12 +29,12 @@ To deploy an application to **LangGraph Platform**, your application code must r 6. Click **Submit** to deploy. This may take about 15 minutes to complete. You can check the status in the **Deployment details** view. -## 3. Test your application in LangGraph Studio +## 3. Test your application in Studio Once your application is deployed: 1. Select the deployment you just created to view more details. -2. Click the **LangGraph Studio** button in the top right corner. [LangGraph Studio](/langgraph-platform/langgraph-studio) will open to display your graph. +2. Click the **Studio** button in the top right corner. [Studio](/langgraph-platform/langgraph-studio) will open to display your graph. ## 4. Get the API URL for your deployment @@ -157,5 +157,5 @@ You can now test the API: You have deployed an application using LangGraph Platform. Here are some other resources to check out: -* [LangGraph Studio overview](/langgraph-platform/langgraph-studio) +* [Studio overview](/langgraph-platform/langgraph-studio) * [Deployment options](/langgraph-platform/deployment-options) diff --git a/src/langgraph-platform/double-texting.mdx b/src/langsmith/double-texting.mdx similarity index 100% rename from src/langgraph-platform/double-texting.mdx rename to src/langsmith/double-texting.mdx diff --git a/src/langgraph-platform/egress-metrics-metadata.mdx b/src/langsmith/egress-metrics-metadata.mdx similarity index 100% rename from src/langgraph-platform/egress-metrics-metadata.mdx rename to src/langsmith/egress-metrics-metadata.mdx diff --git a/src/langgraph-platform/enqueue-concurrent.mdx b/src/langsmith/enqueue-concurrent.mdx similarity index 100% rename from src/langgraph-platform/enqueue-concurrent.mdx rename to src/langsmith/enqueue-concurrent.mdx diff --git a/src/langgraph-platform/env-var.mdx b/src/langsmith/env-var.mdx similarity index 100% rename from src/langgraph-platform/env-var.mdx rename to src/langsmith/env-var.mdx diff --git a/src/langsmith/evaluation-quickstart.mdx b/src/langsmith/evaluation-quickstart.mdx index 9f3100b6a..4d4a64e0c 100644 --- a/src/langsmith/evaluation-quickstart.mdx +++ b/src/langsmith/evaluation-quickstart.mdx @@ -1,6 +1,6 @@ --- title: Evaluation quickstart -sidebarTitle: Evaluate an application +sidebarTitle: Evaluate an app --- import WorkspaceSecret from '/snippets/langsmith/set-workspace-secrets.mdx'; diff --git a/src/langsmith/evaluation.mdx b/src/langsmith/evaluation.mdx index 6f3e03fe5..2f32943d6 100644 --- a/src/langsmith/evaluation.mdx +++ b/src/langsmith/evaluation.mdx @@ -6,14 +6,60 @@ mode: wide Welcome to the LangSmith Evaluation documentation. The following sections help you create datasets, run evaluations, and analyze results: -- **Datasets**: [Create](/langsmith/manage-datasets-in-application) and [manage](/langsmith/manage-datasets) datasets for evaluation, including creating datasets through the UI or SDK and managing existing datasets. + -- **Evaluations**: [Run evaluations](/langsmith/evaluate-llm-application) on your applications using various methods and techniques, including different evaluator types and evaluation techniques. + + Review core terminology and concepts to understand how evaluations work in LangSmith. + -- **Analyze experiment results**: [View and analyze your evaluation results](/langsmith/analyze-an-experiment), including comparing experiments, filtering results, and downloading data. + + Create and manage datasets for evaluation through the UI or SDK. + -- **Annotation & human feedback**: Collect human feedback on your application outputs through [annotation queues](/langsmith/annotation-queues) and [inline annotation](/langsmith/annotate-traces-inline). + + Evaluate your applications with different evaluators and techniques to measure quality. + -- **Tutorials**: Follow step-by-step tutorials to evaluate different types of applications, from [chatbots](/langsmith/evaluate-chatbot-tutorial) to [complex agents](/langsmith/evaluate-complex-agent). + + View and analyze evaluation results, compare experiments, filter data, and export findings. + -For terminology definitions and core concepts, refer to the [introduction on evaluation](/langsmith/evaluation-concepts). + + Gather human feedback through annotation queues and inline annotation on outputs. + + + + Learn by following step-by-step tutorials, from simple chatbots to complex agent evaluations. + + + diff --git a/src/langsmith/faq.mdx b/src/langsmith/faq.mdx index 6ecff984c..509c43456 100644 --- a/src/langsmith/faq.mdx +++ b/src/langsmith/faq.mdx @@ -3,26 +3,28 @@ title: Frequently asked questions sidebarTitle: FAQs --- -## *I can't create API keys or manage users in the UI, what's wrong?* +## Observability + +### *I can't create API keys or manage users in the UI, what's wrong?* * You have likely deployed LangSmith without setting up SSO. LangSmith requires SSO to manage users and API keys. You can find more information on setting up SSO in the [configuration section.](/langsmith/self-host-sso) -## *How does load balancing/ingress work*? +### *How does load balancing/ingress work*? * You will need to expose the frontend container/service to your applications/users. This will handle routing to all downstream services. * You will need to terminate SSL at the ingress level. We recommend using a managed service like AWS ALB, GCP Load Balancer, or Nginx. -## *How can we authenticate to the application?* +### *How can we authenticate to the application?* * Currently, our self-hosted solution supports SSO with OAuth2.0 and OIDC as an authn solution. Note, we do offer a no-auth solution but highly recommend setting up oauth before moving into production. You can find more information on setting up SSO in the [configuration section.](/langsmith/self-host-sso) -## *Can I use external storage services?* +### *Can I use external storage services?* * You can configure LangSmith to use external versions of all storage services. In a production setting, we strongly recommend using external storage services. Check out the [configuration section](/langsmith/architectural-overview) for more information. -## *Does my application need egress to function properly?* +### *Does my application need egress to function properly?* Our deployment only needs egress for a few things (most of which can reside within your VPC): @@ -41,16 +43,16 @@ Our deployment only needs egress for a few things (most of which can reside with Your VPC can set up rules to limit any other access. Note: We require the `X-Organization-Id` and `X-Tenant-Id` headers to be allowed to be passed through to the backend service. These are used to determine which organization and workspace (previously called "tenant") the request is for. -## *Resource requirements for the application?* +### *Resource requirements for the application?* * In kubernetes, we recommend a minimum helm configuration which can be found in [here](https://github.com/langchain-ai/helm/blob/main/charts/langsmith/examples/medium_size.yaml). For docker, we recommend a minimum of 16GB of RAM and 4 CPUs. * For Postgres, we recommend a minimum of 8GB of RAM and 2 CPUs. * For Redis, we recommend 4GB of RAM and 2 CPUs. * For Clickhouse, we recommend 32GB of RAM and 8 CPUs. -## SAML SSO FAQs +### SAML SSO FAQs -### *How do I change a SAML SSO user's email address?* +#### *How do I change a SAML SSO user's email address?* Some identity providers retain the original `User ID` through an email change while others do not, so we recommend that you follow these steps to avoid duplicate users in LangSmith: @@ -58,33 +60,97 @@ Some identity providers retain the original `User ID` through an email change wh 2. Change their email address in the IdP 3. Have them login to LangSmith again via SAML SSO - this will trigger the usual [JIT provisioning](#just-in-time-jit-provisioning) flow with their new email address -### *How do I fix "405 method not allowed"?* +#### *How do I fix "405 method not allowed"?* Ensure you're using the correct ACS URL: [https://auth.langchain.com/auth/v1/sso/saml/acs](https://auth.langchain.com/auth/v1/sso/saml/acs) -## SCIM FAQs +### SCIM FAQs -### *Can I use SCIM without SAML SSO?* +#### *Can I use SCIM without SAML SSO?* * **Cloud**: No, SAML SSO is required for SCIM in cloud deployments * **Self-hosted**: Yes, SCIM works with OAuth with Client Secret authentication mode -### *What happens if I have both JIT provisioning and SCIM enabled?* +#### *What happens if I have both JIT provisioning and SCIM enabled?* JIT provisioning and SCIM can conflict with each other. We recommend disabling JIT provisioning before enabling SCIM to ensure consistent user provisioning behavior. -### *How do I change a user's role or workspace access?* +#### *How do I change a user's role or workspace access?* Update the user's group membership in your IdP. The changes will be synchronized to LangSmith according to the [role precedence rules](#role-precedence). -### *What happens when a user is removed from all groups?* +#### *What happens when a user is removed from all groups?* The user will be deprovisioned from your LangSmith organization according to your IdP's deprovisioning settings. -### *Can I use custom group names?* +#### *Can I use custom group names?* Yes. If your identity provider supports syncing alternate fields to the `displayName` group attribute, you may use an alternate attribute (like `description`) as the `displayName` in LangSmith and retain full customizability of the identity provider group name. Otherwise, groups must follow the specific naming convention described in the [Group Naming Convention](#group-naming-convention) section to properly map to LangSmith roles and workspaces. -#### _Why is my Okta integration not working?_ +##### _Why is my Okta integration not working?_ See Okta's troubleshooting guide here: https://help.okta.com/en-us/content/topics/users-groups-profiles/usgp-group-push-troubleshoot.htm. + +## Deployment + +### Do I need to use LangChain to use LangGraph? What's the difference? + +No. LangGraph is an orchestration framework for complex agentic systems and is more low-level and controllable than LangChain agents. LangChain provides a standard interface to interact with models and other components, useful for straight-forward chains and retrieval flows. + +### How is LangGraph different from other agent frameworks? + +Other agentic frameworks can work for simple, generic tasks but fall short for complex tasks bespoke to a company’s needs. LangGraph provides a more expressive framework to handle companies’ unique tasks without restricting users to a single black-box cognitive architecture. + +### Does LangGraph impact the performance of my app? + +LangGraph will not add any overhead to your code and is specifically designed with streaming workflows in mind. + +### Is LangGraph open source? Is it free? + +Yes. LangGraph is an MIT-licensed open-source library and is free to use. + +### How are LangGraph and LangGraph Platform different? + +LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangGraph Platform is a service for deploying and scaling LangGraph applications, with an opinionated API for building agent UXs, plus an integrated developer studio. + +| Features | LangGraph (open source) | LangGraph Platform | +|---------------------|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------| +| Description | Stateful orchestration framework for agentic applications | Scalable infrastructure for deploying LangGraph applications | +| SDKs | Python and JavaScript | Python and JavaScript | +| HTTP APIs | None | Yes - useful for retrieving & updating state or long-term memory, or creating a configurable assistant | +| Streaming | Basic | Dedicated mode for token-by-token messages | +| Checkpointer | Community contributed | Supported out-of-the-box | +| Persistence Layer | Self-managed | Managed Postgres with efficient storage | +| Deployment | Self-managed | • Cloud
• Free self-hosted
• Enterprise (paid self-hosted) | +| Scalability | Self-managed | Auto-scaling of task queues and servers | +| Fault-tolerance | Self-managed | Automated retries | +| Concurrency Control | Simple threading | Supports double-texting | +| Scheduling | None | Cron scheduling | +| Monitoring | None | Integrated with LangSmith for observability | +| IDE integration | Studio | Studio | + +### Is LangGraph Platform open source? + +No. LangGraph Platform is proprietary software. + +There is a free, self-hosted version of LangGraph Platform with access to basic features. The Cloud deployment option and the Self-Hosted deployment options are paid services. [Contact our sales team](https://www.langchain.com/contact-sales) to learn more. + +For more information, see our [LangGraph Platform pricing page](https://www.langchain.com/pricing-langgraph-platform). + +### Does LangGraph work with LLMs that don't support tool calling? + +Yes! You can use LangGraph with any LLMs. The main reason we use LLMs that support tool calling is that this is often the most convenient way to have the LLM make its decision about what to do. If your LLM does not support tool calling, you can still use it - you just need to write a bit of logic to convert the raw LLM string response to a decision about what to do. + +### Does LangGraph work with OSS LLMs? + +Yes! LangGraph is totally ambivalent to what LLMs are used under the hood. The main reason we use closed LLMs in most of the tutorials is that they seamlessly support tool calling, while OSS LLMs often don't. But tool calling is not necessary (see [this section](#does-langgraph-work-with-llms-that-dont-support-tool-calling)) so you can totally use LangGraph with OSS LLMs. + +### Can I use Studio without logging in to LangSmith + +Yes! You can use the [development version of LangGraph Server](/langgraph-platform/local-server) to run the backend locally. +This will connect to the studio frontend hosted as part of LangSmith. +If you set an environment variable of `LANGSMITH_TRACING=false`, then no traces will be sent to LangSmith. + +### What does "nodes executed" mean for LangGraph Platform usage? + +**Nodes Executed** is the aggregate number of nodes in a LangGraph application that are called and completed successfully during an invocation of the application. If a node in the graph is not called during execution or ends in an error state, these nodes will not be counted. If a node is called and completes successfully multiple times, each occurrence will be counted. diff --git a/src/langgraph-platform/generative-ui-react.mdx b/src/langsmith/generative-ui-react.mdx similarity index 100% rename from src/langgraph-platform/generative-ui-react.mdx rename to src/langsmith/generative-ui-react.mdx diff --git a/src/langgraph-platform/graph-rebuild.mdx b/src/langsmith/graph-rebuild.mdx similarity index 100% rename from src/langgraph-platform/graph-rebuild.mdx rename to src/langsmith/graph-rebuild.mdx diff --git a/src/langsmith/home.mdx b/src/langsmith/home.mdx index 7e31ab6b3..37af84f6e 100644 --- a/src/langsmith/home.mdx +++ b/src/langsmith/home.mdx @@ -4,13 +4,11 @@ sidebarTitle: Overview mode: wide --- -[LangSmith](https://www.langchain.com/langsmith) is a platform for building production-grade LLM applications. Monitor and evaluate your application, so you can ship quickly and with confidence. +**LangSmith provides tools for developing, debugging, and deploying LLM applications.** +It helps you trace requests, evaluate outputs, test prompts, and manage deployments in one place. LangSmith is framework agnostic, which means you can use it with or without LangChain's open-source frameworks [`langchain`](https://python.langchain.com) and [`langgraph`](https://langchain-ai.github.io/langgraph/). You can prototype locally and then move to production with integrated monitoring and evaluation to build more reliable AI systems. - -LangSmith is framework agnostic — you can use it with or without LangChain's open source frameworks [`langchain`](https://python.langchain.com) and [`langgraph`](https://langchain-ai.github.io/langgraph/). - +**NEW IMAGE NEEDED** -
Dark mode overview -
- - + - Gain visibility into each step your application takes when handling a request to debug faster. - + cta="Start tracing" + > + Gain visibility into every step your application takes to debug faster and improve reliability. + - - Measure quality of your applications over time to build more reliable AI applications. - + cta="Evaluate your app" + > + Measure and track quality over time to ensure your AI applications are consistent and trustworthy. + - + Take your application from testing to production with LangSmith’s deployment tools. + + + -Iterate on prompts, with automatic version control and collaboration features. - + cta="Test your prompts" + > + Iterate on prompts with built-in versioning and collaboration to ship improvements faster. + + + + Run LangSmith locally to prototype, test, and debug without leaving your dev environment. + - -Set up your workspace, configure admin settings, and invite your team to collaborate. - + cta="Build with Studio" + > + Use an intuitive visual interface to design, test, and refine applications end-to-end. + diff --git a/src/langgraph-platform/human-in-the-loop-time-travel.mdx b/src/langsmith/human-in-the-loop-time-travel.mdx similarity index 100% rename from src/langgraph-platform/human-in-the-loop-time-travel.mdx rename to src/langsmith/human-in-the-loop-time-travel.mdx diff --git a/src/langgraph-platform/hybrid.mdx b/src/langsmith/hybrid.mdx similarity index 90% rename from src/langgraph-platform/hybrid.mdx rename to src/langsmith/hybrid.mdx index 2995ea6ed..cc698dcfd 100644 --- a/src/langgraph-platform/hybrid.mdx +++ b/src/langsmith/hybrid.mdx @@ -1,5 +1,6 @@ --- title: Hybrid +sidebarTitle: Overview --- @@ -9,13 +10,20 @@ The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) ## Requirements -* You use `langgraph-cli` and/or [LangGraph Studio](/langgraph-platform/langgraph-studio) app to test graph locally. +* You use `langgraph-cli` and/or [Studio](/langgraph-platform/langgraph-studio) app to test graph locally. * You use `langgraph build` command to build image. -## Hybrid + +Supported Compute Platforms: [Kubernetes](https://kubernetes.io/) +For a guide on deployment, refer to [How to deploy the Hybrid](/langgraph-platform/deploy-hybrid). + + +## Overview The [Hybrid](/langgraph-platform/deploy-hybrid) deployment option lets you manage the [data plane](/langgraph-platform/data-plane) in your own cloud, while we handle the [control plane](/langgraph-platform/control-plane) in ours. When using the Hybrid version, you authenticate with a [LangSmith](https://smith.langchain.com/) API key. +Build a Docker image using the [LangGraph CLI](/langgraph-platform/langgraph-cli) and deploy your LangGraph Server from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). + | | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | |-------------------|-------------------|------------| | **What is it?** |
  • Control plane UI for creating deployments and revisions
  • Control plane APIs for creating deployments and revisions
|
  • Data plane "listener" for reconciling deployments with control plane state
  • LangGraph Servers
  • Postgres, Redis, etc
| diff --git a/src/langgraph-platform/interrupt-concurrent.mdx b/src/langsmith/interrupt-concurrent.mdx similarity index 100% rename from src/langgraph-platform/interrupt-concurrent.mdx rename to src/langsmith/interrupt-concurrent.mdx diff --git a/src/langgraph-platform/langgraph-cli.mdx b/src/langsmith/langgraph-cli.mdx similarity index 98% rename from src/langgraph-platform/langgraph-cli.mdx rename to src/langsmith/langgraph-cli.mdx index 549e6d7e0..91796d515 100644 --- a/src/langgraph-platform/langgraph-cli.mdx +++ b/src/langsmith/langgraph-cli.mdx @@ -1,5 +1,6 @@ --- title: LangGraph CLI +sidebarTitle: Dupe content CLI --- **LangGraph CLI** is a multi-platform command-line tool for building and running the [LangGraph API server](/langgraph-platform/langgraph-server) locally. The resulting server includes all API endpoints for your graph's runs, threads, assistants, etc. as well as the other services required to run your agent, including a managed database for checkpointing and storage. diff --git a/src/langgraph-platform/js-ts-sdk.mdx b/src/langsmith/langgraph-js-ts-sdk.mdx similarity index 78% rename from src/langgraph-platform/js-ts-sdk.mdx rename to src/langsmith/langgraph-js-ts-sdk.mdx index f96c0c5fb..8078e0363 100644 --- a/src/langgraph-platform/js-ts-sdk.mdx +++ b/src/langsmith/langgraph-js-ts-sdk.mdx @@ -1,5 +1,5 @@ --- title: SDK (JS/TS) -sidebarTitle: SDK (JS/TS) +sidebarTitle: LG SDK (JS/TS) url: "https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html" --- diff --git a/src/langgraph-platform/python-sdk.mdx b/src/langsmith/langgraph-python-sdk.mdx similarity index 78% rename from src/langgraph-platform/python-sdk.mdx rename to src/langsmith/langgraph-python-sdk.mdx index 164418806..c43bc029b 100644 --- a/src/langgraph-platform/python-sdk.mdx +++ b/src/langsmith/langgraph-python-sdk.mdx @@ -1,5 +1,5 @@ --- title: SDK (Python) -sidebarTitle: SDK (Python) +sidebarTitle: LG SDK (Python) url: "https://langchain-ai.github.io/langgraph/cloud/reference/sdk/python_sdk_ref/" --- diff --git a/src/langgraph-platform/langgraph-server-changelog.mdx b/src/langsmith/langgraph-server-changelog.mdx similarity index 100% rename from src/langgraph-platform/langgraph-server-changelog.mdx rename to src/langsmith/langgraph-server-changelog.mdx diff --git a/src/langgraph-platform/langgraph-server.mdx b/src/langsmith/langgraph-server.mdx similarity index 100% rename from src/langgraph-platform/langgraph-server.mdx rename to src/langsmith/langgraph-server.mdx diff --git a/src/langgraph-platform/langgraph-studio.mdx b/src/langsmith/langgraph-studio.mdx similarity index 84% rename from src/langgraph-platform/langgraph-studio.mdx rename to src/langsmith/langgraph-studio.mdx index 3a3965733..aa9b24600 100644 --- a/src/langgraph-platform/langgraph-studio.mdx +++ b/src/langsmith/langgraph-studio.mdx @@ -18,11 +18,11 @@ Studio is a specialized agent IDE that enables visualization, interaction, and d Key features of Studio: * Visualize your graph architecture -* [Run and interact with your agent](/langgraph-platform/use-studio#run-application) -* [Manage assistants](/langgraph-platform/use-studio#manage-assistants) -* [Manage threads](/langgraph-platform/use-studio#manage-threads) -* [Iterate on prompts](/langgraph-platform/observability-studio) -* [Run experiments over a dataset](/langgraph-platform/observability-studio#run-experiments-over-a-dataset) +* [Run and interact with your agent](/langsmith/use-studio#run-application) +* [Manage assistants](/langsmith/use-studio#manage-assistants) +* [Manage threads](/langsmith/use-studio#manage-threads) +* [Iterate on prompts](/langsmith/observability-studio) +* [Run experiments over a dataset](/langsmith/observability-studio#run-experiments-over-a-dataset) * Manage [long term memory](/oss/concepts/memory) * Debug agent state via [time travel](/oss/langgraph/use-time-travel) diff --git a/src/langgraph-platform/local-server.mdx b/src/langsmith/local-server.mdx similarity index 93% rename from src/langgraph-platform/local-server.mdx rename to src/langsmith/local-server.mdx index 58450fd40..52bddca70 100644 --- a/src/langgraph-platform/local-server.mdx +++ b/src/langsmith/local-server.mdx @@ -1,6 +1,6 @@ --- -title: Run a LangGraph application locally -sidebarTitle: Run a LangGraph application locally +title: Run a LangGraph app locally +sidebarTitle: Run a LangGraph app locally --- This quickstart shows you how to set up a LangGraph application locally for testing and development. @@ -101,7 +101,7 @@ Sample output: > > - Docs: http://localhost:2024/docs > -> - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 +> - Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` The `langgraph dev` command starts LangGraph Server in an in-memory mode. This mode is suitable for development and testing purposes. @@ -227,7 +227,7 @@ For production use, deploy LangGraph Server with access to a persistent storage Now that you have a LangGraph app running locally, take your journey further by exploring features and deployment: -- [LangGraph Studio](/langgraph-platform/langgraph-studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [LangGraph Studio quickstart](/langgraph-platform/quick-start-studio). +- [Studio](/langgraph-platform/langgraph-studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [Studio quickstart](/langgraph-platform/quick-start-studio). - [Deploy on cloud](/langgraph-platform/deployment-quickstart) with the quickstart guide. - [LangGraph Server API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/): Explore the LangGraph Server API documentation. - [Python SDK Reference](/langgraph-platform/python-sdk): Explore the Python SDK API Reference. diff --git a/src/langgraph-platform/monorepo-support.mdx b/src/langsmith/monorepo-support.mdx similarity index 100% rename from src/langgraph-platform/monorepo-support.mdx rename to src/langsmith/monorepo-support.mdx diff --git a/src/langsmith/observability-quickstart.mdx b/src/langsmith/observability-quickstart.mdx index 3780db75f..aa854c472 100644 --- a/src/langsmith/observability-quickstart.mdx +++ b/src/langsmith/observability-quickstart.mdx @@ -1,6 +1,6 @@ --- title: Tracing quickstart -sidebarTitle: Trace an application +sidebarTitle: Trace an app --- [_Observability_](/langsmith/observability-concepts) is a critical requirement for applications built with large language models (LLMs). LLMs are non-deterministic, which means that the same prompt can produce different responses. This behavior makes debugging and monitoring more challenging than with traditional software. diff --git a/src/langsmith/observability-studio.mdx b/src/langsmith/observability-studio.mdx new file mode 100644 index 000000000..4292b27b0 --- /dev/null +++ b/src/langsmith/observability-studio.mdx @@ -0,0 +1,213 @@ +--- +title: Observability in Studio +sidebarTitle: Traces, datasets, prompts +--- + +Studio provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: + +- [Iterate on prompts](#iterate-on-prompts): Modify prompts inside graph nodes directly or with the LangSmith playground. +- [Run experiments over a dataset](#run-experiments-over-a-dataset): Execute your assistant over a LangSmith dataset to score and compare results. +- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into Studio and optionally clone them into your local agent. +- [Add a node to a dataset](#add-node-to-dataset): Turn parts of thread history into dataset examples for evaluation or further analysis. + +## Iterate on prompts + +Studio supports the following methods for modifying prompts in your graph: + +- [Direct node editing](#direct-node-editing) +- [Playground interface](#playground) + +### Direct node editing + +Studio allows you to edit prompts used inside individual nodes, directly from the graph interface. + +#### Graph Configuration + +Define your [configuration](/oss/langgraph/use-graph-api#add-runtime-configuration) to specify prompt fields and their associated nodes using `langgraph_nodes` and `langgraph_type` keys. + +##### `langgraph_nodes` + +- **Description**: Specifies which nodes of the graph a configuration field is associated with. +- **Value Type**: Array of strings, where each string is the name of a node in your graph. +- **Usage Context**: Include in the `json_schema_extra` dictionary for Pydantic models or the `metadata["json_schema_extra"]` dictionary for dataclasses. +- **Example**: + ```python + system_prompt: str = Field( + default="You are a helpful AI assistant.", + json_schema_extra={"langgraph_nodes": ["call_model", "other_node"]}, + ) + ``` + +##### `langgraph_type` + +- **Description**: Specifies the type of configuration field, which determines how it's handled in the UI. +- **Value Type**: String +- **Supported Values**: + * `"prompt"`: Indicates the field contains prompt text that should be treated specially in the UI. +- **Usage Context**: Include in the `json_schema_extra` dictionary for Pydantic models or the `metadata["json_schema_extra"]` dictionary for dataclasses. +- **Example**: + ```python + system_prompt: str = Field( + default="You are a helpful AI assistant.", + json_schema_extra={ + "langgraph_nodes": ["call_model"], + "langgraph_type": "prompt", + }, + ) + ``` + + + +```python +## Using Pydantic +from pydantic import BaseModel, Field +from typing import Annotated, Literal + +class Configuration(BaseModel): + """The configuration for the agent.""" + + system_prompt: str = Field( + default="You are a helpful AI assistant.", + description="The system prompt to use for the agent's interactions. " + "This prompt sets the context and behavior for the agent.", + json_schema_extra={ + "langgraph_nodes": ["call_model"], + "langgraph_type": "prompt", + }, + ) + + model: Annotated[ + Literal[ + "anthropic/claude-3-7-sonnet-latest", + "anthropic/claude-3-5-haiku-latest", + "openai/o1", + "openai/gpt-4o-mini", + "openai/o1-mini", + "openai/o3-mini", + ], + {"__template_metadata__": {"kind": "llm"}}, + ] = Field( + default="openai/gpt-4o-mini", + description="The name of the language model to use for the agent's main interactions. " + "Should be in the form: provider/model-name.", + json_schema_extra={"langgraph_nodes": ["call_model"]}, + ) + +## Using Dataclasses +from dataclasses import dataclass, field + +@dataclass(kw_only=True) +class Configuration: + """The configuration for the agent.""" + + system_prompt: str = field( + default="You are a helpful AI assistant.", + metadata={ + "description": "The system prompt to use for the agent's interactions. " + "This prompt sets the context and behavior for the agent.", + "json_schema_extra": {"langgraph_nodes": ["call_model"]}, + }, + ) + + model: Annotated[str, {"__template_metadata__": {"kind": "llm"}}] = field( + default="anthropic/claude-3-5-sonnet-20240620", + metadata={ + "description": "The name of the language model to use for the agent's main interactions. " + "Should be in the form: provider/model-name.", + "json_schema_extra": {"langgraph_nodes": ["call_model"]}, + }, + ) + +``` + + + +#### Editing prompts in the UI + +1. Locate the gear icon on nodes with associated configuration fields. +1. Click to open the configuration modal. +1. Edit the values. +1. Save to update the current assistant version or create a new one. + +### Playground + +The [playground](/langsmith/create-a-prompt) interface allows testing individual LLM calls without running the full graph: + +1. Select a thread. +1. Click **View LLM Runs** on a node. This lists all the LLM calls (if any) made inside the node. +1. Select an LLM run to open in the playground. +1. Modify prompts and test different model and tool settings. +1. Copy updated prompts back to your graph. + +## Run experiments over a dataset + +Studio lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). + +This guide shows you how to run a full end-to-end experiment directly from Studio. + +### Prerequisites + +Before running an experiment, ensure you have the following: + +- **A LangSmith dataset**: Your dataset should contain the inputs you want to test and optionally, reference outputs for comparison. The schema for the inputs must match the required input schema for the assistant. For more information on schemas, see [here](/oss/langgraph/use-graph-api#schema). For more on creating datasets, refer to [How to Manage Datasets](/langsmith/manage-datasets-in-application#set-up-your-dataset). +- **(Optional) Evaluators**: You can attach evaluators (e.g., LLM-as-a-Judge, heuristics, or custom functions) to your dataset in LangSmith. These will run automatically after the graph has processed all inputs. +- **A running application**: The experiment can be run against: + - An application deployed on [LangGraph Platform](/langgraph-platform/deployment-quickstart). + - A locally running application started via the [langgraph-cli](/langgraph-platform/local-server). + +### Experiment setup + +1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Studio page. +1. Select your dataset. In the modal that appears, select the dataset (or a specific dataset split) to use for the experiment and click **Start**. +1. Monitor the progress. All of the inputs in the dataset will now be run against the active assistant. Monitor the experiment's progress via the badge in the top right corner. +1. You can continue to work in Studio while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. + +## Debug LangSmith traces + +This guide explains how to open LangSmith traces in Studio for interactive investigation and debugging. + +### Open deployed threads + +1. Open the LangSmith trace, selecting the root run. +1. Click **Run in Studio**. + +This will open Studio connected to the associated deployment with the trace's parent thread selected. + +### Testing local agents with remote traces + +This section explains how to test a local agent against remote traces from LangSmith. This enables you to use production traces as input for local testing, allowing you to debug and verify agent modifications in your development environment. + +#### Prerequisites + +- A LangSmith traced thread +- A [locally running agent](/langgraph-platform/local-server#local-development-server). + + +**Local agent requirements** +* langgraph>=0.3.18 +* langgraph-api>=0.0.32 +* Contains the same set of nodes present in the remote trace + + +#### Clone thread + +1. Open the LangSmith trace, selecting the root run. +2. Click the dropdown next to **Run in Studio**. +3. Enter your local agent's URL. +4. Select **Clone thread locally**. +5. If multiple graphs exist, select the target graph. + +A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to Studio for your locally running application. + +## Add node to dataset + +Add [examples](/langsmith/evaluation-concepts#examples) to [LangSmith datasets](/langsmith/manage-datasets) from nodes in the thread log. This is useful to evaluate individual steps of the agent. + +1. Select a thread. +1. Click **Add to Dataset**. +1. Select nodes whose input/output you want to add to a dataset. +1. For each selected node, select the target dataset to create the example in. By default a dataset for the specific assistant and node will be selected. If this dataset does not yet exist, it will be created. +1. Edit the example's input/output as needed before adding it to the dataset. +1. Select **Add to dataset** at the bottom of the page to add all selected nodes to their respective datasets. + +For more details, refer to [How to evaluate an application's intermediate steps](/langsmith/evaluate-on-intermediate-steps). diff --git a/src/langsmith/observability.mdx b/src/langsmith/observability.mdx index e1ce2cd26..03dc5a18f 100644 --- a/src/langsmith/observability.mdx +++ b/src/langsmith/observability.mdx @@ -6,16 +6,60 @@ mode: wide Welcome to the LangSmith Observability documentation. The following sections help you set up and use tracing, monitoring, and observability features: -- **Set up tracing**: Configure tracing for your applications with basic configuration, [integrations](/langsmith/trace-with-langgraph) with popular frameworks, and advanced configuration options. + + + Configure tracing with basic options, framework integrations, or advanced settings for full control. + -- **View traces**: Access and manage your traces through the UI and API, including [filtering](/langsmith/filter-traces-in-application), [exporting](/langsmith/data-export), [sharing](/langsmith/share-trace), and [comparing](/langsmith/compare-traces) traces. + + Access and manage traces via UI or API with filtering, exporting, sharing, and comparison tools. + -- **Monitoring**: Set up [dashboards](/langsmith/dashboards) and [alerts](/langsmith/alerts) to monitor your application performance and receive notifications when issues arise. + + Create dashboards and set alerts to track performance and get notified when issues arise. + -- **Automations**: Configure [rules](/langsmith/rules), [webhooks](/langsmith/webhooks), and [online evaluations](/langsmith/online-evaluations) to automate your observability workflows. + + Use rules, webhooks, and online evaluations to streamline observability workflows. + -- **Human Feedback**: [Collect and manage human feedback](/langsmith/attach-user-feedback) on your application outputs through annotation queues and inline annotation. + + Gather and manage annotations on outputs using queues and inline annotation. + -- **Trace a RAG application**: Follow a [tutorial to trace a Retrieval-Augmented Generation (RAG) application](/langsmith/observability-llm-tutorial) from start to finish. + + Follow a step-by-step tutorial to trace a Retrieval-Augmented Generation application from start to finish. + + -For terminology definitions and core concepts, refer to the [introduction on observability](/langsmith/observability-concepts). +For terminology definitions and core concepts, refer to [Observability concepts](/langsmith/observability-concepts). diff --git a/src/langgraph-platform/openapi-security.mdx b/src/langsmith/openapi-security.mdx similarity index 100% rename from src/langgraph-platform/openapi-security.mdx rename to src/langsmith/openapi-security.mdx diff --git a/src/langsmith/prompt-engineering.mdx b/src/langsmith/prompt-engineering.mdx index 02a34388d..b46c828f0 100644 --- a/src/langsmith/prompt-engineering.mdx +++ b/src/langsmith/prompt-engineering.mdx @@ -6,14 +6,59 @@ mode: wide Welcome to the LangSmith Prompt Engineering documentation. The following sections help you create, manage, and optimize your prompts: -- [**Create and update prompts**](/langsmith/create-a-prompt): Create promptsthrough the UI or SDK, configure settings, use tools, include multimodal content, and connect to different model providers. + + + Read definitions and key terminology for prompt engineering in LangSmith. + -- [**Manage prompts**](/langsmith/manage-prompts): Organize your prompts with tags, commit changes, trigger webhooks, and share prompts through the public prompt hub. + + Build prompts via the UI or SDK, configure settings, use tools, add multimodal inputs, and connect model providers. + -- [**Prompt hub**](/langsmith/manage-prompts#public-prompt-hub): Access and manage prompt tags and explore the LangChain Hub for community prompts. + + Organize with tags, commit changes, trigger webhooks, and share through the public prompt hub. + -- [**Prompt playground**](/langsmith/managing-model-configurations): Test and experiment with prompts using custom endpoints and model configurations. + + Browse and manage prompt tags and discover community prompts from the LangChain Hub. + -- **Tutorials**: Follow tutorials to [optimize classifiers](/langsmith/optimize-classifier) and learn advanced prompt engineering techniques. + + Test and experiment with prompts using custom endpoints and model configurations. + -For terminology definitions and core concepts, refer to the [introduction on prompt engineering](/langsmith/prompt-engineering-concepts). + + Learn step-by-step techniques, like optimizing classifiers and advanced prompt engineering. + + + diff --git a/src/langsmith/quick-start-studio.mdx b/src/langsmith/quick-start-studio.mdx new file mode 100644 index 000000000..487fa763c --- /dev/null +++ b/src/langsmith/quick-start-studio.mdx @@ -0,0 +1,135 @@ +--- +title: Get started with Studio +sidebarTitle: Use Studio +--- + +[Studio](/langgraph-platform/langgraph-studio) supports connecting to two types of graphs: + +- Graphs deployed on [cloud or self-hosted](#deployed-graphs). +- Graphs running locally with [LangGraph server](#local-development-server). + +## Deployed graphs + +Studio is accessed in the [LangSmith UI](https://smith.langchain.com) from **Deployments**. + +For applications that are [deployed](/langgraph-platform/deployment-quickstart), you can access Studio as part of that deployment. To do so, navigate to the deployment in the UI and select **Studio**. + +This will load the Studio connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langgraph-platform/assistants), and [memory](/oss/concepts/memory) in that deployment. + +## Local development server + +### Prerequisites + +To test your application locally using Studio: + +- Follow the [local application quickstart](/langgraph-platform/local-server) first. +- If you don't want data [traced](/langsmith/observability-concepts#traces) to LangSmith, set `LANGSMITH_TRACING=false` in your application's `.env` file. With tracing disabled, no data leaves your local server. + +### Setup + +1. Install the [LangGraph CLI](/langgraph-platform/langgraph-cli): + + + ```bash pip + pip install -U "langgraph-cli[inmem]" + langgraph dev + ``` + + ```bash uv + uv add langgraph-cli[inmem] + langgraph dev + ``` + + ```bash npm + npx @langchain/langgraph-cli dev + ``` + + + + **Browser Compatibility** + Safari blocks `localhost` connections to Studio. To work around this, run the command with `--tunnel` to access Studio via a secure tunnel. + + + This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langgraph-platform/cli#dev) to learn about all the options for starting the API server. + + You will see the following logs: + + ``` + > Ready! + > + > - API: [http://localhost:2024](http://localhost:2024/) + > + > - Docs: http://localhost:2024/docs + > + > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 + ``` + + Once running, you will automatically be directed to Studio. + +1. For a running server, access Studio with one of the following: + 1. Directly navigate to the following URL: `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`. + 1. Navigate to **Deployments** in the UI, click the **Studio** button on a deployment, enter `http://127.0.0.1:2024` and click **Connect**. + + If running your server at a different host or port, update the `baseUrl` to match. + +### (Optional) Attach a debugger + +For step-by-step debugging with breakpoints and variable inspection, run the following: + + + ```bash pip + # Install debugpy package + pip install debugpy + # Start server with debugging enabled + langgraph dev --debug-port 5678 + ``` + + ```bash uv + # Install debugpy package + uv add debugpy + # Start server with debugging enabled + langgraph dev --debug-port 5678 + ``` + + +Then attach your preferred debugger: + + + + Add this configuration to `launch.json`: + + ```json + { + "name": "Attach to LangGraph", + "type": "debugpy", + "request": "attach", + "connect": { + "host": "0.0.0.0", + "port": 5678 + } + } + ``` + + + 1. Go to Run → Edit Configurations + 2. Click + and select "Python Debug Server" + 3. Set IDE host name: `localhost` + 4. Set port: `5678` (or the port number you chose in the previous step) + 5. Click "OK" and start debugging + + + + +For issues getting started, refer to the [troubleshooting guide](/langgraph-platform/troubleshooting-studio). + + +## Next steps + +For more information on how to run studio, refer to the following guides: + +- [Run application](/langsmith/use-studio#run-application) +- [Manage assistants](/langsmith/use-studio#manage-assistants) +- [Manage threads](/langsmith/use-studio#manage-threads) +- [Iterate on prompts](/langsmith/observability-studio) +- [Debug LangSmith traces](/langsmith/observability-studio#debug-langsmith-traces) +- [Add node to dataset](/langsmith/observability-studio#add-node-to-dataset) diff --git a/src/langgraph-platform/reference-overview.mdx b/src/langsmith/reference-overview.mdx similarity index 100% rename from src/langgraph-platform/reference-overview.mdx rename to src/langsmith/reference-overview.mdx diff --git a/src/langgraph-platform/reject-concurrent.mdx b/src/langsmith/reject-concurrent.mdx similarity index 100% rename from src/langgraph-platform/reject-concurrent.mdx rename to src/langsmith/reject-concurrent.mdx diff --git a/src/langsmith/release-versions.mdx b/src/langsmith/release-versions.mdx index e117c7c5c..10aa5965e 100644 --- a/src/langsmith/release-versions.mdx +++ b/src/langsmith/release-versions.mdx @@ -4,17 +4,8 @@ title: Release versions import ReleaseVersionPolicy from "/snippets/release-version-policy.mdx"; - - -## Self-hosted considerations - -For self-hosted LangSmith installations: - -- Ensure you're running a supported version to receive security updates -- Plan regular upgrades to maintain support -- Monitor [Helm chart releases](https://github.com/langchain-ai/langsmith-helm-charts/releases) for updates -- Follow the [self-hosted upgrade guide](/langsmith/self-host-upgrades) for detailed instructions + ## Current version support -To check the current supported versions and their support levels, refer to the [LangSmith changelog](https://github.com/langchain-ai/langsmith/releases) for the latest release information. +To check the current supported versions and their support levels, refer to the [LangGraph Server Changelog](/langgraph-platform/langgraph-server-changelog) for the latest release information. diff --git a/src/langgraph-platform/remote-graph.mdx b/src/langsmith/remote-graph.mdx similarity index 100% rename from src/langgraph-platform/remote-graph.mdx rename to src/langsmith/remote-graph.mdx diff --git a/src/langgraph-platform/resource-auth.mdx b/src/langsmith/resource-auth.mdx similarity index 100% rename from src/langgraph-platform/resource-auth.mdx rename to src/langsmith/resource-auth.mdx diff --git a/src/langgraph-platform/rollback-concurrent.mdx b/src/langsmith/rollback-concurrent.mdx similarity index 100% rename from src/langgraph-platform/rollback-concurrent.mdx rename to src/langsmith/rollback-concurrent.mdx diff --git a/src/langgraph-platform/same-thread.mdx b/src/langsmith/same-thread.mdx similarity index 100% rename from src/langgraph-platform/same-thread.mdx rename to src/langsmith/same-thread.mdx diff --git a/src/langgraph-platform/scalability-and-resilience.mdx b/src/langsmith/scalability-and-resilience.mdx similarity index 100% rename from src/langgraph-platform/scalability-and-resilience.mdx rename to src/langsmith/scalability-and-resilience.mdx diff --git a/src/langgraph-platform/sdk.mdx b/src/langsmith/sdk.mdx similarity index 100% rename from src/langgraph-platform/sdk.mdx rename to src/langsmith/sdk.mdx diff --git a/src/langsmith/self-hosted.mdx b/src/langsmith/self-hosted.mdx new file mode 100644 index 000000000..f527a20d7 --- /dev/null +++ b/src/langsmith/self-hosted.mdx @@ -0,0 +1,198 @@ +--- +title: Self-hosted LangSmith Platform deployments +sidebarTitle: Overview +--- + + +**Important** +The Self-Hosted deployment option requires an [Enterprise](/langgraph-platform/plans) plan. + + +LangSmith can be deployed in different self-hosted configurations depending on your scale, security, and infrastructure needs. This section provides an overview of the supported deployment types and guidance on choosing the right one for your use case. + +TODO +All self-hosted options share the same core components (API server, workers, Postgres, Redis, and object storage). What differs is how these services are packaged and deployed. + +The Self-Hosted deployment option allows you to run all components entirely within your own cloud environment. You can choose between two deployment models: + +1. **[Full Platform](#full-platform)**: Deploy both control plane and data plane with full UI/API management capabilities +2. **[Standalone Server](#standalone-server)**: Deploy standalone instances of a LangGraph Server without the control plane UI + +## LangGraph Server + UI Deployments + + + +For a guide on deployment, refer to: + +* [How to deploy the Self-Hosted Full Platform](/langgraph-platform/deploy-self-hosted-full-platform) +* [How to deploy the Self-Hosted Standalone Server](/langgraph-platform/deploy-standalone-server) + +Supported Compute Platforms: [Kubernetes](https://kubernetes.io/) (for Control Plane), any compute platform (for Standalone Server Only) + + + +## Full Platform + +### Overview + +The Full Platform deployment model is a fully self-hosted solution where you manage both the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) in your cloud. This option gives you full control and responsibility of the control plane and data plane infrastructure. + +| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | +|-------------------|-------------------|------------| +| **What is it?** |
  • Control plane UI for creating deployments and revisions
  • Control plane APIs for creating deployments and revisions
|
  • Data plane "listener" for reconciling deployments with control plane state
  • LangGraph Servers
  • Postgres, Redis, etc
| +| **Where is it hosted?** | Your cloud | Your cloud | +| **Who provisions and manages it?** | You | You | + +### Requirements + +* You use `langgraph-cli` and/or [Studio](/langgraph-platform/langgraph-studio) app to test graph locally. +* You use `langgraph build` command to build image. +* You have a Self-Hosted LangSmith instance deployed. +* You are using [Ingress](/langsmith/self-host-ingress) for your LangSmith instance. All agents will be deployed as Kubernetes services behind this ingress. + +With the full platform option, build a Docker image using the [LangGraph CLI](/langgraph-platform/langgraph-cli) and deploy your LangGraph Server from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui) or using the container deployment tooling of your choice. + +### Architecture + +![Self-Hosted Full Platform Architecture](/langgraph-platform/images/self-hosted-full-platform-architecture.png) + +### Compute Platforms + +* **Kubernetes**: The Full Platform deployment model supports deploying control plane and data plane infrastructure to any Kubernetes cluster. + + +If you would like to enable this on your LangSmith instance, please follow the [Self-Hosted Full Platform deployment guide](/langgraph-platform/deploy-self-hosted-full-platform). + + +## Standalone Server + +### Overview + +The Standalone Server Only deployment model is the least restrictive option for deployment. There is no [control plane](/langgraph-platform/control-plane). A simplified version of the [Data plane](/langgraph-platform/data-plane) infrastructure is managed by you. + +| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | +|-------------------|-------------------|------------| +| **What is it?** | n/a |
  • LangGraph Servers
  • Postgres, Redis, etc
| +| **Where is it hosted?** | n/a | Your cloud | +| **Who provisions and manages it?** | n/a | You | + + +LangGraph Platform should not be deployed in serverless environments. Scale to zero may cause task loss and scaling up will not work reliably. + + +### Architecture + +![Standalone Container](/langgraph-platform/images/langgraph-platform-deployment-architecture.png) + +### Compute Platforms + +#### Kubernetes + +The Standalone Server deployment model supports deploying data plane infrastructure to a Kubernetes cluster. + +#### Docker + +The Standalone Server deployment model supports deploying data plane infrastructure to any Docker-supported compute platform. + + +To deploy a [LangGraph Server](/langgraph-platform/langgraph-server), follow the how-to guide for [how to deploy the Standalone Server](/langgraph-platform/deploy-standalone-server). + + + + +## LangSmith overview + + +--- +title: Architectural overview +sidebarTitle: Overview +--- + + +Self-hosted LangSmith is an add-on to the Enterprise Plan designed for our largest, most security-conscious customers. See our [pricing page](https://www.langchain.com/pricing) for more detail, and [contact our sales team](https://www.langchain.com/contact-sales) if you want to get a license key to trial LangSmith in your environment. + + +You can run LangSmith in Kubernetes (recommended) or Docker in a cloud environment that you control. The LangSmith application consists of several components including LangSmith servers and stateful services: + +- [LangSmith frontend](#langsmith-frontend) +- [LangSmith backend](#langsmith-backend) +- [LangSmith platform backend](#langsmith-platform-backend) +- [LangSmith Playground](#langsmith-playground) +- [LangSmith queue](#langsmith-queue) +- [LangSmith ACE (Arbitrary Code Execution) backend](#langsmith-acearbitrary-code-execution-backend) +- [ClickHouse](#clickhouse) +- [PostgreSQL](#postgresql) +- [Redis](#redis) +- [Blob storage](#blob-storage) (Optional, but recommended) + +
+Light mode overview + +Dark mode overview +
+ +To access the LangSmith UI and send API requests, you will need to expose the [LangSmith frontend](#langsmith-frontend) service. Depending on your installation method, this can be a load balancer or a port exposed on the host machine. + +## Storage Services + + +LangSmith Self-Hosted will bundle all storage services by default. You can configure LangSmith to use external versions of all storage services. In a production setting, we **strongly recommend using external storage services**. + + +### ClickHouse + +[ClickHouse](https://clickhouse.com/docs/en/intro) is a high-performance, column-oriented SQL database management system (DBMS) for online analytical processing (OLAP). + +LangSmith uses ClickHouse as the primary data store for traces and feedback (high-volume data). + +### PostgreSQL + +[PostgreSQL](https://www.postgresql.org/about/) is a powerful, open source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale the most complicated data workloads + +LangSmith uses PostgreSQL as the primary data store for transactional workloads and operational data (almost everything besides traces and feedback). + +### Redis + +[Redis](https://github.com/redis/redis) is a powerful in-memory key-value database that persists on disk. By holding data in memory, Redis offers high performance for operations like caching. + +LangSmith uses Redis to back queuing and caching operations. + +### Blob storage + +LangSmith supports several blob storage providers, including [AWS S3](https://aws.amazon.com/s3/), [Azure Blob Storage](https://azure.microsoft.com/en-us/services/storage/blobs/), and [Google Cloud Storage](https://cloud.google.com/storage). + +LangSmith uses blob storage to store large files, such as trace artifacts, feedback attachments, and other large data objects. Blob storage is optional, but highly recommended for production deployments. + +## Services + +### LangSmith frontend + +The frontend uses Nginx to serve the LangSmith UI and route API requests to the other servers. This serves as the entrypoint for the application and is the only component that must be exposed to users. + +### LangSmith backend + +The backend is the main entrypoint for CRUD API requests and handles the majority of the business logic for the application. This includes handling requests from the frontend and SDK, preparing traces for ingestion, and supporting the hub API. + +### LangSmith queue + +The queue handles incoming traces and feedback to ensure that they are ingested and persisted into the traces and feedback datastore asynchronously, handling checks for data integrity and ensuring successful insert into the datastore, handling retries in situations such as database errors or the temporary inability to connect to the database. + +### LangSmith platform backend + +The platform backend is another critical service that primarily handles authentication, run ingestion, and other high-volume tasks. + +### LangSmith playground + +The playground is a service that handles forwarding requests to various LLM APIs to support the LangSmith Playground feature. This can also be used to connect to your own custom model servers. + +### LangSmith ACE (Arbitrary Code Execution) backend + +The ACE backend is a service that handles executing arbitrary code in a secure environment. This is used to support running custom code within LangSmith. diff --git a/src/langgraph-platform/semantic-search.mdx b/src/langsmith/semantic-search.mdx similarity index 100% rename from src/langgraph-platform/semantic-search.mdx rename to src/langsmith/semantic-search.mdx diff --git a/src/langgraph-platform/server-a2a.mdx b/src/langsmith/server-a2a.mdx similarity index 100% rename from src/langgraph-platform/server-a2a.mdx rename to src/langsmith/server-a2a.mdx diff --git a/src/langgraph-platform/server-api-ref.mdx b/src/langsmith/server-api-ref.mdx similarity index 100% rename from src/langgraph-platform/server-api-ref.mdx rename to src/langsmith/server-api-ref.mdx diff --git a/src/langgraph-platform/server-mcp.mdx b/src/langsmith/server-mcp.mdx similarity index 100% rename from src/langgraph-platform/server-mcp.mdx rename to src/langsmith/server-mcp.mdx diff --git a/src/langgraph-platform/set-up-custom-auth.mdx b/src/langsmith/set-up-custom-auth.mdx similarity index 97% rename from src/langgraph-platform/set-up-custom-auth.mdx rename to src/langsmith/set-up-custom-auth.mdx index 99422b62f..a5d02251e 100644 --- a/src/langgraph-platform/set-up-custom-auth.mdx +++ b/src/langsmith/set-up-custom-auth.mdx @@ -142,7 +142,7 @@ Start the server again to test everything out: langgraph dev --no-browser ``` -If you didn't add the `--no-browser`, the studio UI will open in the browser. You may wonder, how is the studio able to still connect to our server? By default, we also permit access from the LangGraph studio, even when using custom auth. This makes it easier to develop and test your bot in the studio. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: +If you didn't add the `--no-browser`, the studio UI will open in the browser. You may wonder, how is the studio able to still connect to our server? By default, we also permit access from the Studio, even when using custom auth. This makes it easier to develop and test your bot in the studio. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: ```json { diff --git a/src/langgraph-platform/setup-app-requirements-txt.mdx b/src/langsmith/setup-app-requirements-txt.mdx similarity index 100% rename from src/langgraph-platform/setup-app-requirements-txt.mdx rename to src/langsmith/setup-app-requirements-txt.mdx diff --git a/src/langgraph-platform/setup-javascript.mdx b/src/langsmith/setup-javascript.mdx similarity index 100% rename from src/langgraph-platform/setup-javascript.mdx rename to src/langsmith/setup-javascript.mdx diff --git a/src/langgraph-platform/setup-pyproject.mdx b/src/langsmith/setup-pyproject.mdx similarity index 100% rename from src/langgraph-platform/setup-pyproject.mdx rename to src/langsmith/setup-pyproject.mdx diff --git a/src/langsmith/api-ref.mdx b/src/langsmith/smith-api-ref.mdx similarity index 68% rename from src/langsmith/api-ref.mdx rename to src/langsmith/smith-api-ref.mdx index b3accdba4..ba244d6d6 100644 --- a/src/langsmith/api-ref.mdx +++ b/src/langsmith/smith-api-ref.mdx @@ -1,5 +1,5 @@ --- title: LangSmith API reference -sidebarTitle: API reference +sidebarTitle: LangSmith API reference url: "https://api.smith.langchain.com/redoc" --- diff --git a/src/langsmith/js-ts-sdk.mdx b/src/langsmith/smith-js-ts-sdk.mdx similarity index 72% rename from src/langsmith/js-ts-sdk.mdx rename to src/langsmith/smith-js-ts-sdk.mdx index 45c677cdf..82238cf41 100644 --- a/src/langsmith/js-ts-sdk.mdx +++ b/src/langsmith/smith-js-ts-sdk.mdx @@ -1,5 +1,5 @@ --- title: LangSmith JS/TS SDK -sidebarTitle: JS/TS SDK +sidebarTitle: LangSmith JS/TS SDK url: "https://docs.smith.langchain.com/reference/js" --- diff --git a/src/langsmith/python-sdk.mdx b/src/langsmith/smith-python-sdk.mdx similarity index 74% rename from src/langsmith/python-sdk.mdx rename to src/langsmith/smith-python-sdk.mdx index 412319266..72a667f56 100644 --- a/src/langsmith/python-sdk.mdx +++ b/src/langsmith/smith-python-sdk.mdx @@ -1,5 +1,5 @@ --- title: LangSmith Python SDK -sidebarTitle: Python SDK +sidebarTitle: LangSmith Python SDK url: "https://docs.smith.langchain.com/reference/python/reference" --- diff --git a/src/langgraph-platform/stateless-runs.mdx b/src/langsmith/stateless-runs.mdx similarity index 100% rename from src/langgraph-platform/stateless-runs.mdx rename to src/langsmith/stateless-runs.mdx diff --git a/src/langgraph-platform/streaming.mdx b/src/langsmith/streaming.mdx similarity index 100% rename from src/langgraph-platform/streaming.mdx rename to src/langsmith/streaming.mdx diff --git a/src/langgraph-platform/troubleshooting-studio.mdx b/src/langsmith/troubleshooting-studio.mdx similarity index 91% rename from src/langgraph-platform/troubleshooting-studio.mdx rename to src/langsmith/troubleshooting-studio.mdx index 7dc48d792..d921d6d0c 100644 --- a/src/langgraph-platform/troubleshooting-studio.mdx +++ b/src/langsmith/troubleshooting-studio.mdx @@ -1,5 +1,5 @@ --- -title: LangGraph Studio troubleshooting +title: Studio troubleshooting sidebarTitle: Troubleshooting --- @@ -44,7 +44,7 @@ Brave blocks plain-HTTP traffic on localhost when Brave Shields are enabled. Whe Disable Brave Shields for LangSmith using the Brave icon in the URL bar. -![Brave Shields panel with Shields turned off for smith.langchain.com so LangGraph Studio can reach localhost.](/langgraph-platform/images/brave-shields.png) +![Brave Shields panel with Shields turned off for smith.langchain.com so Studio can reach localhost.](/langgraph-platform/images/brave-shields.png) ### Solution 2: Use Cloudflare Tunnel @@ -74,7 +74,7 @@ Use this URL in Brave to load Studio. Here, the `baseUrl` parameter specifies yo ## Graph Edge Issues Undefined conditional edges may show unexpected connections in your graph. This is -because without proper definition, LangGraph Studio assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: +because without proper definition, Studio assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: ### Solution 1: Path Map diff --git a/src/langgraph-platform/use-remote-graph.mdx b/src/langsmith/use-remote-graph.mdx similarity index 100% rename from src/langgraph-platform/use-remote-graph.mdx rename to src/langsmith/use-remote-graph.mdx diff --git a/src/langgraph-platform/use-stream-react.mdx b/src/langsmith/use-stream-react.mdx similarity index 100% rename from src/langgraph-platform/use-stream-react.mdx rename to src/langsmith/use-stream-react.mdx diff --git a/src/langsmith/use-studio.mdx b/src/langsmith/use-studio.mdx new file mode 100644 index 000000000..97f3e03c3 --- /dev/null +++ b/src/langsmith/use-studio.mdx @@ -0,0 +1,139 @@ +--- +title: How to use Studio +sidebarTitle: Runs, assistants, threads +--- + +This page describes the core workflows you’ll use in Studio. It explains how to run your application, manage assistant configurations, and work with conversation threads. Each section includes steps in both graph mode (full-featured view of your graph’s execution) and chat mode (lightweight conversational interface): + +- [Run application](#run-application): Execute your application or agent and observe its behavior. +- [Manage assistants](#manage-assistants): Create, edit, and select the assistant configuration used by your application. +- [Manage threads](#manage-threads): View and organize the threads, including forking or editing past runs for debugging. + +## Run application + + + + +### Specify input + +1. Define the input to your graph in the **Input** section on the left side of the page, below the graph interface. Studio will attempt to render a form for your input based on the graph's defined [state schema](/oss/langgraph/graph-api/#schema). To disable this, click the **View Raw** button, which will present you with a JSON editor. +1. Click the up or down arrows at the top of the **Input** section to toggle through and use previously submitted inputs. + +### Run settings + +#### Assistant + +To specify the [assistant](/langgraph-platform/assistants) that is used for the run: + +1. Click the **Settings** button in the bottom left corner. If an assistant is currently selected the button will also list the assistant name. If no assistant is selected it will say **Manage Assistants**. +1. Select the assistant to run. +1. Click the **Active** toggle at the top of the modal to activate it. + +For more information, refer to [Manage assistants](#manage-assistants). + +#### Streaming + +Click the dropdown next to **Submit** and click the toggle to enable or disable streaming. + +#### Breakpoints + +To run your graph with breakpoints: + +1. Click **Interrupt**. +1. Select a node and whether to pause before or after that node has executed. +1. Click **Continue** in the thread log to resume execution. + +For more information on breakpoints, refer to [Human-in-the-loop](/langgraph-platform/add-human-in-the-loop). + +### Submit run + +To submit the run with the specified input and run settings: + +1. Click the **Submit** button. This will add a [run](/langgraph-platform/assistants#execution) to the existing selected [thread](/oss/langgraph/persistence#threads). If no thread is currently selected, a new one will be created. +1. To cancel the ongoing run, click the **Cancel** button. + + + + +Specify the input to your chat application in the bottom of the conversation panel. + +1. Click the **Send message** button to submit the input as a Human message and have the response streamed back. + +To cancel the ongoing run: + +1. Click **Cancel**. +1. Click the **Show tool calls** toggle to hide or show tool calls in the conversation. + + + + +## Manage assistants + +Studio lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. + +For more conceptual details, refer to the [Assistants overview](/langgraph-platform/assistants/). + + + + +To view your assistants: + +1. Click **Manage Assistants** in the bottom left corner. This opens a modal for you to view all the assistants for the selected graph. +1. Specify the assistant and its version you would like to mark as **Active**. LangSmith will use this assistant when runs are submitted. + +The **Default configuration** option will be active, which reflects the default configuration defined in your graph. Edits made to this configuration will be used to update the run-time configuration, but will not update or create a new assistant unless you click **Create new assistant**. + + + + +Chat mode enables you to switch through the different assistants in your graph via the dropdown selector at the top of the page. To create, edit, or delete assistants, use Graph mode. + + + + +## Manage threads + +Studio provides tools to view all [threads](/oss/langgraph/persistence#threads) saved on the server and edit their state. You can create new threads, switch between threads, and modify past states both in graph mode and chat mode. + + + + +### View threads + +1. In the top of the right-hand pane, select the dropdown menu to view existing threads. +1. Select the desired thread, and the thread history will populate in the right-hand side of the page. +1. To create a new thread, click **+ New Thread** and [submit a run](#run-application). +1. To view more granular information in the thread, drag the slider at the top of the page to the right. To view less information, drag the slider to the left. Additionally, collapse or expand individual turns, nodes, and keys of the state. +1. Switch between `Pretty` and `JSON` mode for different rendering formats. + +### Edit thread history + +To edit the state of the thread: + +1. Select **Edit node state** next to the desired node. +1. Edit the node's output as desired and click **Fork** to confirm. This will create a new forked run from the checkpoint of the selected node. + +If you instead want to re-run the thread from a given checkpoint without editing the state, click **Re-run from here**. This will again create a new forked run from the selected checkpoint. This is useful for re-running with changes that are not specific to the state, such as the selected assistant. + + + + +1. View all threads in the right-hand pane of the page. +1. Select the desired thread and the thread history will populate in the center panel. +1. To create a new thread, click **+** and [submit a run](/langgraph-platform/use-studio#run-application#chat-mode). + +To edit a human message in the thread: + +1. Click **Edit node state** below the human message. +1. Edit the message as desired and submit. This will create a new fork of the conversation history. +1. To re-generate an AI message, click the retry icon below the AI message. + + + + +## Next steps + +Refer to the following guides for more detail on tasks you can complete in Studio: + +- [Iterate on prompts](/langsmith/observability-studio) +- [Run experiments over datasets](/langsmith/observability-studio#run-experiments-over-a-dataset) diff --git a/src/langgraph-platform/use-threads.mdx b/src/langsmith/use-threads.mdx similarity index 99% rename from src/langgraph-platform/use-threads.mdx rename to src/langsmith/use-threads.mdx index 6378f476f..2c1f9f6be 100644 --- a/src/langgraph-platform/use-threads.mdx +++ b/src/langsmith/use-threads.mdx @@ -498,4 +498,4 @@ You can also view threads in a deployment via the LangGraph Platform UI. Inside your deployment, select the "Threads" tab. This will load a table of all of the threads in your deployment. -Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in [LangGraph Studio](/langgraph-platform/langgraph-studio). +Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in [Studio](/langgraph-platform/langgraph-studio). diff --git a/src/langgraph-platform/use-webhooks.mdx b/src/langsmith/use-webhooks.mdx similarity index 100% rename from src/langgraph-platform/use-webhooks.mdx rename to src/langsmith/use-webhooks.mdx diff --git a/src/langsmith/why-langgraph.mdx b/src/langsmith/why-langgraph.mdx new file mode 100644 index 000000000..db2b3842c --- /dev/null +++ b/src/langsmith/why-langgraph.mdx @@ -0,0 +1,10 @@ +--- +title: Overview +sidebarTitle: Overview +--- + +**LangSmith** extends the [LangGraph](/oss/langgraph/overview) framework for building stateful, multi-agent applications as graphs, which allows you to define control flow, manage persistence, and coordinate interactions across components or agents. LangSmith is framework-agnostic, which means you can deploy and operate agents built with LangGraph or [another framework](/langgraph-platform/autogen-integration). + +To get acquainted with LangGraph's key concepts and features, complete the following [LangGraph quickstart](/oss/langgraph/quickstart). + +While the OSS framework introduces the core abstractions and execution model, LangSmith adds capabilities including managed infrastructure, [deployment models](/langgraph-platform/deployment-options), [assistants](/langgraph-platform/configuration-cloud), and [double-texting](/langgraph-platform/double-texting) support. These platform-level features support the full lifecycle of LangGraph applications, from development to production at scale. diff --git a/src/snippets/oss/studio.mdx b/src/snippets/oss/studio.mdx index d5d6a2188..10318c0e6 100644 --- a/src/snippets/oss/studio.mdx +++ b/src/snippets/oss/studio.mdx @@ -153,7 +153,7 @@ Your agent will be accessible via API (`http://127.0.0.1:2024`) and the Studio U Studio makes each step of your agent easily observable. Replay any input and inspect the exact prompt, tool arguments, return values, and token/latency metrics. If a tool throws an exception, Studio records it with surrounding state so you can spend less time debugging. -Keep your dev server running, edit prompts or tool signatures, and watch Studio hot-reload. Re-run the conversation thread from any step to verify behavior changes. See [Manage threads](/langgraph-platform/use-studio#manage-threads#edit-thread-history) for more details. +Keep your dev server running, edit prompts or tool signatures, and watch Studio hot-reload. Re-run the conversation thread from any step to verify behavior changes. See [Manage threads](/langsmith/use-studio#edit-thread-history) for more details. As your agent grows, the same view scales from a single-tool demo to multi-node graphs, keeping decisions legible and reproducible. From e31066d0e806fd15d31347ccf3277cb3c7de6a53 Mon Sep 17 00:00:00 2001 From: Kathryn May Date: Wed, 1 Oct 2025 14:07:05 -0400 Subject: [PATCH 02/18] Work on self-hosted section WIP --- .github/labeler.yml | 4 +- src/docs.json | 220 ++++++++-------- src/labs/oap/custom-agents/overview.mdx | 4 +- src/labs/oap/quickstart.mdx | 4 +- src/labs/oap/setup/agents.mdx | 6 +- src/labs/swe/setup/monorepo.mdx | 16 +- src/langgraph-platform/index.mdx | 18 +- .../observability-studio.mdx | 214 ---------------- src/langgraph-platform/use-studio.mdx | 14 +- src/langsmith/add-auth-server.mdx | 14 +- src/langsmith/add-human-in-the-loop.mdx | 4 +- src/langsmith/api-ref-control-plane.mdx | 2 +- src/langsmith/application-structure.mdx | 8 +- src/langsmith/assistants.mdx | 12 +- src/langsmith/auth.mdx | 22 +- src/langsmith/autogen-integration.mdx | 18 +- src/langsmith/cli.mdx | 34 +-- .../cloud-architecture-and-scalability.mdx | 2 +- src/langsmith/cloud.mdx | 10 +- src/langsmith/components.mdx | 22 +- src/langsmith/composite-evaluators.mdx | 2 +- src/langsmith/configurable-headers.mdx | 2 +- src/langsmith/configuration-cloud.mdx | 22 +- src/langsmith/configure-ttl.mdx | 4 +- src/langsmith/control-plane.mdx | 46 ++-- src/langsmith/cron-jobs.mdx | 6 +- src/langsmith/custom-auth.mdx | 12 +- src/langsmith/custom-lifespan.mdx | 10 +- src/langsmith/custom-middleware.mdx | 10 +- src/langsmith/custom-routes.mdx | 10 +- src/langsmith/data-plane.mdx | 62 ++--- src/langsmith/data-storage-and-privacy.mdx | 10 +- src/langsmith/deploy-hybrid.mdx | 21 +- .../deploy-self-hosted-full-platform.mdx | 30 +-- src/langsmith/deploy-standalone-server.mdx | 15 +- src/langsmith/deploy-to-cloud.mdx | 45 ++-- src/langsmith/deployment-options.mdx | 24 +- src/langsmith/deployment-quickstart.mdx | 14 +- src/langsmith/double-texting.mdx | 18 +- src/langsmith/egress-metrics-metadata.mdx | 8 +- src/langsmith/enqueue-concurrent.mdx | 2 +- src/langsmith/env-var.mdx | 12 +- src/langsmith/evaluation.mdx | 2 +- src/langsmith/export-backend.mdx | 2 +- src/langsmith/faq.mdx | 18 +- src/langsmith/generative-ui-react.mdx | 16 +- src/langsmith/graph-rebuild.mdx | 4 +- src/langsmith/home.mdx | 7 +- .../human-in-the-loop-time-travel.mdx | 2 +- src/langsmith/hybrid.mdx | 99 +++----- .../images/assistants.png | Bin .../images/authentication.png | Bin .../images/authorization.png | Bin .../images/autogen-output.png | Bin .../images/brave-shields.png | Bin .../images/double-texting.png | Bin .../images/generative-ui-sample.jpg | Bin .../images/hybrid-architecture.png | Bin .../langgraph-cloud-architecture.excalidraw | 0 .../images/langgraph-cloud-architecture.png | Bin ...graph-platform-deployment-architecture.png | Bin .../images/lg-platform.png | Bin .../images/lg-studio.png | Bin .../images/no-auth.png | Bin ...self-hosted-full-platform-architecture.png | Bin src/langsmith/interrupt-concurrent.mdx | 2 +- src/langsmith/kubernetes.mdx | 6 +- src/langsmith/langgraph-cli.mdx | 12 +- src/langsmith/langgraph-platform-logs.mdx | 4 +- src/langsmith/langgraph-server-changelog.mdx | 2 +- src/langsmith/langgraph-server.mdx | 24 +- src/langsmith/langgraph-studio.mdx | 12 +- src/langsmith/local-server.mdx | 10 +- src/langsmith/monorepo-support.mdx | 2 +- src/langsmith/observability-studio.mdx | 6 +- src/langsmith/observability.mdx | 2 +- src/langsmith/openapi-security.mdx | 16 +- src/langsmith/prompt-engineering.mdx | 2 +- src/langsmith/quick-start-studio.mdx | 14 +- src/langsmith/reference-overview.mdx | 22 +- src/langsmith/reject-concurrent.mdx | 2 +- src/langsmith/release-versions.mdx | 4 +- src/langsmith/resource-auth.mdx | 22 +- src/langsmith/rollback-concurrent.mdx | 2 +- src/langsmith/same-thread.mdx | 2 +- src/langsmith/scalability-and-resilience.mdx | 4 +- .../script-running-pg-support-queries.mdx | 4 +- src/langsmith/sdk.mdx | 10 +- src/langsmith/self-host-usage.mdx | 2 +- src/langsmith/self-hosted.mdx | 234 ++++++++---------- src/langsmith/semantic-search.mdx | 2 +- src/langsmith/server-a2a.mdx | 6 +- src/langsmith/server-api-ref.mdx | 2 +- src/langsmith/server-mcp.mdx | 16 +- src/langsmith/set-up-custom-auth.mdx | 24 +- src/langsmith/setup-app-requirements-txt.mdx | 12 +- src/langsmith/setup-javascript.mdx | 10 +- src/langsmith/setup-pyproject.mdx | 14 +- src/langsmith/stateless-runs.mdx | 2 +- src/langsmith/streaming.mdx | 16 +- src/langsmith/troubleshooting-studio.mdx | 4 +- src/langsmith/use-remote-graph.mdx | 8 +- src/langsmith/use-stream-react.mdx | 6 +- src/langsmith/use-studio.mdx | 10 +- src/langsmith/use-threads.mdx | 20 +- src/langsmith/use-webhooks.mdx | 14 +- src/langsmith/why-langgraph.mdx | 4 +- src/oss/concepts/memory.mdx | 2 +- src/oss/langchain/deploy.mdx | 4 +- src/oss/langchain/sql-agent.mdx | 2 +- src/oss/langgraph/add-human-in-the-loop.mdx | 2 +- src/oss/langgraph/application-structure.mdx | 8 +- src/oss/langgraph/deploy.mdx | 4 +- src/oss/langgraph/local-server.mdx | 8 +- src/oss/langgraph/observability.mdx | 2 +- src/oss/langgraph/overview.mdx | 4 +- src/oss/langgraph/persistence.mdx | 14 +- src/oss/langgraph/use-functional-api.mdx | 2 +- src/oss/langgraph/use-graph-api.mdx | 2 +- .../python/integrations/chat/anthropic.mdx | 2 +- src/snippets/oss/deploy.mdx | 4 +- src/snippets/oss/studio.mdx | 4 +- src/snippets/oss/ui.mdx | 4 +- 123 files changed, 790 insertions(+), 1053 deletions(-) delete mode 100644 src/langgraph-platform/observability-studio.mdx rename src/{langgraph-platform => langsmith}/images/assistants.png (100%) rename src/{langgraph-platform => langsmith}/images/authentication.png (100%) rename src/{langgraph-platform => langsmith}/images/authorization.png (100%) rename src/{langgraph-platform => langsmith}/images/autogen-output.png (100%) rename src/{langgraph-platform => langsmith}/images/brave-shields.png (100%) rename src/{langgraph-platform => langsmith}/images/double-texting.png (100%) rename src/{langgraph-platform => langsmith}/images/generative-ui-sample.jpg (100%) rename src/{langgraph-platform => langsmith}/images/hybrid-architecture.png (100%) rename src/{langgraph-platform => langsmith}/images/langgraph-cloud-architecture.excalidraw (100%) rename src/{langgraph-platform => langsmith}/images/langgraph-cloud-architecture.png (100%) rename src/{langgraph-platform => langsmith}/images/langgraph-platform-deployment-architecture.png (100%) rename src/{langgraph-platform => langsmith}/images/lg-platform.png (100%) rename src/{langgraph-platform => langsmith}/images/lg-studio.png (100%) rename src/{langgraph-platform => langsmith}/images/no-auth.png (100%) rename src/{langgraph-platform => langsmith}/images/self-hosted-full-platform-architecture.png (100%) diff --git a/.github/labeler.yml b/.github/labeler.yml index ab2aa25a5..096a9eb1b 100644 --- a/.github/labeler.yml +++ b/.github/labeler.yml @@ -37,8 +37,8 @@ oss: langgraph-platform: - changed-files: - any-glob-to-any-file: - - src/langgraph-platform/** - - src/langgraph-platform/**/* + - src/langsmith/** + - src/langsmith/**/* # Label for Labs documentation changes labs: diff --git a/src/docs.json b/src/docs.json index 7a7c04938..4b5eb3280 100644 --- a/src/docs.json +++ b/src/docs.json @@ -478,9 +478,9 @@ ] }, { - "dropdown": "LangSmith", + "dropdown": "LangSmith Platform", "icon": "rocket", - "description": "Platform for deployment, LLM observability, and evaluation", + "description": "LLM observability, evaluation, and deployment", "tabs": [ { "tab": "Get started", @@ -851,16 +851,7 @@ { "group": "Overview", "pages": [ - "langsmith/deployment-options", - { - "group": "Platform components", - "pages": [ - "langsmith/components", - "langsmith/langgraph-server", - "langsmith/data-plane", - "langsmith/control-plane" - ] - } + "langsmith/deployment-options" ] }, { @@ -882,126 +873,135 @@ "pages": [ "langsmith/self-hosted", { - "group": "Deployment guides", + "group": "Setup guides", "pages": [ + { + "group": "LangSmith Platform", + "icon": "server", + "pages": [ + "langsmith/kubernetes", + "langsmith/docker" + ] + }, "langsmith/deploy-self-hosted-full-platform", - "langsmith/deploy-standalone-server" + "langsmith/deploy-standalone-server", + { + "group": "Manage an installation", + "pages": [ + "langsmith/self-host-usage", + "langsmith/self-host-upgrades", + "langsmith/self-host-egress", + "langsmith/self-host-organization-charts", + "langsmith/langsmith-managed-clickhouse", + "langsmith/self-host-ingress", + "langsmith/self-host-mirroring-images" + ] + } ] }, { "group": "Configuration", "pages": [ + "langsmith/self-host-scale", + "langsmith/self-host-ttl", "langsmith/custom-docker", - "langsmith/egress-metrics-metadata" + "langsmith/egress-metrics-metadata", + "langsmith/self-host-playground-environment-settings", + "langsmith/troubleshooting" ] }, { - "group": "(Existing LS content to organize) Self-host", + "group": "Connect external services", "pages": [ - "langsmith/architectural-overview", - { - "group": "Setup", - "pages": [ - "langsmith/kubernetes", - "langsmith/docker", - "langsmith/self-host-usage", - "langsmith/self-host-upgrades", - "langsmith/self-host-egress", - "langsmith/self-host-organization-charts", - "langsmith/langsmith-managed-clickhouse" - ] - }, - { - "group": "Configuration", - "pages": [ - "langsmith/self-host-scale", - "langsmith/self-host-ttl", - "langsmith/self-host-ingress", - "langsmith/self-host-mirroring-images", - "langsmith/self-host-playground-environment-settings", - "langsmith/troubleshooting" - ] - }, - { - "group": "Authentication & access control", - "pages": [ - "langsmith/self-host-basic-auth", - "langsmith/self-host-sso", - "langsmith/self-host-user-management", - "langsmith/self-host-custom-tls-certificates", - "langsmith/self-host-using-an-existing-secret" - ] - }, - { - "group": "Connect external services", - "pages": [ - "langsmith/self-host-blob-storage", - "langsmith/self-host-external-clickhouse", - "langsmith/self-host-external-postgres", - "langsmith/self-host-external-redis" - ] - }, - { - "group": "Scripts", - "pages": [ - "langsmith/script-delete-a-workspace", - "langsmith/script-delete-an-organization", - "langsmith/script-delete-traces", - "langsmith/script-generate-clickhouse-stats", - "langsmith/script-generate-query-stats", - "langsmith/script-running-pg-support-queries", - "langsmith/script-running-ch-support-queries" - ] - }, - { - "group": "Observability", - "pages": [ - "langsmith/export-backend", - "langsmith/langsmith-collector", - "langsmith/observability-stack" - ] - } + "langsmith/self-host-blob-storage", + "langsmith/self-host-external-clickhouse", + "langsmith/self-host-external-postgres", + "langsmith/self-host-external-redis" + ] + }, + { + "group": "Platform components", + "pages": [ + "langsmith/components", + "langsmith/langgraph-server", + "langsmith/data-plane", + "langsmith/control-plane" + ] + }, + { + "group": "Platform auth & access control", + "pages": [ + "langsmith/self-host-basic-auth", + "langsmith/self-host-sso", + "langsmith/self-host-user-management", + "langsmith/self-host-custom-tls-certificates", + "langsmith/self-host-using-an-existing-secret" + ] + }, + { + "group": "Self-hosted observability", + "pages": [ + "langsmith/export-backend", + "langsmith/langsmith-collector", + "langsmith/observability-stack" + ] + }, + { + "group": "Scripts for management tasks", + "pages": [ + "langsmith/script-delete-a-workspace", + "langsmith/script-delete-an-organization", + "langsmith/script-delete-traces", + "langsmith/script-generate-clickhouse-stats", + "langsmith/script-generate-query-stats", + "langsmith/script-running-pg-support-queries", + "langsmith/script-running-ch-support-queries" ] } ] }, { - "group": "Configure app for deployment", + "group": "Agent deployment", "pages": [ - "langsmith/application-structure", { - "group": "Setup", + "group": "Configure for deployment", "pages": [ - "langsmith/setup-app-requirements-txt", - "langsmith/setup-pyproject", - "langsmith/setup-javascript" + "langsmith/application-structure", + { + "group": "Setup", + "pages": [ + "langsmith/setup-app-requirements-txt", + "langsmith/setup-pyproject", + "langsmith/setup-javascript" + ] + }, + "langsmith/graph-rebuild", + "langsmith/use-remote-graph", + "langsmith/monorepo-support", + "langsmith/semantic-search", + "langsmith/configure-ttl" ] }, - "langsmith/graph-rebuild", - "langsmith/use-remote-graph", - "langsmith/monorepo-support", - "langsmith/semantic-search", - "langsmith/configure-ttl" - ] - }, - { - "group": "Authentication & access control", - "pages": [ - "langsmith/auth", - "langsmith/custom-auth", - "langsmith/set-up-custom-auth", - "langsmith/resource-auth", - "langsmith/add-auth-server", - "langsmith/openapi-security", - "langsmith/agent-auth" - ] - }, - { - "group": "Server customization", - "pages": [ - "langsmith/custom-lifespan", - "langsmith/custom-middleware", - "langsmith/custom-routes" + { + "group": "Auth & access control", + "pages": [ + "langsmith/auth", + "langsmith/custom-auth", + "langsmith/set-up-custom-auth", + "langsmith/resource-auth", + "langsmith/add-auth-server", + "langsmith/openapi-security", + "langsmith/agent-auth" + ] + }, + { + "group": "Server customization", + "pages": [ + "langsmith/custom-lifespan", + "langsmith/custom-middleware", + "langsmith/custom-routes" + ] + } ] } ] diff --git a/src/labs/oap/custom-agents/overview.mdx b/src/labs/oap/custom-agents/overview.mdx index cc805ed81..b1d188e9a 100644 --- a/src/labs/oap/custom-agents/overview.mdx +++ b/src/labs/oap/custom-agents/overview.mdx @@ -9,7 +9,7 @@ We built Open Agent Platform with custom agents in mind. Although we offer a few ## Platform Requirements -OAP is built on top of LangGraph Platform, which means all agents which you build to be used in OAP must be LangGraph agents deployed on LangGraph Platform. +OAP is built on top of LangSmith Platform, which means all agents which you build to be used in OAP must be LangGraph agents deployed on LangSmith Platform. ## Agent Types @@ -24,7 +24,7 @@ When building custom agents, you can create three main types: The typical development flow for creating custom agents involves: 1. Developing and testing your agent locally using LangGraph -2. Deploying your agent to LangGraph Platform +2. Deploying your agent to LangSmith Platform 3. Configuring your Open Agent Platform to connect to your deployed agent 4. Testing and refining your agent through the OAP interface diff --git a/src/labs/oap/quickstart.mdx b/src/labs/oap/quickstart.mdx index e957f5677..9142bc5e3 100644 --- a/src/labs/oap/quickstart.mdx +++ b/src/labs/oap/quickstart.mdx @@ -70,7 +70,7 @@ The next step in setting up Open Agent Platform is to deploy and configure your For each agent repository: 1. Clone the repository 2. Follow the instructions in the README - 3. Deploy the agents to LangGraph Platform, or run `langgraph dev` to run the agents locally. Optionally pass `--port ` to use a custom port. This is useful if running multiple graphs locally. + 3. Deploy the agents to LangSmith Platform, or run `langgraph dev` to run the agents locally. Optionally pass `--port ` to use a custom port. This is useful if running multiple graphs locally. After deployment, create a configuration object for each agent (for the next step): @@ -85,7 +85,7 @@ The next step in setting up Open Agent Platform is to deploy and configure your } ``` - You can find your project & tenant IDs with a GET request to the `/info` endpoint on your LangGraph Platform deployment. + You can find your project & tenant IDs with a GET request to the `/info` endpoint on your LangSmith Platform deployment. If you are running agents locally via `langgraph dev`, the `id` (project ID), and `tenantId` should be any valid UUID version 4, such as those generated by `uuid.uuid4()`. Ensure each graph has a unique `id`, and all graphs share the same `tenantId`. diff --git a/src/labs/oap/setup/agents.mdx b/src/labs/oap/setup/agents.mdx index 51fac0f81..2e6c96abf 100644 --- a/src/labs/oap/setup/agents.mdx +++ b/src/labs/oap/setup/agents.mdx @@ -15,7 +15,7 @@ To use these agents in your instance, you should: 1. Clone the repositories 2. Follow the instructions in the READMEs -3. Deploy the agents to LangGraph Platform, or run `langgraph dev` to run the agents locally. +3. Deploy the agents to LangSmith Platform, or run `langgraph dev` to run the agents locally. ## API Keys @@ -25,7 +25,7 @@ We've exposed `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`, and `TAVI The keys that users set in this page will be passed to any agents as part of the runtime config, under the `apiKeys` configurable field. -As a developer, you can choose to require users to bring their own API keys, or to fallback to the environment variables set in LangGraph Platform deployment itself. +As a developer, you can choose to require users to bring their own API keys, or to fallback to the environment variables set in LangSmith Platform deployment itself. When using the demo instance of OAP, the Tools Agent and Supervisor will first use any API keys set by the user in OAP, but will fall back to API Keys set by the LangChain team in the base deployments. @@ -52,7 +52,7 @@ Once deployed or running locally, you can connect them to your instance of Open ## Finding Project & Tenant IDs -To easily find your project & tenant IDs for agents deployed to LangGraph Platform, you can make a GET request to the `/info` endpoint of your deployment URL. Then, copy the `project_id` value into `id`, and `tenant_id` into `tenantId`. +To easily find your project & tenant IDs for agents deployed to LangSmith Platform, you can make a GET request to the `/info` endpoint of your deployment URL. Then, copy the `project_id` value into `id`, and `tenant_id` into `tenantId`. ## Setting Environment Variables diff --git a/src/labs/swe/setup/monorepo.mdx b/src/labs/swe/setup/monorepo.mdx index 840273d38..ad3bc5f9a 100644 --- a/src/labs/swe/setup/monorepo.mdx +++ b/src/labs/swe/setup/monorepo.mdx @@ -78,20 +78,20 @@ Run these commands from the repository root: Always install dependencies in the specific package where they're used, never in the root `package.json` unless adding a resolution. - + ```bash # Correct - install in specific package cd apps/web yarn add some-package - + # Incorrect - don't install in root yarn add some-package ``` - + When multiple packages need the same dependency, add a resolution to the root `package.json` to ensure version consistency. - + ```json { "resolutions": { @@ -101,10 +101,10 @@ Run these commands from the repository root: } ``` - + Run `yarn build` from the root when making changes to `packages/shared` to make them available to other packages. - + ```bash # After modifying packages/shared yarn build @@ -125,7 +125,7 @@ The `apps/open-swe` package includes a critical `postinstall` hook: ``` - This postinstall hook is **required** for LangGraph Platform deployment. Since + This postinstall hook is **required** for LangSmith Platform deployment. Since Open SWE is a monorepo and the agent requires access to built files from the shared package, we must run the build process before starting the LangGraph server. @@ -133,7 +133,7 @@ The `apps/open-swe` package includes a critical `postinstall` hook: ### Why This Matters -1. **Deployment Compatibility**: The LangGraph Platform needs all dependencies built and available +1. **Deployment Compatibility**: The LangSmith Platform needs all dependencies built and available 2. **Shared Package Access**: The agent imports utilities from `@openswe/shared` which must be compiled 3. **Build Order**: Ensures the shared package is built before the agent attempts to use it diff --git a/src/langgraph-platform/index.mdx b/src/langgraph-platform/index.mdx index 73c2d7b72..2476ca6eb 100644 --- a/src/langgraph-platform/index.mdx +++ b/src/langgraph-platform/index.mdx @@ -6,7 +6,7 @@ mode: wide LangGraph Platform is a runtime for deploying and managing long-running, stateful agent workflows in production. It provides APIs for execution, persistence, monitoring, and scaling of agent applications. Agents built with [LangGraph](/oss/langgraph/overview) or other frameworks can be hosted on the platform and exposed through managed endpoints. -Choose from cloud, hybrid, or self-hosted deployments based on your infrastructure requirements. For more details, refer to the [Deployment options](/langgraph-platform/deployment-options) page. +Choose from cloud, hybrid, or self-hosted deployments based on your infrastructure requirements. For more details, refer to the [Deployment options](/langsmith/deployment-options) page. ## Quickstarts @@ -15,7 +15,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -25,7 +25,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -35,7 +35,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -52,7 +52,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -63,7 +63,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -74,7 +74,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -85,7 +85,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu @@ -107,7 +107,7 @@ Choose from cloud, hybrid, or self-hosted deployments based on your infrastructu diff --git a/src/langgraph-platform/observability-studio.mdx b/src/langgraph-platform/observability-studio.mdx deleted file mode 100644 index 20de79431..000000000 --- a/src/langgraph-platform/observability-studio.mdx +++ /dev/null @@ -1,214 +0,0 @@ ---- -title: Observability in Studio -sidebarTitle: Traces, datasets, prompts ---- - -Studio provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: - -- [Iterate on prompts](#iterate-on-prompts): Modify prompts inside graph nodes directly or with the LangSmith playground. -- [Run experiments over a dataset](#run-experiments-over-a-dataset): Execute your assistant over a LangSmith dataset to score and compare results. -- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into Studio and optionally clone them into your local agent. -- [Add a node to a dataset](#add-node-to-dataset): Turn parts of thread history into dataset examples for evaluation or further analysis. - -## Iterate on prompts - -Studio supports the following methods for modifying prompts in your graph: - -- [Direct node editing](#direct-node-editing) -- [Playground interface](#playground) - -### Direct node editing - -Studio allows you to edit prompts used inside individual nodes, directly from the graph interface. - -#### Graph Configuration - -Define your [configuration](/oss/langgraph/use-graph-api#add-runtime-configuration) to specify prompt fields and their associated nodes using `langgraph_nodes` and `langgraph_type` keys. - -##### `langgraph_nodes` - -- **Description**: Specifies which nodes of the graph a configuration field is associated with. -- **Value Type**: Array of strings, where each string is the name of a node in your graph. -- **Usage Context**: Include in the `json_schema_extra` dictionary for Pydantic models or the `metadata["json_schema_extra"]` dictionary for dataclasses. -- **Example**: - ```python - system_prompt: str = Field( - default="You are a helpful AI assistant.", - json_schema_extra={"langgraph_nodes": ["call_model", "other_node"]}, - ) - ``` - -##### `langgraph_type` - -- **Description**: Specifies the type of configuration field, which determines how it's handled in the UI. -- **Value Type**: String -- **Supported Values**: - * `"prompt"`: Indicates the field contains prompt text that should be treated specially in the UI. -- **Usage Context**: Include in the `json_schema_extra` dictionary for Pydantic models or the `metadata["json_schema_extra"]` dictionary for dataclasses. -- **Example**: - ```python - system_prompt: str = Field( - default="You are a helpful AI assistant.", - json_schema_extra={ - "langgraph_nodes": ["call_model"], - "langgraph_type": "prompt", - }, - ) - ``` - - - -```python -## Using Pydantic -from pydantic import BaseModel, Field -from typing import Annotated, Literal - -class Configuration(BaseModel): - """The configuration for the agent.""" - - system_prompt: str = Field( - default="You are a helpful AI assistant.", - description="The system prompt to use for the agent's interactions. " - "This prompt sets the context and behavior for the agent.", - json_schema_extra={ - "langgraph_nodes": ["call_model"], - "langgraph_type": "prompt", - }, - ) - - model: Annotated[ - Literal[ - "anthropic/claude-3-7-sonnet-latest", - "anthropic/claude-3-5-haiku-latest", - "openai/o1", - "openai/gpt-4o-mini", - "openai/o1-mini", - "openai/o3-mini", - ], - {"__template_metadata__": {"kind": "llm"}}, - ] = Field( - default="openai/gpt-4o-mini", - description="The name of the language model to use for the agent's main interactions. " - "Should be in the form: provider/model-name.", - json_schema_extra={"langgraph_nodes": ["call_model"]}, - ) - -## Using Dataclasses -from dataclasses import dataclass, field - -@dataclass(kw_only=True) -class Configuration: - """The configuration for the agent.""" - - system_prompt: str = field( - default="You are a helpful AI assistant.", - metadata={ - "description": "The system prompt to use for the agent's interactions. " - "This prompt sets the context and behavior for the agent.", - "json_schema_extra": {"langgraph_nodes": ["call_model"]}, - }, - ) - - model: Annotated[str, {"__template_metadata__": {"kind": "llm"}}] = field( - default="anthropic/claude-3-5-sonnet-20240620", - metadata={ - "description": "The name of the language model to use for the agent's main interactions. " - "Should be in the form: provider/model-name.", - "json_schema_extra": {"langgraph_nodes": ["call_model"]}, - }, - ) - -``` - - - -#### Editing prompts in the UI - -1. Locate the gear icon on nodes with associated configuration fields. -1. Click to open the configuration modal. -1. Edit the values. -1. Save to update the current assistant version or create a new one. - -### Playground - -The [playground](/langsmith/create-a-prompt) interface allows testing individual LLM calls without running the full graph: - -1. Select a thread. -1. Click **View LLM Runs** on a node. This lists all the LLM calls (if any) made inside the node. -1. Select an LLM run to open in the playground. -1. Modify prompts and test different model and tool settings. -1. Copy updated prompts back to your graph. - -## Run experiments over a dataset - -Studio lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). - -This guide shows you how to run a full end-to-end experiment directly from Studio. - -### Prerequisites - -Before running an experiment, ensure you have the following: - -- **A LangSmith dataset**: Your dataset should contain the inputs you want to test and optionally, reference outputs for comparison. The schema for the inputs must match the required input schema for the assistant. For more information on schemas, see [here](/oss/langgraph/use-graph-api#schema). For more on creating datasets, refer to [How to Manage Datasets](/langsmith/manage-datasets-in-application#set-up-your-dataset). -- **(Optional) Evaluators**: You can attach evaluators (e.g., LLM-as-a-Judge, heuristics, or custom functions) to your dataset in LangSmith. These will run automatically after the graph has processed all inputs. -- **A running application**: The experiment can be run against: - - An application deployed on [LangGraph Platform](/langgraph-platform/deployment-quickstart). - - A locally running application started via the [langgraph-cli](/langgraph-platform/local-server). - -### Experiment setup - -1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Studio page. -1. Select your dataset. In the modal that appears, select the dataset (or a specific dataset split) to use for the experiment and click **Start**. -1. Monitor the progress. All of the inputs in the dataset will now be run against the active assistant. Monitor the experiment's progress via the badge in the top right corner. -1. You can continue to work in Studio while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. - -## Debug LangSmith traces - -This guide explains how to open LangSmith traces in Studio for interactive investigation and debugging. - -### Open deployed threads - -1. Open the LangSmith trace, selecting the root run. -1. Click **Run in Studio**. - -This will open Studio connected to the associated deployment with the trace's parent thread selected. - -### Testing local agents with remote traces - -This section explains how to test a local agent against remote traces from LangSmith. This enables you to use production traces as input for local testing, allowing you to debug and verify agent modifications in your development environment. - -#### Prerequisites - -- A LangSmith traced thread -- A [locally running agent](/langgraph-platform/local-server#local-development-server). - - -**Local agent requirements** -* langgraph>=0.3.18 -* langgraph-api>=0.0.32 -* Contains the same set of nodes present in the remote trace - - -#### Clone thread - -1. Open the LangSmith trace, selecting the root run. -2. Click the dropdown next to **Run in Studio**. -3. Enter your local agent's URL. -4. Select **Clone thread locally**. -5. If multiple graphs exist, select the target graph. - -A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to Studio for your locally running application. - -## Add node to dataset - -Add [examples](/langsmith/evaluation-concepts#examples) to [LangSmith datasets](/langsmith/manage-datasets) from nodes in the thread log. This is useful to evaluate individual steps of the agent. - -1. Select a thread. -1. Click **Add to Dataset**. -1. Select nodes whose input/output you want to add to a dataset. -1. For each selected node, select the target dataset to create the example in. By default a dataset for the specific assistant and node will be selected. If this dataset does not yet exist, it will be created. -1. Edit the example's input/output as needed before adding it to the dataset. -1. Select **Add to dataset** at the bottom of the page to add all selected nodes to their respective datasets. - -For more details, refer to [How to evaluate an application's intermediate steps](/langsmith/evaluate-on-intermediate-steps). - diff --git a/src/langgraph-platform/use-studio.mdx b/src/langgraph-platform/use-studio.mdx index 6822ed633..64d2bcc61 100644 --- a/src/langgraph-platform/use-studio.mdx +++ b/src/langgraph-platform/use-studio.mdx @@ -23,7 +23,7 @@ This page describes the core workflows you’ll use in Studio. It explains how t #### Assistant -To specify the [assistant](/langgraph-platform/assistants) that is used for the run: +To specify the [assistant](/langsmith/assistants) that is used for the run: 1. Click the **Settings** button in the bottom left corner. If an assistant is currently selected the button will also list the assistant name. If no assistant is selected it will say **Manage Assistants**. 1. Select the assistant to run. @@ -43,13 +43,13 @@ To run your graph with breakpoints: 1. Select a node and whether to pause before or after that node has executed. 1. Click **Continue** in the thread log to resume execution. -For more information on breakpoints, refer to [Human-in-the-loop](/langgraph-platform/add-human-in-the-loop). +For more information on breakpoints, refer to [Human-in-the-loop](/langsmith/add-human-in-the-loop). ### Submit run To submit the run with the specified input and run settings: -1. Click the **Submit** button. This will add a [run](/langgraph-platform/assistants#execution) to the existing selected [thread](/oss/langgraph/persistence#threads). If no thread is currently selected, a new one will be created. +1. Click the **Submit** button. This will add a [run](/langsmith/assistants#execution) to the existing selected [thread](/oss/langgraph/persistence#threads). If no thread is currently selected, a new one will be created. 1. To cancel the ongoing run, click the **Cancel** button. @@ -71,7 +71,7 @@ To cancel the ongoing run: Studio lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. -For more conceptual details, refer to the [Assistants overview](/langgraph-platform/assistants/). +For more conceptual details, refer to the [Assistants overview](/langsmith/assistants/). @@ -120,7 +120,7 @@ If you instead want to re-run the thread from a given checkpoint without editing 1. View all threads in the right-hand pane of the page. 1. Select the desired thread and the thread history will populate in the center panel. -1. To create a new thread, click **+** and [submit a run](/langgraph-platform/use-studio#run-application#chat-mode). +1. To create a new thread, click **+** and [submit a run](/langsmith/use-studio#run-application#chat-mode). To edit a human message in the thread: @@ -135,5 +135,5 @@ To edit a human message in the thread: Refer to the following guides for more detail on tasks you can complete in Studio: -- [Iterate on prompts](/langgraph-platform/observability-studio#iterate-on-prompts) -- [Run experiments over datasets](/langgraph-platform/observability-studio#run-experiments-over-a-dataset) +- [Iterate on prompts](/langsmith/observability-studio#iterate-on-prompts) +- [Run experiments over datasets](/langsmith/observability-studio#run-experiments-over-a-dataset) diff --git a/src/langsmith/add-auth-server.mdx b/src/langsmith/add-auth-server.mdx index 02017263e..64343bee4 100644 --- a/src/langsmith/add-auth-server.mdx +++ b/src/langsmith/add-auth-server.mdx @@ -2,9 +2,9 @@ title: Connect an authentication provider sidebarTitle: Connect an authentication provider --- -In [the last tutorial](/langgraph-platform/resource-auth), you added resource authorization to give users private conversations. However, you are still using hard-coded tokens for authentication, which is not secure. Now you'll replace those tokens with real user accounts using [OAuth2](/langgraph-platform/deployment-quickstart). +In [the last tutorial](/langsmith/resource-auth), you added resource authorization to give users private conversations. However, you are still using hard-coded tokens for authentication, which is not secure. Now you'll replace those tokens with real user accounts using [OAuth2](/langsmith/deployment-quickstart). -You'll keep the same [`Auth`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth) object and [resource-level access control](/langgraph-platform/auth#single-owner-resources), but upgrade authentication to use Supabase as your identity provider. While Supabase is used in this tutorial, the concepts apply to any OAuth2 provider. You'll learn how to: +You'll keep the same [`Auth`](/langsmith/python-sdk#langgraph_sdk.auth.Auth) object and [resource-level access control](/langsmith/auth#single-owner-resources), but upgrade authentication to use Supabase as your identity provider. While Supabase is used in this tutorial, the concepts apply to any OAuth2 provider. You'll learn how to: 1. Replace test tokens with real JWT tokens 2. Integrate with OAuth2 providers for secure user authentication @@ -40,7 +40,7 @@ sequenceDiagram Before you start this tutorial, ensure you have: -* The [bot from the second tutorial](/langgraph-platform/resource-auth) running without errors. +* The [bot from the second tutorial](/langsmith/resource-auth) running without errors. * A [Supabase project](https://supabase.com/dashboard) to use its authentication server. ## 1. Install dependencies @@ -82,9 +82,9 @@ Since you're using Supabase for this, you can do this in the Supabase dashboard: ## 3. Implement token validation -In the previous tutorials, you used the [`Auth`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth) object to [validate hard-coded tokens](/langgraph-platform/set-up-custom-auth) and [add resource ownership](/langgraph-platform/resource-auth). +In the previous tutorials, you used the [`Auth`](/langsmith/python-sdk#langgraph_sdk.auth.Auth) object to [validate hard-coded tokens](/langsmith/set-up-custom-auth) and [add resource ownership](/langsmith/resource-auth). -Now you'll upgrade your authentication to validate real JWT tokens from Supabase. The main changes will all be in the [`@auth.authenticate`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.authenticate) decorated function: +Now you'll upgrade your authentication to validate real JWT tokens from Supabase. The main changes will all be in the [`@auth.authenticate`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.authenticate) decorated function: * Instead of checking against a hard-coded list of tokens, you'll make an HTTP request to Supabase to validate the token. * You'll extract real user information (ID, email) from the validated token. @@ -277,5 +277,5 @@ You've successfully built a production-ready authentication system for your Lang Now that you have production authentication, consider: 1. Building a web UI with your preferred framework (see the [Custom Auth](https://github.com/langchain-ai/custom-auth) template for an example) -2. Learn more about the other aspects of authentication and authorization in the [conceptual guide on authentication](/langgraph-platform/auth). -3. Customize your handlers and setup further after reading the [reference docs](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth). +2. Learn more about the other aspects of authentication and authorization in the [conceptual guide on authentication](/langsmith/auth). +3. Customize your handlers and setup further after reading the [reference docs](/langsmith/python-sdk#langgraph_sdk.auth.Auth). diff --git a/src/langsmith/add-human-in-the-loop.mdx b/src/langsmith/add-human-in-the-loop.mdx index 9bf7130c5..0c765d8c6 100644 --- a/src/langsmith/add-human-in-the-loop.mdx +++ b/src/langsmith/add-human-in-the-loop.mdx @@ -134,7 +134,7 @@ To review, edit, and approve tool calls in an agent or workflow, use LangGraph's This is an example graph you can run in the LangGraph API server. - See [LangGraph Platform quickstart](/langgraph-platform/deployment-quickstart) for more details. + See [LangSmith Platform quickstart](/langsmith/deployment-quickstart) for more details. ```python {highlight={7,13}} from typing import TypedDict @@ -172,7 +172,7 @@ To review, edit, and approve tool calls in an agent or workflow, use LangGraph's 3. Once resumed, the return value of `interrupt(...)` is the human-provided input, which is used to update the state. Once you have a running LangGraph API server, you can interact with it using - [LangGraph SDK](/langgraph-platform/python-sdk) + [LangGraph SDK](/langsmith/python-sdk) diff --git a/src/langsmith/api-ref-control-plane.mdx b/src/langsmith/api-ref-control-plane.mdx index e609c0b27..6af32b603 100644 --- a/src/langsmith/api-ref-control-plane.mdx +++ b/src/langsmith/api-ref-control-plane.mdx @@ -14,7 +14,7 @@ LangGraph Control Plane hosts for Cloud data regions: |----|----| | `https://api.host.langchain.com` | `https://eu.api.host.langchain.com` | -**Note**: Self-hosted deployments of LangGraph Platform will have a custom host for the LangGraph Control Plane. The control plane APIs can be accessed at the path `/api-host`. For example, `http(s):///api-host/v2/deployments`. See [here](../langsmith/self-host-usage#configuring-the-application-you-want-to-use-with-langsmith) for more details. +**Note**: Self-hosted deployments of LangSmith Platform will have a custom host for the LangGraph Control Plane. The control plane APIs can be accessed at the path `/api-host`. For example, `http(s):///api-host/v2/deployments`. See [here](../langsmith/self-host-usage#configuring-the-application-you-want-to-use-with-langsmith) for more details. ## Authentication diff --git a/src/langsmith/application-structure.mdx b/src/langsmith/application-structure.mdx index 487075cdb..7467f681f 100644 --- a/src/langsmith/application-structure.mdx +++ b/src/langsmith/application-structure.mdx @@ -7,11 +7,11 @@ sidebarTitle: Application structure A LangGraph application consists of one or more graphs, a configuration file (`langgraph.json`), a file that specifies dependencies, and an optional `.env` file that specifies environment variables. -This guide shows a typical structure of an application and shows how the required information to deploy an application using the LangGraph Platform is specified. +This guide shows a typical structure of an application and shows how the required information to deploy an application using the LangSmith Platform is specified. ## Key Concepts -To deploy using the LangGraph Platform, the following information should be provided: +To deploy using the LangSmith Platform, the following information should be provided: 1. A [LangGraph configuration file](#configuration-file-concepts) (`langgraph.json`) that specifies the dependencies, graphs, and environment variables to use for the application. 2. The [graphs](#graphs) that implement the logic of the application. @@ -80,10 +80,10 @@ The directory structure of a LangGraph application can vary depending on the pro The `langgraph.json` file is a JSON file that specifies the dependencies, graphs, environment variables, and other settings required to deploy a LangGraph application. -See the [LangGraph configuration file reference](/langgraph-platform/cli#configuration-file) for details on all supported keys in the JSON file. +See the [LangGraph configuration file reference](/langsmith/cli#configuration-file) for details on all supported keys in the JSON file. -The [LangGraph CLI](/langgraph-platform/langgraph-cli) defaults to using the configuration file `langgraph.json` in the current directory. +The [LangGraph CLI](/langsmith/langgraph-cli) defaults to using the configuration file `langgraph.json` in the current directory. ### Examples diff --git a/src/langsmith/assistants.mdx b/src/langsmith/assistants.mdx index f5956a353..05f60a0e8 100644 --- a/src/langsmith/assistants.mdx +++ b/src/langsmith/assistants.mdx @@ -7,32 +7,32 @@ sidebarTitle: Overview For example, imagine a general-purpose writing agent built on a common graph architecture. While the structure remains the same, different writing styles—such as blog posts and tweets—require tailored configurations to optimize performance. To support these variations, you can create multiple assistants (e.g., one for blogs and another for tweets) that share the underlying graph but differ in model selection and system prompt. -![assistant versions](/langgraph-platform/images/assistants.png) +![assistant versions](/langsmith/images/assistants.png) The LangGraph Cloud API provides several endpoints for creating and managing assistants and their versions. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/#tag/assistants) for more details. -Assistants are a [LangGraph Platform](/langgraph-platform/index) concept. They are not available in the open source LangGraph library. +Assistants are a [LangSmith Platform](/langsmith/home) concept. They are not available in the open source LangGraph library. ## Configuration Assistants build on the LangGraph open source concept of [configuration](/oss/langgraph/graph-api#runtime-context). -While configuration is available in the open source LangGraph library, assistants are only present in [LangGraph Platform](/langgraph-platform/index). This is due to the fact that assistants are tightly coupled to your deployed graph. Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings. +While configuration is available in the open source LangGraph library, assistants are only present in [LangSmith Platform](/langsmith/home). This is due to the fact that assistants are tightly coupled to your deployed graph. Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings. -In practice, an assistant is just an _instance_ of a graph with a specific configuration. Therefore, multiple assistants can reference the same graph but can contain different configurations (e.g. prompts, models, tools). The LangGraph Server API provides several endpoints for creating and managing assistants. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) and [this how-to](/langgraph-platform/configuration-cloud) for more details on how to create assistants. +In practice, an assistant is just an _instance_ of a graph with a specific configuration. Therefore, multiple assistants can reference the same graph but can contain different configurations (e.g. prompts, models, tools). The LangGraph Server API provides several endpoints for creating and managing assistants. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) and [this how-to](/langsmith/configuration-cloud) for more details on how to create assistants. ## Versioning Assistants support versioning to track changes over time. -Once you've created an assistant, subsequent edits to that assistant will create new versions. See [this how-to](/langgraph-platform/configuration-cloud#create-a-new-version-for-your-assistant) for more details on how to manage assistant versions. +Once you've created an assistant, subsequent edits to that assistant will create new versions. See [this how-to](/langsmith/configuration-cloud#create-a-new-version-for-your-assistant) for more details on how to manage assistant versions. ## Execution A **run** is an invocation of an assistant. Each run may have its own input, configuration, and metadata, which may affect execution and output of the underlying graph. A run can optionally be executed on a [thread](/oss/langgraph/persistence#threads). -The LangGraph Platform API provides several endpoints for creating and managing runs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. +The LangSmith Platform API provides several endpoints for creating and managing runs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. ## Video guide diff --git a/src/langsmith/auth.mdx b/src/langsmith/auth.mdx index ff27cb6a8..a98d21ab3 100644 --- a/src/langsmith/auth.mdx +++ b/src/langsmith/auth.mdx @@ -3,7 +3,7 @@ title: Authentication & access control sidebarTitle: Overview --- -LangGraph Platform provides a flexible authentication and authorization system that can integrate with most authentication schemes. +LangSmith Platform provides a flexible authentication and authorization system that can integrate with most authentication schemes. ## Core Concepts @@ -14,13 +14,13 @@ While often used interchangeably, these terms represent distinct security concep * [**Authentication**](#authentication) ("AuthN") verifies _who_ you are. This runs as middleware for every request. * [**Authorization**](#authorization) ("AuthZ") determines _what you can do_. This validates the user's privileges and roles on a per-resource basis. -In LangGraph Platform, authentication is handled by your [`@auth.authenticate`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler, and authorization is handled by your [`@auth.on`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.on) handlers. +In LangSmith Platform, authentication is handled by your [`@auth.authenticate`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler, and authorization is handled by your [`@auth.on`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.on) handlers. ## Default Security Models -LangGraph Platform provides different security defaults: +LangSmith Platform provides different security defaults: -### LangGraph Platform +### LangSmith Platform * Uses LangSmith API keys by default * Requires valid API key in `x-api-key` header @@ -28,7 +28,7 @@ LangGraph Platform provides different security defaults: **Custom auth** -Custom auth **is supported** for all plans in LangGraph Platform. +Custom auth **is supported** for all plans in LangSmith Platform. ### Self-Hosted @@ -76,11 +76,11 @@ sequenceDiagram LG-->>Client: 8. Return resources ``` -Your [`@auth.authenticate`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler in LangGraph handles steps 4-6, while your [`@auth.on`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.on) handlers implement step 7. +Your [`@auth.authenticate`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler in LangGraph handles steps 4-6, while your [`@auth.on`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.on) handlers implement step 7. ## Authentication -Authentication in LangGraph runs as middleware on every request. Your [`@auth.authenticate`](/langgraph-platform/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler receives request information and should: +Authentication in LangGraph runs as middleware on every request. Your [`@auth.authenticate`](/langsmith/python-sdk#langgraph_sdk.auth.Auth.authenticate) handler receives request information and should: 1. Validate the credentials 2. Return [user info](https://langchain-ai.github.io/langgraph/cloud/reference/sdk/python_sdk_ref/#langgraph_sdk.auth.types.MinimalUserDict) containing the user's identity and user information if valid @@ -175,9 +175,9 @@ sequenceDiagram After authentication, the platform creates a special configuration object that is passed to your graph and all nodes via the configurable context. This object contains information about the current user, including any custom fields you return from your [`@auth.authenticate`](https://langchain-ai.github.io/langgraph/cloud/reference/sdk/python_sdk_ref/#langgraph_sdk.auth.Auth.authenticate) handler. -To enable an agent to act on behalf of the user, use [custom authentication middleware](/langgraph-platform/custom-auth). This will allow the agent to interact with external systems like MCP servers, external databases, and even other agents on behalf of the user. +To enable an agent to act on behalf of the user, use [custom authentication middleware](/langsmith/custom-auth). This will allow the agent to interact with external systems like MCP servers, external databases, and even other agents on behalf of the user. -For more information, see the [Use custom auth](/langgraph-platform/custom-auth#enable-agent-authentication) guide. +For more information, see the [Use custom auth](/langsmith/custom-auth#enable-agent-authentication) guide. ### Agent authentication with MCP @@ -486,5 +486,5 @@ There is a specific `create_run` handler for creating new runs because it had mo For implementation details: -* Check out the introductory tutorial on [setting up authentication](/langgraph-platform/set-up-custom-auth) -* See the how-to guide on implementing a [custom auth handlers](/langgraph-platform/custom-auth) +* Check out the introductory tutorial on [setting up authentication](/langsmith/set-up-custom-auth) +* See the how-to guide on implementing a [custom auth handlers](/langsmith/custom-auth) diff --git a/src/langsmith/autogen-integration.mdx b/src/langsmith/autogen-integration.mdx index 1d42165e2..57358b68b 100644 --- a/src/langsmith/autogen-integration.mdx +++ b/src/langsmith/autogen-integration.mdx @@ -2,13 +2,13 @@ title: How to integrate LangGraph with AutoGen, CrewAI, and other frameworks sidebarTitle: Integrate LangGraph with AutoGen, CrewAI, and other frameworks --- -This guide shows how to integrate AutoGen agents with LangGraph to leverage features like persistence, streaming, and memory, and then deploy the integrated solution to LangGraph Platform for scalable production use. In this guide we show how to build a LangGraph chatbot that integrates with AutoGen, but you can follow the same approach with other frameworks. +This guide shows how to integrate AutoGen agents with LangGraph to leverage features like persistence, streaming, and memory, and then deploy the integrated solution to LangSmith Platform for scalable production use. In this guide we show how to build a LangGraph chatbot that integrates with AutoGen, but you can follow the same approach with other frameworks. Integrating AutoGen with LangGraph provides several benefits: -* Enhanced features: Add [persistence](/oss/langgraph/persistence), [streaming](/langgraph-platform/streaming), [short and long-term memory](/oss/concepts/memory) and more to your AutoGen agents. +* Enhanced features: Add [persistence](/oss/langgraph/persistence), [streaming](/langsmith/streaming), [short and long-term memory](/oss/concepts/memory) and more to your AutoGen agents. * Multi-agent systems: Build [multi-agent systems](/oss/langchain/multi-agent) where individual agents are built with different frameworks. -* Production deployment: Deploy your integrated solution to [LangGraph Platform](/langgraph-platform/index) for scalable production use. +* Production deployment: Deploy your integrated solution to [LangSmith Platform](/langsmith/home) for scalable production use. ## Prerequisites @@ -120,11 +120,11 @@ from IPython.display import display, Image display(Image(graph.get_graph().draw_mermaid_png())) ``` -![LangGraph chatbot with one step: START routes to autogen, where call_autogen_agent sends the latest user message (with prior context) to the AutoGen agent.](/langgraph-platform/images/autogen-output.png) +![LangGraph chatbot with one step: START routes to autogen, where call_autogen_agent sends the latest user message (with prior context) to the AutoGen agent.](/langsmith/images/autogen-output.png) ## 3. Test the graph locally -Before deploying to LangGraph Platform, you can test the graph locally: +Before deploying to LangSmith Platform, you can test the graph locally: ```python {highlight={2,13}} # pass the thread ID to persist agent outputs for future interactions @@ -213,7 +213,7 @@ TERMINATE ## 4. Prepare for deployment -To deploy to LangGraph Platform, create a file structure like the following: +To deploy to LangSmith Platform, create a file structure like the following: ``` my-autogen-agent/ @@ -283,7 +283,7 @@ my-autogen-agent/ builder.add_edge(START, "autogen") return builder.compile(checkpointer=checkpointer) - # Export the graph for LangGraph Platform + # Export the graph for LangSmith Platform graph = create_graph() ``` @@ -308,9 +308,9 @@ my-autogen-agent/ -## 5. Deploy to LangGraph Platform +## 5. Deploy to LangSmith Platform -Deploy the graph with the LangGraph Platform CLI: +Deploy the graph with the LangSmith Platform CLI: ```bash pip diff --git a/src/langsmith/cli.mdx b/src/langsmith/cli.mdx index 76f76cf6c..d5cee9931 100644 --- a/src/langsmith/cli.mdx +++ b/src/langsmith/cli.mdx @@ -3,7 +3,7 @@ title: LangGraph CLI sidebarTitle: CLI --- -The LangGraph command line interface includes commands to build and run a LangGraph Platform API server locally in [Docker](https://www.docker.com/). For development and testing, you can use the CLI to deploy a local API server. +The LangGraph command line interface includes commands to build and run a LangSmith Platform API server locally in [Docker](https://www.docker.com/). For development and testing, you can use the CLI to deploy a local API server. ## Installation @@ -37,9 +37,9 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | Key | Description | | ------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | - | `dependencies` | **Required**. Array of dependencies for LangGraph Platform API server. Dependencies can be one of the following:
  • A single period (`"."`), which will look for local Python packages.
  • The directory path where `pyproject.toml`, `setup.py` or `requirements.txt` is located.

    For example, if `requirements.txt` is located in the root of the project directory, specify `"./"`. If it's located in a subdirectory called `local_package`, specify `"./local_package"`. Do not specify the string `"requirements.txt"` itself.
  • A Python package name.
| - | `graphs` | **Required**. Mapping from graph ID to path where the compiled graph or a function that makes a graph is defined. Example:
  • `./your_package/your_file.py:variable`, where `variable` is an instance of `langgraph.graph.state.CompiledStateGraph`
  • `./your_package/your_file.py:make_graph`, where `make_graph` is a function that takes a config dictionary (`langchain_core.runnables.RunnableConfig`) and returns an instance of `langgraph.graph.state.StateGraph` or `langgraph.graph.state.CompiledStateGraph`. See [how to rebuild a graph at runtime](/langgraph-platform/graph-rebuild) for more details.
| - | `auth` | _(Added in v0.0.11)_ Auth configuration containing the path to your authentication handler. Example: `./your_package/auth.py:auth`, where `auth` is an instance of `langgraph_sdk.Auth`. See [authentication guide](/langgraph-platform/auth) for details. | + | `dependencies` | **Required**. Array of dependencies for LangSmith Platform API server. Dependencies can be one of the following:
  • A single period (`"."`), which will look for local Python packages.
  • The directory path where `pyproject.toml`, `setup.py` or `requirements.txt` is located.

    For example, if `requirements.txt` is located in the root of the project directory, specify `"./"`. If it's located in a subdirectory called `local_package`, specify `"./local_package"`. Do not specify the string `"requirements.txt"` itself.
  • A Python package name.
| + | `graphs` | **Required**. Mapping from graph ID to path where the compiled graph or a function that makes a graph is defined. Example:
  • `./your_package/your_file.py:variable`, where `variable` is an instance of `langgraph.graph.state.CompiledStateGraph`
  • `./your_package/your_file.py:make_graph`, where `make_graph` is a function that takes a config dictionary (`langchain_core.runnables.RunnableConfig`) and returns an instance of `langgraph.graph.state.StateGraph` or `langgraph.graph.state.CompiledStateGraph`. See [how to rebuild a graph at runtime](/langsmith/graph-rebuild) for more details.
| + | `auth` | _(Added in v0.0.11)_ Auth configuration containing the path to your authentication handler. Example: `./your_package/auth.py:auth`, where `auth` is an instance of `langgraph_sdk.Auth`. See [authentication guide](/langsmith/auth) for details. | | `base_image` | Optional. Base image to use for the LangGraph API server. Defaults to `langchain/langgraph-api` or `langchain/langgraphjs-api`. Use this to pin your builds to a particular version of the langgraph API, such as `"langchain/langgraph-server:0.2"`. See https://hub.docker.com/r/langchain/langgraph-server/tags for more details. (added in `langgraph-cli==0.2.8`) | | `image_distro` | Optional. Linux distribution for the base image. Must be one of `"debian"`, `"wolfi"`, `"bookworm"`, or `"bullseye"`. If omitted, defaults to `"debian"`. Available in `langgraph-cli>=0.2.11`.| | `env` | Path to `.env` file or a mapping from environment variable to its value. | @@ -52,19 +52,19 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | `keep_pkg_tools` | _(Added in v0.3.4)_ Optional. Control whether to retain Python packaging tools (`pip`, `setuptools`, `wheel`) in the final image. Accepted values:
  • true : Keep all three tools (skip uninstall).
  • false / omitted : Uninstall all three tools (default behaviour).
  • list[str] : Names of tools to retain. Each value must be one of "pip", "setuptools", "wheel".
. By default, all three tools are uninstalled. | | `dockerfile_lines` | Array of additional lines to add to Dockerfile following the import from parent image. | | `checkpointer` | Configuration for the checkpointer. Contains a `ttl` field which is an object with the following keys:
  • `strategy`: How to handle expired checkpoints (e.g., `"delete"`).
  • `sweep_interval_minutes`: How often to check for expired checkpoints (integer).
  • `default_ttl`: Default time-to-live for checkpoints in **minutes** (integer); applied to newly created checkpoints/threads only (existing data is unchanged). Defines how long checkpoints are kept before the specified strategy is applied.
| - | `http` | HTTP server configuration with the following fields:
  • `app`: Path to custom Starlette/FastAPI app (e.g., `"./src/agent/webapp.py:app"`). See [custom routes guide](/langgraph-platform/custom-routes).
  • `cors`: CORS configuration with fields for `allow_origins`, `allow_methods`, `allow_headers`, etc.
  • `configurable_headers`: Define which request headers to exclude or include as a run's configurable values.
  • `disable_assistants`: Disable `/assistants` routes
  • `disable_mcp`: Disable `/mcp` routes
  • `disable_meta`: Disable `/ok`, `/info`, `/metrics`, and `/docs` routes
  • `disable_runs`: Disable `/runs` routes
  • `disable_store`: Disable `/store` routes
  • `disable_threads`: Disable `/threads` routes
  • `disable_ui`: Disable `/ui` routes
  • `disable_webhooks`: Disable webhooks calls on run completion in all routes
  • `mount_prefix`: Prefix for mounted routes (e.g., "/my-deployment/api")
| - | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langgraph-platform/langgraph-server-changelog) for details on each release. | + | `http` | HTTP server configuration with the following fields:
  • `app`: Path to custom Starlette/FastAPI app (e.g., `"./src/agent/webapp.py:app"`). See [custom routes guide](/langsmith/custom-routes).
  • `cors`: CORS configuration with fields for `allow_origins`, `allow_methods`, `allow_headers`, etc.
  • `configurable_headers`: Define which request headers to exclude or include as a run's configurable values.
  • `disable_assistants`: Disable `/assistants` routes
  • `disable_mcp`: Disable `/mcp` routes
  • `disable_meta`: Disable `/ok`, `/info`, `/metrics`, and `/docs` routes
  • `disable_runs`: Disable `/runs` routes
  • `disable_store`: Disable `/store` routes
  • `disable_threads`: Disable `/threads` routes
  • `disable_ui`: Disable `/ui` routes
  • `disable_webhooks`: Disable webhooks calls on run completion in all routes
  • `mount_prefix`: Prefix for mounted routes (e.g., "/my-deployment/api")
| + | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/langgraph-server-changelog) for details on each release. |
| Key | Description | | ------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | - | `graphs` | **Required**. Mapping from graph ID to path where the compiled graph or a function that makes a graph is defined. Example:
  • `./src/graph.ts:variable`, where `variable` is an instance of `CompiledStateGraph`
  • `./src/graph.ts:makeGraph`, where `makeGraph` is a function that takes a config dictionary (`LangGraphRunnableConfig`) and returns an instance of `StateGraph` or `CompiledStateGraph`. See [how to rebuild a graph at runtime](/langgraph-platform/graph-rebuild) for more details.
| + | `graphs` | **Required**. Mapping from graph ID to path where the compiled graph or a function that makes a graph is defined. Example:
  • `./src/graph.ts:variable`, where `variable` is an instance of `CompiledStateGraph`
  • `./src/graph.ts:makeGraph`, where `makeGraph` is a function that takes a config dictionary (`LangGraphRunnableConfig`) and returns an instance of `StateGraph` or `CompiledStateGraph`. See [how to rebuild a graph at runtime](/langsmith/graph-rebuild) for more details.
| | `env` | Path to `.env` file or a mapping from environment variable to its value. | | `store` | Configuration for adding semantic search and/or time-to-live (TTL) to the BaseStore. Contains the following fields:
  • `index` (optional): Configuration for semantic search indexing with fields `embed`, `dims`, and optional `fields`.
  • `ttl` (optional): Configuration for item expiration. An object with optional fields: `refresh_on_read` (boolean, defaults to `true`), `default_ttl` (float, lifespan in **minutes**; applied to newly created items only; existing items are unchanged; defaults to no expiration), and `sweep_interval_minutes` (integer, how often to check for expired items, defaults to no sweeping).
| | `node_version` | Specify `node_version: 20` to use LangGraph.js. | | `dockerfile_lines` | Array of additional lines to add to Dockerfile following the import from parent image. | | `checkpointer` | Configuration for the checkpointer. Contains a `ttl` field which is an object with the following keys:
  • `strategy`: How to handle expired checkpoints (e.g., `"delete"`).
  • `sweep_interval_minutes`: How often to check for expired checkpoints (integer).
  • `default_ttl`: Default time-to-live for checkpoints in **minutes** (integer); applied to newly created checkpoints/threads only (existing data is unchanged). Defines how long checkpoints are kept before the specified strategy is applied.
| - | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langgraph-platform/langgraph-server-changelog) for details on each release. | + | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/langgraph-server-changelog) for details on each release. |
@@ -101,7 +101,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( #### Adding semantic search to the store - All deployments come with a DB-backed BaseStore. Adding an "index" configuration to your `langgraph.json` will enable [semantic search](/langgraph-platform/semantic-search) within the BaseStore of your deployment. + All deployments come with a DB-backed BaseStore. Adding an "index" configuration to your `langgraph.json` will enable [semantic search](/langsmith/semantic-search) within the BaseStore of your deployment. The `index.fields` configuration determines which parts of your documents to embed: @@ -193,7 +193,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( } ``` - See the [authentication conceptual guide](/langgraph-platform/auth) for details, and the [setting up custom authentication](/langgraph-platform/set-up-custom-auth) guide for a practical walk through of the process. + See the [authentication conceptual guide](/langsmith/auth) for details, and the [setting up custom authentication](/langsmith/set-up-custom-auth) guide for a practical walk through of the process.
#### Configuring Store Item Time-to-Live @@ -403,7 +403,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( - Build LangGraph Platform API server Docker image. + Build LangSmith Platform API server Docker image. **Usage** @@ -417,7 +417,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | -------------------- | ---------------- | --------------------------------------------------------------------------------------------------------------- | | `--platform TEXT` | | Target platform(s) to build the Docker image for. Example: `langgraph build --platform linux/amd64,linux/arm64` | | `-t, --tag TEXT` | | **Required**. Tag for the Docker image. Example: `langgraph build -t my-image` | - | `--pull / --no-pull` | `--pull` | Build with latest remote Docker image. Use `--no-pull` for running the LangGraph Platform API server with locally built images. | + | `--pull / --no-pull` | `--pull` | Build with latest remote Docker image. Use `--no-pull` for running the LangSmith Platform API server with locally built images. | | `-c, --config FILE` | `langgraph.json` | Path to configuration file declaring dependencies, graphs and environment variables. | | `--build-command TEXT`* | | Build command to run. Runs from the directory where your `langgraph.json` file lives. Example: `langgraph build --build-command "yarn run turbo build"` | | `--install-command TEXT`* | | Install command to run. Runs from the directory where you call `langgraph build` from. Example: `langgraph build --install-command "yarn install"` | @@ -426,7 +426,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( *Only supported for JS deployments, will have no impact on Python deployments. - Build LangGraph Platform API server Docker image. + Build LangSmith Platform API server Docker image. **Usage** @@ -450,7 +450,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( - Start LangGraph API server. For local testing, requires a LangSmith API key with access to LangGraph Platform. Requires a license key for production use. + Start LangGraph API server. For local testing, requires a LangSmith API key with access to LangSmith Platform. Requires a license key for production use. **Usage** @@ -478,7 +478,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( | `--help` | | Display command documentation. | - Start LangGraph API server. For local testing, requires a LangSmith API key with access to LangGraph Platform. Requires a license key for production use. + Start LangGraph API server. For local testing, requires a LangSmith API key with access to LangSmith Platform. Requires a license key for production use. **Usage** @@ -508,7 +508,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( - Generate a Dockerfile for building a LangGraph Platform API server Docker image. + Generate a Dockerfile for building a LangSmith Platform API server Docker image. **Usage** @@ -557,7 +557,7 @@ The LangGraph CLI requires a JSON configuration file that follows this [schema]( - Generate a Dockerfile for building a LangGraph Platform API server Docker image. + Generate a Dockerfile for building a LangSmith Platform API server Docker image. **Usage** diff --git a/src/langsmith/cloud-architecture-and-scalability.mdx b/src/langsmith/cloud-architecture-and-scalability.mdx index 107dc254f..e0b751a17 100644 --- a/src/langsmith/cloud-architecture-and-scalability.mdx +++ b/src/langsmith/cloud-architecture-and-scalability.mdx @@ -28,7 +28,7 @@ The resources and services in this table are stored in the location correspondin | GCP | us-central1 (Iowa) | europe-west4 (Netherlands) | | Supabase | AWS us-east-1 (N. Virginia) | AWS eu-central-1 (Germany) | | ClickHouse Cloud | us-central1 (Iowa) | europe-west4 (Netherlands) | -| [LangGraph Cloud](/langgraph-platform/deployment-options) | us-central1 (Iowa) | europe-west4 (Netherlands) | +| [LangGraph Cloud](/langsmith/deployment-options) | us-central1 (Iowa) | europe-west4 (Netherlands) | See the [Regions FAQ](/langsmith/regions-faq) for more information. diff --git a/src/langsmith/cloud.mdx b/src/langsmith/cloud.mdx index df566723a..1d0952e7c 100644 --- a/src/langsmith/cloud.mdx +++ b/src/langsmith/cloud.mdx @@ -3,15 +3,15 @@ title: Cloud sidebarTitle: Overview --- -To deploy a [LangGraph Server](/langgraph-platform/langgraph-server), follow the how-to guide for [how to deploy to Cloud](/langgraph-platform/deploy-to-cloud). -The [Cloud](/langgraph-platform/cloud) deployment option is a fully managed model for deployment where the [control plane](/langgraph-platform/control-plane) and [data plane](/langgraph-platform/data-plane) run in our cloud. This option provides a simple way to deploy and manage your LangGraph Servers. -Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed internally by the platform. +The [Cloud](/langsmith/cloud) option is a fully managed model where the [_control plane_](/langsmith/control-plane) and [_data plane_](/langsmith/data-plane) run in LangChain’s cloud. This option provides a simple way to deploy and manage your application. To get started, follow the how-to guide for [Cloud](/langsmith/deploy-to-cloud). -| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | +Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langsmith/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed automatically by the platform. + +| | [Control plane](/langsmith/control-plane) | [Data plane](/langsmith/data-plane) | |-------------------|---------------------------------------------------|-----------------------------------------------| | **Managed by** | LangChain | LangChain | | **Infrastructure**| LangChain's cloud | LangChain's cloud | -![Cloud](/langgraph-platform/images/langgraph-cloud-architecture.png) +![Cloud deployment: LangChain hosts and manages both the control plane (UI/APIs) and data plane (LangGraph Servers, backing services).](/langsmith/images/langgraph-cloud-architecture.png) diff --git a/src/langsmith/components.mdx b/src/langsmith/components.mdx index 9facaca5b..c4492b641 100644 --- a/src/langsmith/components.mdx +++ b/src/langsmith/components.mdx @@ -1,16 +1,16 @@ --- -title: LangSmith components -sidebarTitle: Platform components +title: LangSmith Platform components +sidebarTitle: Deployment components --- -LangGraph Platform consists of several key components that work together to provide a complete solution for deploying and managing agentic applications: +When running the self-hosted [LangSmith Platform with agent deployment](/langsmith/deploy-self-hosted-full-platform), your installation includes several key components. Together these tools and services provide a complete solution for building, deploying, and managing graphs (including agentic applications) in your own infrastructure: -* [LangGraph Server](/langgraph-platform/langgraph-server): The server defines an opinionated API and architecture that incorporates best practices for deploying agentic applications, allowing you to focus on building your agent logic rather than developing server infrastructure. -* [LangGraph CLI](/langgraph-platform/langgraph-cli): LangGraph CLI is a command-line interface that helps to interact with a local LangGraph. -* [Studio](/langgraph-platform/langgraph-studio): Studio is a specialized IDE that can connect to a LangGraph Server to enable visualization, interaction, and debugging of the application locally. -* [Python/JS SDK](/langgraph-platform/sdk): The Python/JS SDK provides a programmatic way to interact with deployed LangGraph Applications. -* [Remote Graph](/langgraph-platform/use-remote-graph): A RemoteGraph allows you to interact with any deployed LangGraph application as though it were running locally. -* [LangGraph Control Plane](/langgraph-platform/control-plane): The LangGraph Control Plane refers to the Control Plane UI where users create and update LangGraph Servers and the Control Plane APIs that support the UI experience. -* [LangGraph Data Plane](/langgraph-platform/data-plane): The LangGraph Data Plane refers to LangGraph Servers, the corresponding infrastructure for each server, and the "listener" application that continuously polls for updates from the LangGraph Control Plane. +- [LangGraph Server](/langsmith/langgraph-server): Defines an opinionated API and runtime for deploying graphs and agents. Handles execution, state management, and persistence so you can focus on building logic rather than server infrastructure. +- [LangGraph CLI](/langsmith/langgraph-cli): A command-line interface to build, package, and interact with graphs locally and prepare them for deployment. +- [Studio](/langsmith/langgraph-studio): A specialized IDE for visualization, interaction, and debugging. Connects to a local LangGraph Server for developing and testing your graph. +- [Python/JS SDK](/langsmith/sdk): The Python/JS SDK provides a programmatic way to interact with deployed graphs and agents from your applications. +- [Remote Graph](/langsmith/use-remote-graph): Allows you to interact with a deployed graph as though it were running locally. +- [LangGraph Control Plane](/langsmith/control-plane): The UI and APIs for creating, updating, and managing LangGraph Server deployments. +- [Data plane](/langsmith/data-plane): The runtime layer that executes your graphs, including LangGraph Servers, their backing services (PostgreSQL, Redis, etc.), and the listener that reconciles state from the control plane. -![LangGraph components](/langgraph-platform/images/lg-platform.png) +![LangGraph components](/langsmith/images/lg-platform.png) diff --git a/src/langsmith/composite-evaluators.mdx b/src/langsmith/composite-evaluators.mdx index 0c5e6b167..81a353c6b 100644 --- a/src/langsmith/composite-evaluators.mdx +++ b/src/langsmith/composite-evaluators.mdx @@ -92,7 +92,7 @@ oai_client = OpenAI() examples = [ { - "inputs": {"blog_intro": "Today we’re excited to announce the general availability of LangGraph Platform — our purpose-built infrastructure and management layer for deploying and scaling long-running, stateful agents. Since our beta last June, nearly 400 companies have used LangGraph Platform to deploy their agents into production. Agent deployment is the next hard hurdle for shipping reliable agents, and LangGraph Platform dramatically lowers this barrier with: 1-click deployment to go live in minutes, 30 API endpoints for designing custom user experiences that fit any interaction pattern, Horizontal scaling to handle bursty, long-running traffic, A persistence layer to support memory, conversational history, and async collaboration with human-in-the-loop or multi-agent workflows, Native LangGraph Studio, the agent IDE, for easy debugging, visibility, and iteration "}, + "inputs": {"blog_intro": "Today we’re excited to announce the general availability of LangSmith Platform — our purpose-built infrastructure and management layer for deploying and scaling long-running, stateful agents. Since our beta last June, nearly 400 companies have used LangSmith Platform to deploy their agents into production. Agent deployment is the next hard hurdle for shipping reliable agents, and LangSmith Platform dramatically lowers this barrier with: 1-click deployment to go live in minutes, 30 API endpoints for designing custom user experiences that fit any interaction pattern, Horizontal scaling to handle bursty, long-running traffic, A persistence layer to support memory, conversational history, and async collaboration with human-in-the-loop or multi-agent workflows, Native LangGraph Studio, the agent IDE, for easy debugging, visibility, and iteration "}, }, { "inputs": {"blog_intro": "Klarna has reshaped global commerce with its consumer-centric, AI-powered payment and shopping solutions. With over 85 million active users and 2.5 million daily transactions on its platform, Klarna is a fintech leader that simplifies shopping while empowering consumers with smarter, more flexible financial solutions. Klarna’s flagship AI Assistant is revolutionizing the shopping and payments experience. Built on LangGraph and powered by LangSmith, the AI Assistant handles tasks ranging from customer payments, to refunds, to other payment escalations. With 2.5 million conversations to date, the AI Assistant is more than just a chatbot; it’s a transformative agent that performs the work equivalent of 700 full-time staff, delivering results quickly and improving company efficiency."}, diff --git a/src/langsmith/configurable-headers.mdx b/src/langsmith/configurable-headers.mdx index a0efb5498..0e6fcd899 100644 --- a/src/langsmith/configurable-headers.mdx +++ b/src/langsmith/configurable-headers.mdx @@ -2,7 +2,7 @@ title: Configurable headers sidebarTitle: Configurable headers --- -LangGraph allows runtime configuration to modify agent behavior and permissions dynamically. When using the [LangGraph Platform](/langgraph-platform/deployment-quickstart), you can pass this configuration in the request body (`config`) or specific request headers. This enables adjustments based on user identity or other reques +LangGraph allows runtime configuration to modify agent behavior and permissions dynamically. When using the [LangSmith Platform](/langsmith/deployment-quickstart), you can pass this configuration in the request body (`config`) or specific request headers. This enables adjustments based on user identity or other reques For privacy, control which headers are passed to the runtime configuration via the `http.configurable_headers` section in your `langgraph.json` file. diff --git a/src/langsmith/configuration-cloud.mdx b/src/langsmith/configuration-cloud.mdx index 040aa38b3..44b9fcd05 100644 --- a/src/langsmith/configuration-cloud.mdx +++ b/src/langsmith/configuration-cloud.mdx @@ -2,7 +2,7 @@ title: Manage assistants sidebarTitle: Manage assistants --- -In this guide we will show how to create, configure, and manage an [assistant](/langgraph-platform/assistants). +In this guide we will show how to create, configure, and manage an [assistant](/langsmith/assistants). First, as a brief refresher on the concept of context, consider the following simple `call_model` node and context schema. Observe that this node tries to read and use the `model_name` as defined by the `context` object's `model_name` field. @@ -45,13 +45,13 @@ Observe that this node tries to read and use the `model_name` as defined by the -For more information on configurations, [see here](/langgraph-platform/configuration-cloud#configuration). +For more information on configurations, [see here](/langsmith/configuration-cloud#configuration). ## Create an assistant ### LangGraph SDK -To create an assistant, use the [LangGraph SDK](/langgraph-platform/sdk) `create` method. See the [Python](/langgraph-platform/python-sdk#langgraph_sdk.client.AssistantsClient.create) and [JS](/langgraph-platform/js-ts-sdk#create) SDK reference docs for more information. +To create an assistant, use the [LangGraph SDK](/langsmith/sdk) `create` method. See the [Python](/langsmith/python-sdk#langgraph_sdk.client.AssistantsClient.create) and [JS](/langsmith/js-ts-sdk#create) SDK reference docs for more information. This example uses the same context schema as above, and creates an assistant with `model_name` set to `openai`. @@ -109,15 +109,15 @@ Output: } ``` -### LangGraph Platform UI +### LangSmith Platform UI -You can also create assistants from the LangGraph Platform UI. +You can also create assistants from the LangSmith Platform UI. Inside your deployment, select the "Assistants" tab. This will load a table of all of the assistants in your deployment, across all graphs. To create a new assistant, select the "+ New assistant" button. This will open a form where you can specify the graph this assistant is for, as well as provide a name, description, and the desired configuration for the assistant based on the configuration schema for that graph. -To confirm, click "Create assistant". This will take you to [Studio](/langgraph-platform/langgraph-studio) where you can test the assistant. If you go back to the "Assistants" tab in the deployment, you will see the newly created assistant in the table. +To confirm, click "Create assistant". This will take you to [Studio](/langsmith/langgraph-studio) where you can test the assistant. If you go back to the "Assistants" tab in the deployment, you will see the newly created assistant in the table. ## Use an assistant @@ -223,7 +223,7 @@ Receiving event of type: updates {'agent': {'messages': [{'content': 'I was created by OpenAI, a research organization focused on developing and advancing artificial intelligence technology.', 'additional_kwargs': {}, 'response_metadata': {'finish_reason': 'stop', 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_157b3831f5'}, 'type': 'ai', 'name': None, 'id': 'run-e1a6b25c-8416-41f2-9981-f9cfe043f414', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': None}]}} ``` -### LangGraph Platform UI +### LangSmith Platform UI Inside your deployment, select the "Assistants" tab. For the assistant you would like to use, click the "Studio" button. This will open Studio with the selected assistant. When you submit an input (either in Graph or Chat mode), the selected assistant and its configuration will be used. @@ -231,7 +231,7 @@ Inside your deployment, select the "Assistants" tab. For the assistant you would ### LangGraph SDK -To edit the assistant, use the `update` method. This will create a new version of the assistant with the provided edits. See the [Python](/langgraph-platform/python-sdk#langgraph_sdk.client.AssistantsClient.update) and [JS](/langgraph-platform/js-ts-sdk#update) SDK reference docs for more information. +To edit the assistant, use the `update` method. This will create a new version of the assistant with the provided edits. See the [Python](/langsmith/python-sdk#langgraph_sdk.client.AssistantsClient.update) and [JS](/langsmith/js-ts-sdk#update) SDK reference docs for more information. **Note** @@ -279,9 +279,9 @@ For example, to update your assistant's system prompt: This will create a new version of the assistant with the updated parameters and set this as the active version of your assistant. If you now run your graph and pass in this assistant id, it will use this latest version. -### LangGraph Platform UI +### LangSmith Platform UI -You can also edit assistants from the LangGraph Platform UI. +You can also edit assistants from the LangSmith Platform UI. Inside your deployment, select the "Assistants" tab. This will load a table of all of the assistants in your deployment, across all graphs. @@ -322,7 +322,7 @@ In the example above, to rollback to the first version of the assistant: If you now run your graph and pass in this assistant id, it will use the first version of the assistant. -### LangGraph Platform UI +### LangSmith Platform UI If using Studio, to set the active version of your assistant, click the "Manage Assistants" button and locate the assistant you would like to use. Select the assistant and the version, and then click the "Active" toggle. This will update the assistant to make the selected version active. diff --git a/src/langsmith/configure-ttl.mdx b/src/langsmith/configure-ttl.mdx index 9484f000e..cfbe37849 100644 --- a/src/langsmith/configure-ttl.mdx +++ b/src/langsmith/configure-ttl.mdx @@ -4,7 +4,7 @@ sidebarTitle: Add TTLs to your LangGraph application --- **Prerequisites** -This guide assumes familiarity with the [LangGraph Platform](/langgraph-platform/index), [Persistence](/oss/langgraph/persistence), and [Cross-thread persistence](/oss/langgraph/persistence#memory-store) concepts. +This guide assumes familiarity with the [LangSmith Platform](/langsmith/home), [Persistence](/oss/langgraph/persistence), and [Cross-thread persistence](/oss/langgraph/persistence#memory-store) concepts. @@ -13,7 +13,7 @@ This guide assumes familiarity with the [LangGraph Platform](/langgraph-platform TTLs are only supported for LangGraph platform deployments. This guide does not apply to LangGraph OSS. -The LangGraph Platform persists both [checkpoints](/oss/langgraph/persistence#checkpoints) (thread state) and [cross-thread memories](/oss/langgraph/persistence#memory-store) (store items). Configure Time-to-Live (TTL) policies in `langgraph.json` to automatically manage the lifecycle of this data, preventing indefinite accumulation. +The LangSmith Platform persists both [checkpoints](/oss/langgraph/persistence#checkpoints) (thread state) and [cross-thread memories](/oss/langgraph/persistence#memory-store) (store items). Configure Time-to-Live (TTL) policies in `langgraph.json` to automatically manage the lifecycle of this data, preventing indefinite accumulation. ## Configuring Checkpoint TTL diff --git a/src/langsmith/control-plane.mdx b/src/langsmith/control-plane.mdx index e2650561c..a8ae76098 100644 --- a/src/langsmith/control-plane.mdx +++ b/src/langsmith/control-plane.mdx @@ -1,13 +1,13 @@ --- -title: LangGraph Control Plane -sidebarTitle: Control Plane +title: LangSmith Platform control plane +sidebarTitle: Control plane --- -The term "control plane" is used broadly to refer to the control plane UI where users create and update [LangGraph Servers](/langgraph-platform/langgraph-server) (deployments) and the control plane APIs that support the UI experience. +The _control plane_ is the part of LangSmith Platform that manages deployments. It includes the control plane UI, where users create and update [LangGraph Servers](/langsmith/langgraph-server), and the control plane APIs, which support the UI and provide programmatic access. -When a user makes an update through the control plane UI, the update is stored in the control plane state. The [LangGraph Data Plane](/langgraph-platform/data-plane) "listener" application polls for these updates by calling the control plane APIs. +When you make an update through the control plane, the update is stored in control plane state. The [data plane](/langsmith/data-plane) “listener” polls for these updates by calling the control plane APIs. -## Control Plane UI +## Control plane UI From the control plane UI, you can: @@ -20,11 +20,11 @@ From the control plane UI, you can: * View deployment metrics such as CPU and memory usage. * Delete a deployment. -The Control Plane UI is embedded in [LangSmith](https://docs.smith.langchain.com). +The Control plane UI is embedded in [LangSmith](https://docs.smith.langchain.com). -## Control Plane API +## Control plane API -This section describes the data model of the control plane API. The API is used to create, update, and delete deployments. See the [control plane API reference](/langgraph-platform/api-ref-control-plane) for more details. +This section describes the data model of the control plane API. The API is used to create, update, and delete deployments. See the [control plane API reference](/langsmith/api-ref-control-plane) for more details. ### Integrations @@ -40,9 +40,9 @@ A revision is an iteration of a deployment. When a new deployment is created, an ### Listeners -A listener is an instance of a ["listener" application](/langgraph-platform/data-plane#”listener”-application). A listener contains metadata about the application (e.g. version) and metadata about the compute infrastructure where it can deploy to (e.g. Kubernetes namespaces). +A listener is an instance of a ["listener" application](/langsmith/data-plane#”listener”-application). A listener contains metadata about the application (e.g. version) and metadata about the compute infrastructure where it can deploy to (e.g. Kubernetes namespaces). -The listener data model only applies for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +The listener data model only applies for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments. ## Control Plane Features @@ -66,7 +66,7 @@ Once a deployment is created, the deployment type cannot be changed. **Self-Hosted Deployment** -Resources for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments can be fully customized. Deployment types are only applicable for [Cloud](/langgraph-platform/cloud) deployments. +Resources for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments can be fully customized. Deployment types are only applicable for [Cloud](/langsmith/cloud) deployments. #### Production @@ -92,23 +92,23 @@ This behavior is expected. Preemptible compute infrastructure **significantly re `Production` type deployments are provisioned on durable compute infrastructure, not preemptible compute infrastructure. -Database disk size for `Development` type deployments can be manually increased on a case-by-case basis depending on use case and capacity constraints. For most use cases, [TTLs](/langgraph-platform/configure-ttl) should be configured to manage disk usage. Contact support@langchain.dev to request an increase in resources. +Database disk size for `Development` type deployments can be manually increased on a case-by-case basis depending on use case and capacity constraints. For most use cases, [TTLs](/langsmith/configure-ttl) should be configured to manage disk usage. Contact support@langchain.dev to request an increase in resources. -### Database Provisioning +### Database provisioning -The control plane and [LangGraph Data Plane](/langgraph-platform/data-plane) "listener" application coordinate to automatically create a Postgres database for each deployment. The database serves as the [persistence layer](/oss/langgraph/persistence#memory-store) for the deployment. +The control plane and [data plane](/langsmith/data-plane) "listener" application coordinate to automatically create a Postgres database for each deployment. The database serves as the [persistence layer](/oss/langgraph/persistence#memory-store) for the deployment. When implementing a LangGraph application, a [checkpointer](/oss/langgraph/persistence#checkpointer-libraries) does not need to be configured by the developer. Instead, a checkpointer is automatically configured for the graph. Any checkpointer configured for a graph will be replaced by the one that is automatically configured. -There is no direct access to the database. All access to the database occurs through the [LangGraph Server](/langgraph-platform/langgraph-server). +There is no direct access to the database. All access to the database occurs through the [LangGraph Server](/langsmith/langgraph-server). The database is never deleted until the deployment itself is deleted. -A custom Postgres instance can be configured for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +A custom Postgres instance can be configured for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments. -### Asynchronous Deployment +### Asynchronous deployment Infrastructure for deployments and revisions are provisioned and deployed asynchronously. They are not deployed immediately after submission. Currently, deployment can take up to several minutes. @@ -116,7 +116,7 @@ Infrastructure for deployments and revisions are provisioned and deployed asynch * When a subsequent revision is created for a deployment, there is no database creation step. The deployment time for a subsequent revision is significantly faster compared to the deployment time of the initial revision. * The deployment process for each revision contains a build step, which can take up to a few minutes. -The control plane and [LangGraph Data Plane](/langgraph-platform/data-plane) "listener" application coordinate to achieve asynchronous deployments. +The control plane and [data plane](/langsmith/data-plane) "listener" application coordinate to achieve asynchronous deployments. ### Monitoring @@ -124,14 +124,14 @@ After a deployment is ready, the control plane monitors the deployment and recor * CPU and memory usage of the deployment. * Number of container restarts. -* Number of replicas (this will increase with [autoscaling](/langgraph-platform/data-plane#autoscaling)). -* [Postgres](/langgraph-platform/data-plane#postgres) CPU, memory usage, and disk usage. -* [LangGraph Server queue](/langgraph-platform/langgraph-server#persistence-and-task-queue) pending/active run count. -* [LangGraph Server API](/langgraph-platform/langgraph-server) success response count, error response count, and latency. +* Number of replicas (this will increase with [autoscaling](/langsmith/data-plane#autoscaling)). +* [PostgreSQL](/langsmith/data-plane#postgres) CPU, memory usage, and disk usage. +* [LangGraph Server queue](/langsmith/langgraph-server#persistence-and-task-queue) pending/active run count. +* [LangGraph Server API](/langsmith/langgraph-server) success response count, error response count, and latency. These metrics are displayed as charts in the Control Plane UI. -### LangSmith Integration +### LangSmith integration A [LangSmith](/langsmith/home) tracing project is automatically created for each deployment. The tracing project has the same name as the deployment. When creating a deployment, the `LANGCHAIN_TRACING` and `LANGSMITH_API_KEY`/`LANGCHAIN_API_KEY` environment variables do not need to be specified; they are set automatically by the control plane. diff --git a/src/langsmith/cron-jobs.mdx b/src/langsmith/cron-jobs.mdx index 9d64b3fa5..d712ef227 100644 --- a/src/langsmith/cron-jobs.mdx +++ b/src/langsmith/cron-jobs.mdx @@ -8,16 +8,16 @@ There are many situations in which it is useful to run an assistant on a schedul For example, say that you're building an assistant that runs daily and sends an email summary of the day's news. You could use a cron job to run the assistant every day at 8:00 PM. -LangGraph Platform supports cron jobs, which run on a user-defined schedule. The user specifies a schedule, an assistant, and some input. After that, on the specified schedule, the server will: +LangSmith Platform supports cron jobs, which run on a user-defined schedule. The user specifies a schedule, an assistant, and some input. After that, on the specified schedule, the server will: * Create a new thread with the specified assistant * Send the specified input to that thread Note that this sends the same input to the thread every time. -The LangGraph Platform API provides several endpoints for creating and managing cron jobs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. +The LangSmith Platform API provides several endpoints for creating and managing cron jobs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. -Sometimes you don't want to run your graph based on user interaction, but rather you would like to schedule your graph to run on a schedule - for example if you wish for your graph to compose and send out a weekly email of to-dos for your team. LangGraph Platform allows you to do this without having to write your own script by using the `Crons` client. To schedule a graph job, you need to pass a [cron expression](https://crontab.cronhub.io/) to inform the client when you want to run the graph. `Cron` jobs are run in the background and do not interfere with normal invocations of the graph. +Sometimes you don't want to run your graph based on user interaction, but rather you would like to schedule your graph to run on a schedule - for example if you wish for your graph to compose and send out a weekly email of to-dos for your team. LangSmith Platform allows you to do this without having to write your own script by using the `Crons` client. To schedule a graph job, you need to pass a [cron expression](https://crontab.cronhub.io/) to inform the client when you want to run the graph. `Cron` jobs are run in the background and do not interfere with normal invocations of the graph. ## Setup diff --git a/src/langsmith/custom-auth.mdx b/src/langsmith/custom-auth.mdx index e193e7243..43b01617b 100644 --- a/src/langsmith/custom-auth.mdx +++ b/src/langsmith/custom-auth.mdx @@ -2,7 +2,7 @@ title: Add custom authentication sidebarTitle: Add custom authentication --- -This guide shows how to add custom authentication to your LangGraph Platform application. This guide applies to both LangGraph Platform and self-hosted deployments. It does not apply to isolated usage of the LangGraph open source library in your own custom server. +This guide shows how to add custom authentication to your LangSmith Platform application. This guide applies to both cloud and self-hosted deployments. It does not apply to isolated usage of the LangGraph open source library in your own custom server. ## Add custom authentication to your deployment @@ -114,7 +114,7 @@ To leverage custom authentication and access user-level metadata in your deploym ## Enable agent authentication -After [authentication](#add-custom-authentication-to-your-deployment), the platform creates a special configuration object (`config`) that is passed to LangGraph Platform deployment. This object contains information about the current user, including any custom fields you return from your `@auth.authenticate` handler. +After [authentication](#add-custom-authentication-to-your-deployment), the platform creates a special configuration object (`config`) that is passed to LangSmith Platform deployment. This object contains information about the current user, including any custom fields you return from your `@auth.authenticate` handler. To allow an agent to perform authenticated actions on behalf of the user, access this object in your graph with the `langgraph_auth_user` key: @@ -158,10 +158,10 @@ async def add_owner( return filters ``` -Only use this if you want to permit developer access to a graph deployed on the managed LangGraph Platform SaaS. +Only use this if you want to permit developer access to a graph deployed on the managed LangSmith Platform SaaS. ## Learn more -* [Authentication & Access Control](/langgraph-platform/auth) -* [LangGraph Platform](/langgraph-platform/index) -* [Setting up custom authentication tutorial](/langgraph-platform/set-up-custom-auth) +* [Authentication & Access Control](/langsmith/auth) +* [LangSmith Platform](/langsmith/home) +* [Setting up custom authentication tutorial](/langsmith/set-up-custom-auth) diff --git a/src/langsmith/custom-lifespan.mdx b/src/langsmith/custom-lifespan.mdx index 41c2960a3..42f32f7d6 100644 --- a/src/langsmith/custom-lifespan.mdx +++ b/src/langsmith/custom-lifespan.mdx @@ -2,9 +2,9 @@ title: How to add custom lifespan events sidebarTitle: Add custom lifespan events --- -When deploying agents to LangGraph Platform, you often need to initialize resources like database connections when your server starts up, and ensure they're properly closed when it shuts down. Lifespan events let you hook into your server's startup and shutdown sequence to handle these critical setup and teardown tasks. +When deploying agents to LangSmith Platform, you often need to initialize resources like database connections when your server starts up, and ensure they're properly closed when it shuts down. Lifespan events let you hook into your server's startup and shutdown sequence to handle these critical setup and teardown tasks. -This works the same way as [adding custom routes](/langgraph-platform/custom-routes). You just need to provide your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). +This works the same way as [adding custom routes](/langsmith/custom-routes). You just need to provide your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). Below is an example using FastAPI. @@ -15,7 +15,7 @@ We currently only support custom lifespan events in Python deployments with `lan ## Create app -Starting from an **existing** LangGraph Platform application, add the following lifespan code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. +Starting from an **existing** LangSmith Platform application, add the following lifespan code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. ```bash langgraph new --template=new-langgraph-project-python my_new_project @@ -77,8 +77,8 @@ You should see your startup message printed when the server starts, and your cle ## Deploying -You can deploy your app as-is to LangGraph Platform or to your self-hosted platform. +You can deploy your app as-is to cloud or to your self-hosted platform. ## Next steps -Now that you've added lifespan events to your deployment, you can use similar techniques to add [custom routes](/langgraph-platform/custom-routes) or [custom middleware](/langgraph-platform/custom-middleware) to further customize your server's behavior. +Now that you've added lifespan events to your deployment, you can use similar techniques to add [custom routes](/langsmith/custom-routes) or [custom middleware](/langsmith/custom-middleware) to further customize your server's behavior. diff --git a/src/langsmith/custom-middleware.mdx b/src/langsmith/custom-middleware.mdx index ce3327dae..2698979ce 100644 --- a/src/langsmith/custom-middleware.mdx +++ b/src/langsmith/custom-middleware.mdx @@ -2,9 +2,9 @@ title: How to add custom middleware sidebarTitle: Add custom middleware --- -When deploying agents to LangGraph Platform, you can add custom middleware to your server to handle concerns like logging request metrics, injecting or checking headers, and enforcing security policies without modifying core server logic. This works the same way as [adding custom routes](/langgraph-platform/custom-routes). You just need to provide your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). +When deploying agents to LangSmith Platform, you can add custom middleware to your server to handle concerns like logging request metrics, injecting or checking headers, and enforcing security policies without modifying core server logic. This works the same way as [adding custom routes](/langsmith/custom-routes). You just need to provide your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). -Adding middleware lets you intercept and modify requests and responses globally across your deployment, whether they're hitting your custom endpoints or the built-in LangGraph Platform APIs. +Adding middleware lets you intercept and modify requests and responses globally across your deployment, whether they're hitting your custom endpoints or the built-in LangSmith Platform APIs. Below is an example using FastAPI. @@ -15,7 +15,7 @@ We currently only support custom middleware in Python deployments with `langgrap ## Create app -Starting from an **existing** LangGraph Platform application, add the following middleware code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. +Starting from an **existing** LangSmith Platform application, add the following middleware code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. ```bash langgraph new --template=new-langgraph-project-python my_new_project @@ -91,8 +91,8 @@ Now any request to your server will include the custom header `X-Custom-Header` ## Deploying -You can deploy this app as-is to LangGraph Platform or to your self-hosted platform. +You can deploy this app as-is to cloud or to your self-hosted platform. ## Next steps -Now that you've added custom middleware to your deployment, you can use similar techniques to add [custom routes](/langgraph-platform/custom-routes) or define [custom lifespan events](/langgraph-platform/custom-lifespan) to further customize your server's behavior. +Now that you've added custom middleware to your deployment, you can use similar techniques to add [custom routes](/langsmith/custom-routes) or define [custom lifespan events](/langsmith/custom-lifespan) to further customize your server's behavior. diff --git a/src/langsmith/custom-routes.mdx b/src/langsmith/custom-routes.mdx index 92a0f84c6..d66ed857d 100644 --- a/src/langsmith/custom-routes.mdx +++ b/src/langsmith/custom-routes.mdx @@ -2,9 +2,9 @@ title: How to add custom routes sidebarTitle: Add custom routes --- -When deploying agents to LangGraph platform, your server automatically exposes routes for creating runs and threads, interacting with the long-term memory store, managing configurable assistants, and other core functionality ([see all default API endpoints](/langgraph-platform/server-api-ref)). +When deploying agents to LangGraph platform, your server automatically exposes routes for creating runs and threads, interacting with the long-term memory store, managing configurable assistants, and other core functionality ([see all default API endpoints](/langsmith/server-api-ref)). -You can add custom routes by providing your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). You make LangGraph Platform aware of this by providing a path to the app in your `langgraph.json` configuration file. +You can add custom routes by providing your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). You make LangSmith Platform aware of this by providing a path to the app in your `langgraph.json` configuration file. Defining a custom app object lets you add any routes you'd like, so you can do anything from adding a `/login` endpoint to writing an entire full-stack web-app, all deployed in a single LangGraph Server. @@ -12,7 +12,7 @@ Below is an example using FastAPI. ## Create app -Starting from an **existing** LangGraph Platform application, add the following custom route code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. +Starting from an **existing** LangSmith Platform application, add the following custom route code to your `webapp.py` file. If you are starting from scratch, you can create a new app from a template using the CLI. ```bash langgraph new --template=new-langgraph-project-python my_new_project @@ -68,8 +68,8 @@ The routes you create in the app are given priority over the system defaults, me ## Deploying -You can deploy this app as-is to LangGraph Platform or to your self-hosted platform. +You can deploy this app as-is to LangSmith Platform or to your self-hosted platform. ## Next steps -Now that you've added a custom route to your deployment, you can use this same technique to further customize how your server behaves, such as defining custom [custom middleware](/langgraph-platform/custom-middleware) and [custom lifespan events](/langgraph-platform/custom-lifespan). +Now that you've added a custom route to your deployment, you can use this same technique to further customize how your server behaves, such as defining custom [custom middleware](/langsmith/custom-middleware) and [custom lifespan events](/langsmith/custom-lifespan). diff --git a/src/langsmith/data-plane.mdx b/src/langsmith/data-plane.mdx index 73491200d..8ccfd6dd8 100644 --- a/src/langsmith/data-plane.mdx +++ b/src/langsmith/data-plane.mdx @@ -1,22 +1,22 @@ --- -title: LangGraph Data Plane -sidebarTitle: Data Plane +title: LangSmith Platform data plane +sidebarTitle: Data plane --- -The term "data plane" is used broadly to refer to [LangGraph Servers](/langgraph-platform/langgraph-server) (deployments), the corresponding infrastructure for each server, and the "listener" application that continuously polls for updates from the [LangGraph Control Plane](/langgraph-platform/control-plane). +The _data plane_ consists of your [LangGraph Servers](/langsmith/langgraph-server) (deployments), their supporting infrastructure, and the "listener" application that continuously polls for updates from the [LangSmith Platform control plane](/langsmith/control-plane). -## Server Infrastructure +## Server infrastructure -In addition to the [LangGraph Server](/langgraph-platform/langgraph-server) itself, the following infrastructure components for each server are also included in the broad definition of "data plane": +In addition to the [LangGraph Server](/langsmith/langgraph-server) itself, the following infrastructure components for each server are also included in the broad definition of "data plane": -* Postgres -* Redis -* Secrets store -* Autoscalers +- **PostgreSQL**: persistence layer for user, run, and memory data. +- **Redis**: communication and ephemeral metadata for workers. +- **Secrets store**: secure management of environment secrets. +- **Autoscalers**: scale server containers based on load. -## "Listener" Application +## "Listener" application -The data plane "listener" application periodically calls [control plane APIs](/langgraph-platform/control-plane#control-plane-api) to: +The data plane "listener" application periodically calls [control plane APIs](/langsmith/control-plane#control-plane-api) to: * Determine if new deployments should be created. * Determine if existing deployments should be updated (i.e. new revisions). @@ -24,9 +24,9 @@ The data plane "listener" application periodically calls [control plane APIs](/l In other words, the data plane "listener" reads the latest state of the control plane (desired state) and takes action to reconcile outstanding deployments (current state) to match the latest state. -## Postgres +## PostgreSQL -Postgres is the persistence layer for all user, run, and long-term memory data in a LangGraph Server. This stores both checkpoints (see more info [here](/oss/langgraph/persistence)), server resources (threads, runs, assistants and crons), as well as items saved in the long-term memory store (see more info [here](/oss/langgraph/persistence#memory-store)). +PostgreSQL is the persistence layer for all user, run, and long-term memory data in a LangGraph Server. This stores both checkpoints (see more info [here](/oss/langgraph/persistence)), server resources (threads, runs, assistants and crons), as well as items saved in the long-term memory store (see more info [here](/oss/langgraph/persistence#memory-store)). ## Redis @@ -36,23 +36,23 @@ Redis is used in each LangGraph Server as a way for server and queue workers to All runs in a LangGraph Server are executed by a pool of background workers that are part of each deployment. In order to enable some features for those runs (such as cancellation and output streaming) we need a channel for two-way communication between the server and the worker handling a particular run. We use Redis to organize that communication. -1. A Redis list is used as a mechanism to wake up a worker as soon as a new run is created. Only a sentinel value is stored in this list, no actual run information. The run information is then retrieved from Postgres by the worker. +1. A Redis list is used as a mechanism to wake up a worker as soon as a new run is created. Only a sentinel value is stored in this list, no actual run information. The run information is then retrieved from PostgreSQL by the worker. 2. A combination of a Redis string and Redis PubSub channel is used for the server to communicate a run cancellation request to the appropriate worker. 3. A Redis PubSub channel is used by the worker to broadcast streaming output from an agent while the run is being handled. Any open `/stream` request in the server will subscribe to that channel and forward any events to the response as they arrive. No events are stored in Redis at any time. ### Ephemeral metadata -Runs in a LangGraph Server may be retried for specific failures (currently only for transient Postgres errors encountered during the run). In order to limit the number of retries (currently limited to 3 attempts per run) we record the attempt number in a Redis string when it is picked up. This contains no run-specific info other than its ID, and expires after a short delay. +Runs in a LangGraph Server may be retried for specific failures (currently only for transient PostgreSQL errors encountered during the run). In order to limit the number of retries (currently limited to 3 attempts per run) we record the attempt number in a Redis string when it is picked up. This contains no run-specific info other than its ID, and expires after a short delay. -## Data Plane Features +## Data plane features This section describes various features of the data plane. -### Data Region +### Data region **Only for Cloud** -Data regions are only applicable for [Cloud](/langgraph-platform/cloud) deployments. +Data regions are only applicable for [Cloud](/langsmith/cloud) deployments. Deployments can be created in 2 data regions: US and EU @@ -61,11 +61,11 @@ The data region for a deployment is implied by the data region of the LangSmith ### Autoscaling -[`Production` type](/langgraph-platform/control-plane#deployment-types) deployments automatically scale up to 10 containers. Scaling is based on 3 metrics: +[`Production` type](/langsmith/control-plane#deployment-types) deployments automatically scale up to 10 containers. Scaling is based on 3 metrics: 1. CPU utilization 2. Memory utilization -3. Number of pending (in progress) [runs](/langgraph-platform/assistants#execution) +3. Number of pending (in progress) [runs](/langsmith/assistants#execution) For CPU utilization, the autoscaler targets 75% utilization. This means the autoscaler will scale the number of containers up or down to ensure that CPU utilization is at or near 75%. For memory utilization, the autoscaler targets 75% utilization as well. @@ -75,11 +75,11 @@ Each metric is computed independently and the autoscaler will determine the scal Scale down actions are delayed for 30 minutes before any action is taken. In other words, if the autoscaler decides to scale down a deployment, it will first wait for 30 minutes before scaling down. After 30 minutes, the metrics are recomputed and the deployment will scale down if the recomputed metrics result in a lower number of containers than the current number. Otherwise, the deployment remains scaled up. This "cool down" period ensures that deployments do not scale up and down too frequently. -### Static IP Addresses +### Static IP addresses **Only for Cloud** -Static IP addresses are only available for [Cloud](/langgraph-platform/cloud) deployments. +Static IP addresses are only available for [Cloud](/langsmith/cloud) deployments. All traffic from deployments created after January 6th 2025 will come through a NAT gateway. This NAT gateway will have several static IP addresses depending on the data region. Refer to the table below for the list of static IP addresses: @@ -103,27 +103,27 @@ All traffic from deployments created after January 6th 2025 will come through a | 34.121.166.52 | | | 34.31.121.70 | | -### Custom Postgres +### Custom PostgreSQL -Custom Postgres instances are only available for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +Custom PostgreSQL instances are only available for [hybrid](/langsmith/hybrid) and [self-hosted](/langsmith/self-hosted) deployments. -A custom Postgres instance can be used instead of the [one automatically created by the control plane](/langgraph-platform/control-plane#database-provisioning). Specify the [`POSTGRES_URI_CUSTOM`](/langgraph-platform/env-var#postgres_uri_custom) environment variable to use a custom Postgres instance. +A custom PostgreSQL instance can be used instead of the [one automatically created by the control plane](/langsmith/control-plane#database-provisioning). Specify the [`POSTGRES_URI_CUSTOM`](/langsmith/env-var#postgres_uri_custom) environment variable to use a custom PostgreSQL instance. -Multiple deployments can share the same Postgres instance. For example, for `Deployment A`, `POSTGRES_URI_CUSTOM` can be set to `postgres://:@/?host=` and for `Deployment B`, `POSTGRES_URI_CUSTOM` can be set to `postgres://:@/?host=`. `` and `database_name_2` are different databases within the same instance, but `` is shared. **The same database cannot be used for separate deployments**. +Multiple deployments can share the same PostgreSQL instance. For example, for `Deployment A`, `POSTGRES_URI_CUSTOM` can be set to `postgres://:@/?host=` and for `Deployment B`, `POSTGRES_URI_CUSTOM` can be set to `postgres://:@/?host=`. `` and `database_name_2` are different databases within the same instance, but `` is shared. **The same database cannot be used for separate deployments**. ### Custom Redis -Custom Redis instances are only available for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +Custom Redis instances are only available for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments. -A custom Redis instance can be used instead of the one automatically created by the control plane. Specify the [REDIS_URI_CUSTOM](/langgraph-platform/env-var#redis_uri_custom) environment variable to use a custom Redis instance. +A custom Redis instance can be used instead of the one automatically created by the control plane. Specify the [REDIS_URI_CUSTOM](/langsmith/env-var#redis_uri_custom) environment variable to use a custom Redis instance. Multiple deployments can share the same Redis instance. For example, for `Deployment A`, `REDIS_URI_CUSTOM` can be set to `redis://:/1` and for `Deployment B`, `REDIS_URI_CUSTOM` can be set to `redis://:/2`. `1` and `2` are different database numbers within the same instance, but `` is shared. **The same database number cannot be used for separate deployments**. -### LangSmith Tracing +### LangSmith tracing LangGraph Server is automatically configured to send traces to LangSmith. See the table below for details with respect to each deployment option. @@ -137,7 +137,7 @@ LangGraph Server is automatically configured to report telemetry metadata for bi | Cloud | Hybrid | Self-Hosted | |------------|------------------------|----------------------| -| Telemetry sent to LangSmith SaaS. | Telemetry sent to LangSmith SaaS. | Self-reported usage (audit) for air-gapped license key.
Telemetry sent to LangSmith SaaS for LangGraph Platform License Key. | +| Telemetry sent to LangSmith SaaS. | Telemetry sent to LangSmith SaaS. | Self-reported usage (audit) for air-gapped license key.
Telemetry sent to LangSmith SaaS for LangSmith Platform License Key. | ### Licensing @@ -145,4 +145,4 @@ LangGraph Server is automatically configured to perform license key validation. | Cloud | Hybrid | Self-Hosted | |------------|------------------------|----------------------| -| LangSmith API Key validated against LangSmith SaaS. | LangSmith API Key validated against LangSmith SaaS. | Air-gapped license key or LangGraph Platform License Key validated against LangSmith SaaS. | +| LangSmith API Key validated against LangSmith SaaS. | LangSmith API Key validated against LangSmith SaaS. | Air-gapped license key or Platform License Key validated against LangSmith SaaS. | diff --git a/src/langsmith/data-storage-and-privacy.mdx b/src/langsmith/data-storage-and-privacy.mdx index 1a1aa9f12..594c51edb 100644 --- a/src/langsmith/data-storage-and-privacy.mdx +++ b/src/langsmith/data-storage-and-privacy.mdx @@ -6,7 +6,7 @@ This document describes how data is processed in the LangGraph CLI and the LangG ## CLI -LangGraph **CLI** is the command-line interface for building and running LangGraph applications; see the [CLI guide](/langgraph-platform/langgraph-cli) to learn more. +LangGraph **CLI** is the command-line interface for building and running LangGraph applications; see the [CLI guide](/langsmith/langgraph-cli) to learn more. By default, calls to most CLI commands log a single analytics event upon invocation. This helps us better prioritize improvements to the CLI experience. Each telemetry event contains the calling process's OS, OS version, Python version, the CLI version, the command name (`dev`, `up`, `run`, etc.), and booleans representing whether a flag was passed to the command. You can see the full analytics logic [here](https://github.com/langchain-ai/langgraph/blob/main/libs/cli/langgraph-cli/analytics.py). @@ -15,7 +15,7 @@ You can disable all CLI telemetry by setting `LANGGRAPH_CLI_NO_ANALYTICS=1`. ## LangGraph Server -The [LangGraph Server](/langgraph-platform/langgraph-server) provides a durable execution runtime that relies on persisting checkpoints of your application state, long-term memories, thread metadata, assistants, and similar resources to the local file system or a database. Unless you have deliberately customized the storage location, this information is either written to local disk (for `langgraph dev`) or a PostgreSQL database (for `langgraph up` and in all deployments). +The [LangGraph Server](/langsmith/langgraph-server) provides a durable execution runtime that relies on persisting checkpoints of your application state, long-term memories, thread metadata, assistants, and similar resources to the local file system or a database. Unless you have deliberately customized the storage location, this information is either written to local disk (for `langgraph dev`) or a PostgreSQL database (for `langgraph up` and in all deployments). ### LangSmith Tracing @@ -24,12 +24,12 @@ When running the LangGraph server (either in-memory or in Docker), LangSmith tra ### In-memory development server -`langgraph dev` runs an [in-memory development server](/langgraph-platform/local-server) as a single Python process, designed for quick development and testing. It saves all checkpointing and memory data to disk within a `.langgraph_api` directory in the current working directory. Apart from the telemetry data described in the [CLI](#cli) section, no data leaves the machine unless you have enabled tracing or your graph code explicitly contacts an external service. +`langgraph dev` runs an [in-memory development server](/langsmith/local-server) as a single Python process, designed for quick development and testing. It saves all checkpointing and memory data to disk within a `.langgraph_api` directory in the current working directory. Apart from the telemetry data described in the [CLI](#cli) section, no data leaves the machine unless you have enabled tracing or your graph code explicitly contacts an external service. ### Standalone Server -`langgraph up` builds your local package into a Docker image and runs the server as the [data plane](/langgraph-platform/deployment-options#self-hosted) consisting of three containers: the API server, a PostgreSQL container, and a Redis container. All persistent data (checkpoints, assistants, etc.) are stored in the PostgreSQL database. Redis is used as a pubsub connection for real-time streaming of events. You can encrypt all checkpoints before saving to the database by setting a valid `LANGGRAPH_AES_KEY` environment variable. You can also specify [TTLs](/langgraph-platform/configure-ttl) for checkpoints and cross-thread memories in `langgraph.json` to control how long data is stored. All persisted threads, memories, and other data can be deleted via the relevant API endpoints. +`langgraph up` builds your local package into a Docker image and runs the server as the [data plane](/langsmith/deployment-options#self-hosted) consisting of three containers: the API server, a PostgreSQL container, and a Redis container. All persistent data (checkpoints, assistants, etc.) are stored in the PostgreSQL database. Redis is used as a pubsub connection for real-time streaming of events. You can encrypt all checkpoints before saving to the database by setting a valid `LANGGRAPH_AES_KEY` environment variable. You can also specify [TTLs](/langsmith/configure-ttl) for checkpoints and cross-thread memories in `langgraph.json` to control how long data is stored. All persisted threads, memories, and other data can be deleted via the relevant API endpoints. Additional API calls are made to confirm that the server has a valid license and to track the number of executed runs and tasks. Periodically, the API server validates the provided license key (or API key). @@ -37,7 +37,7 @@ If you've disabled [tracing](#langsmith-tracing), no user data is persisted exte ## Studio -[Studio](/langgraph-platform/langgraph-studio) is a graphical interface for interacting with your LangGraph server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local LangGraph server so that no data needs to be sent to LangSmith. +[Studio](/langsmith/langgraph-studio) is a graphical interface for interacting with your LangGraph server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local LangGraph server so that no data needs to be sent to LangSmith. If you are logged in, LangSmith does collect some usage analytics to help improve studio's user experience. This includes: diff --git a/src/langsmith/deploy-hybrid.mdx b/src/langsmith/deploy-hybrid.mdx index ab0eec94b..c6d81b450 100644 --- a/src/langsmith/deploy-hybrid.mdx +++ b/src/langsmith/deploy-hybrid.mdx @@ -1,19 +1,20 @@ --- -title: How to deploy hybrid -sidebarTitle: Deployment guide -icon: "server" +title: Set up hybrid LangSmith Platform +sidebarTitle: Setup guide +icon: "cloud" --- -Before deploying, review the [conceptual guide for the Hybrid](/langgraph-platform/hybrid) deployment option. + +Before setup, review the [overview on hybrid](/langsmith/hybrid) LangSmith Platform. **Important** -The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) plan. +The Hybrid deployment option requires an [Enterprise](/langsmith/plans) plan. ## Prerequisites -1. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to [test your application locally](/langgraph-platform/local-server). -2. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to build a Docker image (i.e. `langgraph build`) and push it to a registry your Kubernetes cluster or Amazon ECS cluster has access to. +1. Use the [LangGraph CLI](/langsmith/langgraph-cli) to [test your application locally](/langsmith/local-server). +2. Use the [LangGraph CLI](/langsmith/langgraph-cli) to build a Docker image (i.e. `langgraph build`) and push it to a registry your Kubernetes cluster or Amazon ECS cluster has access to. ## Kubernetes @@ -33,7 +34,7 @@ The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) ### Setup 1. Provide your LangSmith organization ID to us. Your LangSmith organization will be configured to deploy the data plane in your cloud. -2. Create a listener from the LangSmith UI. The `Listener` data model is configured for the actual ["listener" application](/langgraph-platform/data-plane#”listener”-application). +2. Create a listener from the LangSmith UI. The `Listener` data model is configured for the actual ["listener" application](/langsmith/data-plane#”listener”-application). 1. In the left-hand navigation, select `LangGraph Platform` > `Listeners`. 2. In the top-right of the page, select `+ Create Listener`. 3. Enter a unique `Compute ID` for the listener. The `Compute ID` is a user-defined identifier that should be unique across all listeners in the current LangSmith workspace. The `Compute ID` is displayed to end users when they are creating a new deployment. Ensure that the `Compute ID` provides context to the end user about where their LangGraph Server deployments will be deployed to. For example, a `Compute ID` can be set to `k8s-cluster-name-dev-01`. In this example, the name of the Kubernetes cluster is `k8s-cluster-name`, `dev` denotes that the cluster is reserved for "development" workloads, and `01` is a numerical suffix to reduce naming collisions. @@ -45,7 +46,7 @@ The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) Creating a listener from the LangSmith UI does not install the "listener" application in the Kubernetes cluster. 3. A [Helm chart](https://github.com/langchain-ai/helm/tree/main/charts/langgraph-dataplane) is provided to install the necesssary components in your Kubernetes cluster. - - `langgraph-listener`: This is a service that listens to LangChain's [control plane](/langgraph-platform/control-plane) for changes to your deployments and creates/updates downstream CRDs. This is the ["listener" application](/langgraph-platform/data-plane#”listener”-application). + - `langgraph-listener`: This is a service that listens to LangChain's [control plane](/langsmith/control-plane) for changes to your deployments and creates/updates downstream CRDs. This is the ["listener" application](/langsmith/data-plane#”listener”-application). - `LangGraphPlatform CRD`: A CRD for LangGraph Platform deployments. This contains the spec for managing an instance of a LangGraph Platform deployment. - `langgraph-platform-operator`: This operator handles changes to your LangGraph Platform CRDs. 4. Configure your `langgraph-dataplane-values.yaml` file. @@ -87,7 +88,7 @@ The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) langgraph-dataplane-operator-6b88879f9b-t76gk 1/1 Running 0 26s langgraph-dataplane-redis-0 1/1 Running 0 25s ``` -7. Create a deployment from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). +7. Create a deployment from the [control plane UI](/langsmith/control-plane#control-plane-ui). 1. Select the desired listener from the list of `Compute IDs` in the dropdown menu. 2. Select the Kubernetes namespace to deploy to. 3. Fill out all other required fields and select `Submit` in the top-right of the panel. diff --git a/src/langsmith/deploy-self-hosted-full-platform.mdx b/src/langsmith/deploy-self-hosted-full-platform.mdx index 32976c216..2e92711d0 100644 --- a/src/langsmith/deploy-self-hosted-full-platform.mdx +++ b/src/langsmith/deploy-self-hosted-full-platform.mdx @@ -1,22 +1,22 @@ --- -title: How to deploy self-hosted full platform -sidebarTitle: Full-platform deployment guide +title: Self-host LangSmith Platform with agent deployment +sidebarTitle: With agent deployment icon: "server" -iconType: "solid" --- -Before deploying, review the [conceptual guide for the Self-Hosted Full Platform](/langgraph-platform/self-hosted) deployment option. + +Before setting up LangSmith Platform with agent deployment, review the [self-hosting overview page](/langsmith/self-hosted#with-agent-deployment). **Important** -The Self-Hosted Full Platform deployment option requires an [Enterprise](/langgraph-platform/plans) plan. +Self-hosting LangSmith Platform with agent deployment requires an [Enterprise](/langsmith/plans) plan. ## Prerequisites 1. You are using Kubernetes. -2. You have self-hosted LangSmith deployed. -3. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to [test your application locally](/langgraph-platform/local-server). -4. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to build a Docker image (i.e. `langgraph build`) and push it to a registry your Kubernetes cluster has access to. +2. You have an instance of [self-hosted LangSmith Platform](/langsmith/kubernetes) running. +3. Use the [LangGraph CLI](/langsmith/langgraph-cli) to [test your application locally](/langsmith/local-server). +4. Use the [LangGraph CLI](/langsmith/langgraph-cli) to build a Docker image (i.e. `langgraph build`) and push it to a registry your Kubernetes cluster has access to. 5. `KEDA` is installed on your cluster. ```bash helm repo add kedacore https://kedacore.github.io/charts @@ -30,15 +30,15 @@ The Self-Hosted Full Platform deployment option requires an [Enterprise](/langgr ```bash kubectl get storageclass ``` -9. Egress to `https://beacon.langchain.com` from your network. This is required for license verification and usage reporting if not running in air-gapped mode. See the [Egress documentation](/langgraph-platform/egress-metrics-metadata) for more details. +9. Egress to `https://beacon.langchain.com` from your network. This is required for license verification and usage reporting if not running in air-gapped mode. See the [Egress documentation](/langsmith/egress-metrics-metadata) for more details. ## Setup -1. As part of configuring your Self-Hosted LangSmith instance, you enable the `langgraphPlatform` option. This will provision a few key resources. - 1. `listener`: This is a service that listens to the [control plane](/langgraph-platform/control-plane) for changes to your deployments and creates/updates downstream CRDs. - 2. `LangGraphPlatform CRD`: A CRD for LangGraph Platform deployments. This contains the spec for managing an instance of a LangGraph platform deployment. - 3. `operator`: This operator handles changes to your LangGraph Platform CRDs. - 4. `host-backend`: This is the [control plane](/langgraph-platform/control-plane). +1. As part of configuring your self-hosted LangSmith instance, you enable the `langgraphPlatform` option. This will provision a few key resources. + 1. `listener`: This is a service that listens to the [control plane](/langsmith/control-plane) for changes to your deployments and creates/updates downstream CRDs. + 2. `LangGraphPlatform CRD`: A CRD for LangSmith Platform deployments. This contains the spec for managing an instance of a LangGraph platform deployment. + 3. `operator`: This operator handles changes to your LangSmith Platform CRDs. + 4. `host-backend`: This is the [control plane](/langsmith/control-plane). 2. Two additional images will be used by the chart. Use the images that are specified in the latest release. ```bash hostBackendImage: @@ -57,4 +57,4 @@ The Self-Hosted Full Platform deployment option requires an [Enterprise](/langgr ``` 4. In your `values.yaml` file, configure the `hostBackendImage` and `operatorImage` options (if you need to mirror images) 5. You can also configure base templates for your agents by overriding the base templates [here](https://github.com/langchain-ai/helm/blob/main/charts/langsmith/values.yaml#L898). -6. You create a deployment from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). +6. You create a deployment from the [control plane UI](/langsmith/control-plane#control-plane-ui). diff --git a/src/langsmith/deploy-standalone-server.mdx b/src/langsmith/deploy-standalone-server.mdx index bf81f6abb..66b9f1b62 100644 --- a/src/langsmith/deploy-standalone-server.mdx +++ b/src/langsmith/deploy-standalone-server.mdx @@ -1,16 +1,15 @@ --- -title: How to deploy self-hosted standalone server -sidebarTitle: Standalone server deployment guide +title: Self-host standalone servers +sidebarTitle: Standalone servers icon: "server" -iconType: "solid" --- -Before deploying, review the [conceptual guide for the Standalone Server](/langgraph-platform/self-hosted#standalone-server) deployment option. +Before setting up standalone servers, review the [self-hosting overview page](/langsmith/self-hosted#standalone-server). ## Prerequisites -1. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to [test your application locally](/langgraph-platform/local-server). -2. Use the [LangGraph CLI](/langgraph-platform/langgraph-cli) to build a Docker image (i.e. `langgraph build`). +1. Use the [LangGraph CLI](/langsmith/langgraph-cli) to [test your application locally](/langsmith/local-server). +2. Use the [LangGraph CLI](/langsmith/langgraph-cli) to build a Docker image (i.e. `langgraph build`). 3. The following environment variables are needed for a data plane deployment. 1. `REDIS_URI`: Connection details to a Redis instance. Redis will be used as a pub-sub broker to enable streaming real time output from background runs. The value of `REDIS_URI` must be a valid [Redis connection URI](https://redis-py.readthedocs.io/en/stable/connections.html#redis.Redis.from_url). @@ -27,9 +26,9 @@ Before deploying, review the [conceptual guide for the Standalone Server](/langg `` and `database_name_2` are different databases within the same instance, but `` is shared. **The same database cannot be used for separate deployments**. 3. `LANGSMITH_API_KEY`: LangSmith API key. - 4. `LANGGRAPH_CLOUD_LICENSE_KEY`: LangGraph Platform license key. This will be used to authenticate ONCE at server start up. + 4. `LANGGRAPH_CLOUD_LICENSE_KEY`: LangSmith Platform license key. This will be used to authenticate ONCE at server start up. 5. `LANGSMITH_ENDPOINT`: To send traces to a [self-hosted LangSmith](/langsmith/architectural-overview) instance, set `LANGSMITH_ENDPOINT` to the hostname of the self-hosted LangSmith instance. -4. Egress to `https://beacon.langchain.com` from your network. This is required for license verification and usage reporting if not running in air-gapped mode. See the [Egress documentation](/langgraph-platform/egress-metrics-metadata) for more details. +4. Egress to `https://beacon.langchain.com` from your network. This is required for license verification and usage reporting if not running in air-gapped mode. See the [Egress documentation](/langsmith/egress-metrics-metadata) for more details. ## Kubernetes diff --git a/src/langsmith/deploy-to-cloud.mdx b/src/langsmith/deploy-to-cloud.mdx index bb9ce7815..e45ca40f8 100644 --- a/src/langsmith/deploy-to-cloud.mdx +++ b/src/langsmith/deploy-to-cloud.mdx @@ -1,28 +1,29 @@ --- -title: How to deploy to cloud -sidebarTitle: Deployment guide +title: LangSmith Platform on Cloud +sidebarTitle: Setup guide icon: "cloud" iconType: "solid" --- -Before deploying, review the [conceptual guide for the Cloud](/langgraph-platform/cloud) deployment option. + +Before setting up, review the [Cloud overview page](/langsmith/cloud). ## Prerequisites -1. LangGraph Platform applications are deployed from GitHub repositories. Configure and upload a LangGraph Platform application to a GitHub repository in order to deploy it to LangGraph Platform. -2. [Verify that the LangGraph API runs locally](/langgraph-platform/local-server). If the API does not run successfully (i.e. `langgraph dev`), deploying to LangGraph Platform will fail as well. +1. LangSmith Platform applications are deployed from GitHub repositories. Configure and upload a LangSmith Platform application to a GitHub repository in order to deploy it to LangSmith Platform. +2. [Verify that the LangGraph API runs locally](/langsmith/local-server). If the API does not run successfully (i.e. `langgraph dev`), deploying to LangSmith Platform will fail as well. ## Create New Deployment -Starting from the LangSmith UI... +Starting from the LangSmith UI: -1. In the left-hand navigation panel, select `LangGraph Platform`. The `LangGraph Platform` view contains a list of existing LangGraph Platform deployments. -2. In the top-right corner, select `+ New Deployment` to create a new deployment. +1. In the left-hand navigation panel, select **Deployments**, which contains a list of existing deployments. +2. In the top-right corner, select **+ New Deployment** to create a new deployment. 3. In the `Create New Deployment` panel, fill out the required fields. 1. `Deployment details` 1. Select `Import from GitHub` and follow the GitHub OAuth workflow to install and authorize LangChain's `hosted-langserve` GitHub app to access the selected repositories. After installation is complete, return to the `Create New Deployment` panel and select the GitHub repository to deploy from the dropdown menu. **Note**: The GitHub user installing LangChain's `hosted-langserve` GitHub app must be an [owner](https://docs.github.com/en/organizations/managing-peoples-access-to-your-organization-with-roles/roles-in-an-organization#organization-owners) of the organization or account. 2. Specify a name for the deployment. 3. Specify the desired `Git Branch`. A deployment is linked to a branch. When a new revision is created, code for the linked branch will be deployed. The branch can be updated later in the [Deployment Settings](#deployment-settings). - 4. Specify the full path to the [LangGraph API config file](/langgraph-platform/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. + 4. Specify the full path to the [LangGraph API config file](/langsmith/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. 5. Check/uncheck checkbox to `Automatically update deployment on push to branch`. If checked, the deployment will automatically be updated when changes are pushed to the specified `Git Branch`. This setting can be enabled/disabled later in the [Deployment Settings](#deployment-settings). 2. Select the desired `Deployment Type`. 1. `Development` deployments are meant for non-production use cases and are provisioned with minimal resources. @@ -30,7 +31,7 @@ Starting from the LangSmi 3. Determine if the deployment should be `Shareable through LangGraph Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. - 4. Specify `Environment Variables` and secrets. See the [Environment Variables reference](/langgraph-platform/env-var) to configure additional variables for the deployment. + 4. Specify `Environment Variables` and secrets. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the deployment. 1. Sensitive values such as API keys (e.g. `OPENAI_API_KEY`) should be specified as secrets. 2. Additional non-secret environment variables can be specified as well. 5. A new LangSmith `Tracing Project` is automatically created with the same name as the deployment. @@ -42,15 +43,15 @@ When [creating a new deployment](#create-new-deployment), a new revision is crea Starting from the LangSmith UI... -1. In the left-hand navigation panel, select `LangGraph Platform`. The `LangGraph Platform` view contains a list of existing LangGraph Platform deployments. +1. In the left-hand navigation panel, select **Deployments**, which contains a list of existing deployments. 2. Select an existing deployment to create a new revision for. 3. In the `Deployment` view, in the top-right corner, select `+ New Revision`. 4. In the `New Revision` modal, fill out the required fields. - 1. Specify the full path to the [LangGraph API config file](/langgraph-platform/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. + 1. Specify the full path to the [LangGraph API config file](/langsmith/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. 2. Determine if the deployment should be `Shareable through Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. - 3. Specify `Environment Variables` and secrets. Existing secrets and environment variables are prepopulated. See the [Environment Variables reference](/langgraph-platform/env-var) to configure additional variables for the revision. + 3. Specify `Environment Variables` and secrets. Existing secrets and environment variables are prepopulated. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the revision. 1. Add new secrets or environment variables. 2. Remove existing secrets or environment variables. 3. Update the value of existing secrets or environment variables. @@ -60,7 +61,7 @@ Starting from the LangSmi Build and server logs are available for each revision. -Starting from the `LangGraph Platform` view... +Starting from the **Deployments** view: 1. Select the desired revision from the `Revisions` table. A panel slides open from the right-hand side and the `Build` tab is selected by default, which displays build logs for the revision. 2. In the panel, select the `Server` tab to view server logs for the revision. Server logs are only available after a revision has been deployed. @@ -70,9 +71,9 @@ Starting from the `LangGraph Platform` view... Starting from the LangSmith UI... -1. In the left-hand navigation panel, select `LangGraph Platform`. The `LangGraph Platform` view contains a list of existing LangGraph Platform deployments. +1. In the left-hand navigation panel, select **Deployments**, which contains a list of existing deployments. 2. Select an existing deployment to monitor. -3. Select the `Monitoring` tab to view the deployment metrics. See a list of [all available metrics](/langgraph-platform/control-plane#monitoring). +3. Select the `Monitoring` tab to view the deployment metrics. See a list of [all available metrics](/langsmith/control-plane#monitoring). 4. Within the `Monitoring` tab, use the date/time range picker as needed. By default, the date/time range picker is set to the `Last 15 minutes`. ## Interrupt Revision @@ -84,7 +85,7 @@ Interrupting a revision will stop deployment of the revision. Interrupted revisions have undefined behavior. This is only useful if you need to deploy a new revision and you already have a revision "stuck" in progress. In the future, this feature may be removed.
-Starting from the `LangGraph Platform` view... +Starting from the **Deployments** view: 1. Select the menu icon (three dots) on the right-hand side of the row for the desired revision from the `Revisions` table. 2. Select `Interrupt` from the menu. @@ -94,13 +95,13 @@ Starting from the `LangGraph Platform` view... Starting from the LangSmith UI... -1. In the left-hand navigation panel, select `LangGraph Platform`. The `LangGraph Platform` view contains a list of existing LangGraph Platform deployments. +1. In the left-hand navigation panel, select **Deployments**, which contains a list of existing deployments. 2. Select the menu icon (three dots) on the right-hand side of the row for the desired deployment and select `Delete`. 3. A `Confirmation` modal will appear. Select `Delete`. ## Deployment Settings -Starting from the `LangGraph Platform` view... +Starting from the **Deployments**view: 1. In the top-right corner, select the gear icon (`Deployment Settings`). 2. Update the `Git Branch` to the desired branch. @@ -117,10 +118,10 @@ After installing and authorizing LangChain's `hosted-langserve` GitHub app, repo 3. Click `Save`. 4. When creating a new deployment, the list of GitHub repositories in the dropdown menu will be updated to reflect the repository access changes. -## Whitelisting IP Addresses +## Allowlisting IP Addresses -All traffic from `LangGraph Platform` deployments created after January 6th 2025 will come through a NAT gateway. -This NAT gateway will have several static ip addresses depending on the region you are deploying in. Refer to the table below for the list of IP addresses to whitelist: +All traffic from LangSmith Platform deployments created after January 6th 2025 will come through a NAT gateway. +This NAT gateway will have several static ip addresses depending on the region you are deploying in. Refer to the table below for the list of IP addresses to allowlist: | US | EU | |----------------|-----------------| diff --git a/src/langsmith/deployment-options.mdx b/src/langsmith/deployment-options.mdx index 9153dff03..ffcd6ec3c 100644 --- a/src/langsmith/deployment-options.mdx +++ b/src/langsmith/deployment-options.mdx @@ -1,10 +1,10 @@ --- -title: Deployment options -sidebarTitle: Deployment options +title: Run LangSmith Platform +sidebarTitle: Run LangSmith Platform mode: "wide" --- -LangSmith deployment options include: +[LangSmith Platform](https://smith.langchain.com) supports three hosting options, depending on your scale, security, and infrastructure needs: - Fully managed model for deployment running in LangChain's cloud. A simple way to deploy and manage your LangGraph Servers. Connect your GitHub repositories to the platform and deploy your LangGraph Servers from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). The build process (i.e. CI/CD) is managed internally by the platform. + Run all components fully managed in LangChain’s cloud. - (Enterprise) Manage the data plane in your own cloud, while LangChain manages the control plane. Build a Docker image using the LangGraph CLI and deploy your LangGraph Server from the control plane UI. + **(Enterprise)** Manage the data plane in your cloud while LangChain manages the control plane. - (Enterprise) Run all components entirely within your own cloud environment. Deploy the full LangSmith platform or standalone instances of a LangGraph Server without the control plane UI. + **(Enterprise)** Run the full LangSmith platform or run standalone LangGraph Servers without the control plane UI. @@ -49,12 +49,12 @@ For comparison: | **[Pricing](https://www.langchain.com/pricing-langgraph-platform)** | Plus | Enterprise | Enterprise | -You can [run LangSmith locally for free](/langgraph-platform/local-server) for testing and development. +You can [run a LangGraph Server locally for free](/langsmith/local-server) for testing and development. ## Related For more information, refer to: -- [Plans](/langgraph-platform/plans) +- [Plans](/langsmith/plans) - [Pricing](https://www.langchain.com/langgraph-platform-pricing) diff --git a/src/langsmith/deployment-quickstart.mdx b/src/langsmith/deployment-quickstart.mdx index 3edfe21af..fe2c655b1 100644 --- a/src/langsmith/deployment-quickstart.mdx +++ b/src/langsmith/deployment-quickstart.mdx @@ -2,7 +2,7 @@ title: Deploy on cloud sidebarTitle: Deploy on cloud --- -This guide shows you how to set up and use LangGraph Platform to deploy on cloud. +This guide shows you how to set up and use LangSmith Platform to deploy on cloud. ## Prerequisites @@ -13,13 +13,13 @@ Before you begin, ensure you have the following: ## 1. Create a repository on GitHub -To deploy an application to **LangGraph Platform**, your application code must reside in a GitHub repository. Both public and private repositories are supported. For this quickstart, use the [`new-langgraph-project` template](https://github.com/langchain-ai/react-agent) for your application: +To deploy an application to **LangSmith Platform**, your application code must reside in a GitHub repository. Both public and private repositories are supported. For this quickstart, use the [`new-langgraph-project` template](https://github.com/langchain-ai/react-agent) for your application: 1. Go to the [`new-langgraph-project` repository](https://github.com/langchain-ai/new-langgraph-project) or [`new-langgraphjs-project` template](https://github.com/langchain-ai/new-langgraphjs-project). 2. Click the `Fork` button in the top right corner to fork the repository to your GitHub account. 3. Click **Create fork**. -## 2. Deploy to LangGraph Platform +## 2. Deploy to LangSmith Platform 1. Log in to [LangSmith](https://smith.langchain.com/). 2. In the left sidebar, select **Deployments**. @@ -34,7 +34,7 @@ To deploy an application to **LangGraph Platform**, your application code must r Once your application is deployed: 1. Select the deployment you just created to view more details. -2. Click the **Studio** button in the top right corner. [Studio](/langgraph-platform/langgraph-studio) will open to display your graph. +2. Click the **Studio** button in the top right corner. [Studio](/langsmith/langgraph-studio) will open to display your graph. ## 4. Get the API URL for your deployment @@ -155,7 +155,7 @@ You can now test the API: ## Next steps -You have deployed an application using LangGraph Platform. Here are some other resources to check out: +You have deployed an application using LangSmith Platform. Here are some other resources to check out: -* [Studio overview](/langgraph-platform/langgraph-studio) -* [Deployment options](/langgraph-platform/deployment-options) +* [Studio overview](/langsmith/langgraph-studio) +* [Deployment options](/langsmith/deployment-options) diff --git a/src/langsmith/double-texting.mdx b/src/langsmith/double-texting.mdx index 887e43d19..e6855fbf7 100644 --- a/src/langsmith/double-texting.mdx +++ b/src/langsmith/double-texting.mdx @@ -5,7 +5,7 @@ sidebarTitle: Overview **Prerequisites** -* [LangGraph Server](/langgraph-platform/langgraph-server) +* [LangGraph Server](/langsmith/langgraph-server) Many times users might interact with your graph in unintended ways. @@ -13,21 +13,21 @@ For instance, a user may send one message and before the graph has finished runn More generally, users may invoke the graph a second time before the first run has finished. We call this "double texting". -Currently, LangGraph only addresses this as part of [LangGraph Platform](/langgraph-platform/index), not in the open source. -The reason for this is that in order to handle this we need to know how the graph is deployed, and since LangGraph Platform deals with deployment the logic needs to live there. -If you do not want to use LangGraph Platform, we describe the options we have implemented in detail below. +Currently, LangGraph only addresses this as part of [LangSmith Platform](/langsmith/home), not in the open source. +The reason for this is that in order to handle this we need to know how the graph is deployed, and since LangSmith Platform deals with deployment the logic needs to live there. +If you do not want to use LangSmith Platform, we describe the options we have implemented in detail below. -![Double-text strategies across first vs. second run: Reject keeps only the first; Enqueue runs the second afterward; Interrupt halts the first to run the second; Rollback reverts the first and reruns with the second.](/langgraph-platform/images/double-texting.png) +![Double-text strategies across first vs. second run: Reject keeps only the first; Enqueue runs the second afterward; Interrupt halts the first to run the second; Rollback reverts the first and reruns with the second.](/langsmith/images/double-texting.png) ## Reject This is the simplest option, this just rejects any follow-up runs and does not allow double texting. -See the [how-to guide](/langgraph-platform/reject-concurrent) for configuring the reject double text option. +See the [how-to guide](/langsmith/reject-concurrent) for configuring the reject double text option. ## Enqueue This is a relatively simple option which continues the first run until it completes the whole run, then sends the new input as a separate run. -See the [how-to guide](/langgraph-platform/enqueue-concurrent) for configuring the enqueue double text option. +See the [how-to guide](/langsmith/enqueue-concurrent) for configuring the enqueue double text option. ## Interrupt @@ -38,10 +38,10 @@ If you enable this option, your graph should be able to handle weird edge cases For example, you could have called a tool but not yet gotten back a result from running that tool. You may need to remove that tool call in order to not have a dangling tool call. -See the [how-to guide](/langgraph-platform/interrupt-concurrent) for configuring the interrupt double text option. +See the [how-to guide](/langsmith/interrupt-concurrent) for configuring the interrupt double text option. ## Rollback This option interrupts the current execution AND rolls back all work done up until that point, including the original run input. It then sends the new user input in, basically as if it was the original input. -See the [how-to guide](/langgraph-platform/rollback-concurrent) for configuring the rollback double text option. +See the [how-to guide](/langsmith/rollback-concurrent) for configuring the rollback double text option. diff --git a/src/langsmith/egress-metrics-metadata.mdx b/src/langsmith/egress-metrics-metadata.mdx index 35706857f..68cd84819 100644 --- a/src/langsmith/egress-metrics-metadata.mdx +++ b/src/langsmith/egress-metrics-metadata.mdx @@ -3,11 +3,11 @@ title: Egress for subscription metrics and operational metadata sidebarTitle: Egress for metrics and metadata --- - **Important: self-hosted only.** This section only applies to customers who are not running in offline mode and assumes you are using a [self-hosted LangGraph Platform instance](/langgraph-platform/deployment-options). This does not apply to cloud or hybrid deployments. + **Important: self-hosted only.** This section only applies to customers who are not running in offline mode and assumes you are using a [self-hosted LangSmith Platform instance](/langsmith/deployment-options). This does not apply to cloud or hybrid deployments. -Self-hosted LangGraph Platform instances store all information locally and will never send sensitive information outside of your network. We currently only track platform usage for billing purposes according to the entitlements in your order. To better support our customers remotely, we do require egress to `https://beacon.langchain.com`. +Self-hosted LangSmith Platform instances store all information locally and will never send sensitive information outside of your network. We currently only track platform usage for billing purposes according to the entitlements in your order. To better support our customers remotely, we do require egress to `https://beacon.langchain.com`. -In the future, we will be introducing support diagnostics to help us ensure that the LangGraph Platform is running at an optimal level within your environment. +In the future, we will be introducing support diagnostics to help us ensure that the LangSmith Platform is running at an optimal level within your environment. This will require egress to `https://beacon.langchain.com` from your network. If using an API key, you will also need to allow egress to `https://api.smith.langchain.com` or `https://eu.api.smith.langchain.com` for API key verification. Refer to the [allowlisting IP section](/langsmith/cloud-architecture-and-scalability#allowlisting-ip-addresses) for static IP addresses, if needed. @@ -25,7 +25,7 @@ Generally, data that we send to Beacon can be categorized as follows: In an effort to maximize transparency, we provide sample payloads here: -### License Verification ([Enterprise](/langgraph-platform/plans)) +### License Verification ([Enterprise](/langsmith/plans)) **Endpoint:** diff --git a/src/langsmith/enqueue-concurrent.mdx b/src/langsmith/enqueue-concurrent.mdx index 296bad266..bbed153b5 100644 --- a/src/langsmith/enqueue-concurrent.mdx +++ b/src/langsmith/enqueue-concurrent.mdx @@ -3,7 +3,7 @@ title: Enqueue concurrent sidebarTitle: Enqueue --- -This guide assumes knowledge of what double-texting is, which you can learn about in the [double-texting conceptual guide](/langgraph-platform/double-texting). +This guide assumes knowledge of what double-texting is, which you can learn about in the [double-texting conceptual guide](/langsmith/double-texting). The guide covers the `enqueue` option for double texting, which adds the interruptions to a queue and executes them in the order they are received by the client. Below is a quick example of using the `enqueue` option. diff --git a/src/langsmith/env-var.mdx b/src/langsmith/env-var.mdx index ede5fa1c8..4ebe66499 100644 --- a/src/langsmith/env-var.mdx +++ b/src/langsmith/env-var.mdx @@ -44,7 +44,7 @@ For more details, refer to [Set a sampling rate for traces](/langsmith/sample-tr Type of authentication for the LangGraph Server deployment. Valid values: `langsmith`, `noop`. -For deployments to LangGraph Platform, this environment variable is set automatically. For local development or deployments where authentication is handled externally (e.g. self-hosted), set this environment variable to `noop`. +For deployments to LangSmith Platform, this environment variable is set automatically. For local development or deployments where authentication is handled externally (e.g. self-hosted), set this environment variable to `noop`. ## `LANGGRAPH_POSTGRES_POOL_MAX_SIZE` @@ -88,7 +88,7 @@ Set `LOG_JSON` to `true` to render all log messages as JSON objects using the co **Only Allowed in Self-Hosted Deployments** -The `MOUNT_PREFIX` environment variable is only allowed in Self-Hosted Deployment models, LangGraph Platform SaaS will not allow this environment variable. +The `MOUNT_PREFIX` environment variable is only allowed in Self-Hosted Deployment models, LangSmith Platform SaaS will not allow this environment variable. Set `MOUNT_PREFIX` to serve the LangGraph Server under a specific path prefix. This is useful for deployments where the server is behind a reverse proxy or load balancer that requires a specific path prefix. @@ -103,7 +103,7 @@ Number of jobs per worker for the LangGraph Server task queue. Defaults to `10`. **Only for Hybrid and Self-Hosted** -Custom Postgres instances are only available for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +Custom Postgres instances are only available for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments. Specify `POSTGRES_URI_CUSTOM` to use a custom Postgres instance. The value of `POSTGRES_URI_CUSTOM` must be a valid [Postgres connection URI](https://www.postgresql.org/docs/current/libpq-connect.html#LIBPQ-CONNSTRING-URIS). @@ -131,7 +131,7 @@ This feature is in Alpha. **Only Allowed in Self-Hosted Deployments** -Redis Cluster mode is only available in Self-Hosted Deployment models, LangGraph Platform SaaS will provision a redis instance for you by default. +Redis Cluster mode is only available in Self-Hosted Deployment models, LangSmith Platform SaaS will provision a redis instance for you by default. Set `REDIS_CLUSTER` to `True` to enable Redis Cluster mode. When enabled, the system will connect to Redis using cluster mode. This is useful when connecting to a Redis Cluster deployment. @@ -153,7 +153,7 @@ Defaults to `''`. **Only for Hybrid and Self-Hosted** -Custom Redis instances are only available for [Hybrid](/langgraph-platform/hybrid) and [Self-Hosted](/langgraph-platform/self-hosted) deployments. +Custom Redis instances are only available for [Hybrid](/langsmith/hybrid) and [Self-Hosted](/langsmith/self-hosted) deployments. Specify `REDIS_URI_CUSTOM` to use a custom Redis instance. The value of `REDIS_URI_CUSTOM` must be a valid [Redis connection URI](https://redis-py.readthedocs.io/en/stable/connections.html#redis.Redis.from_url). @@ -164,6 +164,6 @@ Time-to-live in seconds for resumable stream data in Redis. When a run is created and the output is streamed, the stream can be configured to be resumable (e.g. `stream_resumable=True`). If a stream is resumable, output from the stream is temporarily stored in Redis. The TTL for this data can be configured by setting `RESUMABLE_STREAM_TTL_SECONDS`. -See the [Python](/langgraph-platform/python-sdk#langgraph_sdk.client.RunsClient.stream) and [JS/TS](https://langchain-ai.github.io/langgraphjs/reference/classes/sdk_client.RunsClient.html#stream) SDKs for more details on how to implement resumable streams. +See the [Python](/langsmith/python-sdk#langgraph_sdk.client.RunsClient.stream) and [JS/TS](https://langchain-ai.github.io/langgraphjs/reference/classes/sdk_client.RunsClient.html#stream) SDKs for more details on how to implement resumable streams. Defaults to `120` seconds. diff --git a/src/langsmith/evaluation.mdx b/src/langsmith/evaluation.mdx index 2f32943d6..e9202064f 100644 --- a/src/langsmith/evaluation.mdx +++ b/src/langsmith/evaluation.mdx @@ -4,7 +4,7 @@ sidebarTitle: Overview mode: wide --- -Welcome to the LangSmith Evaluation documentation. The following sections help you create datasets, run evaluations, and analyze results: +Welcome to the LangSmith Platform evaluation documentation. The following sections help you create datasets, run evaluations, and analyze results: diff --git a/src/langsmith/export-backend.mdx b/src/langsmith/export-backend.mdx index 0ecad5742..c416f70f4 100644 --- a/src/langsmith/export-backend.mdx +++ b/src/langsmith/export-backend.mdx @@ -33,7 +33,7 @@ The following LangSmith services expose metrics at an endpoint, in the Prometheu * **Backend**: `http://-backend..svc.cluster.local:1984/metrics` * **Platform Backend**: `http://-platform-backend..svc.cluster.local:1986/metrics` * **Playground**: `http://-playground..svc.cluster.local:1988/metrics` -* **(LangGraph Platform Control Plane only) Host Backend**: `http://-host-backend..svc.cluster.local:1985/metrics` +* **(LangSmith Platform Control Plane only) Host Backend**: `http://-host-backend..svc.cluster.local:1985/metrics` You can use a [Prometheus](https://prometheus.io/docs/prometheus/latest/getting_started/#configure-prometheus-to-monitor-the-sample-targets) or [OpenTelemetry](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/prometheusreceiver) collector to scrape the endpoints, and export metrics to the backend of your choice. diff --git a/src/langsmith/faq.mdx b/src/langsmith/faq.mdx index 509c43456..1847d5c13 100644 --- a/src/langsmith/faq.mdx +++ b/src/langsmith/faq.mdx @@ -109,11 +109,11 @@ LangGraph will not add any overhead to your code and is specifically designed wi Yes. LangGraph is an MIT-licensed open-source library and is free to use. -### How are LangGraph and LangGraph Platform different? +### How are LangGraph and LangSmith Platform different? -LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangGraph Platform is a service for deploying and scaling LangGraph applications, with an opinionated API for building agent UXs, plus an integrated developer studio. +LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangSmith Platform is a service for deploying and scaling LangGraph applications, with an opinionated API for building agent UXs, plus an integrated developer studio. -| Features | LangGraph (open source) | LangGraph Platform | +| Features | LangGraph (open source) | LangSmith Platform | |---------------------|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------| | Description | Stateful orchestration framework for agentic applications | Scalable infrastructure for deploying LangGraph applications | | SDKs | Python and JavaScript | Python and JavaScript | @@ -129,13 +129,13 @@ LangGraph is a stateful, orchestration framework that brings added control to ag | Monitoring | None | Integrated with LangSmith for observability | | IDE integration | Studio | Studio | -### Is LangGraph Platform open source? +### Is LangSmith Platform open source? -No. LangGraph Platform is proprietary software. +No. LangSmith Platform is proprietary software. -There is a free, self-hosted version of LangGraph Platform with access to basic features. The Cloud deployment option and the Self-Hosted deployment options are paid services. [Contact our sales team](https://www.langchain.com/contact-sales) to learn more. +There is a free, self-hosted version of LangSmith Platform with access to basic features. The Cloud deployment option and the Self-Hosted deployment options are paid services. [Contact our sales team](https://www.langchain.com/contact-sales) to learn more. -For more information, see our [LangGraph Platform pricing page](https://www.langchain.com/pricing-langgraph-platform). +For more information, see our [LangSmith Platform pricing page](https://www.langchain.com/pricing-langgraph-platform). ### Does LangGraph work with LLMs that don't support tool calling? @@ -147,10 +147,10 @@ Yes! LangGraph is totally ambivalent to what LLMs are used under the hood. The m ### Can I use Studio without logging in to LangSmith -Yes! You can use the [development version of LangGraph Server](/langgraph-platform/local-server) to run the backend locally. +Yes! You can use the [development version of LangGraph Server](/langsmith/local-server) to run the backend locally. This will connect to the studio frontend hosted as part of LangSmith. If you set an environment variable of `LANGSMITH_TRACING=false`, then no traces will be sent to LangSmith. -### What does "nodes executed" mean for LangGraph Platform usage? +### What does "nodes executed" mean for LangSmith Platform usage? **Nodes Executed** is the aggregate number of nodes in a LangGraph application that are called and completed successfully during an invocation of the application. If a node in the graph is not called during execution or ends in an error state, these nodes will not be counted. If a node is called and completes successfully multiple times, each occurrence will be counted. diff --git a/src/langsmith/generative-ui-react.mdx b/src/langsmith/generative-ui-react.mdx index 008bfd09a..77e26d287 100644 --- a/src/langsmith/generative-ui-react.mdx +++ b/src/langsmith/generative-ui-react.mdx @@ -4,16 +4,16 @@ sidebarTitle: Implement generative user interfaces with LangGraph --- **Prerequisites** -* [LangGraph Platform](/langgraph-platform/index) -* [LangGraph Server](/langgraph-platform/langgraph-server) -* [`useStream()` React Hook](/langgraph-platform/use-stream-react) +* [LangSmith Platform](/langsmith/home) +* [LangGraph Server](/langsmith/langgraph-server) +* [`useStream()` React Hook](/langsmith/use-stream-react) Generative user interfaces (Generative UI) allows agents to go beyond text and generate rich user interfaces. This enables creating more interactive and context-aware applications where the UI adapts based on the conversation flow and AI responses. -![Agent Chat showing a prompt about booking/lodging and a generated set of hotel listing cards (images, titles, prices, locations) rendered inline as UI components.](/langgraph-platform/images/generative-ui-sample.jpg) +![Agent Chat showing a prompt about booking/lodging and a generated set of hotel listing cards (images, titles, prices, locations) rendered inline as UI components.](/langsmith/images/generative-ui-sample.jpg) -LangGraph Platform supports colocating your React components with your graph code. This allows you to focus on building specific UI components for your graph while easily plugging into existing chat interfaces such as [Agent Chat](https://agentchat.vercel.app) and loading the code only when actually needed. +LangSmith Platform supports colocating your React components with your graph code. This allows you to focus on building specific UI components for your graph while easily plugging into existing chat interfaces such as [Agent Chat](https://agentchat.vercel.app) and loading the code only when actually needed. ## Tutorial @@ -47,7 +47,7 @@ Next, define your UI components in your `langgraph.json` configuration: The `ui` section points to the UI components that will be used by graphs. By default, we recommend using the same key as the graph name, but you can split out the components however you like, see [Customise the namespace of UI components](#customise-the-namespace-of-ui-components) for more details. -LangGraph Platform will automatically bundle your UI components code and styles and serve them as external assets that can be loaded by the `LoadExternalComponent` component. Some dependencies such as `react` and `react-dom` will be automatically excluded from the bundle. +LangSmith Platform will automatically bundle your UI components code and styles and serve them as external assets that can be loaded by the `LoadExternalComponent` component. Some dependencies such as `react` and `react-dom` will be automatically excluded from the bundle. CSS and Tailwind 4.x is also supported out of the box, so you can freely use Tailwind classes as well as `shadcn/ui` in your UI components. @@ -208,13 +208,13 @@ export default function Page() { } ``` -Behind the scenes, `LoadExternalComponent` will fetch the JS and CSS for the UI components from LangGraph Platform and render them in a shadow DOM, thus ensuring style isolation from the rest of your application. +Behind the scenes, `LoadExternalComponent` will fetch the JS and CSS for the UI components from LangSmith Platform and render them in a shadow DOM, thus ensuring style isolation from the rest of your application. ## How-to guides ### Provide custom components on the client side -If you already have the components loaded in your client application, you can provide a map of such components to be rendered directly without fetching the UI code from LangGraph Platform. +If you already have the components loaded in your client application, you can provide a map of such components to be rendered directly without fetching the UI code from LangSmith Platform. ```tsx const clientComponents = { diff --git a/src/langsmith/graph-rebuild.mdx b/src/langsmith/graph-rebuild.mdx index 542eefd3d..5006fd48a 100644 --- a/src/langsmith/graph-rebuild.mdx +++ b/src/langsmith/graph-rebuild.mdx @@ -11,7 +11,7 @@ In most cases, customizing behavior based on the config should be handled by a s ## Prerequisites -Make sure to check out [this how-to guide](/langgraph-platform/setup-app-requirements-txt) on setting up your app for deployment first. +Make sure to check out [this how-to guide](/langsmith/setup-app-requirements-txt) on setting up your app for deployment first. ## Define graphs @@ -148,4 +148,4 @@ Finally, you need to specify the path to your graph-making function (`make_graph } ``` -See more info on LangGraph API configuration file [here](/langgraph-platform/cli#configuration-file) +See more info on LangGraph API configuration file [here](/langsmith/cli#configuration-file) diff --git a/src/langsmith/home.mdx b/src/langsmith/home.mdx index 37af84f6e..b0b0d0e89 100644 --- a/src/langsmith/home.mdx +++ b/src/langsmith/home.mdx @@ -1,12 +1,15 @@ --- -title: Get started with LangSmith +title: Get started with LangSmith Platform sidebarTitle: Overview mode: wide --- -**LangSmith provides tools for developing, debugging, and deploying LLM applications.** +**LangSmith Platform provides tools for developing, debugging, and deploying LLM applications.** It helps you trace requests, evaluate outputs, test prompts, and manage deployments in one place. LangSmith is framework agnostic, which means you can use it with or without LangChain's open-source frameworks [`langchain`](https://python.langchain.com) and [`langgraph`](https://langchain-ai.github.io/langgraph/). You can prototype locally and then move to production with integrated monitoring and evaluation to build more reliable AI systems. + +Get started with the LangSmith Platform at [smith.langchain.com](https://smith.langchain.com). + **NEW IMAGE NEEDED** **Important** -The Hybrid deployment option requires an [Enterprise](/langgraph-platform/plans) plan. +The hybrid option requires an [Enterprise](/langsmith/plans) plan. -## Requirements +The **hybrid** model lets you run the [_data plane_](/langsmith/data-plane) in your own cloud while LangChain hosts and manages the [_control plane_](/langsmith/control-plane). This option combines the convenience of a managed control plane with the flexibility of self-hosting your own LangGraph Servers and backing stores. When using hybrid, you authenticate with a [LangSmith](https://smith.langchain.com/) API key. -* You use `langgraph-cli` and/or [Studio](/langgraph-platform/langgraph-studio) app to test graph locally. -* You use `langgraph build` command to build image. +| Component | Responsibilities | Where it runs | Who manages it | +|----------------|------------------|---------------|----------------| +| **Control plane** |
  • UI for creating deployments and revisions
  • APIs for creating deployments and revisions
| LangChain’s cloud | LangChain | +| **Data plane** |
  • Listener to reconcile deployments with control plane state
  • LangGraph Servers
  • Backing services (Postgres, Redis, etc.)
| Your cloud | You | - -Supported Compute Platforms: [Kubernetes](https://kubernetes.io/) -For a guide on deployment, refer to [How to deploy the Hybrid](/langgraph-platform/deploy-hybrid). - - -## Overview +### Workflow -The [Hybrid](/langgraph-platform/deploy-hybrid) deployment option lets you manage the [data plane](/langgraph-platform/data-plane) in your own cloud, while we handle the [control plane](/langgraph-platform/control-plane) in ours. When using the Hybrid version, you authenticate with a [LangSmith](https://smith.langchain.com/) API key. +1. Use the `langgraph-cli` or [Studio](/langsmith/langgraph-studio) to test your graph locally. +2. Build a Docker image using the `langgraph build` command. +3. Deploy your LangGraph Server from the [control plane UI](/langsmith/control-plane#control-plane-ui). -Build a Docker image using the [LangGraph CLI](/langgraph-platform/langgraph-cli) and deploy your LangGraph Server from the [control plane UI](/langgraph-platform/control-plane#control-plane-ui). - -| | [Control plane](/langgraph-platform/control-plane) | [Data plane](/langgraph-platform/data-plane) | -|-------------------|-------------------|------------| -| **What is it?** |
  • Control plane UI for creating deployments and revisions
  • Control plane APIs for creating deployments and revisions
|
  • Data plane "listener" for reconciling deployments with control plane state
  • LangGraph Servers
  • Postgres, Redis, etc
| -| **Where is it hosted?** | LangChain's cloud | Your cloud | -| **Who provisions and manages it?** | LangChain | You | - -For information on how to deploy a [LangGraph Server](/langgraph-platform/langgraph-server) to Hybrid, see [Deploy to Hybrid](/langgraph-platform/deploy-hybrid) + +Supported Compute Platforms: [Kubernetes](https://kubernetes.io/).

+For setup, refer to the [Hybrid setup guide](/langsmith/deploy-hybrid). +
### Architecture -![Hybrid deployment: LangChain-hosted control plane (LangSmith UI/APIs) manages deployments. Your cloud runs a listener, LangGraph Server instances, and backing stores (Postgres/Redis) on Kubernetes.](/langgraph-platform/images/hybrid-architecture.png) +![Hybrid deployment: LangChain-hosted control plane (LangSmith UI/APIs) manages deployments. Your cloud runs a listener, LangGraph Server instances, and backing stores (Postgres/Redis) on Kubernetes.](/langsmith/images/hybrid-architecture.png) ### Compute Platforms -* **Kubernetes**: The Hybrid deployment option supports deploying data plane infrastructure to any Kubernetes cluster. +- **Kubernetes**: Hybrid supports running the data plane on any Kubernetes cluster. -If you would like to deploy to Kubernetes, you can follow the [Hybrid deployment guide](/langgraph-platform/deploy-hybrid). +For setup in Kubernetes, refer to the [Hybrid setup guide](/langsmith/deploy-hybrid) ### Egress to LangSmith and the control plane @@ -56,46 +50,35 @@ AWS/Azure PrivateLink or GCP Private Service Connect is currently not supported. ## Listeners -In a hybrid deployment, one or more ["listener" applications](/langgraph-platform/data-plane#”listener”-application) can be deployed depending on the organization of LangSmith workspaces and Kubernetes clusters. - -**Kubernetes cluster organization** -- One or more listeners can be deployed on a Kubernetes cluster. -- A listener can be configured to deploy to one or more Kubernetes namespaces in the cluster. -- The owner of the Kubernetes cluster is responsible for the optimal organization of listeners for their use case. This involves carefully planning how LangGraph Server deployments should be structured in advance. - -**LangSmith workspace organization** -- A workspace can have one or more listeners associated with it. -- LangGraph Server deployments in a workspace can only be deployed to Kubernetes clusters where all the workspace listeners are also deployed. - -### Use Cases - -The following provides a non-exhaustive list of examples for configuring listeners based on the organization of your LangSmith workspaces and Kubernetes clusters. However, these are not strict requirements. - -#### Each LangSmith workspace deploys to a separate Kubernetes cluster +In the hybrid option, one or more ["listener" applications](/langsmith/data-plane#listener-application) can run depending on how your LangSmith workspaces and Kubernetes clusters are organized. -Example: -- Kubernetes cluster `alpha` is for workspace `A` -- Kubernetes cluster `beta` is for workspace `B` +### Kubernetes cluster organization +- One or more listeners can run in a Kubernetes cluster. +- A listener can deploy into one or more namespaces in that cluster. +- Cluster owners are responsible for planning listener layout and LangGraph Server deployments. -#### Each LangSmith workspace deploys to a separate Kubernetes cluster, but “dev” workloads can be deployed to a shared Kubernetes cluster +### LangSmith workspace organization +- A workspace can be associated with one or more listeners. +- A workspace can only deploy to Kubernetes clusters where all of its listeners are deployed. -In this use case, mulitple LangSmith workspaces deploy to a single Kubernetes cluster. +## Use Cases -Example: -- Kubernetes cluster `alpha` is for workspace `A` -- Kubernetes cluster `beta` is for workspace `B` -- Kubernetes cluster `dev` is for workspaces `A` and `B` -- Both workspaces have two listeners associated with them -- Kubernetes cluster `dev` has two listener deployments +Here are some common listener configurations (not strict requirements): -#### Deploy to one Kubernetes cluster, in one Kubernetes namespace +### Each LangSmith workspace → separate Kubernetes cluster +- Cluster `alpha` runs workspace `A` +- Cluster `beta` runs workspace `B` -Example: -- Kubernetes cluster `alpha` is for workspace `A` -- Kubernetes cluster `alpha` is for workspace `B` +### Separate clusters, with shared “dev” cluster +- Cluster `alpha` runs workspace `A` +- Cluster `beta` runs workspace `B` +- Cluster `dev` runs workspaces `A` and `B` +- Both workspaces have two listeners; cluster `dev` has two listener deployments -#### Deploy to one Kubernetes cluster, but in multiple Kubernetes namespaces +### One cluster, one namespace per workspace +- Cluster `alpha`, namespace `1` runs workspace `A` +- Cluster `alpha`, namespace `2` runs workspace `B` -Example: -- Kubernetes cluster `alpha` and namespace `1` is for workspace `A` -- Kubernetes cluster `alpha` and namespace `2` is for workspace `B` +### One cluster, single namespace for multiple workspaces +- Cluster `alpha` runs workspace `A` +- Cluster `alpha` runs workspace `B` diff --git a/src/langgraph-platform/images/assistants.png b/src/langsmith/images/assistants.png similarity index 100% rename from src/langgraph-platform/images/assistants.png rename to src/langsmith/images/assistants.png diff --git a/src/langgraph-platform/images/authentication.png b/src/langsmith/images/authentication.png similarity index 100% rename from src/langgraph-platform/images/authentication.png rename to src/langsmith/images/authentication.png diff --git a/src/langgraph-platform/images/authorization.png b/src/langsmith/images/authorization.png similarity index 100% rename from src/langgraph-platform/images/authorization.png rename to src/langsmith/images/authorization.png diff --git a/src/langgraph-platform/images/autogen-output.png b/src/langsmith/images/autogen-output.png similarity index 100% rename from src/langgraph-platform/images/autogen-output.png rename to src/langsmith/images/autogen-output.png diff --git a/src/langgraph-platform/images/brave-shields.png b/src/langsmith/images/brave-shields.png similarity index 100% rename from src/langgraph-platform/images/brave-shields.png rename to src/langsmith/images/brave-shields.png diff --git a/src/langgraph-platform/images/double-texting.png b/src/langsmith/images/double-texting.png similarity index 100% rename from src/langgraph-platform/images/double-texting.png rename to src/langsmith/images/double-texting.png diff --git a/src/langgraph-platform/images/generative-ui-sample.jpg b/src/langsmith/images/generative-ui-sample.jpg similarity index 100% rename from src/langgraph-platform/images/generative-ui-sample.jpg rename to src/langsmith/images/generative-ui-sample.jpg diff --git a/src/langgraph-platform/images/hybrid-architecture.png b/src/langsmith/images/hybrid-architecture.png similarity index 100% rename from src/langgraph-platform/images/hybrid-architecture.png rename to src/langsmith/images/hybrid-architecture.png diff --git a/src/langgraph-platform/images/langgraph-cloud-architecture.excalidraw b/src/langsmith/images/langgraph-cloud-architecture.excalidraw similarity index 100% rename from src/langgraph-platform/images/langgraph-cloud-architecture.excalidraw rename to src/langsmith/images/langgraph-cloud-architecture.excalidraw diff --git a/src/langgraph-platform/images/langgraph-cloud-architecture.png b/src/langsmith/images/langgraph-cloud-architecture.png similarity index 100% rename from src/langgraph-platform/images/langgraph-cloud-architecture.png rename to src/langsmith/images/langgraph-cloud-architecture.png diff --git a/src/langgraph-platform/images/langgraph-platform-deployment-architecture.png b/src/langsmith/images/langgraph-platform-deployment-architecture.png similarity index 100% rename from src/langgraph-platform/images/langgraph-platform-deployment-architecture.png rename to src/langsmith/images/langgraph-platform-deployment-architecture.png diff --git a/src/langgraph-platform/images/lg-platform.png b/src/langsmith/images/lg-platform.png similarity index 100% rename from src/langgraph-platform/images/lg-platform.png rename to src/langsmith/images/lg-platform.png diff --git a/src/langgraph-platform/images/lg-studio.png b/src/langsmith/images/lg-studio.png similarity index 100% rename from src/langgraph-platform/images/lg-studio.png rename to src/langsmith/images/lg-studio.png diff --git a/src/langgraph-platform/images/no-auth.png b/src/langsmith/images/no-auth.png similarity index 100% rename from src/langgraph-platform/images/no-auth.png rename to src/langsmith/images/no-auth.png diff --git a/src/langgraph-platform/images/self-hosted-full-platform-architecture.png b/src/langsmith/images/self-hosted-full-platform-architecture.png similarity index 100% rename from src/langgraph-platform/images/self-hosted-full-platform-architecture.png rename to src/langsmith/images/self-hosted-full-platform-architecture.png diff --git a/src/langsmith/interrupt-concurrent.mdx b/src/langsmith/interrupt-concurrent.mdx index 267a0d14f..fdc22dedd 100644 --- a/src/langsmith/interrupt-concurrent.mdx +++ b/src/langsmith/interrupt-concurrent.mdx @@ -3,7 +3,7 @@ title: Interrupt concurrent sidebarTitle: Interrupt --- -This guide assumes knowledge of what double-texting is, which you can learn about in the [double-texting conceptual guide](/langgraph-platform/double-texting). +This guide assumes knowledge of what double-texting is, which you can learn about in the [double-texting conceptual guide](/langsmith/double-texting). The guide covers the `interrupt` option for double texting, which interrupts the prior run of the graph and starts a new one with the double-text. This option does not delete the first run, but rather keeps it in the database but sets its status to `interrupted`. Below is a quick example of using the `interrupt` option. diff --git a/src/langsmith/kubernetes.mdx b/src/langsmith/kubernetes.mdx index 904245ed0..1b566c0b2 100644 --- a/src/langsmith/kubernetes.mdx +++ b/src/langsmith/kubernetes.mdx @@ -1,5 +1,5 @@ --- -title: Self-host LangSmith on Kubernetes +title: Self-host LangSmith Platform on Kubernetes sidebarTitle: Install on Kubernetes --- @@ -7,9 +7,9 @@ sidebarTitle: Install on Kubernetes Self-hosting LangSmith is an add-on to the Enterprise Plan designed for our largest, most security-conscious customers. See our [pricing page](https://www.langchain.com/pricing) for more detail, and [contact our sales team](https://www.langchain.com/contact-sales) if you want to get a license key to trial LangSmith in your environment. -This guide will walk you through the process of deploying LangSmith to a Kubernetes cluster. We will use Helm to install LangSmith and its dependencies. +This page describes how to set up [LangSmith Platform](/langsmith/self-hosted#langsmith-platform) in a Kubernetes cluster. You'll use Helm to install LangSmith Platform and its dependencies. -We've successfully tested LangSmith on the following Kubernetes distributions: +We've successfully tested LangSmith Platform on the following Kubernetes distributions: * Google Kubernetes Engine (GKE) * Amazon Elastic Kubernetes Service (EKS) diff --git a/src/langsmith/langgraph-cli.mdx b/src/langsmith/langgraph-cli.mdx index 91796d515..544499d68 100644 --- a/src/langsmith/langgraph-cli.mdx +++ b/src/langsmith/langgraph-cli.mdx @@ -3,7 +3,7 @@ title: LangGraph CLI sidebarTitle: Dupe content CLI --- -**LangGraph CLI** is a multi-platform command-line tool for building and running the [LangGraph API server](/langgraph-platform/langgraph-server) locally. The resulting server includes all API endpoints for your graph's runs, threads, assistants, etc. as well as the other services required to run your agent, including a managed database for checkpointing and storage. +**LangGraph CLI** is a multi-platform command-line tool for building and running the [LangGraph API server](/langsmith/langgraph-server) locally. The resulting server includes all API endpoints for your graph's runs, threads, assistants, etc. as well as the other services required to run your agent, including a managed database for checkpointing and storage. ## Installation @@ -28,9 +28,9 @@ LangGraph CLI provides the following core functionality: | Command | Description | | -------- | -------| -| [`langgraph build`](/langgraph-platform/cli#build) | Builds a Docker image for the [LangGraph API server](/langgraph-platform/langgraph-server) that can be directly deployed. | -| [`langgraph dev`](/langgraph-platform/cli#dev) | Starts a lightweight development server that requires no Docker installation. This server is ideal for rapid development and testing. This is available in version 0.1.55 and up. -| [`langgraph dockerfile`](/langgraph-platform/cli#dockerfile) | Generates a [Dockerfile](https://docs.docker.com/reference/dockerfile/) that can be used to build images for and deploy instances of the [LangGraph API server](/langgraph-platform/langgraph-server). This is useful if you want to further customize the dockerfile or deploy in a more custom way. | -| [`langgraph up`](/langgraph-platform/cli#up) | Starts an instance of the [LangGraph API server](/langgraph-platform/langgraph-server) locally in a docker container. This requires the docker server to be running locally. It also requires a LangSmith API key for local development or a license key for production use. | +| [`langgraph build`](/langsmith/cli#build) | Builds a Docker image for the [LangGraph API server](/langsmith/langgraph-server) that can be directly deployed. | +| [`langgraph dev`](/langsmith/cli#dev) | Starts a lightweight development server that requires no Docker installation. This server is ideal for rapid development and testing. This is available in version 0.1.55 and up. +| [`langgraph dockerfile`](/langsmith/cli#dockerfile) | Generates a [Dockerfile](https://docs.docker.com/reference/dockerfile/) that can be used to build images for and deploy instances of the [LangGraph API server](/langsmith/langgraph-server). This is useful if you want to further customize the dockerfile or deploy in a more custom way. | +| [`langgraph up`](/langsmith/cli#up) | Starts an instance of the [LangGraph API server](/langsmith/langgraph-server) locally in a docker container. This requires the docker server to be running locally. It also requires a LangSmith API key for local development or a license key for production use. | -For more information, see the [LangGraph CLI Reference](/langgraph-platform/cli). +For more information, see the [LangGraph CLI Reference](/langsmith/cli). diff --git a/src/langsmith/langgraph-platform-logs.mdx b/src/langsmith/langgraph-platform-logs.mdx index a982d2d21..7e2e68347 100644 --- a/src/langsmith/langgraph-platform-logs.mdx +++ b/src/langsmith/langgraph-platform-logs.mdx @@ -3,7 +3,7 @@ title: View server logs for a trace sidebarTitle: View server logs for a trace --- -When viewing a trace that was generated by a run in LangGraph Platform, you can access the associated server logs directly from the trace view. +When viewing a trace that was generated by a run in LangSmith Platform, you can access the associated server logs directly from the trace view. Viewing server logs for a trace only works with the [Cloud SaaS](https://langchain-ai.github.io/langgra/langsmith/observability-concepts/deployment_options/#cloud-saas) and [fully self-hosted](https://langchain-ai.github.io/langgra/langsmith/observability-concepts/deployment_options/#self-hosted-control-plane) deployment options. @@ -15,7 +15,7 @@ In the trace view, use the **See Logs** button in the top right corner, next to ![](/langsmith/images/view-server-logs-button.png) -Clicking this button will take you to the server logs view for the associated deployment in LangGraph Platform. +Clicking this button will take you to the server logs view for the associated deployment in LangSmith Platform. ## Server logs view diff --git a/src/langsmith/langgraph-server-changelog.mdx b/src/langsmith/langgraph-server-changelog.mdx index 6395997e3..183bada6b 100644 --- a/src/langsmith/langgraph-server-changelog.mdx +++ b/src/langsmith/langgraph-server-changelog.mdx @@ -3,7 +3,7 @@ title: LangGraph Server changelog sidebarTitle: Server changelog --- -[LangGraph Server](/langgraph-platform/langgraph-server) is an API platform for creating and managing agent-based applications. It provides built-in persistence, a task queue, and supports deploying, configuring, and running assistants (agentic workflows) at scale. This changelog documents all notable updates, features, and fixes to LangGraph Server releases. +[LangGraph Server](/langsmith/langgraph-server) is an API platform for creating and managing agent-based applications. It provides built-in persistence, a task queue, and supports deploying, configuring, and running assistants (agentic workflows) at scale. This changelog documents all notable updates, features, and fixes to LangGraph Server releases. ## v0.4.37 diff --git a/src/langsmith/langgraph-server.mdx b/src/langsmith/langgraph-server.mdx index 1f82668dd..b55c54679 100644 --- a/src/langsmith/langgraph-server.mdx +++ b/src/langsmith/langgraph-server.mdx @@ -2,13 +2,13 @@ title: LangGraph Server --- -**LangGraph Server** offers an API for creating and managing agent-based applications. It is built on the concept of [assistants](/langgraph-platform/assistants), which are agents configured for specific tasks, and includes built-in [persistence](/oss/langgraph/persistence#memory-store) and a **task queue**. This versatile API supports a wide range of agentic application use cases, from background processing to real-time interactions. +**LangGraph Server** offers an API for creating and managing agent-based applications. It is built on the concept of [assistants](/langsmith/assistants), which are agents configured for specific tasks, and includes built-in [persistence](/oss/langgraph/persistence#memory-store) and a **task queue**. This versatile API supports a wide range of agentic application use cases, from background processing to real-time interactions. -Use LangGraph Server to create and manage [assistants](/langgraph-platform/assistants), [threads](/oss/langgraph/persistence#threads), [runs](/langgraph-platform/assistants#execution), [cron jobs](/langgraph-platform/cron-jobs), [webhooks](/langgraph-platform/use-webhooks), and more. +Use LangGraph Server to create and manage [assistants](/langsmith/assistants), [threads](/oss/langgraph/persistence#threads), [runs](/langsmith/assistants#execution), [cron jobs](/langsmith/cron-jobs), [webhooks](/langsmith/use-webhooks), and more. -**API reference** -For detailed information on the API endpoints and data models, see [LangGraph Platform API reference docs](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html). +**API reference**

+For detailed information on the API endpoints and data models, refer to the [API reference docs](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html).
To use the `Enterprise` version of the LangGraph Server, you must acquire a license key that you will need to specify when running the Docker image. To acquire a license key, [contact our sales team](https://www.langchain.com/contact-sales). @@ -23,7 +23,7 @@ You can run the `Enterprise` version of the LangGraph Server on the following de To deploy a LangGraph Server application, you need to specify the graph(s) you want to deploy, as well as any relevant configuration settings, such as dependencies and environment variables. -Read the [application structure](/langgraph-platform/application-structure) guide to learn how to structure your LangGraph application for deployment. +Read the [application structure](/langsmith/application-structure) guide to learn how to structure your LangGraph application for deployment. ## Parts of a deployment @@ -31,9 +31,9 @@ When you deploy LangGraph Server, you are deploying one or more [graphs](#graphs ### Graphs -When you deploy a graph with LangGraph Server, you are deploying a "blueprint" for an [Assistant](/langgraph-platform/assistants). +When you deploy a graph with LangGraph Server, you are deploying a "blueprint" for an [Assistant](/langsmith/assistants). -An [Assistant](/langgraph-platform/assistants) is a graph paired with specific configuration settings. You can create multiple assistants per graph, each with unique settings to accommodate different use cases +An [Assistant](/langsmith/assistants) is a graph paired with specific configuration settings. You can create multiple assistants per graph, each with unique settings to accommodate different use cases that can be served by the same graph. Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings. @@ -47,13 +47,13 @@ chatbot that only supports back-and-forth conversation, without the ability to i LangGraph Server leverages a database for [persistence](/oss/langgraph/persistence) and a task queue. -Currently, only [Postgres](https://www.postgresql.org/) is supported as a database for LangGraph Server and [Redis](https://redis.io/) as the task queue. +[PostgreSQL](https://www.postgresql.org/) is supported as a database for LangGraph Server and [Redis](https://redis.io/) as the task queue. -If you're deploying using [LangGraph Platform](/langgraph-platform/cloud), these components are managed for you. If you're deploying LangGraph Server on your own infrastructure, you'll need to set up and manage these components yourself. +If you're deploying using [LangSmith Platform cloud](/langsmith/cloud), these components are managed for you. If you're deploying LangGraph Server on your [own infrastructure](/langsmith/self-hosted), you'll need to set up and manage these components yourself. -Please review the [deployment options](/langgraph-platform/deployment-options) guide for more information on how these components are set up and managed. +For more information on how these components are set up and managed, review the [deployment options](/langsmith/deployment-options) guide. ## Learn more -* LangGraph [Application Structure](/langgraph-platform/application-structure) guide explains how to structure your LangGraph application for deployment. -* The [LangGraph Platform API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html) provides detailed information on the API endpoints and data models. +- LangGraph [Application Structure](/langsmith/application-structure) guide explains how to structure your LangGraph application for deployment. +- The [API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html) provides detailed information on the API endpoints and data models. diff --git a/src/langsmith/langgraph-studio.mdx b/src/langsmith/langgraph-studio.mdx index aa9b24600..cd3bd648f 100644 --- a/src/langsmith/langgraph-studio.mdx +++ b/src/langsmith/langgraph-studio.mdx @@ -4,14 +4,14 @@ title: Overview **Prerequisites** -* [LangGraph Platform](/langgraph-platform/index) -* [LangGraph Server](/langgraph-platform/langgraph-server) -* [LangGraph CLI](/langgraph-platform/langgraph-cli) +* [LangSmith Platform](/langsmith/home) +* [LangGraph Server](/langsmith/langgraph-server) +* [LangGraph CLI](/langsmith/langgraph-cli) Studio is a specialized agent IDE that enables visualization, interaction, and debugging of agentic systems that implement the LangGraph Server API protocol. Studio also integrates with tracing, evaluation, and prompt engineering. -![The LangGraph CLI creates a LangGraph Server instance. Studio can interact with LangGraph Server. RemoteGraph and the SDKs can also interact with LangGraph Server.](/langgraph-platform/images/lg-platform.png) +![The LangGraph CLI creates a LangGraph Server instance. Studio can interact with LangGraph Server. RemoteGraph and the SDKs can also interact with LangGraph Server.](/langsmith/images/lg-platform.png) ## Features @@ -26,7 +26,7 @@ Key features of Studio: * Manage [long term memory](/oss/concepts/memory) * Debug agent state via [time travel](/oss/langgraph/use-time-travel) -Studio works for graphs that are deployed on [LangGraph Platform](/langgraph-platform/deployment-quickstart) or for graphs that are running locally via the [LangGraph Server](/langgraph-platform/local-server). +Studio works for graphs that are deployed on [LangSmith Platform](/langsmith/deployment-quickstart) or for graphs that are running locally via the [LangGraph Server](/langsmith/local-server). Studio supports two modes: @@ -40,7 +40,7 @@ Chat mode is a simpler UI for iterating on and testing chat-specific agents. It ## Learn more -* See this guide on how to [get started](/langgraph-platform/quick-start-studio) with Studio. +* See this guide on how to [get started](/langsmith/quick-start-studio) with Studio. ## Video guide diff --git a/src/langsmith/deploy-to-cloud.mdx b/src/langsmith/deploy-to-cloud.mdx index e45ca40f8..9173d5b37 100644 --- a/src/langsmith/deploy-to-cloud.mdx +++ b/src/langsmith/deploy-to-cloud.mdx @@ -28,9 +28,9 @@ Starting from the LangSmi 2. Select the desired `Deployment Type`. 1. `Development` deployments are meant for non-production use cases and are provisioned with minimal resources. 2. `Production` deployments can serve up to 500 requests/second and are provisioned with highly available storage with automatic backups. - 3. Determine if the deployment should be `Shareable through LangGraph Studio`. + 3. Determine if the deployment should be `Shareable through the Debugger`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through the Debugger to any LangSmith user. A direct URL to the Debugger for the deployment will be provided to share with other LangSmith users. 4. Specify `Environment Variables` and secrets. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the deployment. 1. Sensitive values such as API keys (e.g. `OPENAI_API_KEY`) should be specified as secrets. 2. Additional non-secret environment variables can be specified as well. @@ -48,9 +48,9 @@ Starting from the LangSmi 3. In the `Deployment` view, in the top-right corner, select `+ New Revision`. 4. In the `New Revision` modal, fill out the required fields. 1. Specify the full path to the [LangGraph API config file](/langsmith/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. - 2. Determine if the deployment should be `Shareable through Studio`. + 2. Determine if the deployment should be `Shareable through the Debugger`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through the Debugger to any LangSmith user. A direct URL to the Debugger for the deployment will be provided to share with other LangSmith users. 3. Specify `Environment Variables` and secrets. Existing secrets and environment variables are prepopulated. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the revision. 1. Add new secrets or environment variables. 2. Remove existing secrets or environment variables. diff --git a/src/langsmith/deployment-quickstart.mdx b/src/langsmith/deployment-quickstart.mdx index 1a40f47c4..d8f715a43 100644 --- a/src/langsmith/deployment-quickstart.mdx +++ b/src/langsmith/deployment-quickstart.mdx @@ -29,16 +29,16 @@ To deploy an application to **LangSmith Platform**, your application code must r 6. Click **Submit** to deploy. This may take about 15 minutes to complete. You can check the status in the **Deployment details** view. -## 3. Test your application in Studio +## 3. Test your application in the Debugger Once your application is deployed: 1. Select the deployment you just created to view more details. -2. Click the **Studio** button in the top right corner. [Studio](/langsmith/langgraph-studio) will open to display your graph. +2. Click the **Debugger** button in the top right corner. The [Debugger](/langsmith/debugger) will open to display your graph. ## 4. Get the API URL for your deployment -1. In the **Deployment details** view in LangGraph, click the **API URL** to copy it to your clipboard. +1. In the **Deployment details** view, click the **API URL** to copy it to your clipboard. 2. Click the `URL` to copy it to the clipboard. ## 5. Test the API @@ -157,5 +157,5 @@ You can now test the API: You have deployed an application using LangSmith Platform. Here are some other resources to check out: -* [Studio overview](/langsmith/langgraph-studio) -* [Deployment options](/langsmith/deployment-options) +* The [Debugger overview](/langsmith/debugger) +* [Deployment options](/langsmith/deployments) diff --git a/src/langsmith/deployments.mdx b/src/langsmith/deployments.mdx index 8858f022a..675a20192 100644 --- a/src/langsmith/deployments.mdx +++ b/src/langsmith/deployments.mdx @@ -74,7 +74,7 @@ _Platform deployment_ covers how to host and manage the LangSmith Platform itsel
-### Platform Comparison +### Comparison | | **Cloud** | **Hybrid** | **Self-Hosted** | |--|--|--|--| diff --git a/src/langsmith/faq.mdx b/src/langsmith/faq.mdx index 48ea366c6..3573401fa 100644 --- a/src/langsmith/faq.mdx +++ b/src/langsmith/faq.mdx @@ -111,7 +111,7 @@ Yes. LangGraph is an MIT-licensed open-source library and is free to use. ### How are LangGraph and LangSmith Platform different? -LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangSmith Platform is a service for deploying and scaling LangGraph applications, with an opinionated API for building agent UXs, plus an integrated developer studio. +LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangSmith Platform is a service for deploying and scaling agentic applications, with an opinionated API for building agent UXs, plus an integrated developer UI. | Features | LangGraph (open source) | LangSmith Platform | |---------------------|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------| @@ -127,7 +127,7 @@ LangGraph is a stateful, orchestration framework that brings added control to ag | Concurrency Control | Simple threading | Supports double-texting | | Scheduling | None | Cron scheduling | | Monitoring | None | Integrated with LangSmith for observability | -| IDE integration | Studio | Studio | +| IDE integration | Debugger | Debugger | ### Is LangSmith Platform open source? @@ -145,10 +145,10 @@ Yes! You can use LangGraph with any LLMs. The main reason we use LLMs that suppo Yes! LangGraph is totally ambivalent to what LLMs are used under the hood. The main reason we use closed LLMs in most of the tutorials is that they seamlessly support tool calling, while OSS LLMs often don't. But tool calling is not necessary (see [this section](#does-langgraph-work-with-llms-that-dont-support-tool-calling)) so you can totally use LangGraph with OSS LLMs. -### Can I use Studio without logging in to LangSmith +### Can I use the Debugger without logging in to LangSmith? Yes! You can use the [development version of LangGraph Server](/langsmith/local-server) to run the backend locally. -This will connect to the studio frontend hosted as part of LangSmith. +This will connect to the Debugger frontend hosted as part of LangSmith. If you set an environment variable of `LANGSMITH_TRACING=false`, then no traces will be sent to LangSmith. ### What does "nodes executed" mean for LangSmith Platform usage? diff --git a/src/langsmith/home.mdx b/src/langsmith/home.mdx index a61419952..a7c4ddbe6 100644 --- a/src/langsmith/home.mdx +++ b/src/langsmith/home.mdx @@ -78,11 +78,11 @@ Once your account and API key are ready, choose a quickstart to begin building w Use an intuitive visual interface to design, test, and refine applications end-to-end. diff --git a/src/langsmith/hybrid.mdx b/src/langsmith/hybrid.mdx index 2614a7200..7a3f35b19 100644 --- a/src/langsmith/hybrid.mdx +++ b/src/langsmith/hybrid.mdx @@ -17,7 +17,7 @@ The **hybrid** model lets you run the [_data plane_](/langsmith/data-plane) in y ### Workflow -1. Use the `langgraph-cli` or [Studio](/langsmith/langgraph-studio) to test your graph locally. +1. Use the `langgraph-cli` or the [Debugger](/langsmith/debugger) to test your graph locally. 2. Build a Docker image using the `langgraph build` command. 3. Deploy your LangGraph Server from the [control plane UI](/langsmith/control-plane#control-plane-ui). diff --git a/src/langsmith/langgraph-studio.mdx b/src/langsmith/langgraph-studio.mdx deleted file mode 100644 index c9d9ed89f..000000000 --- a/src/langsmith/langgraph-studio.mdx +++ /dev/null @@ -1,55 +0,0 @@ ---- -title: Overview ---- - - -**Prerequisites** -* [LangSmith Platform](/langsmith/home) -* [LangGraph Server](/langsmith/langgraph-server) -* [LangGraph CLI](/langsmith/cli) - - -Studio is a specialized agent IDE that enables visualization, interaction, and debugging of agentic systems that implement the LangGraph Server API protocol. Studio also integrates with tracing, evaluation, and prompt engineering. - - - The LangGraph CLI creates a LangGraph Server instance. Studio can interact with LangGraph Server. RemoteGraph and the SDKs can also interact with the LangGraph Server deployment(s). - - -## Features - -Key features of Studio: - -* Visualize your graph architecture -* [Run and interact with your agent](/langsmith/use-studio#run-application) -* [Manage assistants](/langsmith/use-studio#manage-assistants) -* [Manage threads](/langsmith/use-studio#manage-threads) -* [Iterate on prompts](/langsmith/observability-studio) -* [Run experiments over a dataset](/langsmith/observability-studio#run-experiments-over-a-dataset) -* Manage [long term memory](/oss/concepts/memory) -* Debug agent state via [time travel](/oss/langgraph/use-time-travel) - -Studio works for graphs that are deployed on [LangSmith Platform](/langsmith/deployment-quickstart) or for graphs that are running locally via the [LangGraph Server](/langsmith/local-server). - -Studio supports two modes: - -### Graph mode - -Graph mode exposes the full feature-set of Studio and is useful when you would like as many details about the execution of your agent, including the nodes traversed, intermediate states, and LangSmith integrations (such as adding to datasets and playground). - -### Chat mode - -Chat mode is a simpler UI for iterating on and testing chat-specific agents. It is useful for business users and those who want to test overall agent behavior. Chat mode is only supported for graph's whose state includes or extends [`MessagesState`](/oss/langgraph/use-graph-api#messagesstate). - -## Learn more - -* See this guide on how to [get started](/langsmith/quick-start-studio) with Studio. - -## Video guide - diff --git a/src/langsmith/local-server.mdx b/src/langsmith/local-server.mdx index 048cac74c..5cad7cb8a 100644 --- a/src/langsmith/local-server.mdx +++ b/src/langsmith/local-server.mdx @@ -227,7 +227,7 @@ For production use, deploy LangGraph Server with access to a persistent storage Now that you have a LangGraph app running locally, take your journey further by exploring features and deployment: -- [Studio](/langsmith/langgraph-studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [Studio quickstart](/langsmith/quick-start-studio). +- The [Debugger](/langsmith/debugger) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [Debugger quickstart](/langsmith/quick-start-debugger). - [Deploy on cloud](/langsmith/deployment-quickstart) with the quickstart guide. - [LangGraph Server API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/): Explore the LangGraph Server API documentation. - [Python SDK Reference](/langsmith/python-sdk): Explore the Python SDK API Reference. diff --git a/src/langsmith/observability-studio.mdx b/src/langsmith/observability-debugger.mdx similarity index 83% rename from src/langsmith/observability-studio.mdx rename to src/langsmith/observability-debugger.mdx index 98bc071d8..d3350f523 100644 --- a/src/langsmith/observability-studio.mdx +++ b/src/langsmith/observability-debugger.mdx @@ -1,25 +1,25 @@ --- -title: Observability in Studio +title: Observability in the Debugger sidebarTitle: Traces, datasets, prompts --- -Studio provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: +The LangSmith Deployments [Debugger](/langsmith/debugger) provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: - [Iterate on prompts](#iterate-on-prompts): Modify prompts inside graph nodes directly or with the LangSmith playground. - [Run experiments over a dataset](#run-experiments-over-a-dataset): Execute your assistant over a LangSmith dataset to score and compare results. -- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into Studio and optionally clone them into your local agent. +- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into the Debugger and optionally clone them into your local agent. - [Add a node to a dataset](#add-node-to-dataset): Turn parts of thread history into dataset examples for evaluation or further analysis. ## Iterate on prompts -Studio supports the following methods for modifying prompts in your graph: +The Debugger supports the following methods for modifying prompts in your graph: - [Direct node editing](#direct-node-editing) - [Playground interface](#playground) ### Direct node editing -Studio allows you to edit prompts used inside individual nodes, directly from the graph interface. +The Debugger allows you to edit prompts used inside individual nodes, directly from the graph interface. #### Graph Configuration @@ -141,9 +141,9 @@ The [playground](/langsmith/create-a-prompt) interface allows testing individual ## Run experiments over a dataset -Studio lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). +The Debugger lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). -This guide shows you how to run a full end-to-end experiment directly from Studio. +This guide shows you how to run a full end-to-end experiment directly from the Debugger. ### Prerequisites @@ -152,26 +152,26 @@ Before running an experiment, ensure you have the following: - **A LangSmith dataset**: Your dataset should contain the inputs you want to test and optionally, reference outputs for comparison. The schema for the inputs must match the required input schema for the assistant. For more information on schemas, see [here](/oss/langgraph/use-graph-api#schema). For more on creating datasets, refer to [How to Manage Datasets](/langsmith/manage-datasets-in-application#set-up-your-dataset). - **(Optional) Evaluators**: You can attach evaluators (e.g., LLM-as-a-Judge, heuristics, or custom functions) to your dataset in LangSmith. These will run automatically after the graph has processed all inputs. - **A running application**: The experiment can be run against: - - An application deployed on [LangSmith Platform](/langsmith/deployment-quickstart). + - An application deployed on [LangSmith Platform](/langsmith/deployments). - A locally running application started via the [langgraph-cli](/langsmith/local-server). ### Experiment setup -1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Studio page. +1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Debugger page. 1. Select your dataset. In the modal that appears, select the dataset (or a specific dataset split) to use for the experiment and click **Start**. 1. Monitor the progress. All of the inputs in the dataset will now be run against the active assistant. Monitor the experiment's progress via the badge in the top right corner. -1. You can continue to work in Studio while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. +1. You can continue to work in the Debugger while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. ## Debug LangSmith traces -This guide explains how to open LangSmith traces in Studio for interactive investigation and debugging. +This guide explains how to open LangSmith traces in the Debugger for interactive investigation and debugging. ### Open deployed threads 1. Open the LangSmith trace, selecting the root run. -1. Click **Run in Studio**. +1. Click **Run in Debugger**. -This will open Studio connected to the associated deployment with the trace's parent thread selected. +This will open the Debugger connected to the associated deployment with the trace's parent thread selected. ### Testing local agents with remote traces @@ -192,12 +192,12 @@ This section explains how to test a local agent against remote traces from LangS #### Clone thread 1. Open the LangSmith trace, selecting the root run. -2. Click the dropdown next to **Run in Studio**. +2. Click the dropdown next to **Run in Debugger**. 3. Enter your local agent's URL. 4. Select **Clone thread locally**. 5. If multiple graphs exist, select the target graph. -A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to Studio for your locally running application. +A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to the Debugger for your locally running application. ## Add node to dataset diff --git a/src/langsmith/langgraph-platform-logs.mdx b/src/langsmith/platform-logs.mdx similarity index 97% rename from src/langsmith/langgraph-platform-logs.mdx rename to src/langsmith/platform-logs.mdx index 7e2e68347..07dd28d79 100644 --- a/src/langsmith/langgraph-platform-logs.mdx +++ b/src/langsmith/platform-logs.mdx @@ -11,7 +11,7 @@ Viewing server logs for a trace only works with the [Cloud SaaS](https://langcha ## Access server logs from trace view -In the trace view, use the **See Logs** button in the top right corner, next to the **Run in Studio** button. +In the trace view, use the **See Logs** button in the top right corner, next to the **Run in Debugger** button. ![](/langsmith/images/view-server-logs-button.png) diff --git a/src/langsmith/quick-start-studio.mdx b/src/langsmith/quick-start-debugger.mdx similarity index 62% rename from src/langsmith/quick-start-studio.mdx rename to src/langsmith/quick-start-debugger.mdx index 6a02a32e6..34ff1da40 100644 --- a/src/langsmith/quick-start-studio.mdx +++ b/src/langsmith/quick-start-debugger.mdx @@ -1,26 +1,26 @@ --- -title: Get started with Studio -sidebarTitle: Use Studio +title: Get started with the Debugger +sidebarTitle: Quickstart --- -[Studio](/langsmith/langgraph-studio) supports connecting to two types of graphs: +The [Debugger](/langsmith/debugger) in the [LangSmith Deployments UI](https://smith.langchain.com) supports connecting to two types of graphs: - Graphs deployed on [cloud or self-hosted](#deployed-graphs). - Graphs running locally with [LangGraph server](#local-development-server). ## Deployed graphs -Studio is accessed in the [LangSmith UI](https://smith.langchain.com) from **Deployments**. +The Debugger is accessed in the [LangSmith UI](https://smith.langchain.com) from the **Deployments** navigation. -For applications that are [deployed](/langsmith/deployment-quickstart), you can access Studio as part of that deployment. To do so, navigate to the deployment in the UI and select **Studio**. +For applications that are [deployed](/langsmith/deployment-quickstart), you can access the Debugger as part of that deployment. To do so, navigate to the deployment in the UI and select **Debugger**. -This will load the Studio connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langsmith/assistants), and [memory](/oss/concepts/memory) in that deployment. +This will load the Debugger connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langsmith/assistants), and [memory](/oss/concepts/memory) in that deployment. ## Local development server ### Prerequisites -To test your application locally using Studio: +To test your application locally using the Debugger: - Follow the [local application quickstart](/langsmith/local-server) first. - If you don't want data [traced](/langsmith/observability-concepts#traces) to LangSmith, set `LANGSMITH_TRACING=false` in your application's `.env` file. With tracing disabled, no data leaves your local server. @@ -47,7 +47,7 @@ To test your application locally using Studio: **Browser Compatibility** - Safari blocks `localhost` connections to Studio. To work around this, run the command with `--tunnel` to access Studio via a secure tunnel. + Safari blocks `localhost` connections to the Debugger. To work around this, run the command with `--tunnel` to access the Debugger via a secure tunnel. This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langsmith/cli#dev) to learn about all the options for starting the API server. @@ -64,11 +64,11 @@ To test your application locally using Studio: > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` - Once running, you will automatically be directed to Studio. + Once running, you will automatically be directed to the Debugger. -1. For a running server, access Studio with one of the following: +1. For a running server, access the Dbugger with one of the following: 1. Directly navigate to the following URL: `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`. - 1. Navigate to **Deployments** in the UI, click the **Studio** button on a deployment, enter `http://127.0.0.1:2024` and click **Connect**. + 1. Navigate to **Deployments** in the UI, click the **Debugger** button on a deployment, enter `http://127.0.0.1:2024` and click **Connect**. If running your server at a different host or port, update the `baseUrl` to match. @@ -120,16 +120,16 @@ Then attach your preferred debugger: -For issues getting started, refer to the [troubleshooting guide](/langsmith/troubleshooting-studio). +For issues getting started, refer to the [troubleshooting guide](/langsmith/troubleshooting-debugger). ## Next steps -For more information on how to run studio, refer to the following guides: +For more information on how to run the Debugger, refer to the following guides: -- [Run application](/langsmith/use-studio#run-application) -- [Manage assistants](/langsmith/use-studio#manage-assistants) -- [Manage threads](/langsmith/use-studio#manage-threads) -- [Iterate on prompts](/langsmith/observability-studio) -- [Debug LangSmith traces](/langsmith/observability-studio#debug-langsmith-traces) -- [Add node to dataset](/langsmith/observability-studio#add-node-to-dataset) +- [Run application](/langsmith/use-debugger#run-application) +- [Manage assistants](/langsmith/use-debugger#manage-assistants) +- [Manage threads](/langsmith/use-debugger#manage-threads) +- [Iterate on prompts](/langsmith/observability-debugger) +- [Debug LangSmith traces](/langsmith/observability-debugger#debug-langsmith-traces) +- [Add node to dataset](/langsmith/observability-debugger#add-node-to-dataset) diff --git a/src/langsmith/self-hosted.mdx b/src/langsmith/self-hosted.mdx index 654089ff8..a412b9143 100644 --- a/src/langsmith/self-hosted.mdx +++ b/src/langsmith/self-hosted.mdx @@ -19,7 +19,7 @@ Self-hosting allows you to run all components entirely within your own cloud env Model | Includes | Best for | Methods ------------------|------------------|----------|-------------------- **LangSmith Platform** |
  • LangSmith app (UI + API)
  • Backend services (queue, playground, ACE)
  • Datastores: PostgreSQL, Redis, ClickHouse, optional blob storage
|
  • Teams who need self-hosted observability, tracing, and evaluation
  • Running the LangSmith app without deploying agents/graphs
|
  • Docker Compose (dev/test)
  • Kubernetes + Helm (production)
-**With agent deployment** |
  • Everything from LangSmith Platform
  • Control plane (deployments UI, revision management, Studio)
  • Data plane (LangGraph Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
+**With agent deployment** |
  • Everything from LangSmith Platform
  • Control plane (deployments UI, revision management, the Debugger)
  • Data plane (LangGraph Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
**Standalone server** |
  • LangGraph Server container(s)
  • Requires PostgreSQL + Redis (shared or dedicated)
  • Optional LangSmith integration for tracing
|
  • Lightweight deployments of one or a few agents
  • Integrating LangGraph Servers as microservices
  • Teams preferring to manage scaling & CI/CD themselves
|
  • Docker / Docker Compose (dev/test)
  • Kubernetes + Helm (production)
  • Any container runtime or VM (ECS, EC2, ACI, etc.)
@@ -103,7 +103,7 @@ You run both the control plane and the data plane entirely within your own infra ### Requirements -1. Use the `langgraph-cli` or Studio to test your graph locally. +1. Use the `langgraph-cli` or the [Debugger](/langsmith/debugger) to test your graph locally. 2. Build a Docker image with `langgraph build`. 3. Deploy your LangGraph Server via the LangSmith control plane UI or through your container tooling of choice. 4. All agents are deployed as Kubernetes services behind the ingress configured for your LangSmith instance. @@ -137,19 +137,15 @@ Do not run standalone servers in serverless environments. Scale-to-zero may caus ### Workflow -1. Define and test your graph locally using the `langgraph-cli` or Studio. +1. Define and test your graph locally using the `langgraph-cli` or the [Debugger](/langsmith/debugger). 2. Package your agent as a Docker image. 3. Deploy the LangGraph Server to your compute platform of choice (Kubernetes, Docker, VM). 4. Optionally, configure LangSmith API keys and endpoints so the server reports traces and evaluations back to LangSmith Platform (self-hosted or SaaS). ---- - ### Architecture ![Standalone Container](/langsmith/images/langgraph-platform-deployment-architecture.png) ---- - ### Supported Compute Platforms - **Kubernetes**: Use the LangSmith Helm chart to run LangGraph Servers in a Kubernetes cluster. This is the recommended option for production-grade deployments. diff --git a/src/langsmith/set-up-custom-auth.mdx b/src/langsmith/set-up-custom-auth.mdx index 4e88cb649..63962867b 100644 --- a/src/langsmith/set-up-custom-auth.mdx +++ b/src/langsmith/set-up-custom-auth.mdx @@ -55,7 +55,7 @@ npx @langchain/langgraph-cli dev ``` -The server will start and open the studio in your browser: +The server will start and open the [Debugger](/langsmith/debugger) in your browser: ``` > - 🚀 API: http://127.0.0.1:2024 @@ -142,7 +142,7 @@ Start the server again to test everything out: langgraph dev --no-browser ``` -If you didn't add the `--no-browser`, the studio UI will open in the browser. You may wonder, how is the studio able to still connect to our server? By default, we also permit access from the Studio, even when using custom auth. This makes it easier to develop and test your bot in the studio. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: +If you didn't add the `--no-browser`, the Debugger UI will open in the browser. By default, we also permit access from the Debugger, even when using custom auth. This makes it easier to develop and test your bot in the Debugger. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: ```json { diff --git a/src/langsmith/setup-javascript.mdx b/src/langsmith/setup-javascript.mdx index 1c0f8fa3b..d273e80a6 100644 --- a/src/langsmith/setup-javascript.mdx +++ b/src/langsmith/setup-javascript.mdx @@ -153,7 +153,7 @@ const workflow = new StateGraph(MessagesAnnotation) // will be called after the source node is called. routeModelOutput, // List of the possible destinations the conditional edge can route to. - // Required for conditional edges to properly render the graph in Studio + // Required for conditional edges to properly render the graph in the Debugger ["tools", "__end__"] ) // This means that after `tools` is called, `callModel` node is called next. diff --git a/src/langsmith/troubleshooting-studio.mdx b/src/langsmith/troubleshooting-debugger.mdx similarity index 77% rename from src/langsmith/troubleshooting-studio.mdx rename to src/langsmith/troubleshooting-debugger.mdx index 4a7f877fe..98174c3ec 100644 --- a/src/langsmith/troubleshooting-studio.mdx +++ b/src/langsmith/troubleshooting-debugger.mdx @@ -1,11 +1,11 @@ --- -title: Studio troubleshooting +title: Debugger troubleshooting sidebarTitle: Troubleshooting --- ## Safari Connection Issues -Safari blocks plain-HTTP traffic on localhost. When running Studio with `langgraph dev`, you may see "Failed to load assistants" errors. +Safari blocks plain-HTTP traffic on localhost. When running the Debugger with `langgraph dev`, you may see "Failed to load assistants" errors. ### Solution 1: Use Cloudflare Tunnel @@ -30,7 +30,7 @@ The command outputs a URL in this format: https://smith.langchain.com/studio/?baseUrl=https://hamilton-praise-heart-costumes.trycloudflare.com ``` -Use this URL in Safari to load Studio. Here, the `baseUrl` parameter specifies your agent server endpoint. +Use this URL in Safari to load the Debugger. Here, the `baseUrl` parameter specifies your agent server endpoint. ### Solution 2: Use Chromium Browser @@ -38,14 +38,12 @@ Chrome and other Chromium browsers allow HTTP on localhost. Use `langgraph dev` ## Brave Connection Issues -Brave blocks plain-HTTP traffic on localhost when Brave Shields are enabled. When running Studio with `langgraph dev`, you may see "Failed to load assistants" errors. +Brave blocks plain-HTTP traffic on localhost when Brave Shields are enabled. When running the Debugger with `langgraph dev`, you may see "Failed to load assistants" errors. ### Solution 1: Disable Brave Shields Disable Brave Shields for LangSmith using the Brave icon in the URL bar. -![Brave Shields panel with Shields turned off for smith.langchain.com so Studio can reach localhost.](/langsmith/images/brave-shields.png) - ### Solution 2: Use Cloudflare Tunnel @@ -69,12 +67,12 @@ The command outputs a URL in this format: https://smith.langchain.com/studio/?baseUrl=https://hamilton-praise-heart-costumes.trycloudflare.com ``` -Use this URL in Brave to load Studio. Here, the `baseUrl` parameter specifies your agent server endpoint. +Use this URL in Brave to load the Debugger. Here, the `baseUrl` parameter specifies your agent server endpoint. ## Graph Edge Issues Undefined conditional edges may show unexpected connections in your graph. This is -because without proper definition, Studio assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: +because without proper definition, the Debugger assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: ### Solution 1: Path Map @@ -106,7 +104,7 @@ def routing_function(state: GraphState) -> Literal["node_b","node_c"]: return "node_c" ``` -## Experiment troubleshooting in Studio +## Experiment troubleshooting in the Debugger ### **Run experiment** button is disabled diff --git a/src/langsmith/use-studio.mdx b/src/langsmith/use-debugger.mdx similarity index 80% rename from src/langsmith/use-studio.mdx rename to src/langsmith/use-debugger.mdx index a5bab43f2..be281cbbf 100644 --- a/src/langsmith/use-studio.mdx +++ b/src/langsmith/use-debugger.mdx @@ -1,9 +1,9 @@ --- -title: How to use Studio +title: How to use the Debugger sidebarTitle: Runs, assistants, threads --- -This page describes the core workflows you’ll use in Studio. It explains how to run your application, manage assistant configurations, and work with conversation threads. Each section includes steps in both graph mode (full-featured view of your graph’s execution) and chat mode (lightweight conversational interface): +This page describes the core workflows you’ll use in the Debugger. It explains how to run your application, manage assistant configurations, and work with conversation threads. Each section includes steps in both graph mode (full-featured view of your graph’s execution) and chat mode (lightweight conversational interface): - [Run application](#run-application): Execute your application or agent and observe its behavior. - [Manage assistants](#manage-assistants): Create, edit, and select the assistant configuration used by your application. @@ -16,7 +16,7 @@ This page describes the core workflows you’ll use in Studio. It explains how t ### Specify input -1. Define the input to your graph in the **Input** section on the left side of the page, below the graph interface. Studio will attempt to render a form for your input based on the graph's defined [state schema](/oss/langgraph/graph-api/#schema). To disable this, click the **View Raw** button, which will present you with a JSON editor. +1. Define the input to your graph in the **Input** section on the left side of the page, below the graph interface. The Debugger will attempt to render a form for your input based on the graph's defined [state schema](/oss/langgraph/graph-api/#schema). To disable this, click the **View Raw** button, which will present you with a JSON editor. 1. Click the up or down arrows at the top of the **Input** section to toggle through and use previously submitted inputs. ### Run settings @@ -69,7 +69,7 @@ To cancel the ongoing run: ## Manage assistants -Studio lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. +The Debugger lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. For more conceptual details, refer to the [Assistants overview](/langsmith/assistants/). @@ -93,7 +93,7 @@ Chat mode enables you to switch through the different assistants in your graph v ## Manage threads -Studio provides tools to view all [threads](/oss/langgraph/persistence#threads) saved on the server and edit their state. You can create new threads, switch between threads, and modify past states both in graph mode and chat mode. +The Debugger provides tools to view all [threads](/oss/langgraph/persistence#threads) saved on the server and edit their state. You can create new threads, switch between threads, and modify past states both in graph mode and chat mode. @@ -120,7 +120,7 @@ If you instead want to re-run the thread from a given checkpoint without editing 1. View all threads in the right-hand pane of the page. 1. Select the desired thread and the thread history will populate in the center panel. -1. To create a new thread, click **+** and [submit a run](/langsmith/use-studio#run-application#chat-mode). +1. To create a new thread, click **+** and submit a run. To edit a human message in the thread: @@ -133,7 +133,7 @@ To edit a human message in the thread: ## Next steps -Refer to the following guides for more detail on tasks you can complete in Studio: +Refer to the following guides for more detail on tasks you can complete in the Debugger: -- [Iterate on prompts](/langsmith/observability-studio) -- [Run experiments over datasets](/langsmith/observability-studio#run-experiments-over-a-dataset) +- [Iterate on prompts](/langsmith/observability-debugger) +- [Run experiments over datasets](/langsmith/observability-debugger#run-experiments-over-a-dataset) diff --git a/src/langsmith/use-threads.mdx b/src/langsmith/use-threads.mdx index 24ab690c6..b5f94be41 100644 --- a/src/langsmith/use-threads.mdx +++ b/src/langsmith/use-threads.mdx @@ -498,4 +498,4 @@ You can also view threads in a deployment via the LangSmith Platform UI. Inside your deployment, select the "Threads" tab. This will load a table of all of the threads in your deployment. -Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in [Studio](/langsmith/langgraph-studio). +Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in the [Debugger](/langsmith/debugger). diff --git a/src/oss/langchain/debugger.mdx b/src/oss/langchain/debugger.mdx new file mode 100644 index 000000000..86a6dc850 --- /dev/null +++ b/src/oss/langchain/debugger.mdx @@ -0,0 +1,14 @@ +--- +title: Debugger +--- + +import AlphaCallout from '/snippets/alpha-lc-callout.mdx'; +import debugger from '/snippets/oss/debugger.mdx'; + + + + + + +For more information about local and deployed agents, see [Set up local LangGraph Server](/oss/langchain/debugger#setup-local-langgraph-server) and [Deploy](/oss/langchain/deploy). + diff --git a/src/oss/langchain/deploy.mdx b/src/oss/langchain/deploy.mdx index 363e6a73c..8593d9313 100644 --- a/src/oss/langchain/deploy.mdx +++ b/src/oss/langchain/deploy.mdx @@ -20,6 +20,6 @@ Before you begin, ensure you have the following: ### 1. Create a repository on GitHub -Your application's code must reside in a GitHub repository to be deployed on LangSmith Platform. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/studio#setup-local-langgraph-server). Then, push your code to the repository. +Your application's code must reside in a GitHub repository to be deployed on LangSmith Platform. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/debugger#setup-local-langgraph-server). Then, push your code to the repository. diff --git a/src/oss/langchain/sql-agent.mdx b/src/oss/langchain/sql-agent.mdx index 0cb002c92..acf685a32 100644 --- a/src/oss/langchain/sql-agent.mdx +++ b/src/oss/langchain/sql-agent.mdx @@ -675,10 +675,10 @@ The agent correctly wrote a query, checked the query, and ran it to inform its f You can inspect all aspects of the above run, including steps taken, tools invoked, what prompts were seen by the LLM, and more in the [LangSmith trace](https://smith.langchain.com/public/653d218b-af67-4854-95ca-6abecb9b2520/r). -#### (Optional) Use Studio +#### (Optional) Use the Debugger -[Studio](/langsmith/langgraph-studio) provides a "client side" loop as well as memory so you can run this as a chat interface and query the database. You can ask questions like "Tell me the scheme of the database" or "Show me the invoices for the 5 top customers". You will see the SQL command that is generated and the resulting output. The details of how to get that started are below. - +The [Debugger](/langsmith/debugger) provides a "client side" loop as well as memory so you can run this as a chat interface and query the database. You can ask questions like "Tell me the scheme of the database" or "Show me the invoices for the 5 top customers". You will see the SQL command that is generated and the resulting output. The details of how to get that started are below. + In addition to the previously mentioned packages, you will need to: diff --git a/src/oss/langchain/studio.mdx b/src/oss/langchain/studio.mdx deleted file mode 100644 index 1c40c0038..000000000 --- a/src/oss/langchain/studio.mdx +++ /dev/null @@ -1,14 +0,0 @@ ---- -title: Studio ---- - -import AlphaCallout from '/snippets/alpha-lc-callout.mdx'; -import studio from '/snippets/oss/studio.mdx'; - - - - - - -For more information about local and deployed agents, see [Set up local LangGraph Server](/oss/langchain/studio#setup-local-langgraph-server) and [Deploy](/oss/langchain/deploy). - diff --git a/src/oss/langchain/ui.mdx b/src/oss/langchain/ui.mdx index cbd4dbd4a..389805f2b 100644 --- a/src/oss/langchain/ui.mdx +++ b/src/oss/langchain/ui.mdx @@ -11,7 +11,7 @@ import chat_ui from '/snippets/oss/ui.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langchain/studio#setup-local-langgraph-server) and [deployed agents](/oss/langchain/deploy). +Agent Chat UI can connect to both [local](/oss/langchain/debugger#setup-local-langgraph-server) and [deployed agents](/oss/langchain/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: diff --git a/src/oss/langgraph/add-human-in-the-loop.mdx b/src/oss/langgraph/add-human-in-the-loop.mdx index 1779c754f..ec512c1eb 100644 --- a/src/oss/langgraph/add-human-in-the-loop.mdx +++ b/src/oss/langgraph/add-human-in-the-loop.mdx @@ -1614,13 +1614,13 @@ Static interrupts are **not** recommended for human-in-the-loop workflows. Use [ ``` -### Use static interrupts in LangGraph Studio +### Use static interrupts in the Debugger -You can use [LangGraph Studio](/langsmith/langgraph-studio) to debug your graph. You can set static breakpoints in the UI and then run the graph. You can also use the UI to inspect the graph state at any point in the execution. +You can use [the Debugger](/langsmith/debugger) to debug your graph. You can set static breakpoints in the UI and then run the graph. You can also use the UI to inspect the graph state at any point in the execution. ![image](/oss/images/static-interrupt.png) -LangGraph Studio is free with [locally deployed applications](/oss/langgraph/local-server) using `langgraph dev`. +The Debugger is free with [locally deployed applications](/oss/langgraph/local-server) using `langgraph dev`. ::: ## Considerations diff --git a/src/oss/langgraph/local-server.mdx b/src/oss/langgraph/local-server.mdx index c7d05b368..ff179cbeb 100644 --- a/src/oss/langgraph/local-server.mdx +++ b/src/oss/langgraph/local-server.mdx @@ -123,9 +123,9 @@ Sample output: The `langgraph dev` command starts LangGraph Server in an in-memory mode. This mode is suitable for development and testing purposes. For production use, deploy LangGraph Server with access to a persistent storage backend. For more information, see [Deployment options](/langsmith/deployment-options). -## 6. Test your application in LangGraph Studio +## 6. Test your application in the Debugger -[LangGraph Studio](/langsmith/langgraph-studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Test your graph in LangGraph Studio by visiting the URL provided in the output of the `langgraph dev` command: +The [Debugger](/langsmith/debugger) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Test your graph in the Debugger by visiting the URL provided in the output of the `langgraph dev` command: ``` > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 diff --git a/src/oss/langgraph/observability.mdx b/src/oss/langgraph/observability.mdx index 395da9302..5271c032c 100644 --- a/src/oss/langgraph/observability.mdx +++ b/src/oss/langgraph/observability.mdx @@ -9,7 +9,7 @@ import observability from '/snippets/oss/observability.mdx'; Traces are a series of steps that your application takes to go from input to output. Each of these individual steps is represented by a run. You can use [LangSmith](https://smith.langchain.com/) to visualize these execution steps. To use it, [enable tracing for your application](/langsmith/trace-with-langgraph). This enables you to do the following: -* [Debug a locally running application](/langsmith/observability-studio#debug-langsmith-traces). +* [Debug a locally running application](/langsmith/observability-debugger#debug-langsmith-traces). * [Evaluate the application performance](/oss/langchain/evals). * [Monitor the application](/langsmith/dashboards). diff --git a/src/oss/langgraph/overview.mdx b/src/oss/langgraph/overview.mdx index 5167babff..a6f733f66 100644 --- a/src/oss/langgraph/overview.mdx +++ b/src/oss/langgraph/overview.mdx @@ -101,7 +101,7 @@ LangGraph provides low-level supporting infrastructure for *any* long-running, s While LangGraph can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools for building agents. To improve your LLM application development, pair LangGraph with: * [LangSmith](http://www.langchain.com/langsmith) — Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time. -* [LangSmith Platform](/langgraph-platform) — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [LangGraph Studio](/langsmith/langgraph-studio). +* [LangSmith Platform](/langgraph-platform) — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [the Debugger](/langsmith/debugger). * [LangChain](/oss/langchain/overview) - Provides integrations and composable components to streamline LLM application development. Contains agent abstractions built on top of LangGraph. ## Acknowledgements diff --git a/src/oss/langgraph/persistence.mdx b/src/oss/langgraph/persistence.mdx index e3d32aa32..73c37faf3 100644 --- a/src/oss/langgraph/persistence.mdx +++ b/src/oss/langgraph/persistence.mdx @@ -1046,7 +1046,7 @@ for await (const update of await graph.stream( ``` ::: -When we use the LangSmith Platform, either locally (e.g., in LangGraph Studio) or with LangSmith Platform, the base store is available to use by default and does not need to be specified during graph compilation. To enable semantic search, however, you **do** need to configure the indexing settings in your `langgraph.json` file. For example: +When we use the LangSmith Platform, either locally (e.g., in the Debugger) or with LangSmith Platform, the base store is available to use by default and does not need to be specified during graph compilation. To enable semantic search, however, you **do** need to configure the indexing settings in your `langgraph.json` file. For example: ```json { diff --git a/src/oss/langgraph/studio.mdx b/src/oss/langgraph/studio.mdx index b3cc851e4..b95af8487 100644 --- a/src/oss/langgraph/studio.mdx +++ b/src/oss/langgraph/studio.mdx @@ -1,10 +1,10 @@ --- -title: Studio +title: Debugger --- import AlphaCallout from '/snippets/alpha-lg-callout.mdx'; -import studio from '/snippets/oss/studio.mdx'; +import debugger from '/snippets/oss/debugger.mdx'; - + diff --git a/src/oss/langgraph/ui.mdx b/src/oss/langgraph/ui.mdx index 2ae1f9286..5a7aea1dc 100644 --- a/src/oss/langgraph/ui.mdx +++ b/src/oss/langgraph/ui.mdx @@ -11,7 +11,7 @@ import chat_ui from '/snippets/oss/ui.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langgraph/studio#setup-local-langgraph-server) and [deployed agents](/oss/langgraph/deploy). +Agent Chat UI can connect to both [local](/oss/langgraph/debugger#setup-local-langgraph-server) and [deployed agents](/oss/langgraph/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: diff --git a/src/oss/langgraph/use-graph-api.mdx b/src/oss/langgraph/use-graph-api.mdx index a64684756..67f646c63 100644 --- a/src/oss/langgraph/use-graph-api.mdx +++ b/src/oss/langgraph/use-graph-api.mdx @@ -1465,7 +1465,7 @@ This allows state to be checkpointed in between the execution of nodes, so your * How we can "rewind" and branch-off executions using LangGraph's [time travel](/oss/langgraph/use-time-travel) features They also determine how execution steps are [streamed](/oss/langgraph/streaming), and how your application is visualized -and debugged using [LangGraph Studio](/langsmith/langgraph-studio). +and debugged using [the Debugger](/langsmith/debugger). Let's demonstrate an end-to-end example. We will create a sequence of three steps: diff --git a/src/snippets/oss/studio.mdx b/src/snippets/oss/debugger.mdx similarity index 69% rename from src/snippets/oss/studio.mdx rename to src/snippets/oss/debugger.mdx index 7ceb3b3cd..064383fa1 100644 --- a/src/snippets/oss/studio.mdx +++ b/src/snippets/oss/debugger.mdx @@ -1,12 +1,12 @@ -This guide will walk you through how to use **LangGraph Studio** to visualize, interact, and debug your agent locally. +This guide will walk you through how to use the **Debugger** to visualize, interact, and debug your agent locally. -LangGraph Studio is our free-to-use, powerful agent IDE that integrates with [LangSmith](/langsmith/home) to enable tracing, evaluation, and prompt engineering. See exactly how your agent thinks, trace every decision, and ship smarter, more reliable agents. +The Debugger is our free-to-use, powerful agent IDE that integrates with [LangSmith](/langsmith/home) to enable tracing, evaluation, and prompt engineering. See exactly how your agent thinks, trace every decision, and ship smarter, more reliable agents. diff --git a/src/langsmith/deploy-to-cloud.mdx b/src/langsmith/deploy-to-cloud.mdx index ba8bd004f..d69376d68 100644 --- a/src/langsmith/deploy-to-cloud.mdx +++ b/src/langsmith/deploy-to-cloud.mdx @@ -28,9 +28,9 @@ Starting from the
LangSmi 2. Select the desired `Deployment Type`. 1. `Development` deployments are meant for non-production use cases and are provisioned with minimal resources. 2. `Production` deployments can serve up to 500 requests/second and are provisioned with highly available storage with automatic backups. - 3. Determine if the deployment should be `Shareable through the Debugger`. + 3. Determine if the deployment should be `Shareable through Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through the Debugger to any LangSmith user. A direct URL to the Debugger for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. 4. Specify `Environment Variables` and secrets. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the deployment. 1. Sensitive values such as API keys (e.g. `OPENAI_API_KEY`) should be specified as secrets. 2. Additional non-secret environment variables can be specified as well. @@ -48,9 +48,9 @@ Starting from the LangSmi 3. In the `Deployment` view, in the top-right corner, select `+ New Revision`. 4. In the `New Revision` modal, fill out the required fields. 1. Specify the full path to the [LangGraph API config file](/langsmith/cli#configuration-file) including the file name. For example, if the file `langgraph.json` is in the root of the repository, simply specify `langgraph.json`. - 2. Determine if the deployment should be `Shareable through the Debugger`. + 2. Determine if the deployment should be `Shareable through Studio`. 1. If unchecked, the deployment will only be accessible with a valid LangSmith API key for the workspace. - 2. If checked, the deployment will be accessible through the Debugger to any LangSmith user. A direct URL to the Debugger for the deployment will be provided to share with other LangSmith users. + 2. If checked, the deployment will be accessible through Studio to any LangSmith user. A direct URL to Studio for the deployment will be provided to share with other LangSmith users. 3. Specify `Environment Variables` and secrets. Existing secrets and environment variables are prepopulated. See the [Environment Variables reference](/langsmith/env-var) to configure additional variables for the revision. 1. Add new secrets or environment variables. 2. Remove existing secrets or environment variables. diff --git a/src/langsmith/deployment-quickstart.mdx b/src/langsmith/deployment-quickstart.mdx index cdbe1b4ea..54bef6e58 100644 --- a/src/langsmith/deployment-quickstart.mdx +++ b/src/langsmith/deployment-quickstart.mdx @@ -29,12 +29,12 @@ To deploy an application to **LangSmith**, your application code must reside in 6. Click **Submit** to deploy. This may take about 15 minutes to complete. You can check the status in the **Deployment details** view. -## 3. Test your application in the Debugger +## 3. Test your application in Studio Once your application is deployed: 1. Select the deployment you just created to view more details. -2. Click the **Debugger** button in the top right corner. The [Debugger](/langsmith/debugger) will open to display your graph. +2. Click the **Studio** button in the top right corner. [Studio](/langsmith/studio) will open to display your graph. ## 4. Get the API URL for your deployment @@ -157,5 +157,5 @@ You can now test the API: You have deployed an application using LangSmith. Here are some other resources to check out: -- The [Debugger overview](/langsmith/debugger) +- The [Studio overview](/langsmith/studio) - [Hosting options](/langsmith/hosting) diff --git a/src/langsmith/deployments.mdx b/src/langsmith/deployments.mdx index 29aab4e2e..3b39fa2ff 100644 --- a/src/langsmith/deployments.mdx +++ b/src/langsmith/deployments.mdx @@ -1,6 +1,6 @@ --- title: LangSmith Deployment -sidebarTitle: Deployment +sidebarTitle: Overview mode: wide --- @@ -18,7 +18,7 @@ You’ll learn how to: - Build, deploy, and update [LangGraph Servers](/langsmith/langgraph-servers). - Secure your deployments with [authentication and access control](/langsmith/auth). - Customize your server runtime ([lifespan hooks](/langsmith/custom-lifespan), [middleware](/langsmith/custom-middleware), and [routes](/langsmith/custom-routes)). -- Debug, observe, and troubleshoot deployed agents using the [Debugger UI](/langsmith/debugger). +- Debug, observe, and troubleshoot deployed agents using the [Studio UI](/langsmith/studio). - Use an intuitive visual interface to design, test, and refine applications end-to-end. + Use a visual interface to design, test, and refine applications end-to-end. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/src/langsmith/local-server.mdx b/src/langsmith/local-server.mdx index aa1020c98..50f3b2055 100644 --- a/src/langsmith/local-server.mdx +++ b/src/langsmith/local-server.mdx @@ -44,8 +44,8 @@ Create a new app from the [`new-langgraph-project-python` template](https://gith -**Additional templates** -If you use `langgraph new` without specifying a template, you will be presented with an interactive menu that will allow you to choose from a list of available templates. +**Additional templates**

+If you use [`langgraph new`](/langsmith/cli) without specifying a template, you will be presented with an interactive menu that will allow you to choose from a list of available templates.
## 3. Install dependencies @@ -69,7 +69,7 @@ In the root of your new LangGraph app, install the dependencies in `edit` mode s ## 4. Create a `.env` file -You will find a `.env.example` in the root of your new LangGraph app. Create a `.env` file in the root of your new LangGraph app and copy the contents of the `.env.example` file into it, filling in the necessary API keys: +You will find a [`.env.example`](/langsmith/application-structure#configuration-file) in the root of your new LangGraph app. Create a `.env` file in the root of your new LangGraph app and copy the contents of the `.env.example` file into it, filling in the necessary API keys: ```bash LANGSMITH_API_KEY=lsv2... @@ -104,7 +104,7 @@ Sample output: > - Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` -The `langgraph dev` command starts LangGraph Server in an in-memory mode. This mode is suitable for development and testing purposes. +The [`langgraph dev`](/langsmith/cli) command starts [LangGraph Server](/langsmith/langgraph-server) in an in-memory mode. This mode is suitable for development and testing purposes. For production use, deploy LangGraph Server with access to a persistent storage backend. For more information, refer to the LangSmith [hosting options](/langsmith/hosting). @@ -227,7 +227,7 @@ For production use, deploy LangGraph Server with access to a persistent storage Now that you have a LangGraph app running locally, take your journey further by exploring features and deployment: -- The [Debugger](/langsmith/debugger) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [Debugger quickstart](/langsmith/quick-start-debugger). +- [Studio](/langsmith/studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Try the [Studio quickstart](/langsmith/quick-start-studio). - [Deploy on cloud](/langsmith/deployment-quickstart) with the quickstart guide. - [LangGraph Server API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/): Explore the LangGraph Server API documentation. - [Python SDK Reference](/langsmith/python-sdk): Explore the Python SDK API Reference. diff --git a/src/langsmith/observability-llm-tutorial.mdx b/src/langsmith/observability-llm-tutorial.mdx index b825ad849..7aaebee02 100644 --- a/src/langsmith/observability-llm-tutorial.mdx +++ b/src/langsmith/observability-llm-tutorial.mdx @@ -1,6 +1,6 @@ --- -title: Trace a RAG application -sidebarTitle: Tutorial - Trace a RAG application +title: Trace a RAG application tutorial +sidebarTitle: Trace a RAG application --- In this tutorial, we'll build a simple RAG application using the OpenAI SDK. We'll add observability to the application at each stage of development, from prototyping to production. diff --git a/src/langsmith/observability-quickstart.mdx b/src/langsmith/observability-quickstart.mdx index aa854c472..37b14fed4 100644 --- a/src/langsmith/observability-quickstart.mdx +++ b/src/langsmith/observability-quickstart.mdx @@ -1,6 +1,6 @@ --- title: Tracing quickstart -sidebarTitle: Trace an app +sidebarTitle: Quickstart --- [_Observability_](/langsmith/observability-concepts) is a critical requirement for applications built with large language models (LLMs). LLMs are non-deterministic, which means that the same prompt can produce different responses. This behavior makes debugging and monitoring more challenging than with traditional software. diff --git a/src/langsmith/observability-debugger.mdx b/src/langsmith/observability-studio.mdx similarity index 83% rename from src/langsmith/observability-debugger.mdx rename to src/langsmith/observability-studio.mdx index d385dbdb8..afe65ce4e 100644 --- a/src/langsmith/observability-debugger.mdx +++ b/src/langsmith/observability-studio.mdx @@ -1,25 +1,25 @@ --- -title: Observability in the Debugger +title: Observability in Studio sidebarTitle: Traces, datasets, prompts --- -The LangSmith Deployments [Debugger](/langsmith/debugger) provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: +LangSmith [Studio](/langsmith/studio) provides tools to inspect, debug, and improve your app beyond execution. By working with traces, datasets, and prompts, you can see how your application behaves in detail, measure its performance, and refine its outputs: - [Iterate on prompts](#iterate-on-prompts): Modify prompts inside graph nodes directly or with the LangSmith playground. - [Run experiments over a dataset](#run-experiments-over-a-dataset): Execute your assistant over a LangSmith dataset to score and compare results. -- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into the Debugger and optionally clone them into your local agent. +- [Debug LangSmith traces](#debug-langsmith-traces): Import traced runs into Studio and optionally clone them into your local agent. - [Add a node to a dataset](#add-node-to-dataset): Turn parts of thread history into dataset examples for evaluation or further analysis. ## Iterate on prompts -The Debugger supports the following methods for modifying prompts in your graph: +Studio supports the following methods for modifying prompts in your graph: - [Direct node editing](#direct-node-editing) - [Playground interface](#playground) ### Direct node editing -The Debugger allows you to edit prompts used inside individual nodes, directly from the graph interface. +Studio allows you to edit prompts used inside individual nodes, directly from the graph interface. #### Graph Configuration @@ -141,9 +141,9 @@ The [playground](/langsmith/create-a-prompt) interface allows testing individual ## Run experiments over a dataset -The Debugger lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). +Studio lets you run [evaluations](/langsmith/evaluation-concepts) by executing your assistant against a predefined LangSmith [dataset](/langsmith/evaluation-concepts#datasets). This allows you to test performance across a variety of inputs, compare outputs to reference answers, and score results with configured [evaluators](/langsmith/evaluation-concepts#evaluators). -This guide shows you how to run a full end-to-end experiment directly from the Debugger. +This guide shows you how to run a full end-to-end experiment directly from Studio. ### Prerequisites @@ -157,21 +157,21 @@ Before running an experiment, ensure you have the following: ### Experiment setup -1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Debugger page. +1. Launch the experiment. Click the **Run experiment** button in the top right corner of the Studio page. 1. Select your dataset. In the modal that appears, select the dataset (or a specific dataset split) to use for the experiment and click **Start**. 1. Monitor the progress. All of the inputs in the dataset will now be run against the active assistant. Monitor the experiment's progress via the badge in the top right corner. -1. You can continue to work in the Debugger while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. +1. You can continue to work in Studio while the experiment runs in the background. Click the arrow icon button at any time to navigate to LangSmith and view the detailed experiment results. ## Debug LangSmith traces -This guide explains how to open LangSmith traces in the Debugger for interactive investigation and debugging. +This guide explains how to open LangSmith traces in Studio for interactive investigation and debugging. ### Open deployed threads 1. Open the LangSmith trace, selecting the root run. -1. Click **Run in Debugger**. +1. Click **Run in Studio**. -This will open the Debugger connected to the associated deployment with the trace's parent thread selected. +This will open Studio connected to the associated deployment with the trace's parent thread selected. ### Testing local agents with remote traces @@ -192,12 +192,12 @@ This section explains how to test a local agent against remote traces from LangS #### Clone thread 1. Open the LangSmith trace, selecting the root run. -2. Click the dropdown next to **Run in Debugger**. +2. Click the dropdown next to **Run in Studio**. 3. Enter your local agent's URL. 4. Select **Clone thread locally**. 5. If multiple graphs exist, select the target graph. -A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to the Debugger for your locally running application. +A new thread will be created in your local agent with the thread history inferred and copied from the remote thread, and you will be navigated to Studio for your locally running application. ## Add node to dataset diff --git a/src/langsmith/platform-logs.mdx b/src/langsmith/platform-logs.mdx index ef73b0a80..a4c3931e3 100644 --- a/src/langsmith/platform-logs.mdx +++ b/src/langsmith/platform-logs.mdx @@ -11,7 +11,7 @@ Viewing server logs for a trace only works with the [Cloud SaaS](https://langcha ## Access server logs from trace view -In the trace view, use the **See Logs** button in the top right corner, next to the **Run in Debugger** button. +In the trace view, use the **See Logs** button in the top right corner, next to the **Run in Studio** button. ![](/langsmith/images/view-server-logs-button.png) diff --git a/src/langsmith/prompt-engineering-quickstart.mdx b/src/langsmith/prompt-engineering-quickstart.mdx index cc1b3d882..c62b0741c 100644 --- a/src/langsmith/prompt-engineering-quickstart.mdx +++ b/src/langsmith/prompt-engineering-quickstart.mdx @@ -1,6 +1,6 @@ --- title: Prompt engineering quickstart -sidebarTitle: Test prompts +sidebarTitle: Quickstart --- import WorkspaceSecret from '/snippets/langsmith/set-workspace-secrets.mdx'; diff --git a/src/langsmith/quick-start-debugger.mdx b/src/langsmith/quick-start-studio.mdx similarity index 63% rename from src/langsmith/quick-start-debugger.mdx rename to src/langsmith/quick-start-studio.mdx index 34ff1da40..fda5131bb 100644 --- a/src/langsmith/quick-start-debugger.mdx +++ b/src/langsmith/quick-start-studio.mdx @@ -1,26 +1,26 @@ --- -title: Get started with the Debugger +title: Get started with Studio sidebarTitle: Quickstart --- -The [Debugger](/langsmith/debugger) in the [LangSmith Deployments UI](https://smith.langchain.com) supports connecting to two types of graphs: +[Studio](/langsmith/studio) in the [LangSmith Deployments UI](https://smith.langchain.com) supports connecting to two types of graphs: - Graphs deployed on [cloud or self-hosted](#deployed-graphs). - Graphs running locally with [LangGraph server](#local-development-server). ## Deployed graphs -The Debugger is accessed in the [LangSmith UI](https://smith.langchain.com) from the **Deployments** navigation. +Studio is accessed in the [LangSmith UI](https://smith.langchain.com) from the **Deployments** navigation. -For applications that are [deployed](/langsmith/deployment-quickstart), you can access the Debugger as part of that deployment. To do so, navigate to the deployment in the UI and select **Debugger**. +For applications that are [deployed](/langsmith/deployment-quickstart), you can access Studio as part of that deployment. To do so, navigate to the deployment in the UI and select **Studio**. -This will load the Debugger connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langsmith/assistants), and [memory](/oss/concepts/memory) in that deployment. +This will load Studio connected to your live deployment, allowing you to create, read, and update the [threads](/oss/langgraph/persistence#threads), [assistants](/langsmith/assistants), and [memory](/oss/concepts/memory) in that deployment. ## Local development server ### Prerequisites -To test your application locally using the Debugger: +To test your application locally using Studio: - Follow the [local application quickstart](/langsmith/local-server) first. - If you don't want data [traced](/langsmith/observability-concepts#traces) to LangSmith, set `LANGSMITH_TRACING=false` in your application's `.env` file. With tracing disabled, no data leaves your local server. @@ -47,7 +47,7 @@ To test your application locally using the Debugger: **Browser Compatibility** - Safari blocks `localhost` connections to the Debugger. To work around this, run the command with `--tunnel` to access the Debugger via a secure tunnel. + Safari blocks `localhost` connections to Studio. To work around this, run the command with `--tunnel` to access Studio via a secure tunnel. This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langsmith/cli#dev) to learn about all the options for starting the API server. @@ -61,14 +61,14 @@ To test your application locally using the Debugger: > > - Docs: http://localhost:2024/docs > - > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 + > - LangSmith Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` - Once running, you will automatically be directed to the Debugger. + Once running, you will automatically be directed to Studio. 1. For a running server, access the Dbugger with one of the following: 1. Directly navigate to the following URL: `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`. - 1. Navigate to **Deployments** in the UI, click the **Debugger** button on a deployment, enter `http://127.0.0.1:2024` and click **Connect**. + 1. Navigate to **Deployments** in the UI, click the **Studio** button on a deployment, enter `http://127.0.0.1:2024` and click **Connect**. If running your server at a different host or port, update the `baseUrl` to match. @@ -120,16 +120,16 @@ Then attach your preferred debugger: -For issues getting started, refer to the [troubleshooting guide](/langsmith/troubleshooting-debugger). +For issues getting started, refer to the [troubleshooting guide](/langsmith/troubleshooting-studio). ## Next steps -For more information on how to run the Debugger, refer to the following guides: +For more information on how to run Studio, refer to the following guides: -- [Run application](/langsmith/use-debugger#run-application) -- [Manage assistants](/langsmith/use-debugger#manage-assistants) -- [Manage threads](/langsmith/use-debugger#manage-threads) -- [Iterate on prompts](/langsmith/observability-debugger) -- [Debug LangSmith traces](/langsmith/observability-debugger#debug-langsmith-traces) -- [Add node to dataset](/langsmith/observability-debugger#add-node-to-dataset) +- [Run application](/langsmith/use-studio#run-application) +- [Manage assistants](/langsmith/use-studio#manage-assistants) +- [Manage threads](/langsmith/use-studio#manage-threads) +- [Iterate on prompts](/langsmith/observability-studio) +- [Debug LangSmith traces](/langsmith/observability-studio#debug-langsmith-traces) +- [Add node to dataset](/langsmith/observability-studio#add-node-to-dataset) diff --git a/src/langsmith/self-hosted.mdx b/src/langsmith/self-hosted.mdx index b9611e779..753452e6a 100644 --- a/src/langsmith/self-hosted.mdx +++ b/src/langsmith/self-hosted.mdx @@ -19,7 +19,7 @@ Self-hosting allows you to run all components entirely within your own cloud env Model | Includes | Best for | Methods ------------------|------------------|----------|-------------------- **LangSmith** |
  • LangSmith app (UI + API)
  • Backend services (queue, playground, ACE)
  • Datastores: PostgreSQL, Redis, ClickHouse, optional blob storage
|
  • Teams who need self-hosted observability, tracing, and evaluation
  • Running the LangSmith app without deploying agents/graphs
|
  • Docker Compose (dev/test)
  • Kubernetes + Helm (production)
-**LangSmith with agent deployment** |
  • Everything from LangSmith
  • Control plane (deployments UI, revision management, the Debugger)
  • Data plane (LangGraph Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
+**LangSmith with agent deployment** |
  • Everything from LangSmith
  • Control plane (deployments UI, revision management, Studio)
  • Data plane (LangGraph Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
**Standalone server** |
  • LangGraph Server container(s)
  • Requires PostgreSQL + Redis (shared or dedicated)
  • Optional LangSmith integration for tracing
|
  • Lightweight deployments of one or a few agents
  • Integrating LangGraph Servers as microservices
  • Teams preferring to manage scaling & CI/CD themselves
|
  • Docker / Docker Compose (dev/test)
  • Kubernetes + Helm (production)
  • Any container runtime or VM (ECS, EC2, ACI, etc.)
@@ -101,7 +101,7 @@ You run both the control plane and the data plane entirely within your own infra ### Requirements -1. Use the `langgraph-cli` or the [Debugger](/langsmith/debugger) to test your graph locally. +1. Use the `langgraph-cli` or [Studio](/langsmith/studio) to test your graph locally. 2. Build a Docker image with `langgraph build`. 3. Deploy your LangGraph Server via the LangSmith control plane UI or through your container tooling of choice. 4. All agents are deployed as Kubernetes services behind the ingress configured for your LangSmith instance. @@ -135,7 +135,7 @@ Do not run standalone servers in serverless environments. Scale-to-zero may caus ### Workflow -1. Define and test your graph locally using the `langgraph-cli` or the [Debugger](/langsmith/debugger). +1. Define and test your graph locally using the `langgraph-cli` or [Studio](/langsmith/studio). 2. Package your agent as a Docker image. 3. Deploy the LangGraph Server to your compute platform of choice (Kubernetes, Docker, VM). 4. Optionally, configure LangSmith API keys and endpoints so the server reports traces and evaluations back to LangSmith (self-hosted or SaaS). diff --git a/src/langsmith/set-up-custom-auth.mdx b/src/langsmith/set-up-custom-auth.mdx index 9cf96cf1c..39c502a4b 100644 --- a/src/langsmith/set-up-custom-auth.mdx +++ b/src/langsmith/set-up-custom-auth.mdx @@ -55,7 +55,7 @@ npx @langchain/langgraph-cli dev ``` -The server will start and open the [Debugger](/langsmith/debugger) in your browser: +The server will start and open [Studio](/langsmith/studio) in your browser: ``` > - 🚀 API: http://127.0.0.1:2024 @@ -142,7 +142,7 @@ Start the server again to test everything out: langgraph dev --no-browser ``` -If you didn't add the `--no-browser`, the Debugger UI will open in the browser. By default, we also permit access from the Debugger, even when using custom auth. This makes it easier to develop and test your bot in the Debugger. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: +If you didn't add the `--no-browser`, the Studio UI will open in the browser. By default, we also permit access from Studio, even when using custom auth. This makes it easier to develop and test your bot in Studio. You can remove this alternative authentication option by setting `disable_studio_auth: "true"` in your auth configuration: ```json { diff --git a/src/langsmith/setup-javascript.mdx b/src/langsmith/setup-javascript.mdx index bbf6eb560..b5ae1aa37 100644 --- a/src/langsmith/setup-javascript.mdx +++ b/src/langsmith/setup-javascript.mdx @@ -153,7 +153,7 @@ const workflow = new StateGraph(MessagesAnnotation) // will be called after the source node is called. routeModelOutput, // List of the possible destinations the conditional edge can route to. - // Required for conditional edges to properly render the graph in the Debugger + // Required for conditional edges to properly render the graph in Studio ["tools", "__end__"] ) // This means that after `tools` is called, `callModel` node is called next. diff --git a/src/langsmith/studio.mdx b/src/langsmith/studio.mdx new file mode 100644 index 000000000..159a310fe --- /dev/null +++ b/src/langsmith/studio.mdx @@ -0,0 +1,62 @@ +--- +title: LangSmith Studio +sidebarTitle: Overview +--- + + +**Prerequisites** +* [LangSmith](/langsmith/home) +* [LangGraph Server](/langsmith/langgraph-server) +* [LangGraph CLI](/langsmith/cli) + + +Studio is a specialized agent IDE that enables visualization, interaction, and debugging of agentic systems that implement the LangGraph Server API protocol. Studio also integrates with [tracing](/langsmith/observability-concepts), [evaluation](/langsmith/evaluation), and [prompt engineering](/langsmith/prompt-engineering). + +## Features + +Key features of Studio: + +* Visualize your graph architecture +* [Run and interact with your agent](/langsmith/use-studio#run-application) +* [Manage assistants](/langsmith/use-studio#manage-assistants) +* [Manage threads](/langsmith/use-studio#manage-threads) +* [Iterate on prompts](/langsmith/observability-studio) +* [Run experiments over a dataset](/langsmith/observability-studio#run-experiments-over-a-dataset) +* Manage [long term memory](/oss/concepts/memory) +* Debug agent state via [time travel](/oss/langgraph/use-time-travel) + +```mermaid +flowchart + subgraph LangSmith Deployment + A[LangGraph CLI] -->|creates| B(LangGraph Server deployment) + B <--> D[Studio] + B <--> E[SDKs] + B <--> F[RemoteGraph] + end +``` + +Studio works for graphs that are deployed on [LangSmith](/langsmith/deployment-quickstart) or for graphs that are running locally via the [LangGraph Server](/langsmith/local-server). + +Studio supports two modes: + +### Graph mode + +Graph mode exposes the full feature-set and is useful when you would like as many details about the execution of your agent, including the nodes traversed, intermediate states, and LangSmith integrations (such as adding to datasets and playground). + +### Chat mode + +Chat mode is a simpler UI for iterating on and testing chat-specific agents. It is useful for business users and those who want to test overall agent behavior. Chat mode is only supported for graph's whose state includes or extends [`MessagesState`](/oss/langgraph/use-graph-api#messagesstate). + +## Learn more + +* See this guide on how to [get started](/langsmith/quick-start-studio) with Studio. + +## Video guide + diff --git a/src/langsmith/troubleshooting-debugger.mdx b/src/langsmith/troubleshooting-studio.mdx similarity index 79% rename from src/langsmith/troubleshooting-debugger.mdx rename to src/langsmith/troubleshooting-studio.mdx index bf83c1998..521d44eb0 100644 --- a/src/langsmith/troubleshooting-debugger.mdx +++ b/src/langsmith/troubleshooting-studio.mdx @@ -1,11 +1,11 @@ --- -title: Debugger troubleshooting +title: Studio troubleshooting sidebarTitle: Troubleshooting --- ## Safari Connection Issues -Safari blocks plain-HTTP traffic on localhost. When running the Debugger with `langgraph dev`, you may see "Failed to load assistants" errors. +Safari blocks plain-HTTP traffic on localhost. When running Studio with `langgraph dev`, you may see "Failed to load assistants" errors. ### Solution 1: Use Cloudflare Tunnel @@ -30,7 +30,7 @@ The command outputs a URL in this format: https://smith.langchain.com/studio/?baseUrl=https://hamilton-praise-heart-costumes.trycloudflare.com ``` -Use this URL in Safari to load the Debugger. Here, the `baseUrl` parameter specifies your agent server endpoint. +Use this URL in Safari to load Studio. Here, the `baseUrl` parameter specifies your agent server endpoint. ### Solution 2: Use Chromium Browser @@ -38,7 +38,7 @@ Chrome and other Chromium browsers allow HTTP on localhost. Use `langgraph dev` ## Brave Connection Issues -Brave blocks plain-HTTP traffic on localhost when Brave Shields are enabled. When running the Debugger with `langgraph dev`, you may see "Failed to load assistants" errors. +Brave blocks plain-HTTP traffic on localhost when Brave Shields are enabled. When running Studio with `langgraph dev`, you may see "Failed to load assistants" errors. ### Solution 1: Disable Brave Shields @@ -67,12 +67,12 @@ The command outputs a URL in this format: https://smith.langchain.com/studio/?baseUrl=https://hamilton-praise-heart-costumes.trycloudflare.com ``` -Use this URL in Brave to load the Debugger. Here, the `baseUrl` parameter specifies your agent server endpoint. +Use this URL in Brave to load Studio. Here, the `baseUrl` parameter specifies your agent server endpoint. ## Graph Edge Issues Undefined conditional edges may show unexpected connections in your graph. This is -because without proper definition, the Debugger assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: +because without proper definition, Studio assumes the conditional edge could access all other nodes. To address this, explicitly define the routing paths using one of these methods: ### Solution 1: Path Map @@ -104,7 +104,7 @@ def routing_function(state: GraphState) -> Literal["node_b","node_c"]: return "node_c" ``` -## Experiment troubleshooting in the Debugger +## Experiment troubleshooting in Studio ### **Run experiment** button is disabled diff --git a/src/langsmith/use-debugger.mdx b/src/langsmith/use-studio.mdx similarity index 81% rename from src/langsmith/use-debugger.mdx rename to src/langsmith/use-studio.mdx index be281cbbf..c3da483e1 100644 --- a/src/langsmith/use-debugger.mdx +++ b/src/langsmith/use-studio.mdx @@ -1,9 +1,9 @@ --- -title: How to use the Debugger +title: How to use Studio sidebarTitle: Runs, assistants, threads --- -This page describes the core workflows you’ll use in the Debugger. It explains how to run your application, manage assistant configurations, and work with conversation threads. Each section includes steps in both graph mode (full-featured view of your graph’s execution) and chat mode (lightweight conversational interface): +This page describes the core workflows you’ll use in Studio. It explains how to run your application, manage assistant configurations, and work with conversation threads. Each section includes steps in both graph mode (full-featured view of your graph’s execution) and chat mode (lightweight conversational interface): - [Run application](#run-application): Execute your application or agent and observe its behavior. - [Manage assistants](#manage-assistants): Create, edit, and select the assistant configuration used by your application. @@ -16,7 +16,7 @@ This page describes the core workflows you’ll use in the Debugger. It explains ### Specify input -1. Define the input to your graph in the **Input** section on the left side of the page, below the graph interface. The Debugger will attempt to render a form for your input based on the graph's defined [state schema](/oss/langgraph/graph-api/#schema). To disable this, click the **View Raw** button, which will present you with a JSON editor. +1. Define the input to your graph in the **Input** section on the left side of the page, below the graph interface. Studio will attempt to render a form for your input based on the graph's defined [state schema](/oss/langgraph/graph-api/#schema). To disable this, click the **View Raw** button, which will present you with a JSON editor. 1. Click the up or down arrows at the top of the **Input** section to toggle through and use previously submitted inputs. ### Run settings @@ -69,7 +69,7 @@ To cancel the ongoing run: ## Manage assistants -The Debugger lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. +Studio lets you view, edit, and update your assistants, and allows you to run your graph using these assistant configurations. For more conceptual details, refer to the [Assistants overview](/langsmith/assistants/). @@ -93,7 +93,7 @@ Chat mode enables you to switch through the different assistants in your graph v ## Manage threads -The Debugger provides tools to view all [threads](/oss/langgraph/persistence#threads) saved on the server and edit their state. You can create new threads, switch between threads, and modify past states both in graph mode and chat mode. +Studio provides tools to view all [threads](/oss/langgraph/persistence#threads) saved on the server and edit their state. You can create new threads, switch between threads, and modify past states both in graph mode and chat mode. @@ -133,7 +133,7 @@ To edit a human message in the thread: ## Next steps -Refer to the following guides for more detail on tasks you can complete in the Debugger: +Refer to the following guides for more detail on tasks you can complete in Studio: -- [Iterate on prompts](/langsmith/observability-debugger) -- [Run experiments over datasets](/langsmith/observability-debugger#run-experiments-over-a-dataset) +- [Iterate on prompts](/langsmith/observability-studio) +- [Run experiments over datasets](/langsmith/observability-studio#run-experiments-over-a-dataset) diff --git a/src/langsmith/use-threads.mdx b/src/langsmith/use-threads.mdx index 437ae202d..b10dac82a 100644 --- a/src/langsmith/use-threads.mdx +++ b/src/langsmith/use-threads.mdx @@ -498,4 +498,4 @@ You can also view threads in a deployment via the LangSmith UI. Inside your deployment, select the "Threads" tab. This will load a table of all of the threads in your deployment. -Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in the [Debugger](/langsmith/debugger). +Select a thread to inspect its current state. To view its full history and for further debugging, open the thread in [Studio](/langsmith/studio). diff --git a/src/oss/langchain/debugger.mdx b/src/oss/langchain/debugger.mdx deleted file mode 100644 index 288b1bea0..000000000 --- a/src/oss/langchain/debugger.mdx +++ /dev/null @@ -1,14 +0,0 @@ ---- -title: Debugger ---- - -import AlphaCallout from '/snippets/alpha-lc-callout.mdx'; -import Debugger from '/snippets/oss/debugger.mdx'; - - - - - - -For more information about local and deployed agents, see [Set up local LangGraph Server](/oss/langchain/debugger#setup-local-langgraph-server) and [Deploy](/oss/langchain/deploy). - diff --git a/src/oss/langchain/deploy.mdx b/src/oss/langchain/deploy.mdx index 8e49b2898..68717e56c 100644 --- a/src/oss/langchain/deploy.mdx +++ b/src/oss/langchain/deploy.mdx @@ -20,6 +20,6 @@ Before you begin, ensure you have the following: ### 1. Create a repository on GitHub -Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/debugger#setup-local-langgraph-server). Then, push your code to the repository. +Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/studio#setup-local-langgraph-server). Then, push your code to the repository. diff --git a/src/oss/langchain/sql-agent.mdx b/src/oss/langchain/sql-agent.mdx index acf685a32..c8ce70b33 100644 --- a/src/oss/langchain/sql-agent.mdx +++ b/src/oss/langchain/sql-agent.mdx @@ -675,10 +675,10 @@ The agent correctly wrote a query, checked the query, and ran it to inform its f You can inspect all aspects of the above run, including steps taken, tools invoked, what prompts were seen by the LLM, and more in the [LangSmith trace](https://smith.langchain.com/public/653d218b-af67-4854-95ca-6abecb9b2520/r). -#### (Optional) Use the Debugger +#### (Optional) Use Studio -The [Debugger](/langsmith/debugger) provides a "client side" loop as well as memory so you can run this as a chat interface and query the database. You can ask questions like "Tell me the scheme of the database" or "Show me the invoices for the 5 top customers". You will see the SQL command that is generated and the resulting output. The details of how to get that started are below. - +[Studio](/langsmith/studio) provides a "client side" loop as well as memory so you can run this as a chat interface and query the database. You can ask questions like "Tell me the scheme of the database" or "Show me the invoices for the 5 top customers". You will see the SQL command that is generated and the resulting output. The details of how to get that started are below. + In addition to the previously mentioned packages, you will need to: diff --git a/src/oss/langchain/studio.mdx b/src/oss/langchain/studio.mdx new file mode 100644 index 000000000..7d7078b5f --- /dev/null +++ b/src/oss/langchain/studio.mdx @@ -0,0 +1,14 @@ +--- +title: Studio +--- + +import AlphaCallout from '/snippets/alpha-lc-callout.mdx'; +import Studio from '/snippets/oss/studio.mdx'; + + + + + + +For more information about local and deployed agents, see [Set up local LangGraph Server](/oss/langchain/studio#setup-local-langgraph-server) and [Deploy](/oss/langchain/deploy). + diff --git a/src/oss/langchain/ui.mdx b/src/oss/langchain/ui.mdx index 389805f2b..cbd4dbd4a 100644 --- a/src/oss/langchain/ui.mdx +++ b/src/oss/langchain/ui.mdx @@ -11,7 +11,7 @@ import chat_ui from '/snippets/oss/ui.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langchain/debugger#setup-local-langgraph-server) and [deployed agents](/oss/langchain/deploy). +Agent Chat UI can connect to both [local](/oss/langchain/studio#setup-local-langgraph-server) and [deployed agents](/oss/langchain/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: diff --git a/src/oss/langgraph/add-human-in-the-loop.mdx b/src/oss/langgraph/add-human-in-the-loop.mdx index ec512c1eb..a823082e4 100644 --- a/src/oss/langgraph/add-human-in-the-loop.mdx +++ b/src/oss/langgraph/add-human-in-the-loop.mdx @@ -1614,13 +1614,13 @@ Static interrupts are **not** recommended for human-in-the-loop workflows. Use [ ``` -### Use static interrupts in the Debugger +### Use static interrupts in Studio -You can use [the Debugger](/langsmith/debugger) to debug your graph. You can set static breakpoints in the UI and then run the graph. You can also use the UI to inspect the graph state at any point in the execution. +You can use [Studio](/langsmith/studio) to debug your graph. You can set static breakpoints in the UI and then run the graph. You can also use the UI to inspect the graph state at any point in the execution. ![image](/oss/images/static-interrupt.png) -The Debugger is free with [locally deployed applications](/oss/langgraph/local-server) using `langgraph dev`. +Studio is free with [locally deployed applications](/oss/langgraph/local-server) using `langgraph dev`. ::: ## Considerations diff --git a/src/oss/langgraph/deploy.mdx b/src/oss/langgraph/deploy.mdx index 98b95e4dd..071d9265b 100644 --- a/src/oss/langgraph/deploy.mdx +++ b/src/oss/langgraph/deploy.mdx @@ -20,6 +20,6 @@ Before you begin, ensure you have the following: ### 1. Create a repository on GitHub -Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langgraph/debugger#setup-local-langgraph-server). Then, push your code to the repository. +Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langgraph/studio#setup-local-langgraph-server). Then, push your code to the repository. diff --git a/src/oss/langgraph/local-server.mdx b/src/oss/langgraph/local-server.mdx index e88af1368..e57a15814 100644 --- a/src/oss/langgraph/local-server.mdx +++ b/src/oss/langgraph/local-server.mdx @@ -123,9 +123,9 @@ Sample output: The `langgraph dev` command starts LangGraph Server in an in-memory mode. This mode is suitable for development and testing purposes. For production use, deploy LangGraph Server with access to a persistent storage backend. For more information, see the [Hosting overview](/langsmith/hosting). -## 6. Test your application in the Debugger +## 6. Test your application in Studio -The [Debugger](/langsmith/debugger) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Test your graph in the Debugger by visiting the URL provided in the output of the `langgraph dev` command: +[Studio](/langsmith/studio) is a specialized UI that you can connect to LangGraph API server to visualize, interact with, and debug your application locally. Test your graph in Studio by visiting the URL provided in the output of the `langgraph dev` command: ``` > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 diff --git a/src/oss/langgraph/observability.mdx b/src/oss/langgraph/observability.mdx index 5271c032c..395da9302 100644 --- a/src/oss/langgraph/observability.mdx +++ b/src/oss/langgraph/observability.mdx @@ -9,7 +9,7 @@ import observability from '/snippets/oss/observability.mdx'; Traces are a series of steps that your application takes to go from input to output. Each of these individual steps is represented by a run. You can use [LangSmith](https://smith.langchain.com/) to visualize these execution steps. To use it, [enable tracing for your application](/langsmith/trace-with-langgraph). This enables you to do the following: -* [Debug a locally running application](/langsmith/observability-debugger#debug-langsmith-traces). +* [Debug a locally running application](/langsmith/observability-studio#debug-langsmith-traces). * [Evaluate the application performance](/oss/langchain/evals). * [Monitor the application](/langsmith/dashboards). diff --git a/src/oss/langgraph/overview.mdx b/src/oss/langgraph/overview.mdx index 6fe3bbcbb..b23e1fb63 100644 --- a/src/oss/langgraph/overview.mdx +++ b/src/oss/langgraph/overview.mdx @@ -101,7 +101,7 @@ LangGraph provides low-level supporting infrastructure for *any* long-running, s While LangGraph can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools for building agents. To improve your LLM application development, pair LangGraph with: * [LangSmith](http://www.langchain.com/langsmith) — Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time. -* [LangSmith](/langsmith/home) — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [the Debugger](/langsmith/debugger). +* [LangSmith](/langsmith/home) — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [Studio](/langsmith/studio). * [LangChain](/oss/langchain/overview) - Provides integrations and composable components to streamline LLM application development. Contains agent abstractions built on top of LangGraph. ## Acknowledgements diff --git a/src/oss/langgraph/persistence.mdx b/src/oss/langgraph/persistence.mdx index 9ac1133fb..875a54c45 100644 --- a/src/oss/langgraph/persistence.mdx +++ b/src/oss/langgraph/persistence.mdx @@ -1046,7 +1046,7 @@ for await (const update of await graph.stream( ``` ::: -When we use the LangSmith, either locally (e.g., in the Debugger) or with LangSmith, the base store is available to use by default and does not need to be specified during graph compilation. To enable semantic search, however, you **do** need to configure the indexing settings in your `langgraph.json` file. For example: +When we use the LangSmith, either locally (e.g., in [Studio](/langsmith/studio)) or [hosted with LangSmith](/langsmith/hosting), the base store is available to use by default and does not need to be specified during graph compilation. To enable semantic search, however, you **do** need to configure the indexing settings in your `langgraph.json` file. For example: ```json { diff --git a/src/oss/langgraph/debugger.mdx b/src/oss/langgraph/studio.mdx similarity index 52% rename from src/oss/langgraph/debugger.mdx rename to src/oss/langgraph/studio.mdx index c8d4946af..5dae28fc7 100644 --- a/src/oss/langgraph/debugger.mdx +++ b/src/oss/langgraph/studio.mdx @@ -1,10 +1,10 @@ --- -title: Debugger +title: Studio --- import AlphaCallout from '/snippets/alpha-lg-callout.mdx'; -import Debugger from '/snippets/oss/debugger.mdx'; +import Studio from '/snippets/oss/studio.mdx'; - + diff --git a/src/oss/langgraph/ui.mdx b/src/oss/langgraph/ui.mdx index 5a7aea1dc..2ae1f9286 100644 --- a/src/oss/langgraph/ui.mdx +++ b/src/oss/langgraph/ui.mdx @@ -11,7 +11,7 @@ import chat_ui from '/snippets/oss/ui.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langgraph/debugger#setup-local-langgraph-server) and [deployed agents](/oss/langgraph/deploy). +Agent Chat UI can connect to both [local](/oss/langgraph/studio#setup-local-langgraph-server) and [deployed agents](/oss/langgraph/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: diff --git a/src/oss/langgraph/use-graph-api.mdx b/src/oss/langgraph/use-graph-api.mdx index 67f646c63..eb14a3a9a 100644 --- a/src/oss/langgraph/use-graph-api.mdx +++ b/src/oss/langgraph/use-graph-api.mdx @@ -1464,8 +1464,7 @@ This allows state to be checkpointed in between the execution of nodes, so your * How interruptions are resumed in [human-in-the-loop](/oss/langgraph/add-human-in-the-loop) workflows * How we can "rewind" and branch-off executions using LangGraph's [time travel](/oss/langgraph/use-time-travel) features -They also determine how execution steps are [streamed](/oss/langgraph/streaming), and how your application is visualized -and debugged using [the Debugger](/langsmith/debugger). +They also determine how execution steps are [streamed](/oss/langgraph/streaming), and how your application is visualized and debugged using [Studio](/langsmith/studio). Let's demonstrate an end-to-end example. We will create a sequence of three steps: diff --git a/src/snippets/oss/deploy.mdx b/src/snippets/oss/deploy.mdx index 767d92176..2b498350f 100644 --- a/src/snippets/oss/deploy.mdx +++ b/src/snippets/oss/deploy.mdx @@ -16,12 +16,12 @@ -### 3. Test your application in the Debugger +### 3. Test your application in Studio Once your application is deployed: 1. Select the deployment you just created to view more details. -2. Click the **Debugger** button in the top right corner. Debugger will open to display your graph. +2. Click the **Studio** button in the top right corner. Studio will open to display your graph. ### 4. Get the API URL for your deployment diff --git a/src/snippets/oss/debugger.mdx b/src/snippets/oss/studio.mdx similarity index 69% rename from src/snippets/oss/debugger.mdx rename to src/snippets/oss/studio.mdx index 064383fa1..8af5f0b8f 100644 --- a/src/snippets/oss/debugger.mdx +++ b/src/snippets/oss/studio.mdx @@ -1,12 +1,12 @@ -This guide will walk you through how to use the **Debugger** to visualize, interact, and debug your agent locally. +This guide will walk you through how to use **Studio** to visualize, interact, and debug your agent locally. -The Debugger is our free-to-use, powerful agent IDE that integrates with [LangSmith](/langsmith/home) to enable tracing, evaluation, and prompt engineering. See exactly how your agent thinks, trace every decision, and ship smarter, more reliable agents. +Studio is our free-to-use, powerful agent IDE that integrates with [LangSmith](/langsmith/home) to enable tracing, evaluation, and prompt engineering. See exactly how your agent thinks, trace every decision, and ship smarter, more reliable agents.