Skip to content

Commit cda3371

Browse files
committed
Add documentation
Signed-off-by: Christian Tzolov <[email protected]>
1 parent bb45742 commit cda3371

File tree

2 files changed

+62
-16
lines changed

2 files changed

+62
-16
lines changed

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chatclient.adoc

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -283,6 +283,23 @@ List<ActorFilms> actorFilms = chatClient.prompt()
283283
.entity(new ParameterizedTypeReference<List<ActorFilms>>() {});
284284
----
285285

286+
==== Native Structured Output
287+
288+
As more AI models support structured output natively, you can take advantage of this feature by using the `AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT` advisor parameter when calling the `ChatClient`.
289+
You can use the `defaultAdvisors()` method on the `ChatClient.Builder` to set this parameter globally for all calls or set it per call as shown below:
290+
291+
[source,java]
292+
----
293+
ActorFilms actorFilms = chatClient.prompt()
294+
.advisors(AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT)
295+
.user("Generate the filmography for a random actor.")
296+
.call()
297+
.entity(ActorFilms.class);
298+
----
299+
300+
NOTE: Some AI models such as OpenAI don't support arrays of objects natively.
301+
In such cases, you can use the Spring AI default structured output conversion.
302+
286303
=== Streaming Responses
287304

288305
The `stream()` method lets you get an asynchronous response as shown below:

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/structured-output-converter.adoc

Lines changed: 45 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,6 @@
22

33
= Structured Output Converter
44

5-
NOTE: As of 02.05.2024 the old `OutputParser`, `BeanOutputParser`, `ListOutputParser` and `MapOutputParser` classes are deprecated in favor of the new `StructuredOutputConverter`, `BeanOutputConverter`, `ListOutputConverter` and `MapOutputConverter` implementations.
6-
the latter are drop-in replacements for the former ones and provide the same functionality. The reason for the change was primarily naming, as there isn't any parsing being done, but also have aligned with the Spring `org.springframework.core.convert.converter` package bringing in some improved functionality.
7-
85
The ability of LLMs to produce structured outputs is important for downstream applications that rely on reliably parsing output values.
96
Developers want to quickly turn results from an AI model into data types, such as JSON, XML or Java classes, that can be passed to other application functions and methods.
107

@@ -17,6 +14,8 @@ Generating structured outputs from Large Language Models (LLMs) using generic co
1714

1815
Before the LLM call, the converter appends format instructions to the prompt, providing explicit guidance to the models on generating the desired output structure. These instructions act as a blueprint, shaping the model's response to conform to the specified format.
1916

17+
NOTE: As more AI models natively support structured outputs, you can leverage this capability using the xref:api/chatclient.adoc#_native_structured_output[Native Structured Output] feature with `AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT`. This approach uses the generated JSON schema directly with the model's native structured output API, eliminating the need for pre-prompt formatting instructions and providing more reliable results.
18+
2019
After the LLM call, the converter takes the model's output text and transforms it into instances of the structured type. This conversion process involves parsing the raw text output and mapping it to the corresponding structured data representation, such as JSON, XML, or domain-specific data structures.
2120

2221
TIP: The `StructuredOutputConverter` is a best effort to convert the model output into a structured output.
@@ -253,22 +252,52 @@ Generation generation = this.chatModel.call(this.prompt).getResult();
253252
List<String> list = this.listOutputConverter.convert(this.generation.getOutput().getText());
254253
----
255254

256-
== Supported AI Models
255+
== Native Structured Output
256+
257+
Many modern AI models now provide native support for structured output, which offers more reliable results compared to prompt-based formatting. Spring AI supports this through the xref:api/chatclient.adoc#_native_structured_output[Native Structured Output] feature.
258+
259+
When using native structured output, the JSON schema generated by `BeanOutputConverter` is sent directly to the model's structured output API, eliminating the need for format instructions in the prompt. This approach provides:
260+
261+
* **Higher reliability**: The model guarantees output conforming to the schema
262+
* **Cleaner prompts**: No need to append format instructions
263+
* **Better performance**: Models can optimize for structured output internally
264+
265+
=== Using Native Structured Output
266+
267+
To enable native structured output, use the `AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT` parameter:
268+
269+
[source,java]
270+
----
271+
ActorsFilms actorsFilms = ChatClient.create(chatModel).prompt()
272+
.advisors(AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT)
273+
.user("Generate the filmography for a random actor.")
274+
.call()
275+
.entity(ActorsFilms.class);
276+
----
277+
278+
You can also set this globally using `defaultAdvisors()` on the `ChatClient.Builder`:
279+
280+
[source,java]
281+
----
282+
@Bean
283+
ChatClient chatClient(ChatClient.Builder builder) {
284+
return builder
285+
.defaultAdvisors(AdvisorParams.WITH_NATIVE_STRUCTURED_OUTPUT)
286+
.build();
287+
}
288+
----
289+
290+
=== Supported Models for Native Structured Output
291+
292+
The following models currently support native structured output:
257293

258-
The following AI Models have been tested to support List, Map and Bean structured outputs.
294+
* **OpenAI**: GPT-4o and later models with JSON Schema support
295+
* **Anthropic**: Claude 3.5 Sonnet and later models
296+
* **Vertex AI Gemini**: Gemini 1.5 Pro and later models
259297

260-
[cols="2,5"]
261-
|====
262-
| Model | Integration Tests / Samples
263-
| xref:api/chat/openai-chat.adoc[OpenAI] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/test/java/org/springframework/ai/openai/chat/OpenAiChatModelIT.java[OpenAiChatModelIT]
264-
| xref:api/chat/anthropic-chat.adoc[Anthropic Claude 3] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-anthropic/src/test/java/org/springframework/ai/anthropic/AnthropicChatModelIT.java[AnthropicChatModelIT.java]
265-
| xref:api/chat/azure-openai-chat.adoc[Azure OpenAI] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-azure-openai/src/test/java/org/springframework/ai/azure/openai/AzureOpenAiChatModelIT.java[AzureOpenAiChatModelIT.java]
266-
| xref:api/chat/mistralai-chat.adoc[Mistral AI] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-mistral-ai/src/test/java/org/springframework/ai/mistralai/MistralAiChatModelIT.java[MistralAiChatModelIT.java]
267-
| xref:api/chat/ollama-chat.adoc[Ollama] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-ollama/src/test/java/org/springframework/ai/ollama/OllamaChatModelIT.java[OllamaChatModelIT.java]
268-
| xref:api/chat/vertexai-gemini-chat.adoc[Vertex AI Gemini] | link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-vertex-ai-gemini/src/test/java/org/springframework/ai/vertexai/gemini/VertexAiGeminiChatModelIT.java[VertexAiGeminiChatModelIT.java]
269-
|====
298+
NOTE: Some AI models, such as OpenAI, don't support arrays of objects natively at the top level. In such cases, you can use the Spring AI default structured output conversion (without the native structured output advisor).
270299

271-
== Built-in JSON mode
300+
=== Built-in JSON mode
272301

273302
Some AI Models provide dedicated configuration options to generate structured (usually JSON) output.
274303

0 commit comments

Comments
 (0)