@@ -16,6 +16,8 @@ placeholders filled in based on the provided arguments.
1616
1717## Defining Prompts
1818
19+ Prompts can be defined in your configuration files using the ` prompts ` key:
20+
1921``` yaml
2022prompts :
2123 - id : generate-controller
@@ -40,31 +42,43 @@ Each prompt contains:
4042- **id**: Unique identifier for the prompt
4143- **description**: Human-readable description
4244- **schema** (optional): Defines input parameters with descriptions and required fields
43- - **messages**: The sequence of conversation messages that make up the prompt
45+ - **messages**: The sequence of conversation messages that make up the prompt template
4446
4547## Variable Substitution
4648
4749Prompts support variable substitution in message content using the format ` {{variableName}}`. When LLM requests a
4850prompt with arguments, the MCP server replaces these placeholders with the provided values.
4951
52+ # # Prompt Message Structure
53+
54+ Each message in the `messages` array must include :
55+
56+ - **role**: The role of the message sender (system, user, or assistant)
57+ - **content**: The content of the message (can include variable placeholders)
58+
59+ Valid role values are defined in the `Mcp\Types\Role` enum and include :
60+
61+ - ` user`
62+ - ` assistant`
63+
5064# # Available Prompt Tools
5165
5266When connected via MCP, LLM has access to the following prompt-related tools :
5367
5468# ## Prompts Tools
5569
56- - `prompts. list` : List all available prompts defined in the configuration
57- - `prompts. get` : Get a specific prompt by ID
70+ - `prompts- list` : List all available prompts defined in the configuration
71+ - `prompt- get` : Get a specific prompt by ID
5872
5973# # Example Usage
6074
6175Here's how LLM might use prompts during a conversation :
6276
63771. **Listing available prompts** :
64- Claude can request a list of all available prompts to discover what templates are available.
78+ LLM can request a list of all available prompts to discover what templates are available.
6579
66802. **Using a prompt with arguments** :
67- Claude can request a specific prompt with arguments, which will return the prompt messages with variables
81+ LLM can request a specific prompt with arguments, which will return the prompt messages with variables
6882 substituted.
6983
70843. **Custom workflows** :
0 commit comments