Closed
Description
Describe the bug
When I try to run genkit.Generate using the googlegenai plugin with streaming and tools it returns the following error:
panic: invalid output: data did not match expected schema:
- message.content.0: Invalid type. Expected: object, given: null
goroutine 1 [running]:
main.main-range1(0x14000114480?, {0x1039ffe60?, 0x14000426680?})
/../dev/main.go:65 +0xfc
main.main.(*Flow[...]).Stream.func3(...)
/../golang/1.24.3/packages/pkg/mod/github.com/firebase/genkit/[email protected]/core/flow.go:139
main.main()
/../dev/main.go:63 +0x2c0
exit status 2
I have only been able to reproduce this bug with the googlegenai plugin. Tool calling appears to work with the Ollama plugin.
To Reproduce
Use this module to test
package main
import (
"context"
"fmt"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/core"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
type Input struct {
Location string `json:"file" jsonschema_description:"The location to check weather"`
}
func main() {
modelName := "googleai/gemini-2.0-flash"
ctx := context.Background()
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel(modelName),
)
if err != nil {
panic(err)
}
weatherTool := genkit.DefineTool(g, "getWeather", "Get weather in a location",
func(ctx *ai.ToolContext, i Input) (string, error) {
return fmt.Sprintf("The weather in %s is 19 degrees and partly cloudy", i.Location), nil
},
)
getWeatherFlow := genkit.DefineStreamingFlow(g, "getWeatherFlow",
func(ctx context.Context, message string, callback core.StreamCallback[string]) (string, error) {
resp, err := genkit.Generate(
ctx,
g,
ai.WithTools(weatherTool),
ai.WithSystem(`
You have access to a tool, getWeather that lets you get the current weather
in a location.
Always use tools if it is relevant to the user's request. Carefully
consider the user's input and determine if it contains inputs to your tools.
ALWAYS LET THE USER KNOW YOU ARE ABOUT TO USE TOOLS.
`),
ai.WithPrompt(message),
ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
callback(ctx, chunk.Text())
return nil
}),
)
if err != nil {
return "", err
}
return resp.Message.Text(), nil
})
streamCh := getWeatherFlow.Stream(ctx, "What is the weather in Vancouver?")
for result, err := range streamCh {
if err != nil {
panic(err)
}
if result.Done {
fmt.Printf("Final Message: %s \n", result.Output)
} else {
fmt.Printf("Received Token Chunk: %s \n", result.Stream)
}
}
}
Just run with go run.
Expected behavior
I would expect to see a stream of tokens followed by the complete output.
Screenshots
N/A
Runtime (please complete the following information):
- OS: MacOS
- Version 15.3.1
** Go version
- 1.24.3
Additional context
Add any other context about the problem here.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Done