Skip to content

New serverless pattern - appsync websockets with lambda and bedrock #2773

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

humem001
Copy link
Contributor

This AI chat application works through a simple request-response flow. A client sends a GraphQL mutation to the AWS AppSync API endpoint. AWS AppSync then routes the request to the AWS Lambda function via the configured data source and resolver. The AWS Lambda function receives the user's input message, then constructs a properly formatted request for Amazon Bedrock's Claude model (including the Anthropic API version, token limit, and user message). This is then sent to Amazon Bedrock using the AWS SDK, waits for the AI response, and returns the generated text back to AWS AppSync. AWS AppSync then delivers this response to the client, while the subscription feature enables real-time notifications to connected clients when new responses are available, creating an interactive chat experience where users can send messages and receive AI-generated replies in real-time.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants