The MariaDB Vector MCP server provides tools that LLM agents can use to interact with a MariaDB database with vector support, providing users with a natural language interface to store and interact with their data. Thanks to the Model Context Protocol (MCP), this server is compatible with any MCP client, including those provided by applications like Claude Desktop and Cursor/Windsurf, as well as LLM Agent frameworks like LangGraph and PydanticAI.
Using the MariaDB Vector MCP server, users can for example:
- Provide context from a knowledge-base to their conversations with LLM agents
- Store and query their conversations with LLM agents
-
Vector Store Management
- Create and delete vector stores in a MariaDB database
- List all vector stores in a MariaDB database
-
Document Management
- Add documents with optional metadata to a vector store
- Query a vector store using semantic search
-
Embedding Provider
- Use OpenAI's embedding models to embed documents
mariadb_create_vector_store
: Create a vector store in a MariaDB databasemariadb_delete_vector_store
: Delete a vector store in a MariaDB databasemariadb_list_vector_stores
: List all vector stores in a MariaDB databasemariadb_insert_documents
: Add documents with optional metadata to a vector storemariadb_search_vector_store
: Query a vector store using semantic search
Note: From here on, it is assumed that you have a running MariaDB instance with vector support (version 11.7 or higher). If you don't have one, you can quickly spin up a MariaDB instance using Docker:
docker run -p 3306:3306 --name mariadb-instance -e MARIADB_ROOT_PASSWORD=password -e MARIADB_DATABASE=database_name mariadb:11.8
First clone the repository:
git clone https://github.com/MariaDB/mcp mcp-server-mariadb-vector
There are two ways to run the MariaDB Vector MCP server: as a Python package using uv or as a Docker container built from the provided Dockerfile.
- MariaDB Connector/C - installation instructions
- uv - installation instructions
- Docker - installation instructions
The server needs to be configured with the following environment variables:
Name | Description | Default Value |
---|---|---|
MARIADB_HOST |
host of the running MariaDB database | 127.0.0.1 |
MARIADB_PORT |
port of the running MariaDB database | 3306 |
MARIADB_USER |
user of the running MariaDB database | None |
MARIADB_PASSWORD |
password of the running MariaDB database | None |
MARIADB_DATABASE |
name of the running MariaDB database | None |
EMBEDDING_PROVIDER |
provider of the embedding models | openai |
EMBEDDING_MODEL |
model of the embedding provider | text-embedding-3-small |
OPENAI_API_KEY |
API key for OpenAI's platform | None |
Using uv, you can add a .env
file to the root of the cloned repository with the environment variables and run the server with the following command:
uv run --dir path/to/mcp-server-mariadb-vector/ --env-file path/to/mcp-server-mariadb-vector/.env mcp_server_mariadb_vector
The dependencies will be installed automatically. An optional --transport
argument can be added to specify the transport protocol to use. The default value is stdio
.
Build the Docker container from the root directory of the cloned repository by running the following command:
docker build -t mcp-server-mariadb-vector .
Then run the container (replace with your own configuration):
docker run -p 8000:8000 \
--add-host host.docker.internal:host-gateway \
-e MARIADB_HOST="host.docker.internal" \
-e MARIADB_PORT="port" \
-e MARIADB_USER="user" \
-e MARIADB_PASSWORD="password" \
-e MARIADB_DATABASE="database" \
-e EMBEDDING_PROVIDER="openai" \
-e EMBEDDING_MODEL="embedding-model" \
-e OPENAI_API_KEY="your-openai-api-key" \
mcp-server-mariadb-vector
The server will be available at http://localhost:8000/sse
, using the SSE transport protocol. Make sure to leave MARIADB_HOST
set to host.docker.internal
if you are running the MariaDB database as a Docker container on your host machine.
Claude Desktop, Cursor and Windsurf can run and connect to the server automatically using stdio transport. To do so, add the following to your configuration file (claude_desktop_config.json
for Claude Desktop, mcp.json
for Cursor or mcp_config.json
for Windsurf):
{
"mcpServers": {
"mariadb-vector": {
"command": "uv",
"args": [
"run",
"--directory",
"path/to/mcp-server-mariadb-vector/",
"--env-file",
"path/to/mcp-server-mariadb-vector/.env",
"mcp-server-mariadb-vector"
]
}
}
}
Alternatively, Cursor and Windsurf can connect to an already running server on your host machine (e.g. if you are running the server as a Docker container) using SSE transport. To do so, add the following to the corresponding configuration file:
"mcpServers": {
"mariadb-vector": {
"url": "http://localhost:8000/sse"
}
}
}