
We’ve entered the Agentic AI Era—a new phase where large language models (LLMs) go beyond understanding instructions to actively making decisions and autonomously executing tasks. In this landscape, the ability to efficiently integrate LLMs with external systems, particularly databases, has become critical for building truly powerful AI applications.
The Model Context Protocol (MCP) has emerged as the standard bridge connecting LLMs and external data sources or tool systems. It simplifies access to external resources, allowing developers to inject real-time data and executable functionality into LLM applications in a unified way.
Today, we’re excited to announce that PyTiDB—our AI SDK for TiDB with vector storage and AI-powered search capabilities—now offers native support for the MCP protocol. This integration eliminates the need to build complex plugins or custom APIs. With a simple MCP server configuration, your TiDB resources become directly accessible within popular agent platforms like Claude Desktop and Cursor, empowering AI agents with enhanced memory, analytical reasoning, and action-taking abilities.
In this post, we’ll dive deep into the core mechanisms behind MCP and TiDB’s role within the MCP ecosystem. We’ll also explore a practical example with complete source code that shows how to build an intelligent data analysis app powered by TiDB and MCP.
What is Model Context Protocol (MCP)?
The MCP (Model Context Protocol) is an open standard that defines how LLMs can access external data and tools.
How MCP Works: The Three Core Components
- MCP Host: The LLM platform that wants to access external data (e.g., Claude Desktop, Cursor).
- MCP Client: A host can include multiple MCP clients, each maintaining a one-on-one connection with an MCP server.
- MCP Server: Handles real requests, communicates with external systems (e.g., databases, APIs), and returns the results to the model.

Why MCP Matters: Key Benefits
- Breaks Down Data Silos: MCP bridges the gap between AI applications and external systems like databases, enabling LLMs to work with up-to-date, real-time data instead of outdated training sets.
- Evolves Chatbots into True Agents: MCP fosters integration with third-party tools, accelerating the evolution from static chatbots to autonomous agents with perception, decision-making, and action capabilities.
- No Need for Custom Agent Plugins: With a unified protocol, each external tool only needs to implement one MCP server. This avoids repeated integration efforts for each AI platform.
Demo: GitHub Data Insights with TiDB and MCP
In the past, we stored over 9 billion public GitHub event records in TiDB for the OSSInsight project, offering pre-written SQL-based analytics and online dashboards. Today, the development pattern for data applications is shifting.
With the TiDB MCP Server, developers can now:
1. Easily Implement Agentic Workflows: Traditional Text-to-SQL apps tend to generate errors in fixed workflows. In contrast, an AI agent connected to a database can automatically retry or correct SQL queries based on feedback—dramatically reducing manual intervention.
2. Enable Continuous Learning for AI: New-generation AI tools include memory modules to record user preferences. Over time, the LLM becomes better at generating accurate SQL queries, tailored to user intent.
To see this in action, we configured the TiDB MCP Server in Claude Desktop and connected it to OSSInsight’s read-only TiDB cluster. We then prompted the AI agent with this request:
Analyze 9 billion GitHub events stored in the TiDB database. Retrieve star history of the pingcap/tidb repository (cumulative by year and only count the first time when the user stared) . Present the result using an ECharts line chart, and keep the explanation concise.
Watch how the AI agent responds:
As shown in the video, the AI agent autonomously:
- Explored the database structure using tools like
show_tables
to understand the schema. - Crafted and executed the appropriate SQL query with
db_query
. - Transformed the raw data into a visual ECharts line chart.
- Delivered a concise explanation of the findings.
This entire workflow happened without any manual coding or database expertise required from the user—demonstrating how AI agents can make complex data analysis accessible through natural language.
Try TiDB MCP Server Yourself
Let’s walk through running the TiDB MCP Server step by step, which allows you to interact with the database and gain insights directly through the MCP Client.
Prerequisites
- An MCP-compatible client (e.g., Claude Desktop, Cursor, Cline, DeepChat, ChatWise).
- A TiDB database cluster for testing (👉 Create a free TiDB Serverless cluster on TiDB Cloud).
- Sample data prepared (TiDB Cloud provides a lightweight sample dataset of GitHub repositories. You can follow the steps in the screenshot below to import it.)

Setup Steps
For comprehensive documentation, please visit https://pingcap.github.io/pytidb/integrations/mcp .
- Clone the MCP repo:
git clone https://github.com/pingcap/pytidb
cd pytidb
- Install dependencies
We recommended using the uv
package manager for the best experience:
uv sync --extra mcp
- In Claude Desktop, go to
Settings -> Developer -> Edit Config
and open the configuration file:

- Add the TiDB MCP server config:
You can find the database connection parameters on the cluster details page in TiDB Cloud.
{
"mcpServers": {
"tidb": {
"command": "uv",
"args": [
"--directory",
"/path/to/pytidb",
"run",
"-m",
"pytidb.ext.mcp"
],
"env": {
"TIDB_HOST": "{host}",
"TIDB_PORT": "4000",
"TIDB_USERNAME": "{username}",
"TIDB_PASSWORD": "{password}",
"TIDB_DATABASE": "{database}"
}
}
}
}
- Restart Claude Desktop. You’ll see the TiDB MCP Server tools listed in the agent’s tool panel.

Looking Ahead
The Model Context Protocol is rapidly establishing itself as the standard interface between AI models and external systems. By creating this universal communication layer, MCP allows AI model providers to focus on enhancing core capabilities while enabling tool developers to build once and deploy across multiple AI platforms.
The release of the TiDB MCP Server marks a significant step forward in our AI ecosystem strategy. We’re excited to see what you’ll create with these capabilities!
If you have any questions or want to share what you’re building, join our community on Discord where our team and other developers are ready to help. Your feedback directly shapes TiDB and makes it better for everyone.
Additional Resources:
- TiDB Vector Search for Generative AI: https://www.pingcap.com/ai/
- Code: https://github.com/pingcap/pytidb/tree/main/pytidb/ext
- Documentation: https://pingcap.github.io/pytidb/integrations/mcp
EBook
The Modern, Unified GenAI Data Stack: Your AI Data Foundation Blueprint
TiDB Cloud Dedicated
A fully-managed cloud DBaaS for predictable workloads
TiDB Cloud Serverless
A fully-managed cloud DBaaS for auto-scaling workloads