MCP Apps is a stable standard (version 2026-01-26) that supports MCP servers to display interactive interfaces directly in AI chat interfaces such as Claude and ChatGPT, including charts, forms, maps, 3D scenes, and data panels.
You can install the @modelcontextprotocol/ext-apps SDK via npm, use the relevant SDKs to develop applications or build a host environment, and run more than 20 examples (such as budgeting tools, PDF viewers, real-time monitoring panels, etc.) locally or on the client.
With this standard, you can create immersive, interactive AI tools that embed rich visualizations and interaction controls directly into conversations to maximize efficiency.
Recently, there is a project in the open source community:
modelcontextprotocol/ext-apps。
At first glance, many people mistakenly think that this repository is just an ordinary example project. But if you look at it in the entire Model Context Protocol (MCP) system, you will find that it is actually doing a very important thing: making the AI chat interface truly an interface that can run applications.
In the past few years, there has been almost only one form of interaction for large language models – text. You ask the AI a question, and it returns a piece of text, along with at most some code blocks or tables. Even if the AI calls the tool, accesses the database, and runs the script in the background, the user is still presented with a text description.
This pattern is already powerful for information querying and simple analysis, but when it comes to more complex tasks such as data analysis, budget calculations, monitoring dashboards, or document browsing, the text becomes somewhat limited. What users really need is often not an explanation, but an interface that can be operated.
MCP Apps wants to solve exactly that.
MCP itself is a set of protocols that define how AI can call external tools, access data, and interact with various systems. And on this basis, MCP Apps has taken it a step further: by allowing these tools to return not only data, but also interfaces.
As a result, AI can no longer just output a description, but can directly present charts, forms, maps, and even 3D scenes in the chat window. Users can interact with these interfaces within conversations, modifying parameters, filtering data, and submitting forms, while the AI continues to perform tasks based on these interactions.
If traditional AI conversations look like this:
AI: Based on your data analysis, the budget is $2,000.
Then the experience with MCP Apps is more like this:
A budget calculator appears directly in the chat window, you can adjust the parameters and see the chart changes in real time; Or open a PDF viewer and discuss content with AI while reading documents; You can even see a real-time monitoring panel to view the system status as if you were looking at an O&M dashboard.
This is exactly ext-apps why repositories exist.
This project actually provides development tools and a sample environment for MCP Apps. Developers can install @modelcontextprotocol/ext-apps the SDK through npm and use it to build their own MCP applications, or they can directly run more than 20 sample programs from the repository.
These examples are not simple demos, but complete small applications such as budgeting tools, PDF readers, real-time monitoring panels, and data visualization interfaces. These examples provide a visual understanding of MCP Apps’ ability to embed truly interactive interfaces in AI conversations.
From a technical point of view, this is actually moving the “application layer” into the chat window.
AI is no longer just a question answering system, but more like an operating system entrance. Users launch the tool and call the service through natural language, while the tool’s interface appears directly in the conversation flow. Charts, forms, maps, and data tables, which were originally web applications, began to appear in AI chat interfaces.
The potential for this change is huge.
Imagine a data analyst no longer having to switch between multiple software, but instead viewing data charts and running analysis scripts directly in AI conversations; An O&M engineer can see the real-time monitoring panel in the chat window and let AI automatically diagnose anomalies; Even ordinary users can complete budget planning, document reading, or project management in conversations.
In this mode, the chat window gradually becomes a new application container.
This capability is likely to become more common as more AI clients begin to support MCPs, such as Claude Desktop or ChatGPT.
Developers are no longer just web apps, but “AI apps” that can be embedded in AI conversations. Users no longer need to open dozens of websites or software, just talk to AI to call these tools.
From this perspective,modelcontextprotocol/ext-apps it is not just a sample repository, it is more like a starting point for a new ecosystem.
When AI conversations begin to host application interfaces, the “software form” we are used to may change. Many tools in the future may no longer appear as standalone websites or apps, but as an interactive panel in AI conversations.
And MCP Apps may be an important piece of the puzzle that leads to this future.
Github:https://github.com/modelcontextprotocol/ext-apps
Tubing: