Artificial intelligence has a new buzzword: MCP, short for model context protocol.
Buzzwords are nothing new in AI, but this one feels different, Baidu chairman Robin Li suggested during the Baidu Create conference on April 25.
“Right now, developing agents based on MCP is like building mobile apps in 2010,” Li said during the event.
Those unfamiliar with MCP have likely heard of the last big buzzword: agent. Chinese AI startup Manus catapulted that term into the mainstream almost overnight with the launch of its agentic AI platform earlier this year.
What made AI agents go viral was their perceived functionality. Previously, large models could answer questions, but many regarded them as glorified chat windows—limited by the data they were trained on. Without added infrastructure, getting these models to interact with external tools was clunky and time-consuming.
This is where MCP enters the picture. The concept is tightly linked to agents and offers a critical path to realizing their full potential. With MCP, large models can freely access external tools that support the protocol and execute specific tasks.
Some consumer apps, like Amap and WeRead, have already launched official MCP servers. That enables developers to treat tools like modular components: pick a foundation model, connect it to an MCP server run by Amap or WeRead, and the model can retrieve maps or read data in real time.
Since February, interest in MCP has surged globally. Nearly every major AI player—including OpenAI, Google, Meta, and China’s Alibaba, Tencent, ByteDance, and Baidu—has announced support for MCP. Each has launched its own MCP platform, with recruitment of developers and service providers underway.
Back in 2024, one of the hottest topics in China’s AI circles was the rise of the “superapp” concept. Many believed that year would mark a turning point, but innovation remained fragmented, hindered by the lack of a shared foundation. MCP’s rapid ascent has drawn comparisons to Qin Shi Huang’s unification of ancient China—standardizing writing, currency, and transportation to accelerate commerce.
Analysts now believe MCP and similar protocols could finally spark a true boom in AI apps.
In truth, MCP isn’t entirely new. It was quietly introduced by Anthropic in November 2024.
As an open standard, MCP enables large model applications to interface with external data sources and tools, as long as both sides use the same standardized “language.”
Still sounds abstract? Think of the ports on your phone or laptop. MCP acts like a universal socket—a standardized “USB interface” for large models.
This plug-and-play setup allows developers to integrate external data and workflows far more efficiently.
Before MCP, building AI applications was a major lift. Take the example of a developer building a travel assistant. The app would need the model to read maps, scrape travel content online, and generate itineraries based on user preferences.
To achieve that, developers had to handle each provider’s quirks. OpenAI and Anthropic had slightly different implementations for function calling. Switching models meant rewriting compatibility layers, essentially coding a custom tool-use manual for each model. Without it, output quality would plummet.
In short, the absence of a standard made tool integrations fragile and time-consuming, slowing the entire AI ecosystem.
“As an AI application developer, before MCP, you had to understand both the model and how to integrate third-party tools yourself. And if something didn’t work, you had to figure out whether the issue was with your app or the tool itself,” said Chen Ziqian, an algorithm expert from Alibaba Cloud’s ModelScope community, in an interview with 36Kr.
Manus is a prime example. A recent 36Kr review showed the app needed access to more than a dozen tools just to write a basic news article. That included browsers, scraping tools, writing modules, verification systems, and delivery frameworks. Each step required custom functions. If Manus overloaded itself, it crashed—typically from token limits triggered by complex task chains.
MCP spares developers from worrying about external tool performance. They only need to manage their app logic. Compatibility is handled by the individual MCP servers maintained by providers like Amap or Alipay.
That said, the MCP ecosystem is still in its early stages and usage is far from seamless. Some developers argue MCP is a standard for standardization’s sake. In many cases, traditional APIs are lighter and more practical. If an MCP server is poorly maintained or lacks documentation, it raises flags around security and stability.
Developer Tang Shuang shared a revealing case involving a map provider’s MCP server. It offered fewer than 20 tools. Five required geographic coordinates. Another, for weather forecasts, asked for an administrative region ID—but gave no guidance on how to obtain it. Users had to return to the provider’s native ecosystem just to retrieve the needed access.
So while MCP is gaining traction, the power dynamics haven’t shifted. Big tech still controls access. No one wants to build an entire ecosystem just to uphold Anthropic’s standard. Without platform-level support and high-quality services, developers risk doing twice the work for half the return.
Why, then, is MCP taking off now?
When Anthropic first introduced the protocol, it barely registered. Only a few apps, like Claude Desktop, supported it. Developers were still building in isolation.
That changed in February, when tools like Cursor, Visual Studio Code, and Cline adopted MCP, drawing interest from the developer community. Support from major model providers also accelerated MCP’s rise.
The true tipping point came on March 27, however, when OpenAI announced it would support the protocol. Google followed soon after.
Google CEO Sundar Pichai initially showed public hesitation. On March 31, he posted on X: “To MCP or not to MCP, that’s the question.” Four days later, Google joined.
This shift marked a turning point, as adopting MCP became less about technical implementation and more about signaling alignment. Instead of competing in silos, companies began rallying around a shared standard to grow the broader ecosystem.
For the past two years, large model companies followed a familiar playbook: build walled gardens and evolve into platforms. Apple did it successfully, growing a vast developer ecosystem.
OpenAI attempted the same. In March 2023, it launched plugins on ChatGPT via an open system where third-party developers could extend functionality. But in January 2024, OpenAI replaced it with custom GPTs and a curated store, making the ecosystem more closed. These GPTs could only run inside OpenAI’s interface, and OpenAI took a cut. Developers had to optimize apps for a tightly controlled environment.
That pivot has not paid off. The GPT store is overcrowded with low-quality clones, and monetization remains elusive.
To be clear, Anthropic’s MCP doesn’t reinvent everything. OpenAI’s function calling already enabled models to interact with tools. MCP builds on that foundation.
The difference? Function calling demands custom code and adaptation. MCP simplifies this by packaging services into modular “Lego blocks,” making development easier.
MCP’s greatest strength is its openness. It’s a neutral protocol that isn’t tied to any one model or deployment method. It works across cloud and local setups and can theoretically support any model. That neutrality means no single company owns the ecosystem.
In many ways, this is Anthropic’s bid to reclaim the developer community: with an open-door approach, in contrast to OpenAI’s increasingly gated system. The early signals from DeepSeek—and now MCP—suggest that in emerging tech, openness and open standards may still be the best long-term play.
KrASIA Connection features translated and adapted content that was originally published by 36Kr. This article was written by Deng Yongyi for 36Kr.