
One of the biggest AI developments to emerge in the past year is the Model Context Protocol (MCP), an open standard introduced by Anthropic in November 2024 to facilitate seamless integration between large language models (LLMs) and external data sources and tools.
OpenAI is an early adopter of the MCP standard, incorporating it in their products, including the ChatGPT Desktop app and Agents SDK.
people love MCP and we are excited to add support across our products.
— Sam Altman (@sama) March 26, 2025
available today in the agents SDK and support for chatgpt desktop app + responses api coming soon!
Developer tools like Zed, Replit, Windsurf, Cursor, and VSCode have also incorporated MCP to enhance their platforms.
Now, Google is the latest company to announce plans to add support for the standard. In a recent X post, Google DeepMind CEO Demis Hassabis said:
MCP is a good protocol and it's rapidly becoming an open standard for the AI agentic era. We're excited to announce that we'll be supporting it for our Gemini models and SDK. Look forward to developing it further with the MCP team and others in the industry.
MCP works by establishing a standardized connection between two components: MCP Clients, which are typically AI-powered applications like chatbots or productivity tools that need to access external data or execute specific functions, and MCP Servers, which expose structured interfaces to data sources, tools, or prompt templates.
The shared components fall into three categories: Resources, which include documents, images, and other data objects; Tools, which are executable functions that the model can call to retrieve information or perform actions; and Prompts, which are structured templates that guide the model’s behavior for specific tasks or domains.
2 Comments - Add comment