Following the establishment of the Linux FoundationAgentic AI Foundation (AAIF)Furthermore, after listing the MCP (Model Context Protocol) as a core standard, Google acted swiftly, announcing earlier...Full support for the MCP protocolFurthermore, the launch of the "fully managed remote MCP server" means that developers no longer need to worry about maintaining the connection layer themselves, and can allow AI agents to directly operate Google Maps, BigQuery, and even manage cloud infrastructure such as Google Compute Engine and GKE.
Embracing the "USB-C of AI," Google officially jumps in to do it.
MCP is an open protocol proposed by Anthropic in 2024. Essentially, it is a standard interface that specifies how external tools should communicate with AI models. Because it provides a consistent interface format, allowing models to easily call various libraries and APIs like plugging and unplugging a USB-C device, it is jokingly referred to in the industry as the "USB-C of the AI world".
Google points out that in the past, developers who wanted to use MCP servers for Google services often had to find open-source versions from the community and maintain and run them on their devices, which was not only costly to integrate but also relatively difficult to control in terms of stability. However, through Google's officially provided fully managed service, developers only need to point the AI Agent to the MCP endpoint provided by Google to interact directly with Google Cloud services.
The first wave of four services: from map lookup to container service management
The first wave of support covers three main areas: data querying, geographic information, and infrastructure management.
• Google Maps (Grounding Lite):It provides reliable information such as location details, weather forecasts, and route times. This helps the AI Agent answer questions about travel planning or nearby recommendations, significantly reducing the risk of the model "talking nonsense" (i.e., creating illusions).
• BigQuery:This enables AI agents to natively parse schemas and execute queries. Crucially, it allows for in-place data analysis without needing to import data into the context window, while maintaining data governance and security.
• Google Compute Engine (GCE):By empowering AI with the ability to autonomously manage infrastructure, from initial setup to subsequent maintenance (Day 2 operations), the AI Agent can dynamically adjust resources based on workload.
• Google Kubernetes Engine (GKE):Provides a structured interface that allows AI to interact directly with the Kubernetes API. Developers no longer need to rely on AI to parse complex CLI text output to diagnose container service issues, handle faults, and optimize costs in a consistent environment.
Example: Gemini 3 Pro automatic address selection
Google also cited a practical application scenario: Developers can use the Agent Development Kit (ADK) to create an agent service centered around the Gemini 3 Pro. Through the MCP protocol, this AI Agent can first access sales data from BigQuery to predict revenue, then use Google Maps to query the surrounding business environment and delivery routes, and finally integrate the information from both sources to provide retail site selection suggestions, all automatically coordinated throughout the process.
In terms of security, Google emphasizes that it will use Cloud API Registry and Apigee API Hub as centralized management tools, and combine them with Cloud IAM for access control. It can also use Model Armor to defend against prompt injection attacks, ensuring the safety of enterprise information.
Google stated that in the coming months, it will gradually include more core services such as Cloud Run, Cloud Storage, and Cloud SQL in the scope of MCP protocol support.
Analysis of viewpoints
Google's move is clearly aimed at seizing control of infrastructure in the era of "Agentic AI." By supporting the universal standard MCP, Google not only lowers the barrier for developers to create AI agents on its own cloud platform, but also allows AI to move from simple "content generation" to the level of "actual execution." When AI assistants can directly help you turn VMs on and off and query databases, the stickiness of Google Cloud users will naturally be higher.
