Google launches managed MCP servers that let AI agents simply plug into itstools

AI agents are often promoted as the ultimate solution for planning trips, answering business questions, and solving a wide range of problems. However, integrating these agents with tools and data outside their chat interfaces has been a persistent challenge. Developers have typically needed to patch together various connectors and maintain them, which is a fragile approach that is difficult to scale and creates significant governance issues.

Google claims it is addressing this problem by launching its own fully managed, remote MCP servers. These servers are designed to make Google and Cloud services—such as Maps and BigQuery—easier for AI agents to connect to and use.

This move follows the launch of Google’s latest Gemini 3 model. The company aims to pair stronger AI reasoning with more reliable connections to real-world tools and data. According to Steren Giannini, product management director at Google Cloud, the goal is to make Google agent-ready by design. Instead of spending one to two weeks setting up connectors, developers can now essentially paste in a URL to a managed endpoint.

At launch, Google is starting with MCP servers for Maps, BigQuery, Compute Engine, and Kubernetes Engine. In practice, this could enable an analytics assistant to query BigQuery directly or an operations agent to interact with infrastructure services. For example, with Maps, developers would otherwise rely on the model’s built-in knowledge. By giving an agent a tool like the Google Maps MCP server, it becomes grounded in actual, up-to-date location information for places or trip planning.

While the MCP servers will eventually be offered across all of Google’s tools, they are initially launching under public preview. This means they are not yet fully covered by Google Cloud terms of service. However, they are being offered to enterprise customers who already pay for Google services at no extra cost. Giannini stated that Google expects to bring them to general availability very soon in the new year, with more MCP servers trickling in every week.

MCP, which stands for Model Context Protocol, was developed by Anthropic about a year ago as an open-source standard to connect AI systems with data and tools. The protocol has been widely adopted across the agent tooling world. Anthropic recently donated MCP to a new Linux Foundation fund dedicated to open-sourcing and standardizing AI agent infrastructure.

The benefit of MCP is that, as a standard, if Google provides a server, it can connect to any client. MCP clients are the AI apps on the other end that talk to MCP servers and call the tools they offer. For Google, that includes Gemini CLI and AI Studio. Giannini noted he has also tried it with Anthropic’s Claude and OpenAI’s ChatGPT as clients, and they just work.

Google argues this initiative is about more than just connecting agents to its services. The bigger enterprise play involves Apigee, its API management product, which many companies already use to issue API keys, set quotas, and monitor traffic. Giannini explained that Apigee can essentially translate a standard API into an MCP server. This turns endpoints, like a product catalog API, into tools an agent can discover and use, with existing security and governance controls layered on top. In other words, the same API guardrails companies use for human-built apps can now apply to AI agents.

Google’s new MCP servers are protected by a permission mechanism called Google Cloud IAM, which explicitly controls what an agent can do with a server. They are also protected by Google Cloud Model Armor, described as a firewall dedicated to agentic workloads that defends against advanced threats like prompt injection and data exfiltration. Administrators can also rely on audit logging for additional observability.

Google plans to expand MCP support beyond the initial set of servers. In the next few months, the company will roll out support for services across areas like storage, databases, logging and monitoring, and security. As Giannini stated, they built the plumbing so that developers don’t have to.