GitHub is enhancing its AI-powered coding assistant, Copilot, to make it more autonomous and integrated with developers’ workflows.
Headlining the updates is the general availability rollout of ‘Agent Mode’ in Visual Studio Code, enhanced Model Context Protocol (MCP) support, and the introduction of several new large language models (LLMs) accessible via a premium request system.
Thomas Dohmke, CEO of GitHub, said: “GitHub Copilot is getting a whole lot more agentic with increased context of your tools and services, powered by the world’s leading models, starting today.”
Agent Mode awakens in VS Code
First previewed to VS Code Insiders in February, Agent Mode is now progressively rolling out to all stable channel users, with full availability expected in the coming weeks.
Unlike standard chat interactions or multi-file edits, Agent Mode is designed to proactively take action based on user prompts. It can analyse a high-level goal, break it down into subtasks, identify or generate necessary files, suggest terminal commands or tool calls for execution, and even attempt self-healing when encountering runtime errors.
“Compared to chat or multi-file edits, which allow you to propose code changes across multiple files in your workspace, Agent Mode is fundamentally capable of taking action to translate your ideas into code,” Dohmke explained.
Early feedback suggests developers are finding practical uses for this enhanced capability:
Agent Mode allows users to select from several powerful LLMs, including Anthropic’s Claude 3.5 and 3.7 Sonnet, Google’s Gemini 2.0 Flash, and OpenAI’s GPT-4o. GitHub reported that Agent Mode currently achieves a 56.0% pass rate on the SWE-bench Verified benchmark using Claude 3.7 Sonnet, with expectations for improvement as underlying models advance.
Alongside the Agent Mode rollout, GitHub announced the general availability of two other features:
- Code Review Agent: After a preview period where it was reportedly used by over one million developers, this agent assists in reviewing pull requests on GitHub.
- Next Edit Suggestions: This feature aims to streamline coding by suggesting subsequent edits, allowing developers to quickly accept changes and move forward—enabling them to “tab tab tab your way to coding glory.”
MCP: The ‘USB port for intelligence’
Central to the enhanced agent functionality for GitHub Copilot is the Model Context Protocol (MCP), now available in public preview. Dohmke described MCP as allowing developers to “equip Agent Mode with the context and capabilities it needs to help you, like a USB port for intelligence.”
MCP enables Copilot Agent Mode to interact with various tools within a developer’s engineering stack – from telemetry systems to infrastructure management tools. When a prompt is entered, the agent can consult available MCP tools to gather context (like database schemas) or perform actions (like querying a web service).
As an example, GitHub outlined a prompt: “Update my GitHub profile to include the title of the PR that was assigned to me yesterday.” The agent, leveraging MCP, would identify the necessary tools, interact with them iteratively (potentially calling multiple tools in sequence), and ultimately fulfil the request.
To foster this ecosystem, GitHub has released a new open-source and local GitHub MCP server. This server allows any LLM tool supporting MCP to integrate with GitHub functionalities like searching repositories, managing issues, and creating pull requests. A community inventory repository highlighting available MCP servers is also available.
GitHub introduces premium models and new Copilot pricing tiers
Expanding on its multi-model strategy, GitHub is making Anthropic Claude 3.5, 3.7 Sonnet, 3.7 Sonnet Thinking, Google Gemini 2.0 Flash, and OpenAI o3-mini generally available.
Access to these specific models will operate via a new ‘premium request’ system, complementing the unlimited access to the base model (currently GPT-4o) included in all paid Copilot plans for core features like Agent Mode, chat, and code completions.
Starting 5 May 2025, GitHub Copilot Pro users will receive 300 monthly premium requests. Copilot Business and Enterprise customers will receive 300 and 1000 monthly premium requests respectively, beginning between 12-19 May 2025. Usage of these premium models is unlimited until these dates.
A new tier, Copilot Pro+, has also been introduced for individual developers. Priced at $39 per month, it includes 1500 monthly premium requests and promises access to “the best models, like GPT-4.5.”
For users exceeding their monthly allowance, a pay-as-you-go option for additional premium requests is available which starts at $0.04 per request. Both individuals and organisations can opt-in and set spending limits. Different premium models will consume varying numbers of requests per use, allowing users to balance capability with cost.
“With GitHub Copilot, what started out as a developer platform company is 1743821902 a platform where anyone can be a developer. Together, GitHub and Microsoft fully intend on enabling a world with 1 billion developers,” Dohmke concludes.
(Image credit: GitHub)
See also: KAI Scheduler: NVIDIA open-sources Kubernetes GPU scheduler

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.