Yohan Lasorsa explains how to deploy a Node.js MCP server to Azure Functions using the official Anthropic MCP SDK, including practical advice for automation, infrastructure setup, and cost management.

Host Your Node.js MCP Server on Azure Functions in 3 Simple Steps

Author: Yohan Lasorsa

Deploying AI agents and MCP (Model Context Protocol) servers to production just got easier thanks to Azure Functions’ serverless hosting capabilities. This tutorial explains how to host your Node.js MCP server with the Anthropic MCP SDK on Azure Functions, enabling scalable, cost-effective deployment with minimal code changes.

What is MCP?

Model Context Protocol (MCP) is an open standard that lets AI models interact with external tools and data sources securely. Instead of hardcoding tool integrations, you build an MCP server that exposes capabilities—such as database queries or placing orders—so any MCP-compatible AI agent can discover and use them. MCP is model-agnostic and supports major AI models including OpenAI and Anthropic.

Why use Azure Functions?

Azure Functions provides a serverless compute platform for:

  • Zero infrastructure management
  • Automatic scaling
  • Pay-per-use pricing
  • Built-in monitoring (Application Insights)
  • Global distribution

Recent updates enable hosting Node.js servers as custom handlers, making Azure Functions a great fit for MCP servers and similar APIs.

Prerequisites

Quick Start: Three Simple Steps

  1. Create host.json configuration

     {
       "version": "2.0",
       "extensions": { "http": { "routePrefix": "" } },
       "customHandler": {
         "description": {
           "defaultExecutablePath": "node",
           "workingDirectory": "",
           "arguments": ["lib/server.js"]
         },
         "enableForwardingHttpRequest": true,
         "enableHttpProxyingRequest": true
       }
     }
    

    Adjust the arguments path as needed for your build output.

  2. Port Configuration in Server Code

     const PORT = process.env.FUNCTIONS_CUSTOMHANDLER_PORT || process.env.PORT || 3000;
     app.listen(PORT, () => {
       console.log(`MCP server listening on port ${PORT}`);
     });
    

    This ensures compatibility with Azure Functions’ environment.

  3. Create a function.json file in handler/ directory

     {
       "bindings": [
         {
           "authLevel": "anonymous",
           "type": "httpTrigger",
           "direction": "in",
           "name": "req",
           "methods": ["get", "post", "put", "delete", "patch", "head", "options"],
           "route": "{*route}"
         },
         {
           "type": "http",
           "direction": "out",
           "name": "res"
         }
       ]
     }
    

    Routes all HTTP requests to your MCP server handler.

Automate with GitHub Copilot

You can automate the entire setup using GitHub Copilot’s prompt helper. This adds all required config files, updates code, and prepares Infrastructure as Code.

Example Project: Burger MCP Server

  • Implements AI-powered burger ordering via a set of MCP tools
  • Built on Express using official MCP SDK and exposes endpoints like get_burgers, place_order.
  • Integrates with test and production environments
  • Full sample repo

MCP Tool Example

import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';

export function getMcpServer() {
  const server = new McpServer({
    name: 'burger-mcp',
    version: '1.0.0',
  });

  server.registerTool(
    'get_burgers',
    { description: 'Get a list of all burgers in the menu' },
    async () => {
      const response = await fetch(`${burgerApiUrl}/burgers`);
      const burgers = await response.json();
      return { content: [{ type: 'text', text: JSON.stringify(burgers, null, 2) }] };
    }
  );
  // ...more tools
  return server;
}

Testing Locally

  • Use Codespaces
  • Clone and run:

      npm install
      npm start
    
  • Use @modelcontextprotocol/inspector for testing the API:

      npx -y @modelcontextprotocol/inspector
    
  • Connect with GitHub Copilot by updating .vscode/mcp.json for real-world testing

Deployment with Azure Developer CLI

  • Define infrastructure in azure.yaml and infra/
  • Deploy with:

      azd auth login
      azd up
    
  • Choose a region and wait for deployment
  • You get a ready-to-use, globally available MCP server endpoint

Costs

  • Free tier: 1 million requests and 400,000 GB-s/month
  • Scale down to zero when idle
  • Pay only for execution time

Current Limitations & Tips

  • Only HTTP Streaming protocol is supported (stateless servers)
  • Stateful protocols like SSE are not supported due to serverless connection model
  • Most use cases should migrate to HTTP Streaming for scalability

Resources

For community help, join the Azure AI community on Discord.


This guide enables you to rapidly bring AI agent integration scenarios to Azure cloud, reducing operational friction and focusing on delivering value through open protocols and managed serverless technology.

This post appeared first on “Microsoft Blog”. Read the entire article here