<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Designing with AI: MCP For Software Engineers]]></title><description><![CDATA[A series coveringall you need to know about building AI applications with model context protocol]]></description><link>https://newsletter.victordibia.com/s/mcp-for-software-engineers</link><generator>Substack</generator><lastBuildDate>Sun, 19 Apr 2026 06:00:33 GMT</lastBuildDate><atom:link href="https://newsletter.victordibia.com/feed" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><webMaster><![CDATA[victordibia@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[victordibia@substack.com]]></itunes:email><itunes:name><![CDATA[Victor Dibia, PhD]]></itunes:name></itunes:owner><itunes:author><![CDATA[Victor Dibia, PhD]]></itunes:author><googleplay:owner><![CDATA[victordibia@substack.com]]></googleplay:owner><googleplay:email><![CDATA[victordibia@substack.com]]></googleplay:email><googleplay:author><![CDATA[Victor Dibia, PhD]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[MCP For Software Engineers | Part 2: Interactive & Long-Running Tools (Progress streaming, User Input, Cancellation), Resources & Prompts]]></title><description><![CDATA[#45 | A deep dive into implementing Tools, Resources, Prompts, Roots in the MCP]]></description><link>https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-2</link><guid isPermaLink="false">https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-2</guid><dc:creator><![CDATA[Victor Dibia, PhD]]></dc:creator><pubDate>Fri, 01 Aug 2025 13:29:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cRS7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cRS7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cRS7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 424w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 848w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 1272w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cRS7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png" width="1456" height="1010" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1010,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:544723,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167856675?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cRS7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 424w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 848w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 1272w, https://substackcdn.com/image/fetch/$s_!cRS7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In <a href="https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-1">Part 1</a> of this <a href="https://newsletter.victordibia.com/s/mcp-for-software-engineers">series</a>, we built a simple MCP server with a single tool using the high-level server API in the Python MCP SDK, and demonstrated how to connect to this server via a host application that implements an MCP client. </p><p>In practice, MCP is a lot more flexible, has a set of <em><strong>advanced features</strong></em>, many of which are only available via low level api implementation. In this part, we&#8217;ll explore these advanced features and how to use them effectively. </p><blockquote><p><strong>The Long Running Tool Misconception</strong><br>Most MCP tutorials show quick request/response patterns for tools, creating the <em><strong>incorrect</strong></em> impression that MCP is unsuitable (compared to protocols like A2A) for handling long-running operations. In reality, MCP supports sophisticated tools that can run for hours, pause to request user input, send real-time progress updates, and handle cancellation gracefully. We will cover these in this article.</p></blockquote><p>In this part, we&#8217;ll go deeper and cover:</p><ul><li><p><strong>Tools</strong>: Advanced features including annotations, requesting user input (elicitation), LLM assistance (sampling), progress notifications, cancellation, and structured return types</p></li><li><p><strong>Resources</strong>: Defining server resources, client operations (list/read/subscribe), and real-time update notifications</p></li><li><p><strong>Prompts</strong>: Creating reusable LLM prompt templates that can be used to modify host application behavior without modifying client/host application code.</p></li><li><p><strong>Roots</strong>: Understanding client-suggested operation boundaries</p></li></ul><p>As done previously, we&#8217;ll use the Python SDK to illustrate these concepts, but the principles apply across languages. This time around we will use the <a href="https://github.com/modelcontextprotocol/python-sdk?tab=readme-ov-file#low-level-server">low-level API </a> which provides more flexibility/control.   </p><p>All of the code for this tutorial is available at the end of the article. </p><blockquote><p><a href="https://newsletter.victordibia.com/p/autogen-studio-v04-a-no-code-tool">AutoGen Studio</a> now has an MCP playground feature that lets users test our interactive tool capabilities (streaming progress notifications, elicitation, sampling). See video below.</p></blockquote><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;cd9e0b67-159e-4dde-822a-7c85b97eb820&quot;,&quot;duration&quot;:null}"></div><p></p><h2>Defining an MCP Server in the low-level Python SDK API</h2><p>The MCP python <a href="https://github.com/modelcontextprotocol/python-sdk?tab=readme-ov-file#low-level-server">low-level API</a> provides more flexibility and control (but with more code) compared to the high-level API. Many production systems may require this level of control, especially related to how resources, authentication, and transport security are implemented.</p><pre><code><code>from mcp.server import Server

class AdvancedMCPServer(Server):
    """Advanced MCP server with tools, resources, and prompts."""
    
    def __init__(self, name: str = "advanced_mcp_server"):
        super().__init__(name)
        # Handlers will be defined in __init__ using decorators
</code></code></pre><p>On the client side, we can write a simple client that connects to this server and prints out available tools.</p><pre><code><code>from mcp.client.session import ClientSession
from mcp.client.streamable_http import streamablehttp_client

async def test_client():
    server_url = "http://127.0.0.1:8006/mcp"
    
    async with streamablehttp_client(server_url) as (read_stream, write_stream, get_session_id):
        async with ClientSession(read_stream, write_stream) as session:
            # Initialize connection
            result = await session.initialize()
            print(f"Connected to: {result.serverInfo.name}")
            
            # List available tools
            tools = await session.list_tools()
            print("Available tools",  tools)
</code></code></pre><p>We will build on this to implement advanced tool capabilities, resources, prompts etc.</p><h2>Tools</h2><p>Tools in MCP servers can be thought of as functions that clients can call (typically driven by an LLM) - anything from a simple calculator to a full data analysis pipeline. Underneath, a tool call is a request made using the client to the server. Each request includes the toolRequest data structure and importantly a unique request ID. This ID is crucial for tracking the request, especially for long-running operations. </p><blockquote><p><strong>Tool Calls can be Long-Running</strong><br>MCP tools can be long-running processes that interact with users and systems over time.<br>Examples: Research agents that analyze data for hours while asking for user input, deployment pipelines that seek approval before critical steps, or data processing jobs that send status updates overnight.</p></blockquote><p>A tool can be defined by creating a function on our server and decorating it with <code>@self.list_tools()</code> to make it discoverable, and another function that handles the tool call with the <code>@self.call_tool()</code> decorator.</p><p>The code below shows a simple example that lists a <code>travel_agent</code> tool for booking trips:</p><pre><code><code>@self.list_tools()
async def handle_list_tools() -&gt; list[Tool]:
    """List available tools."""
    return [
        Tool(
            name="travel_agent",
            description="Book a travel trip with progress updates and price confirmation",
            inputSchema={
                "type": "object", 
                "properties": {
                    "destination": {
                        "type": "string", 
                        "description": "Travel destination",
                        "default": "Paris"
                    }
                }
            }
        )
    ]

@self.call_tool()
async def handle_call_tool(name: str, args: dict) -&gt; list[TextContent]:
    """Handle tool execution."""
    if name == "travel_agent":
        destination = args.get("destination", "Paris")
        result = f"&#9989; Trip booked successfully to {destination}!"
        return [TextContent(type="text", text=result)]
    else:
        raise ValueError(f"Unknown tool: {name}")
</code></code></pre><h3>Tool Annotations</h3><p>Tools can include metadata (annotations) such as <code>readOnlyHint</code>, <code>destructiveHint</code>, <code>idempotentHint</code>, and <code>openWorldHint</code>. These help host applications and users understand what a tool does and <em>how</em> it should be presented in the UI. For example, a tool that deletes files should have <code>destructiveHint: true</code>.</p><p>We can annotate our tool using the following code:</p><pre><code><code>Tool(
    name="travel_agent",
    description="Book a travel trip with progress updates and price confirmation",
    inputSchema={
        "type": "object",
        "properties": {
            "destination": {
                "type": "string", 
                "description": "Travel destination",
                "default": "Paris"
            }
        }
    },
    annotations=ToolAnnotations(
        title="Travel Booking Agent",
        readOnlyHint=False,     # Modifies booking state
        destructiveHint=False,  # Safe, doesn't delete data
        idempotentHint=False,   # Each booking is unique
        openWorldHint=True      # Interacts with external systems
    )
)
</code></code></pre><p><strong>Note:</strong> The list of tools can change during a session. Servers send <code>notifications/tools/list_changed</code> when tools are added or removed. Clients should refresh their tool list when receiving this notification.</p><h3>Requesting (User) Input During Tool Calls</h3><p>Tools can pause execution to request additional input primarily through the Elicitation feature in MCP. Elicitation allows tools to request structured input from users. Here's how to use it within a tool implementation:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rDTp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rDTp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 424w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 848w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 1272w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rDTp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png" width="1456" height="441" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:441,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:92850,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167856675?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rDTp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 424w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 848w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 1272w, https://substackcdn.com/image/fetch/$s_!rDTp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0934cfb7-d6f0-4a95-84f9-caa9358ad8a6_3463x1049.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><pre><code><code>@self.call_tool()
async def handle_call_tool(name: str, args: dict) -&gt; list[TextContent]:
    """Handle tool execution."""
    ctx = self.request_context  # Get the request context
    
    if name == "travel_agent":
        destination = args.get("destination", "Paris")
        
        try:
            # Request user confirmation via elicitation
            elicit_result = await ctx.session.elicit(
                message=f"Please confirm the estimated price of $1200 for your trip to {destination}",
                requestedSchema=PriceConfirmationSchema.model_json_schema(),
                related_request_id=ctx.request_id,
            )
            
            if elicit_result and elicit_result.action == "accept":
                # User confirmed, continue booking
                result = f"&#9989; Trip booked successfully to {destination}!"
                return [TextContent(type="text", text=result)]
            else:
                # User declined or cancelled
                return [TextContent(type="text", text="Booking cancelled")]
                
        except Exception as e:
            # Handle elicitation failures gracefully
            logger.info(f"Elicitation request failed: {e}")
            # Continue with fallback behavior
</code></code></pre><p>In addition, tools can also request LLM completions via the Sampling feature in MCP:</p><pre><code><code>@self.call_tool()
async def handle_call_tool(name: str, args: dict) -&gt; list[TextContent]:
    """Handle tool execution."""
    ctx = self.request_context
    
    if name == "research_agent":
        topic = args.get("topic", "AI trends")
        
        try:
            # Request AI assistance during tool execution
            sampling_result = await ctx.session.create_message(
                messages=[
                    SamplingMessage(
                        role="user",
                        content=TextContent(type="text", text=f"Please summarize the key findings for research on: {topic}")
                    )
                ],
                max_tokens=100,
                related_request_id=ctx.request_id,
            )
            
            if sampling_result and sampling_result.content:
                summary = sampling_result.content.text
                result = f"&#128269; Research on '{topic}' completed!\n\nKey Findings: {summary}"
                return [TextContent(type="text", text=result)]
                
        except Exception as e:
            logger.info(f"Sampling request failed: {e}")
            # Continue with fallback behavior
</code></code></pre><p></p><h3>Tool Progress Notifications</h3><p>For long-running operations, tools can send progress updates. Here's how to integrate progress notifications into your tool implementation:</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9O_A!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9O_A!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 424w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 848w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 1272w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9O_A!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png" width="1456" height="250" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:67847,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167856675?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9O_A!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 424w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 848w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 1272w, https://substackcdn.com/image/fetch/$s_!9O_A!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa38f07e8-8f9f-40f0-b495-888c5218d833_3483x598.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><pre><code><code>@self.call_tool()
async def handle_call_tool(name: str, args: dict) -&gt; list[TextContent]:
    """Handle tool execution."""
    ctx = self.request_context
    
    if name == "travel_agent":
        destination = args.get("destination", "Paris")
        
        # Define steps for progress tracking
        steps = [
            "Checking flights...",
            "Finding available dates...", 
            "Confirming prices...",
            "Booking flight..."
        ]
        
        for i, step in enumerate(steps):
            # Send progress updates during tool execution
            await ctx.session.send_progress_notification(
                progress_token=ctx.request_id,
                progress=i * 25,
                total=100,
                message=step,
                related_request_id=str(ctx.request_id)
            )
            
            # Simulate work being done
            await anyio.sleep(2)
        
        # Final progress update
        await ctx.session.send_progress_notification(
            progress_token=ctx.request_id,
            progress=100,
            total=100,
            message="Trip booked successfully"
        )
        
        return [TextContent(type="text", text=f"&#9989; Trip booked successfully to {destination}!")]
</code></code></pre><h3>Tool Cancellation</h3><p>Tools can be cancelled mid-execution. Each tool call has a unique request ID that clients can use to send cancellation requests. The server should handle cancellation gracefully and clean up any ongoing operations.</p><p>On the server side, tools should be designed to handle cancellation gracefully, checking for cancellation during long-running operations. On the client side, cancellation is typically handled through asyncio task cancellation:</p><pre><code><code>from mcp.client.session import ClientSession
from mcp.client.streamable_http import streamablehttp_client
import asyncio

async def cancel_tool_example():
    server_url = "http://127.0.0.1:8006/mcp"
    
    async with streamablehttp_client(server_url) as (read_stream, write_stream, get_session_id):
        async with ClientSession(read_stream, write_stream) as session:
            await session.initialize()
            
            # Start a long-running tool
            async def call_long_tool():
                return await session.call_tool("long_running_agent", {})
            
            # Create task for the tool call
            tool_task = asyncio.create_task(call_long_tool())
            
            # Wait briefly, then cancel
            await asyncio.sleep(5)
            tool_task.cancel()
            
            try:
                await tool_task
            except asyncio.CancelledError:
                print("Tool call cancelled successfully")
</code></code></pre><p>This is particularly useful for long-running operations where users may want to stop the process.</p><h3>Tool Return Types</h3><p>Based on the MCP specification, tools can return flexible content types in their responses:</p><ul><li><p><strong>Unstructured Content</strong>: Text, Image (base64 with MIME type), Audio, Resource Links, and Embedded Resources</p></li><li><p><strong>Structured Content</strong>: Optional <code>structuredContent</code> field containing structured data (JSON), which should also be included as serialized JSON in a TextContent block for backwards compatibility</p></li><li><p><strong>Error State</strong>: The <code>isError</code> boolean flag indicates whether the response represents an error</p></li><li><p><strong>Metadata</strong>: Optional <code>_meta</code> field for additional annotations and information</p></li></ul><p>Example tool result structure:</p><pre><code><code>{
  "content": [
    {
      "type": "text",
      "text": "Analysis complete: Temperature is 22.5&#176;C"
    }
  ],
  "structuredContent": {
    "temperature": 22.5,
    "unit": "celsius",
    "conditions": "Partly cloudy"
  },
  "isError": false
}
</code></code></pre><ul><li><p><strong>Output Schema</strong>: Tools can provide an optional output schema to validate structured results and help clients understand the expected response structure. When an output schema is provided:</p><ul><li><p>Servers MUST provide results conforming to the schema</p></li><li><p>Clients SHOULD validate results against the schema</p></li><li><p>The schema guides LLMs in parsing tool outputs and improves type safety</p></li></ul></li></ul><p>Example tool definition with output schema:</p><pre><code><code>{
  "name": "get_weather",
  "description": "Get current weather data",
  "inputSchema": {
    "type": "object",
    "properties": { "location": { "type": "string" } }
  },
  "outputSchema": {
    "type": "object",
    "properties": {
      "temperature": { "type": "number" },
      "conditions": { "type": "string" },
      "humidity": { "type": "number", "minimum": 0, "maximum": 100 }
    },
    "required": ["temperature", "conditions"]
  }
}
</code></code></pre><p>To use the <code>structuredContent</code> field effectively, tools should define output schemas that clients can use for validation and type checking.</p><h2>Resources</h2><p>Resources in MCP are how you expose data: files, database records, API responses, logs, images, and more. Each resource is identified by a unique URI (e.g., <code>file:///data/report.csv</code>, <code>postgres://db/table</code>). Resources can be text (UTF-8) or binary (base64-encoded).</p><p>Clients can discover resources via <code>resources/list</code> or by using URI templates for dynamic resources. To read a resource, clients send a <code>resources/read</code> request with the resource URI. Servers can also notify clients when resources change, and clients can subscribe to updates for real-time workflows.</p><p>We can define a resource in the server by creating a resource handler:</p><pre><code><code>@self.list_resources()
async def handle_list_resources() -&gt; list[Resource]:
    """List available resources."""
    return [
        Resource(
            uri=AnyUrl("research://data/sources"),
            name="Research Data Sources",
            description="Collection of research sources and references",
            mimeType="application/json"
        )
    ]

@self.read_resource()
async def handle_read_resource(uri: AnyUrl) -&gt; list[ReadResourceContents]:
    """Read resource content based on URI."""
    uri_str = str(uri)
    
    if uri_str == "research://data/sources":
        # Mock research data
        research_data = {
            "sources": [
                {"title": "AI Trends 2024", "url": "https://example.com/ai-trends"}
            ],
            "last_updated": "2024-01-15T10:30:00Z"
        }
        return [ReadResourceContents(
            content=str(research_data).replace("'", '"'),
            mime_type="application/json"
        )]
    else:
        raise ValueError(f"Unknown resource: {uri_str}")
</code></code></pre><p>Clients can interact with resources through several operations:</p><pre><code><code>from mcp.client.session import ClientSession
from pydantic import AnyUrl

async def resource_client_example(client_session: ClientSession):
    # List available resources
    resources_result = await client_session.list_resources()
    print(f"Available resources: {resources_result.resources}")
    
    # Read a specific resource
    resource_uri = AnyUrl("research://data/sources")
    resource_content = await client_session.read_resource(resource_uri)
    print(f"Resource content: {resource_content.contents}")
    
    # Subscribe to resource updates
    await client_session.subscribe_resource(resource_uri)
    
    # Later, unsubscribe when no longer needed
    await client_session.unsubscribe_resource(resource_uri)
</code></code></pre><p>Servers can notify subscribed clients when resources change. On the server side, you can send notifications:</p><pre><code><code># In a tool or other server operation that modifies a resource
async def handle_call_tool(name: str, args: dict) -&gt; list[TextContent]:
    ctx = self.request_context
    
    if name == "update_data":
        # Perform the update...
        
        # Notify subscribed clients about the resource change
        await ctx.session.send_resource_updated(
            uri=AnyUrl("research://data/sources")
        )
        
        return [TextContent(type="text", text="Data updated successfully")]
</code></code></pre><p>Clients can handle these notifications by setting up a message handler:</p><pre><code><code>async def handle_notifications(message):
    if isinstance(message, types.ServerNotification):
        match message.root:
            case types.ResourceUpdatedNotification(params=params):
                print(f"Resource updated: {params.uri}")
                # Refresh the resource content
            case types.ResourceListChangedNotification():
                print("Resource list changed - refreshing available resources")
</code></code></pre><blockquote><p>Tip<br>When working with resources, use descriptive URIs and set appropriate MIME types for better client compatibility. Handle errors gracefully and consider supporting subscriptions for frequently changing resources to enable real-time applications.</p></blockquote><h2>Prompts</h2><p>Prompts are reusable templates for LLM interactions, defined on the server and surfaced to clients. Each prompt has a name, description, and optional arguments. Prompts can accept dynamic arguments, embed resource context, and support multi-step workflows.</p><p>Clients discover prompts via <code>prompts/list</code> and retrieve them with <code>prompts/get</code>. Prompts are especially useful for standardizing common LLM tasks (e.g., "summarize this file", "generate a commit message") and can be improved on the server side without changing the host application.</p><p>Example prompt definition:</p><pre><code><code>{
  "name": "explain-code",
  "description": "Explain how code works",
  "arguments": [
    { "name": "code", "description": "Code to explain", "required": true },
    {
      "name": "language",
      "description": "Programming language",
      "required": false
    }
  ]
}
</code></code></pre><p>To define a prompt on the server, we can create a prompt handler:</p><pre><code><code>@self.list_prompts()
async def handle_list_prompts() -&gt; list[Prompt]:
    """List available prompt templates."""
    return [
        Prompt(
            name="task_summary",
            description="Generate a summary for any completed task",
            arguments=[
                PromptArgument(
                    name="task_name",
                    description="Name of the completed task",
                    required=True
                ),
                PromptArgument(
                    name="outcome",
                    description="The result or outcome of the task",
                    required=False
                )
            ]
        )
    ]

@self.get_prompt()
async def handle_get_prompt(name: str, arguments: dict[str, str] | None) -&gt; GetPromptResult:
    """Generate prompt content based on template name and arguments."""
    if name != "task_summary":
        raise ValueError(f"Unknown prompt: {name}")
    
    if arguments is None:
        arguments = {}
    
    task_name = arguments.get("task_name", "Unknown Task")
    outcome = arguments.get("outcome", "task completed successfully")
    
    prompt_text = f"""Please create a concise summary for the following completed task:

Task: {task_name}
Outcome: {outcome}

Please provide:
1. What was accomplished
2. Key results or deliverables
3. Any important observations or lessons learned

Keep the summary brief and professional."""
    
    return GetPromptResult(
        description=f"Task summary prompt for {task_name}",
        messages=[
            PromptMessage(
                role="user",
                content=TextContent(type="text", text=prompt_text)
            )
        ]
    )
</code></code></pre><blockquote><p>Tip<br>When creating prompts, use clear names and detailed descriptions, validate arguments properly, and consider versioning prompt templates for backward compatibility.</p></blockquote><h2>Roots</h2><p>Roots are URIs (like file paths or URLs) that a client suggests to a server as the boundaries or focus areas for operations. When a client connects, it can declare support for roots and provide a list of relevant roots (e.g., project directories, API endpoints). Servers should respect these roots, using them to locate and prioritize resources, but roots are informational&#8212;not strictly enforced.</p><p><strong>Common use cases:</strong></p><ul><li><p>Defining project directories or repository locations</p></li><li><p>Specifying API endpoints or configuration boundaries</p></li></ul><p>Example roots declaration:</p><pre><code><code>{
  "roots": [
    {
      "uri": "file:///home/user/projects/frontend",
      "name": "Frontend Repository"
    },
    { "uri": "https://api.example.com/v1", "name": "API Endpoint" }
  ]
}
</code></code></pre><blockquote><p>Tip:<br>When working with roots, only suggest necessary ones and use clear, descriptive names. Monitor accessibility and handle changes gracefully since clients rely on these URIs for scoping operations.</p></blockquote>
      <p>
          <a href="https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-2">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[MCP For Software Engineers | Part 1 : Building Your First Server]]></title><description><![CDATA[#44 | New to MCP? Heres how to build your first MCP server and a Host Application that integrates and uses the server.]]></description><link>https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-1</link><guid isPermaLink="false">https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-1</guid><dc:creator><![CDATA[Victor Dibia, PhD]]></dc:creator><pubDate>Wed, 02 Jul 2025 15:31:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Ct1Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Ct1Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 424w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 848w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1272w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png" width="1456" height="880" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:880,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:185668,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167292158?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 424w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 848w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1272w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>When I <a href="https://newsletter.victordibia.com/p/no-mcps-have-not-won-yet">first tried out</a> the Model Context Protocol (MCP) from Anthropic (March 2025), the developer experience was <em><strong>rough</strong></em>. Integrating and bundling MCP with existing applications was challenging, and there were a few security pitfalls that developers had to navigate. I wrote about that experience <a href="https://newsletter.victordibia.com/p/no-mcps-have-not-won-yet">here</a>. </p><blockquote><p>If you are visually inclined - here is a <a href="https://youtu.be/-yrqkwZr3Nc?si=ISuvOoOcGRPYuYJP">video walkthrough  of this post.</a></p></blockquote><blockquote><p><a href="https://modelcontextprotocol.io/specification/2025-06-18">Model Context Protocol (MCP) is a standard</a> for how AI applications connect to tools and data sources. Simply put: your AI app needs to call tools, monitor requests, handle prompts, and get user approval. MCP standardizes all of this. </p></blockquote><p>However, like all <em>good</em> standards or protocols, MCP has evolved and <em><strong>gotten better (see recent <a href="https://modelcontextprotocol.io/specification/2025-06-18/changelog">changelog</a> including fixes to sdks, improved support for remote servers, improved auth)</strong></em>. And I can <em>now</em> see it solving several critical problems faced by teams building AI applications.  </p><ul><li><p><strong>Integration</strong>: Without MCP, Team X spends weeks integrating Team Z's new tool. With MCP, Team X announces day-one support for any MCP tool (including Team Z&#8217;s new capabiltiies!).</p></li><li><p><strong>Distribution</strong>: Without MCP, Team X writes a Cursor extension, a Windsurf plugin, a VSCode extension, a Claude Desktop add-on, and more. With MCP, Team X writes one MCP server that works everywhere.</p></li><li><p><strong>Discovery</strong>: Without MCP, teams ask "Does anyone have a tool that does X?" With MCP, there's a central registry where teams publish and find tools.</p></li><li><p><strong>Security</strong>: Without MCP, each team implements (or skips) their security/auth for the tools or resources that LLMs use. With MCP, its possible to implement centralized auth and managed registry of MCP servers. </p></li><li><p><strong>Runtime Flexibility</strong>: Without MCP, you're stuck with hard-coded tool configurations. With MCP, tools can be dynamically discovered based on context. Also, it can be helpful to have aspects of the the application logic (e.g., tool execution on remote MCP servers) managed by an MCP server as opposed to running within the host applicaiton</p><p></p></li></ul><p>In this tutorial (part 1 of a series on <a href="https://newsletter.victordibia.com/s/mcp-for-software-engineers">MCP for Software Engineers</a>), we will cover the following: </p><ul><li><p>Building an MCP server that exposes tools (fetch news from techcrunch)</p></li><li><p>Creating a client to connect to the server</p></li><li><p>Building a host application that uses an LLM to translate user requests to tool calls on the MCP server.</p></li><li><p>Choosing between stdio and Streamable HTTP transports for MCP</p></li><li><p>Bonus : how to use the MCP Server we create in VSCode (or any other tool) </p></li></ul><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;e485e6ad-6c99-4a2a-98f2-c7ee78c93b5e&quot;,&quot;caption&quot;:&quot;In Part 1 of this series, we built a simple MCP server with a single tool using the high-level server API in the Python MCP SDK, and demonstrated how to connect to this server via a host application that implements an MCP client.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;MCP For Software Engineers | Part 2: Interactive &amp; Long-Running Tools (Progress streaming, User Input, Cancellation), Resources &amp; Prompts&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:85692678,&quot;name&quot;:&quot;Victor Dibia, PhD&quot;,&quot;bio&quot;:&quot;Hacker, Research Scientist (Microsoft Research), Author working on and writing about Generative AI, Agents. Core maintainer for the AutoGen Multi-Agent Framework (35k Stars on GitHub). Previously at Cloudera, IBM Research. All views are my own.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/27536f63-7d8e-48dc-b34c-150acfacdc8b_1726x1396.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-08-01T13:29:53.324Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!cRS7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9b2419e0-ff4c-48cb-870c-2c74df3c5f54_2723x1889.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-2&quot;,&quot;section_name&quot;:&quot;MCP For Software Engineers&quot;,&quot;video_upload_id&quot;:null,&quot;id&quot;:167856675,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:7,&quot;comment_count&quot;:0,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;Designing with AI&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!1FgP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fff160652-8bbb-475e-80a7-ba4eb5f80dcb_504x504.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><blockquote><p>Note: This series is not for those seeking to learn about the "<em>latest productivity hacks with MCP in some existing host application (Cursor, Windsurf etc)</em>." </p></blockquote><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!r1_H!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!r1_H!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 424w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 848w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 1272w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!r1_H!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png" width="1456" height="839" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:839,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:377128,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167292158?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!r1_H!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 424w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 848w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 1272w, https://substackcdn.com/image/fetch/$s_!r1_H!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffef6df6b-6387-4261-9239-d7521d081ad1_1568x903.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Screenshot of the server we will build, shown in the AutoGen Studio MCP Playground</figcaption></figure></div><div><hr></div><p></p><div id="youtube2--yrqkwZr3Nc" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;-yrqkwZr3Nc&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/-yrqkwZr3Nc?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><div><hr></div><h2>Key MCP Concepts in Brief</h2><p>MCP has an excellent and well maintained <a href="https://modelcontextprotocol.io/introduction">documentation site</a>. In brief, here are key concepts to get started. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Ct1Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 424w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 848w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1272w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png" width="1456" height="880" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:880,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:185668,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167292158?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 424w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 848w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1272w, https://substackcdn.com/image/fetch/$s_!Ct1Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc644e807-9e03-483c-abee-d8586149fc9c_2681x1621.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><ul><li><p><strong>Server:</strong> MCP servers expose capabilities through a standardized interface. A single server can provide multiple tools (functions to call), resources (data to read), prompts (templates for LLM interactions), and sampling (request LLM completions from the client).</p></li><li><p><strong>Client: </strong>MCP clients maintain 1:1 connections with servers and handle the protocol communication. Hosts embed clients to talk to servers.</p></li><li><p><strong>Host</strong> MCP hosts are user-facing applications like Claude Desktop, Cursor, or VSCode. They use clients to connect to servers and decide which tools to call based on user needs.</p></li><li><p><strong>Transport</strong> MCP uses JSON-RPC 2.0 with <a href="https://modelcontextprotocol.io/docs/concepts/transports">two transport options</a> - <em><strong>stdio</strong></em> and <em><strong>streamable HTTP</strong></em>. <strong>Stdio</strong> runs the server as a subprocess using standard input/output - ideal for local integrations where the server runs as a subprocess of the client (e.g., IDE extensions, local development tools). <strong>Streamable HTTP</strong> uses network requests - better for web applications, distributed systems, multiple clients connecting to one server, and easier debugging/monitoring.</p></li></ul><p>For this tutorial, we'll use the <strong>Streamable HTTP</strong> as it provides a better learning experience with clearer separation of concerns (you can run the server on a remote machine) and easier debugging.</p><div><hr></div><h2>Building Your First MCP Server</h2><p>Now that we've covered the core concepts, let's build your first MCP server. We'll use the <a href="https://github.com/modelcontextprotocol/python-sdk">Python MCP SDK</a>, which is mature and widely adopted, but the same concepts apply to other languages.</p><blockquote><p>Protocol vs SDK ?<br>MCP mostly defines <a href="https://modelcontextprotocol.io/specification/2025-06-18">a protocol or standard</a> - essentially a set of rules that says clients and servers  MUST/SHOULD/SHALL/SHALL/NOT  do X and Y in order to communicate. Now SDKs are an implementation of these rules. <br>While you can build your own compliant servers/cleints, in general, it is recommended that you use SDKs for more standardized behaviors where possible.</p></blockquote><p>Our goal: build a tool that can answer news-related queries like "What is the latest AI news on TechCrunch?"</p><h3>1. Set Up Your Project</h3><p>Create a new Python project and install the MCP SDK.</p><p>Using <code>uv</code> (recommended):</p><pre><code><code>uv init mcp-news-demo
cd mcp-news-demo
uv add "mcp[cli]"</code></code></pre><p>Using pip:</p><pre><code><code>pip install "mcp[cli]"</code></code></pre><h3>2. Create the MCP Server</h3><p>Let's create a simple MCP server with a tool that fetches TechCrunch news. </p><pre><code><code># server.py
import os
from mcp.server.fastmcp import FastMCP
import requests

mcp = FastMCP(
    "TechCrunch News Server", 
    host=os.environ.get("MCP_SERVER_HOST", "localhost"), 
    port=int(os.environ.get("MCP_SERVER_PORT", 8011))
)

@mcp.tool(title="Fetch from TechCrunch")
def fetch_from_techcrunch(category: str = "latest") -&gt; str:
    """Fetch the latest news from TechCrunch for a given category."""
    allowed = {"ai", "startup", "security", "venture", "latest"}
    cat = category.lower()
    
    if cat not in allowed:
        cat = "latest"
    
    url = f"https://techcrunch.com/tag/{cat}/" if cat != "latest" else "https://techcrunch.com/"
    
    try:
        response = requests.get(url)
        if response.ok:
            try:
                from bs4 import BeautifulSoup
                soup = BeautifulSoup(response.text, "html.parser")
                text = soup.get_text(separator=' ', strip=True)
                return text[:1000] + ("..." if len(text) &gt; 1000 else "")
            except ImportError:
                return response.text[:1000] + ("..." if len(response.text) &gt; 1000 else "")
        return "Failed to fetch news."
    except Exception as e:
        return f"Error fetching news: {str(e)}"

if __name__ == "__main__":
    mcp.run(transport="streamable-http")
</code></code></pre><h3>3. Run Your Server</h3><p>Start your server:</p><pre><code><code>python server.py</code></code></pre><p>You should see:</p><pre><code><code>INFO: Started server process [65618]
INFO: Waiting for application startup.
[07/01/25 12:40:34] INFO StreamableHTTP session manager started
INFO: Application startup complete.
INFO: Uvicorn running on http://localhost:8011 (Press CTRL+C to quit)</code></code></pre><p>Your server is now running at <code>http://localhost:8011/mcp</code> and ready to accept requests.</p><blockquote><p><strong>Note</strong>: This example uses the high-level FastMCP API with Streamable HTTP transport. For advanced use cases, check the <a href="https://github.com/modelcontextprotocol/python-sdk">MCP Python SDK documentation</a> for the low-level API.</p></blockquote><p></p><div><hr></div><h2>Building Your First MCP Client</h2><p>MCP clients connect to servers and handle communication. They operate within "sessions" - logical groupings of requests and responses. Let's build a client that connects to our TechCrunch server. Note that we will use the same streamablehttp transport as the server:</p><pre><code><code># client.py
import asyncio
import os
from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

async def run_client():
    # Connect to the HTTP MCP server
    host = os.environ.get("MCP_SERVER_HOST", "localhost")
    port = os.environ.get("MCP_SERVER_PORT", "8011")
    server_url = f"http://{host}:{port}/mcp"
    
    async with streamablehttp_client(server_url) as (read, write, _):
        async with ClientSession(read, write) as session:
            # Initialize the connection
            await session.initialize()
            
            # List available tools
            tools_response = await session.list_tools()
            print("Available tools:")
            for tool in tools_response.tools:
                print(f"- {tool.name}: {tool.description}")
            
            # Call a tool
            result = await session.call_tool(
                "fetch_from_techcrunch",
                arguments={"category": "ai"}
            )
            print(f"\nTool result: {result.content}")

if __name__ == "__main__":
    asyncio.run(run_client())
</code></code></pre><p>You should now see a list of tools and a result of a call to the tool using `<code>session.call_tool`</code>.</p><pre><code>Available tools:
- fetch_from_techcrunch: Fetch the latest news from TechCrunch for a given category.
Tool result: [TextContent(type='text', text='AI | TechCrunch AI | TechCrunch TechCrunch Desktop Logo TechCrunch Mobile Logo Latest Startups Venture Apple Security AI Apps Events Podcasts Newsletters Search Submit Site Search Toggle Mega Menu Toggle Topics Latest AI Amazon  ...
</code></pre><p></p><h3>Server and Client Capabilities</h3><p>Note: `await session.initialize()` returns important details about the server including its <a href="https://modelcontextprotocol.io/specification/2025-06-18/schema#servercapabilities">capabilities</a> (e.g., if it provides tools, resources, prompts, logging or any experimental/custom features.). This can be utilized by the client or host application to define dynamic behaviors. </p><p>Similarly, clients can &#8220;advertise support&#8221; for <a href="https://modelcontextprotocol.io/specification/2025-06-18/schema#clientcapabilities">client capabilities</a> such as  sampling, elicitation and roots during initialization. In the python SDK, this is done by providing `callbacks` for each capability.</p><pre><code>async with ClientSession(read, write, sampling_callback=sampling_callback, elicitation_callback=elicitation_callback,list_roots_callback=list_roots_callback, logging_callback=logging_callback) as session:</code></pre><div><hr></div><h2>Building the Host Application</h2><p>The host application is where the magic happens. It bridges MCP with the outside world (users or business applications), turning user queries or business tasks into tool calls and responses.</p><p>Most MCP tutorials skip over how complex host applications really can be. They're not just pass-through layers - they're intelligent orchestrators that often must:</p><ul><li><p>Accept user requests</p></li><li><p>Discover available tools from MCP servers</p></li><li><p>Use an LLM to choose the right tools</p></li><li><p>Execute tool calls with proper parameters</p></li><li><p>Handle errors gracefully</p></li><li><p>Present results in a user-friendly way</p></li></ul><p>Let's build a simple host that uses OpenAI to intelligently orchestrate our MCP tools.</p><h3>Setup</h3><p>First, install dependencies:</p><pre><code><code>uv add openai
# or: pip install openai
</code></code></pre><p>Set your OpenAI API key:</p><pre><code><code>export OPENAI_API_KEY="your-api-key-here"
</code></code></pre><h3>The Host Application</h3><pre><code><code># app.py
import asyncio
import json
import os
import sys
from openai import AsyncOpenAI
from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

def convert_mcp_tools_to_openai_format(tools):
    """Convert MCP tool definitions to OpenAI function calling format."""
    openai_tools = []
    for tool in tools:
        openai_tools.append({
            "type": "function",
            "function": {
                "name": tool.name,
                "description": tool.description,
                "parameters": tool.inputSchema if hasattr(tool, 'inputSchema') else {}
            }
        })
    return openai_tools

async def handle_user_request(session, openai_client, tools, user_input: str):
    """Process user request using LLM and MCP tools."""
    openai_tools = convert_mcp_tools_to_openai_format(tools)
    
    # Ask LLM to decide which tools to use
    response = await openai_client.chat.completions.create(
        model="gpt-4",
        messages=[
            {
                "role": "system", 
                "content": "You are a helpful assistant that can fetch news from TechCrunch. Use tools when needed."
            },
            {"role": "user", "content": user_input}
        ],
        tools=openai_tools,
        tool_choice="auto"
    )
    
    message = response.choices[0].message
    
    # If LLM wants to use a tool, execute it
    if message.tool_calls:
        tool_call = message.tool_calls[0]
        function_name = tool_call.function.name
        function_args = json.loads(tool_call.function.arguments)
        
        print(f"&#128295; Calling tool: {function_name} with args: {function_args}")
        
        result = await session.call_tool(function_name, arguments=function_args)
        
        # Format the response
        content = str(result.content)[:500]
        return f"Here's what I found:\n\n{content}...\n\nFor more details, visit TechCrunch directly."
    else:
        return message.content

async def main():
    if not os.getenv("OPENAI_API_KEY"):
        print("Error: Set OPENAI_API_KEY environment variable")
        return
    
    # Get user input
    user_input = " ".join(sys.argv[1:]).strip() or "What is the latest news on AI?"
    
    # Initialize OpenAI client
    openai_client = AsyncOpenAI()
    
    # Connect to MCP server
    server_url = f"http://localhost:8011/mcp"
    
    async with streamablehttp_client(server_url) as (read, write, _):
        async with ClientSession(read, write) as session:
            await session.initialize()
            
            # Get available tools
            tools = (await session.list_tools()).tools
            
            print(f"Task: {user_input}\n")
            
            # Process the request
            response = await handle_user_request(
                session, openai_client, tools, user_input
            )
            
            print(response)

if __name__ == "__main__":
    # Make sure server is running first!
    asyncio.run(main())
</code></code></pre><p>The host application above combines three key components: an MCP client, an LLM (OpenAI), and user interface logic. When you run <code>python app.py "What's the latest AI news?"</code>, here's what happens: First, the host connects to the MCP server using its embedded client and calls <code>list_tools()</code> to discover available tools. It then converts these MCP tool definitions into OpenAI's function-calling format using <code>convert_mcp_tools_to_openai_format()</code>. Next, it sends your question along with the tool definitions to OpenAI's API. OpenAI analyzes your question and returns which tool to call - in this case, <code>fetch_from_techcrunch</code> with <code>category='ai'</code>. The host then executes this tool call through <code>session.call_tool()</code>, gets the raw results from TechCrunch, and formats them into a readable response. This architecture lets you ask natural language questions without knowing which tools exist or how to call them - the LLM figures that out based on what's available.  </p><p>A similar host application format can be extended to build complex agentic or multi-agent applications. </p><h2>Putting It All Together</h2><p>You now have a complete MCP application:</p><ol><li><p><strong>Server</strong> (<code>server.py</code>) - Exposes the TechCrunch fetching tool</p></li><li><p><strong>Client</strong> (<code>client.py</code>) - Shows how to connect and call tools directly</p></li><li><p><strong>Host</strong> (<code>app.py</code>) - Uses AI to intelligently orchestrate tool usage</p></li></ol><p>To run the full system:</p><pre><code><code># Terminal 1: Start the server
python server.py

# Terminal 2: Run the host application
python app.py "What's the latest AI news?"
</code></code></pre><p>Oh .. and the server above can be used in any of your favorite MCP host applications. For example, you can add it to your VSCode agent by adding an mcp server config to your <a href="https://code.visualstudio.com/docs/copilot/chat/mcp-servers#_add-an-mcp-server-to-your-user-settings">user settings</a>. The same configuration will work for <a href="https://modelcontextprotocol.io/quickstart/user">Claude Desktop</a>, <a href="https://docs.cursor.com/context/model-context-protocol#manual-configuration">Cursor</a>, <a href="https://docs.windsurf.com/windsurf/cascade/mcp#mcp-config-json">Windsurf</a> &#8230;</p><pre><code>"demo": {
        "type": "http",
        "url": "http://localhost:8011/mcp/",
        "headers": { "VERSION": "1.2" }
      }</code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1YrV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1YrV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 424w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 848w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 1272w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1YrV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png" width="1456" height="865" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:865,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:747317,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newsletter.victordibia.com/i/167292158?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!1YrV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 424w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 848w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 1272w, https://substackcdn.com/image/fetch/$s_!1YrV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc0610a27-9024-4fae-b8f3-ea9ae27625eb_2938x1746.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2></h2><blockquote><p><strong>Further Reading:</strong><br>I wrote a book on on <a href="https://buy.multiagentbook.com/">Designing Multi-Agent Systems</a> - Chapter 12 (Protocols for Distributed Agents) provides a perspective on MCP and. A2A and when you should reach for them as you build agents.  </p><ul><li><p>Book Digital PDF:  </p></li></ul><p>https://buy.multiagentbook.com/</p><ul><li><p>Book on Amazon: <a href="https://www.amazon.com/dp/B0G2BCQQJY">https://www.amazon.com/dp/B0G2BCQQJY</a> </p></li></ul></blockquote>
      <p>
          <a href="https://newsletter.victordibia.com/p/mcp-for-software-engineers-part-1">
              Read more
          </a>
      </p>
   ]]></content:encoded></item></channel></rss>