Agents Take Over – Why Orchestration Still Belongs to CMS

Building an AI-Agent with TypeScript and MCP

Dino, Staff Engineer at Hygraph and the lead behind our AI Labs initiative, walks you through how to build a working TypeScript MCP client from scratch.
Enxhi Hamzallari

Written by Enxhi 

Jul 30, 2025
Livestream Recap: How to Build a TypeScript MCP Client

If you're experimenting with AI agents and looking to move beyond simple LLM prompts, this guide is for you. In this session, Dino, Staff Engineer at Hygraph and the lead behind our AI Labs initiative, walks you through how to build a working TypeScript MCP client from scratch.

We’re going hands-on. No fluff, just real code. By the end, you’ll understand how to connect an AI model with external tools and APIs using the Model Context Protocol (MCP) and why that matters for building powerful AI-native applications.

Editor's Note

Bonus: You can watch the full livestream here to follow along with Dino’s live-coding session.

#What we’re building

You’ll create a basic MCP client that allows an AI agent to:

  • Receive a user message
  • Dynamically query an MCP server
  • Use an external tool like a weather API
  • Return a structured, intelligent response

All using TypeScript, Node.js, and a simple frontend.

#What is MCP and why should you care?

MCP (Model Context Protocol) is a standard that lets LLMs (called MCP clients) discover and use external tools dynamically. Instead of hardcoding a fixed set of functions, you expose a list of tools at runtime, and the AI knows what it can call based on the server’s capabilities.

MCP unlocks:

  • Dynamic tool discovery
  • Standardized schemas
  • Plug-and-play agent extensions

In short, it gives your AI superpowers without relying on brittle prompt injection or custom wrappers.

#Step 1: Build the MCP server

Start by creating a small server that exposes two tools:

  • getWeather (calls a weather API)
  • searchWeb (mocked to show error handling)

Here’s what you need:

  • Define a /tools/list endpoint that returns JSON schema for each tool
  • Implement a /tools/call endpoint that parses the AI’s request and performs the right action

Dino used the OpenMeteo API for weather data and wrote basic logic to transform locations into coordinates, query the forecast, and return readable results.

Each tool’s schema should clearly describe:

Screenshot 2025-07-31 at 10.53.19.png

#Step 2: Set up the client in TypeScript

Next, you'll build the MCP client, which is technically your Node.js backend. It:

  • Accepts messages from the user via HTTP
  • Sends messages to the AI model (in this case, AWS Bedrock running Claude)
  • Parses responses and checks for tool calls
  • If needed, calls the MCP server and sends results back to the AI for final output

Screenshot 2025-07-31 at 11.00.17.png

The conversation loop looks like this:

  1. User says “What’s the weather in Berlin?”
  2. AI responds: “I’ll use the weather tool”
  3. AI emits tool call: getWeather({ location: "Berlin" })
  4. Node.js client invokes the MCP server and sends results back to the AI
  5. AI finalizes the response and replies to the user

Important: Dino used stdin/stdout as the transport protocol between client and MCP server for simplicity. MCP also supports HTTP if you're going the API route.

#Step 3: Connect it all

Dino wired this all up in a lightweight Express app:

  • /api/conversations manages in-memory chats
  • /api/message triggers the AI+MCP pipeline
  • Frontend is React, but very minimal

He also included logic to:

  • Translate MCP tool schemas to match the AI provider’s expectations
  • Handle multiple tools in a single conversation
  • Filter/structure messages based on roles (user, assistant, tool)

⚠️ Tip: Always manage the full conversation server-side to prevent prompt injection attacks. Don’t expose the full message history to the frontend.

#Step 4: Extend with Hygraph MCP

At the end of the livestream, Dino showed how easy it is to plug in Hygraph’s own MCP server. With one line added to your config, your AI agent can start accessing content models, running queries, and manipulating schema inside a real CMS.

If you’re building anything that touches content or structured data, this is where things get really exciting.

Screenshot 2025-07-31 at 11.07.32.png

#Why this matters

LLMs are only as smart as the context and tools you give them. By integrating with MCP, your AI agents gain real-world awareness, decision-making power, and dynamic capabilities that go far beyond chat.

Whether you’re building internal automations, dev tools, or next-gen CMS experiences, the TypeScript MCP client approach gives you full control and flexibility.

Want to see it in action? Watch the livestream with Dino to follow along as he builds and debugs live.

Curious about AI-native content management?
Check out hygraph.ai to explore our vision for building with AI and structured content at the core.

Blog Author

Enxhi Hamzallari

Enxhi Hamzallari

Sr. Field Marketing Manager

Enxhi is the Senior Field Marketing Manager at Hygraph. When she’s not bringing people together through content and events, you’ll find her dancing the night away or cheering on her favorite drag queens.

Share with others

Sign up for our newsletter!

Be the first to know about releases and industry news and insights.