Create Your First MCP Server
This guide will walk you through creating your first Model Context Protocol (MCP) server with TRMX AI.
Building an MCP server
In this tutorial, you'll learn how to:
- Set up a new project - Create the foundation for your MCP server
- Configure the server - Set up essential configuration options
- Add tools and resources - Enhance your server with AI capabilities
- Test your server locally - Ensure everything works before deploying
What is an MCP Server?
An MCP server is a serverless application that provides context-aware AI capabilities through a standardized protocol. It handles the communication between your application and AI models.
Prerequisites
Before you begin
Make sure you have the following installed:
- Node.js (v16 or later)
- TRMX CLI (
npm install -g @trmx/cli
) - An OpenAI API key (or other supported model provider)
Step 1: Setting Up Your Project
Create a new MCP server project using the TRMX CLI:
# Create a new MCP server project
trmx create my-first-mcp
# Navigate to your project
cd my-first-mcp
The CLI will guide you through the setup process, asking for your API keys and other configuration options.
Step 2: Exploring the Project Structure
Project Files
Your newly created project contains these key files:
src/index.js
- The main entry point for your serversrc/tools/
- Directory for defining AI toolssrc/resources/
- Directory for defining data resourcesconfig.json
- Server configuration settings
Step 3: Configuring Your Server
Edit the config.json
file to configure your MCP server:
{
"name": "my-first-mcp",
"modelProvider": "openai",
"contextRetention": "session",
"maxTokens": 2048
}
Key configuration options:
- name: The name of your MCP server
- modelProvider: The AI model provider (e.g., "openai", "anthropic")
- contextRetention: How long to retain conversation context
- maxTokens: Maximum context size in tokens
Step 4: Running Your Server Locally
Start your server in development mode:
# Install dependencies
npm install
# Start the server
npm run dev
Your MCP server should now be running at http://localhost:3000
.
Next Steps
Continue Your MCP Journey
Now that you've created your MCP server, you can:
- Deploy your MCP server - Put your server into production
- Test your MCP server - Ensure reliability and performance
- Add custom tools - Extend your server's capabilities