Transport for London▌

by anoopt
Get real-time London transport updates, plan journeys across London’s underground & bus routes, and manage your Oyster c
Provides real-time London transport data including tube line status, detailed disruption information, and comprehensive journey planning between any London locations or UK postcodes with intelligent location disambiguation and multi-modal route options.
best for
- / Travel apps and commuter tools
- / London-based services needing transport data
- / Journey planning integrations
- / Real-time transport monitoring
capabilities
- / Check tube line status and delays
- / Get detailed disruption information
- / Plan journeys between London locations
- / Search UK postcodes for route planning
- / Access multi-modal transport options
- / Resolve ambiguous location names
what it does
Get real-time London transport data including tube status, disruptions, and journey planning between any UK locations.
about
Transport for London is a community-built MCP server published by anoopt that provides AI assistants with tools and capabilities via the Model Context Protocol. Get real-time London transport updates, plan journeys across London’s underground & bus routes, and manage your Oyster c It is categorized under developer tools.
how to install
You can install Transport for London in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
license
MIT
Transport for London is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
TfL (Transport for London) Status & Journey Planner MCP Server
This Model Context Protocol (MCP) server provides AI assistants with access to real-time Transport for London data through a set of automated tools.
⚠️ Important Disclaimer: This is not an official Transport for London (TfL) MCP server. This is an independent project that uses the publicly available TfL Unified API to provide transport data. It is not affiliated with, endorsed by, or officially supported by Transport for London.
Demo Video

🚇 What This MCP Server Does
This server enables AI assistants (like Claude Desktop and VS Code GitHub Copilot) to access live TfL data by providing three main capabilities:
🔧 Available Tools
get_line_status- Get the current status of any TfL line (e.g., Central, Victoria, Piccadilly)get_line_status_detail- Get detailed status information including disruption details for a TfL lineplan_journey- Plan journeys between two locations using the TfL Journey Planner
🎯 Use Cases
With this MCP server connected, AI assistants can help users:
- Check if their tube line is running normally before commuting
- Get detailed information about service disruptions
- Plan optimal routes between London locations
- Provide real-time transport advice for London travel
Example interactions:
- "Is the Central line running normally?"
- "Plan a journey from King's Cross to Heathrow Airport"
- "What's causing delays on the Northern line today?"
Let's set things up!
🚦 Getting Started
Choose your preferred installation method:
📦 Option 1: Quick Install via npm (Recommended)
The easiest way to use this MCP server is through npm:
Installation
npm install -g london-transport-mcp
🔐 Set up your TfL API key
You can get a free API key from the TfL API Portal.
Method 1: Environment Variable (Recommended) Set the environment variable in your system:
# Windows (PowerShell)
$env:TFL_API_KEY="your_actual_tfl_api_key_here"
# macOS/Linux
export TFL_API_KEY="your_actual_tfl_api_key_here"
Method 2: MCP Configuration Include the API key directly in your MCP configuration (see examples below).
AI Assistant Configuration
For Claude Desktop (Settings → Developers → Edit Config):
{
"mcpServers": {
"london-transport": {
"command": "npx",
"args": ["london-transport-mcp"],
"env": {
"TFL_API_KEY": "your_actual_tfl_api_key_here"
}
}
}
}
For VS Code GitHub Copilot (Settings → GitHub Copilot › MCP: Servers):
{
"london-transport": {
"command": "npx",
"args": ["london-transport-mcp"],
"env": {
"TFL_API_KEY": "your_actual_tfl_api_key_here"
}
}
}
That's it! No manual installation or path configuration required.
🛠️ Option 2: Local Development Setup
For developers who want to modify the code or contribute:
⚙️ Prerequisites
Before starting, please ensure you have:
- Node.js (v18+ required, v20+ recommended)
- npm (included with Node)
Warning: if you run with a lower version of Node, fetch won't be present. Tools use fetch to make HTTP calls. To work around this, you can modify the tools to use node-fetch instead. Make sure that node-fetch is installed as a dependency and then import it as fetch into each tool file.
📥 Installation & Setup
1. Clone the repository
git clone https://github.com/anoopt/london-tfl-journey-status-mcp-server.git
cd london-tfl-journey-status-mcp-server
2. Install dependencies
npm install
🔐 Set up your TfL API key
3. Configure your TfL API key
Create a .env file in the project root with your TfL API key:
TFL_API_KEY=your_actual_tfl_api_key_here
You can get a free API key from the TfL API Portal.
🧪 Test the MCP Server with Postman
We strongly recommend testing your MCP server with Postman before connecting it to an AI assistant. The Postman Desktop Application provides the easiest way to run and test MCP servers.
Step 1: Download Postman Desktop
Download the latest Postman Desktop Application from postman.com/downloads.
Step 2: Create an MCP Request
- Open Postman Desktop
- Create a new MCP Request (see the documentation for detailed steps)
- Set the type to STDIO
- Set the command to the full path to your node executable followed by the full path to
mcpServer.js
To get the required paths, run these commands in your terminal:
# Get the full path to node
which node
# Get the full path to mcpServer.js
realpath mcpServer.js
# Check your node version (should be 18+)
node --version
Example command format:
/usr/local/bin/node /full/path/to/TfL-Status-MCP-Server/mcpServer.js
Step 3: Test Your Tools
- Click Connect in your Postman MCP Request
- You should see the three TfL tools listed
- Test each tool:
- Try
get_line_statuswithlineId: "central" - Try
plan_journeywithfromLocation: "King's Cross"andtoLocation: "Westminster" - Try
get_line_status_detailwithlineId: "piccadilly"
- Try
If all tools work correctly in Postman, you're ready to connect to an AI assistant!
🤖 Connect to AI Assistants
Once you've tested with Postman, you can connect your MCP server to AI assistants:
For Local Development Setup (Option 2)
If you're using the local development setup, you'll need to specify full paths:
Claude Desktop
Step 1: Use the same node and mcpServer.js paths from the Postman testing step.
Step 2: Open Claude Desktop → Settings → Developers → Edit Config and add:
{
"mcpServers": {
"london-transport": {
"command": "node",
"args": ["/full/path/to/mcpServer.js"]
}
}
}
Step 3: Restart Claude Desktop and verify the MCP server shows with a green circle.
VS Code GitHub Copilot
Step 1: Install the GitHub Copilot extension in VS Code if you haven't already.
Step 2: Open VS Code → Settings (Ctrl+,) → Search for "MCP" → GitHub Copilot › MCP: Servers
Step 3: Add your TfL MCP server configuration:
{
"london-transport": {
"command": "node",
"args": ["/full/path/to/mcpServer.js"]
}
}
Step 4: Restart VS Code and the MCP server will be available to GitHub Copilot.
Now you can ask your AI assistant things like:
- "Check the status of the Central line"
- "Plan a journey from London Bridge to Camden Town"
Additional Options
🛠️ List Available Tools
View all available tools and their parameters:
npm run list-tools
🚀 Quick Postman Integration
Open Postman with the correct MCP configuration automatically:
npm run postman
🐳 Docker Deployment (Production)
For production deployments, you can use Docker:
1. Build Docker image
docker build -t <your_server_name> .
2. AI Assistant Integration
Add Docker server configuration to your AI assistant:
For Claude Desktop (Settings → Developers → Edit Config):
{
"mcpServers": {
"tfl-status": {
"command": "docker",
"args": ["run", "-i", "--rm", "--env-file=.env", "tfl-mcp-server"]
}
}
}
For VS Code GitHub Copilot (Settings → GitHub Copilot › MCP: Servers):
{
"tfl-status": {
"command": "docker",
"args": ["run", "-i", "--rm", "--env-file=.env", "tfl-mcp-server"]
}
}
Add your environment variables (API keys, etc.) inside the
.envfile.
The project comes bundled with the following minimal Docker setup:
FROM node:22.12-alpine AS builder
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm install
COPY . .
ENTRYPOINT ["node", "mcpServer.js"]
🌐 Streamable HTTP
To run the server with Streamable HTTP support, use the --streamable-http flag. This launches the server with the /mcp endpoint enabled:
node mcpServer.js --streamable-http
🌐 Server-Sent Events (SSE)
To run the server with Server-Sent Events (SSE) support, use the --sse flag. This launches the server with the /sse and /messages endpoints enabled:
node mcpServer.js --sse
🖥️ Stdio (Standard Input/Output)
To run the server using standard input/output (stdio), simply run the script without any flags. This mode is ideal for CLI tools or programmatic integration via stdin and stdout.
node mcpServer.js
🛠️ Extending the Server
To add more TfL API endpoints or other transport APIs:
- Create new tool files in the
tools/tfl/directory - Follow the pattern in existing tools like
tools/tfl/status.js - Add your new tool file to
tools/paths.js - Test with Postman before deploying
📚 API Reference
This server uses the Transport for London Unified API. All tools automatically include your API key from the .env file.
➕ Adding New Tools
Extend your MCP server with more tools easily:
- Visit Postman MCP Generator.
- Pick new API request(s), generate a new MCP server, and download it.
- Copy new generated tool(s) into your existing project's
tools/folder. - Update your
tools/paths.jsfile to include new tool references.
💬 Questions & Support
Visit the Postman MCP Generator page for updates and new capabilities.
Join the #mcp-lab channel in the Postman Discord to sh
FAQ
- What is the Transport for London MCP server?
- Transport for London is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
- How do MCP servers relate to agent skills?
- Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
- How are reviews shown for Transport for London?
- This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
Ratings
4.5★★★★★10 reviews- ★★★★★Shikha Mishra· Oct 10, 2024
Transport for London is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
- ★★★★★Piyush G· Sep 9, 2024
We evaluated Transport for London against two servers with overlapping tools; this profile had the clearer scope statement.
- ★★★★★Chaitanya Patil· Aug 8, 2024
Useful MCP listing: Transport for London is the kind of server we cite when onboarding engineers to host + tool permissions.
- ★★★★★Sakshi Patil· Jul 7, 2024
Transport for London reduced integration guesswork — categories and install configs on the listing matched the upstream repo.
- ★★★★★Ganesh Mohane· Jun 6, 2024
I recommend Transport for London for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.
- ★★★★★Oshnikdeep· May 5, 2024
Strong directory entry: Transport for London surfaces stars and publisher context so we could sanity-check maintenance before adopting.
- ★★★★★Dhruvi Jain· Apr 4, 2024
Transport for London has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Rahul Santra· Mar 3, 2024
According to our notes, Transport for London benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.
- ★★★★★Pratham Ware· Feb 2, 2024
We wired Transport for London into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.
- ★★★★★Yash Thakker· Jan 1, 2024
Transport for London is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.