ai-mlanalytics-data

Langfuse

by z9905080

Monitor LLM performance with Langfuse: advanced ai data analytics and data analysis ai for actionable insights and impro

Connects AI models to Langfuse analytics workspaces, enabling access to LLM performance metrics by time range for monitoring and analysis.

github stars

3

Direct Langfuse workspace integrationTime-based metrics queries

best for

  • / AI developers monitoring model performance
  • / Teams tracking LLM usage analytics
  • / Analyzing AI assistant effectiveness over time

capabilities

  • / Query LLM metrics by time range
  • / Connect to Langfuse workspaces
  • / Access performance analytics data
  • / Retrieve model execution metrics

what it does

Connects AI models to Langfuse analytics workspaces for querying LLM performance metrics. Requires Langfuse project setup with public/private keys.

about

Langfuse is a community-built MCP server published by z9905080 that provides AI assistants with tools and capabilities via the Model Context Protocol. Monitor LLM performance with Langfuse: advanced ai data analytics and data analysis ai for actionable insights and impro It is categorized under ai ml, analytics data.

how to install

You can install Langfuse in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

Apache-2.0

Langfuse is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

MCP Server for langfuse

npm version

A Model Context Protocol (MCP) server implementation for integrating AI assistants with Langfuse workspaces.

Overview

This package provides an MCP server that enables AI assistants to interact with Langfuse workspaces. It allows AI models to:

  • Query LLM Metrics by Time Range

Installation

# Install from npm
npm install shouting-mcp-langfuse

# Or install globally
npm install -g shouting-mcp-langfuse

You can find the package on npm: shouting-mcp-langfuse

Prerequisites

Before using the server, you need to create a Langfuse project and obtain your project's public and private keys. You can find these keys in the Langfuse dashboard.

  1. set up a Langfuse project
  2. get the public and private keys
  3. set the environment variables

Configuration

The server requires the following environment variables:

  • LANGFUSE_DOMAIN: The Langfuse domain (default: https://api.langfuse.com)
  • LANGFUSE_PUBLIC_KEY: Your Langfuse Project Public Key
  • LANGFUSE_PRIVATE_KEY: Your Langfuse Project Private Key

Usage

Running as a CLI Tool

# Set environment variables
export LANGFUSE_DOMAIN="https://api.langfuse.com"
export LANGFUSE_PUBLIC_KEY="your-public-key"
export LANGFUSE_PRIVATE_KEY="your-private

# Run the server
mcp-server-langfuse

Using in Your Code

import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { langfuseClient } from "shouting-mcp-langfuse";

// Initialize the server and client
const server = new Server({...});
const langfuseClient = new LangfuseClient(process.env.LANGFUSE_DOMAIN, process.env.LANGFUSE_PUBLIC_KEY, process.env.LANGFUSE_PRIVATE_KEY);

// Register your custom handlers
// ...

Available Tools

The server provides the following langfuse integration tools:

  • getLLMMetricsByTimeRange: Get LLM Metrics by Time Range

License

ISC

Author

shouting.hsiao@gmail.com

Repository

https://github.com/z9905080/mcp-langfuse

FAQ

What is the Langfuse MCP server?
Langfuse is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Langfuse?
This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.510 reviews
  • Shikha Mishra· Oct 10, 2024

    Langfuse is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Piyush G· Sep 9, 2024

    We evaluated Langfuse against two servers with overlapping tools; this profile had the clearer scope statement.

  • Chaitanya Patil· Aug 8, 2024

    Useful MCP listing: Langfuse is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Sakshi Patil· Jul 7, 2024

    Langfuse reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Ganesh Mohane· Jun 6, 2024

    I recommend Langfuse for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Oshnikdeep· May 5, 2024

    Strong directory entry: Langfuse surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Dhruvi Jain· Apr 4, 2024

    Langfuse has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Rahul Santra· Mar 3, 2024

    According to our notes, Langfuse benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Pratham Ware· Feb 2, 2024

    We wired Langfuse into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Yash Thakker· Jan 1, 2024

    Langfuse is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.