LLM observability platform for tracing, evaluation, prompt management, and cost tracking. Use when setting up Langfuse, monitoring LLM costs, tracking token usage, or implementing prompt versioning.
View on GitHubJanuary 25, 2026
Select agents to install to:
npx add-skill https://github.com/yonatangross/orchestkit/blob/main/skills/langfuse-observability/SKILL.md -a claude-code --skill langfuse-observabilityInstallation paths:
.claude/skills/langfuse-observability/# Langfuse Observability
## Overview
**Langfuse** is the open-source LLM observability platform that OrchestKit uses for tracing, monitoring, evaluation, and prompt management. Unlike LangSmith (deprecated), Langfuse is self-hosted, free, and designed for production LLM applications.
**When to use this skill:**
- Setting up LLM observability from scratch
- Debugging slow or incorrect LLM responses
- Tracking token usage and costs
- Managing prompts in production
- Evaluating LLM output quality
- Migrating from LangSmith to Langfuse
**OrchestKit Integration:**
- **Status**: Migrated from LangSmith (Dec 2025)
- **Location**: `backend/app/shared/services/langfuse/`
- **MCP Server**: `orchestkit-langfuse` (optional)
---
## Quick Start
### Setup
```python
# backend/app/shared/services/langfuse/client.py
from langfuse import Langfuse
from app.core.config import settings
langfuse_client = Langfuse(
public_key=settings.LANGFUSE_PUBLIC_KEY,
secret_key=settings.LANGFUSE_SECRET_KEY,
host=settings.LANGFUSE_HOST # Self-hosted or cloud
)
```
### Basic Tracing with @observe
```python
from langfuse.decorators import observe, langfuse_context
@observe() # Automatic tracing
async def analyze_content(content: str):
langfuse_context.update_current_observation(
metadata={"content_length": len(content)}
)
return await llm.generate(content)
```
### Session & User Tracking
```python
langfuse.trace(
name="analysis",
user_id="user_123",
session_id="session_abc",
metadata={"content_type": "article", "agent_count": 8},
tags=["production", "orchestkit"]
)
```
---
## Core Features Summary
| Feature | Description | Reference |
|---------|-------------|-----------|
| Distributed Tracing | Track LLM calls with parent-child spans | `references/tracing-setup.md` |
| Cost Tracking | Automatic token & cost calculation | `references/cost-tracking.md` |
| Prompt Management | Version control for prompts | `references/prompt-management.