Back to Skills

langfuse-observability

verified

LLM observability platform for tracing, evaluation, prompt management, and cost tracking. Use when setting up Langfuse, monitoring LLM costs, tracking token usage, or implementing prompt versioning.

View on GitHub

Marketplace

orchestkit

yonatangross/skillforge-claude-plugin

Plugin

orchestkit-complete

development

Repository

yonatangross/skillforge-claude-plugin
33stars

./skills/langfuse-observability/SKILL.md

Last Verified

January 23, 2026

Install Skill

Select agents to install to:

Scope:
npx add-skill https://github.com/yonatangross/skillforge-claude-plugin/blob/main/./skills/langfuse-observability/SKILL.md -a claude-code --skill langfuse-observability

Installation paths:

Claude
.claude/skills/langfuse-observability/
Powered by add-skill CLI

Instructions

# Langfuse Observability

## Overview

**Langfuse** is the open-source LLM observability platform that OrchestKit uses for tracing, monitoring, evaluation, and prompt management. Unlike LangSmith (deprecated), Langfuse is self-hosted, free, and designed for production LLM applications.

**When to use this skill:**
- Setting up LLM observability from scratch
- Debugging slow or incorrect LLM responses
- Tracking token usage and costs
- Managing prompts in production
- Evaluating LLM output quality
- Migrating from LangSmith to Langfuse

**OrchestKit Integration:**
- **Status**: Migrated from LangSmith (Dec 2025)
- **Location**: `backend/app/shared/services/langfuse/`
- **MCP Server**: `orchestkit-langfuse` (optional)

---

## Quick Start

### Setup

```python
# backend/app/shared/services/langfuse/client.py
from langfuse import Langfuse
from app.core.config import settings

langfuse_client = Langfuse(
    public_key=settings.LANGFUSE_PUBLIC_KEY,
    secret_key=settings.LANGFUSE_SECRET_KEY,
    host=settings.LANGFUSE_HOST  # Self-hosted or cloud
)
```

### Basic Tracing with @observe

```python
from langfuse.decorators import observe, langfuse_context

@observe()  # Automatic tracing
async def analyze_content(content: str):
    langfuse_context.update_current_observation(
        metadata={"content_length": len(content)}
    )
    return await llm.generate(content)
```

### Session & User Tracking

```python
langfuse.trace(
    name="analysis",
    user_id="user_123",
    session_id="session_abc",
    metadata={"content_type": "article", "agent_count": 8},
    tags=["production", "orchestkit"]
)
```

---

## Core Features Summary

| Feature | Description | Reference |
|---------|-------------|-----------|
| Distributed Tracing | Track LLM calls with parent-child spans | `references/tracing-setup.md` |
| Cost Tracking | Automatic token & cost calculation | `references/cost-tracking.md` |
| Prompt Management | Version control for prompts | `references/prompt-management.

Validation Details

Front Matter
Required Fields
Valid Name Format
Valid Description
Has Sections
Allowed Tools
Instruction Length:
8873 chars