Back to Skills

langchain-webhooks-events

verified
View on GitHub

Marketplace

claude-code-plugins-plus

jeremylongshore/claude-code-plugins-plus-skills

Plugin

langchain-pack

ai-ml

Repository

jeremylongshore/claude-code-plugins-plus-skills
1.1kstars

plugins/saas-packs/langchain-pack/skills/langchain-webhooks-events/SKILL.md

Last Verified

January 22, 2026

Install Skill

Select agents to install to:

Scope:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/langchain-pack/skills/langchain-webhooks-events/SKILL.md -a claude-code --skill langchain-webhooks-events

Installation paths:

Claude
.claude/skills/langchain-webhooks-events/
Powered by add-skill CLI

Instructions

# LangChain Webhooks & Events

## Overview
Implement callback handlers and event-driven patterns for LangChain applications including streaming, webhooks, and real-time updates.

## Prerequisites
- LangChain application configured
- Understanding of async programming
- Webhook endpoint (for external integrations)

## Instructions

### Step 1: Create Custom Callback Handler
```python
from langchain_core.callbacks import BaseCallbackHandler
from langchain_core.messages import BaseMessage
from typing import Any, Dict, List
import httpx

class WebhookCallbackHandler(BaseCallbackHandler):
    """Send events to external webhook."""

    def __init__(self, webhook_url: str):
        self.webhook_url = webhook_url
        self.client = httpx.Client(timeout=10.0)

    def on_llm_start(
        self,
        serialized: Dict[str, Any],
        prompts: List[str],
        **kwargs
    ) -> None:
        """Called when LLM starts."""
        self._send_event("llm_start", {
            "model": serialized.get("name"),
            "prompt_count": len(prompts)
        })

    def on_llm_end(self, response, **kwargs) -> None:
        """Called when LLM completes."""
        self._send_event("llm_end", {
            "generations": len(response.generations),
            "token_usage": response.llm_output.get("token_usage") if response.llm_output else None
        })

    def on_llm_error(self, error: Exception, **kwargs) -> None:
        """Called on LLM error."""
        self._send_event("llm_error", {
            "error_type": type(error).__name__,
            "message": str(error)
        })

    def on_chain_start(
        self,
        serialized: Dict[str, Any],
        inputs: Dict[str, Any],
        **kwargs
    ) -> None:
        """Called when chain starts."""
        self._send_event("chain_start", {
            "chain": serialized.get("name"),
            "input_keys": list(inputs.keys())
        })

    def on_chain_end(self, outputs: Dict[str, Any], **kwargs) -> None:
 

Validation Details

Front Matter
Required Fields
Valid Name Format
Valid Description
Has Sections
Allowed Tools
Instruction Length:
6967 chars