jeremylongshore/claude-code-plugins-plus-skills
langchain-pack
plugins/saas-packs/langchain-pack/skills/langchain-debug-bundle/SKILL.md
January 22, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/langchain-pack/skills/langchain-debug-bundle/SKILL.md -a claude-code --skill langchain-debug-bundleInstallation paths:
.claude/skills/langchain-debug-bundle/# LangChain Debug Bundle
## Overview
Collect comprehensive debug information for LangChain issues including traces, versions, and reproduction steps.
## Prerequisites
- LangChain installed
- Reproducible error condition
- Access to logs and environment
## Instructions
### Step 1: Collect Environment Info
```python
# debug_bundle.py
import sys
import platform
import subprocess
def collect_environment():
"""Collect system and package information."""
info = {
"python_version": sys.version,
"platform": platform.platform(),
"packages": {}
}
# Get LangChain package versions
packages = [
"langchain",
"langchain-core",
"langchain-community",
"langchain-openai",
"langchain-anthropic",
"openai",
"anthropic"
]
for pkg in packages:
try:
result = subprocess.run(
[sys.executable, "-m", "pip", "show", pkg],
capture_output=True, text=True
)
for line in result.stdout.split("\n"):
if line.startswith("Version:"):
info["packages"][pkg] = line.split(":")[1].strip()
except:
info["packages"][pkg] = "not installed"
return info
print(collect_environment())
```
### Step 2: Enable Full Tracing
```python
import os
import langchain
# Enable debug mode
langchain.debug = True
# Enable LangSmith tracing (if available)
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_PROJECT"] = "debug-session"
# Custom callback for logging
from langchain_core.callbacks import BaseCallbackHandler
from datetime import datetime
class DebugCallback(BaseCallbackHandler):
def __init__(self):
self.logs = []
def on_llm_start(self, serialized, prompts, **kwargs):
self.logs.append({
"event": "llm_start",
"time": datetime.now().isoformat(),
"prompts": prompts
})
def on_llm_end(self,