jeremylongshore/claude-code-plugins-plus-skills
langchain-pack
plugins/saas-packs/langchain-pack/skills/langchain-prod-checklist/SKILL.md
January 22, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/langchain-pack/skills/langchain-prod-checklist/SKILL.md -a claude-code --skill langchain-prod-checklistInstallation paths:
.claude/skills/langchain-prod-checklist/# LangChain Production Checklist
## Overview
Comprehensive checklist for deploying LangChain applications to production with reliability, security, and performance.
## Prerequisites
- LangChain application developed and tested
- Infrastructure provisioned
- CI/CD pipeline configured
## Production Checklist
### 1. Configuration & Secrets
- [ ] All API keys in secrets manager (not env vars in code)
- [ ] Environment-specific configurations separated
- [ ] Fallback values for non-critical settings
- [ ] Configuration validation on startup
```python
from pydantic_settings import BaseSettings
from pydantic import Field, SecretStr
class Settings(BaseSettings):
"""Validated configuration."""
openai_api_key: SecretStr = Field(..., env="OPENAI_API_KEY")
model_name: str = "gpt-4o-mini"
max_retries: int = Field(default=3, ge=1, le=10)
timeout_seconds: int = Field(default=30, ge=5, le=120)
class Config:
env_file = ".env"
settings = Settings() # Validates on import
```
### 2. Error Handling & Resilience
- [ ] Retry logic with exponential backoff
- [ ] Fallback models configured
- [ ] Circuit breaker for cascading failures
- [ ] Graceful degradation strategy
```python
from langchain_openai import ChatOpenAI
from langchain_anthropic import ChatAnthropic
primary = ChatOpenAI(model="gpt-4o-mini", max_retries=3)
fallback = ChatAnthropic(model="claude-3-5-sonnet-20241022")
robust_llm = primary.with_fallbacks([fallback])
```
### 3. Observability
- [ ] Structured logging configured
- [ ] Metrics collection enabled
- [ ] Distributed tracing (LangSmith or OpenTelemetry)
- [ ] Alerting rules defined
```python
import os
# LangSmith tracing
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = settings.langsmith_api_key
os.environ["LANGCHAIN_PROJECT"] = "production"
# Prometheus metrics
from prometheus_client import Counter, Histogram
llm_requests = Counter("langchain_llm_requests_total", "Total LLM requests")
llm_late