jeremylongshore/claude-code-plugins-plus-skills
langchain-pack
plugins/saas-packs/langchain-pack/skills/langchain-reference-architecture/SKILL.md
January 22, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/langchain-pack/skills/langchain-reference-architecture/SKILL.md -a claude-code --skill langchain-reference-architectureInstallation paths:
.claude/skills/langchain-reference-architecture/# LangChain Reference Architecture
## Overview
Production-ready architectural patterns for building scalable, maintainable LangChain applications.
## Prerequisites
- Understanding of LangChain fundamentals
- Experience with software architecture
- Knowledge of cloud infrastructure
## Architecture Patterns
### Pattern 1: Layered Architecture
```
src/
├── api/ # API layer (FastAPI/Flask)
│ ├── __init__.py
│ ├── routes/
│ │ ├── chat.py
│ │ └── agents.py
│ └── middleware/
│ ├── auth.py
│ └── rate_limit.py
├── core/ # Business logic layer
│ ├── __init__.py
│ ├── chains/
│ │ ├── __init__.py
│ │ ├── chat_chain.py
│ │ └── rag_chain.py
│ ├── agents/
│ │ ├── __init__.py
│ │ └── research_agent.py
│ └── tools/
│ ├── __init__.py
│ └── search.py
├── infrastructure/ # Infrastructure layer
│ ├── __init__.py
│ ├── llm/
│ │ ├── __init__.py
│ │ └── provider.py
│ ├── vectorstore/
│ │ └── pinecone.py
│ └── cache/
│ └── redis.py
├── config/ # Configuration
│ ├── __init__.py
│ └── settings.py
└── main.py
```
### Pattern 2: Provider Abstraction
```python
# infrastructure/llm/provider.py
from abc import ABC, abstractmethod
from langchain_core.language_models import BaseChatModel
from langchain_openai import ChatOpenAI
from langchain_anthropic import ChatAnthropic
class LLMProvider(ABC):
"""Abstract LLM provider."""
@abstractmethod
def get_chat_model(self, **kwargs) -> BaseChatModel:
pass
class OpenAIProvider(LLMProvider):
def get_chat_model(self, model: str = "gpt-4o-mini", **kwargs) -> BaseChatModel:
return ChatOpenAI(model=model, **kwargs)
class AnthropicProvider(LLMProvider):
def get_chat_model(self, model: str = "claude-3-5-sonnet-20241022", **kwargs) -> BaseChatModel:
return ChatAnthropic(model=model, **kwargs)
class LLMFactory:
"""Factory for creating LLM instances."""