jeremylongshore/claude-code-plugins-plus-skills
langchain-pack
plugins/saas-packs/langchain-pack/skills/langchain-migration-deep-dive/SKILL.md
January 22, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/langchain-pack/skills/langchain-migration-deep-dive/SKILL.md -a claude-code --skill langchain-migration-deep-diveInstallation paths:
.claude/skills/langchain-migration-deep-dive/# LangChain Migration Deep Dive
## Overview
Comprehensive strategies for migrating to LangChain from legacy LLM implementations or other frameworks.
## Prerequisites
- Existing LLM application to migrate
- Understanding of current architecture
- Test coverage for validation
- Staging environment for testing
## Migration Scenarios
### Scenario 1: Raw OpenAI SDK to LangChain
#### Before (Raw SDK)
```python
# legacy_openai.py
import openai
client = openai.OpenAI()
def chat(message: str, history: list = None) -> str:
messages = [{"role": "system", "content": "You are helpful."}]
if history:
messages.extend(history)
messages.append({"role": "user", "content": message})
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
temperature=0.7
)
return response.choices[0].message.content
```
#### After (LangChain)
```python
# langchain_chat.py
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.output_parsers import StrOutputParser
from langchain_core.messages import HumanMessage, AIMessage
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.7)
prompt = ChatPromptTemplate.from_messages([
("system", "You are helpful."),
MessagesPlaceholder(variable_name="history"),
("user", "{message}")
])
chain = prompt | llm | StrOutputParser()
def chat(message: str, history: list = None) -> str:
# Convert legacy format to LangChain messages
lc_history = []
if history:
for msg in history:
if msg["role"] == "user":
lc_history.append(HumanMessage(content=msg["content"]))
elif msg["role"] == "assistant":
lc_history.append(AIMessage(content=msg["content"]))
return chain.invoke({"message": message, "history": lc_history})
```
### Scenario 2: LlamaIndex to LangChain
#### Before (LlamaIndex)
```python
# legacy_llamaindex.py
f