Back to Skills

prompt-engineering-suite

verified

Comprehensive prompt engineering with Chain-of-Thought, few-shot learning, prompt versioning, and optimization. Use when designing prompts, improving accuracy, managing prompt lifecycle.

View on GitHub

Marketplace

orchestkit

yonatangross/orchestkit

Plugin

ork-llm-core

ai

Repository

yonatangross/orchestkit
33stars

plugins/ork-llm-core/skills/prompt-engineering-suite/SKILL.md

Last Verified

January 25, 2026

Install Skill

Select agents to install to:

Scope:
npx add-skill https://github.com/yonatangross/orchestkit/blob/main/plugins/ork-llm-core/skills/prompt-engineering-suite/SKILL.md -a claude-code --skill prompt-engineering-suite

Installation paths:

Claude
.claude/skills/prompt-engineering-suite/
Powered by add-skill CLI

Instructions

# Prompt Engineering Suite

Design, version, and optimize prompts for production LLM applications.

## Overview

- Designing prompts for new LLM features
- Improving accuracy with Chain-of-Thought reasoning
- Few-shot learning with example selection
- Managing prompts in production (versioning, A/B testing)
- Automatic prompt optimization with DSPy

## Quick Reference

### Chain-of-Thought Pattern

```python
from langchain_core.prompts import ChatPromptTemplate

COT_SYSTEM = """You are a helpful assistant that solves problems step-by-step.

When solving problems:
1. Break down the problem into clear steps
2. Show your reasoning for each step
3. Verify your answer before responding
4. If uncertain, acknowledge limitations

Format your response as:
STEP 1: [description]
Reasoning: [your thought process]

STEP 2: [description]
Reasoning: [your thought process]

...

FINAL ANSWER: [your conclusion]"""

cot_prompt = ChatPromptTemplate.from_messages([
    ("system", COT_SYSTEM),
    ("human", "Problem: {problem}\n\nThink through this step-by-step."),
])
```

### Few-Shot with Dynamic Examples

```python
from langchain_core.prompts import FewShotChatMessagePromptTemplate

examples = [
    {"input": "What is 2+2?", "output": "4"},
    {"input": "What is the capital of France?", "output": "Paris"},
]

few_shot = FewShotChatMessagePromptTemplate(
    examples=examples,
    example_prompt=ChatPromptTemplate.from_messages([
        ("human", "{input}"),
        ("ai", "{output}"),
    ]),
)

final_prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant. Answer concisely."),
    few_shot,
    ("human", "{input}"),
])
```

### Prompt Versioning with Langfuse SDK v3

```python
from langfuse import Langfuse
# Note: Langfuse SDK v3 is OTEL-native (acquired by ClickHouse Jan 2026)

langfuse = Langfuse()

# Get versioned prompt with label
prompt = langfuse.get_prompt(
    name="customer-support-v2",
    label="production",  # production, staging, canary

Validation Details

Front Matter
Required Fields
Valid Name Format
Valid Description
Has Sections
Allowed Tools
Instruction Length:
6590 chars