Templates and patterns for common ML training scenarios including text classification, text generation, fine-tuning, and PEFT/LoRA. Provides ready-to-use training configurations, dataset preparation scripts, and complete training pipelines. Use when building ML training pipelines, fine-tuning models, implementing classification or generation tasks, setting up PEFT/LoRA training, or when user mentions model training, fine-tuning, classification, generation, or parameter-efficient tuning.
View on GitHubFebruary 1, 2026
Select agents to install to:
npx add-skill https://github.com/vanman2024/ai-dev-marketplace/blob/main/plugins/ml-training/skills/training-patterns/SKILL.md -a claude-code --skill training-patternsInstallation paths:
.claude/skills/training-patterns/# ML Training Patterns **Purpose:** Provide production-ready training templates, configuration files, and automation scripts for common ML training scenarios including classification, generation, fine-tuning, and PEFT/LoRA approaches. **Activation Triggers:** - Building text classification models (sentiment, intent, NER, etc.) - Training text generation models (summarization, Q&A, chatbots) - Fine-tuning pre-trained models for specific tasks - Implementing PEFT (Parameter-Efficient Fine-Tuning) with LoRA - Setting up training pipelines with HuggingFace Transformers - Configuring training hyperparameters and optimization - Preparing datasets for model training **Key Resources:** - `scripts/setup-classification.sh` - Classification training setup automation - `scripts/setup-generation.sh` - Generation training setup automation - `scripts/setup-fine-tuning.sh` - Full fine-tuning setup automation - `scripts/setup-peft.sh` - PEFT/LoRA training setup automation - `templates/classification-config.yaml` - Classification training configuration - `templates/generation-config.yaml` - Generation training configuration - `templates/peft-config.json` - PEFT/LoRA configuration - `examples/sentiment-classifier.md` - Complete sentiment classification example - `examples/text-generator.md` - Complete text generation example ## Training Scenarios Overview ### 1. Text Classification **Use cases:** Sentiment analysis, intent classification, topic categorization, spam detection, named entity recognition (NER) **Key characteristics:** - Input: Text → Output: Class label(s) - Typically uses encoder models (BERT, RoBERTa, DistilBERT) - Fast inference, suitable for production - Requires labeled training data **Setup command:** ```bash ./scripts/setup-classification.sh <project-name> <model-name> <num-classes> ``` **Example:** ```bash ./scripts/setup-classification.sh sentiment-model distilbert-base-uncased 3 ``` ### 2. Text Generation **Use cases:** Summarization, question answeri