Back to Skills

automl-optimizer

verified

Automated machine learning with hyperparameter optimization using Optuna, Hyperopt, or AutoML libraries. Activates for "automl", "hyperparameter tuning", "optimize hyperparameters", "auto tune model", "neural architecture search", "automated ml". Systematically explores model and hyperparameter spaces, tracks all experiments, and finds optimal configurations with minimal manual intervention.

View on GitHub

Marketplace

specweave

anton-abyzov/specweave

Plugin

sw-ml

development

Repository

anton-abyzov/specweave
27stars

plugins/specweave-ml/skills/automl-optimizer/SKILL.md

Last Verified

January 25, 2026

Install Skill

Select agents to install to:

Scope:
npx add-skill https://github.com/anton-abyzov/specweave/blob/main/plugins/specweave-ml/skills/automl-optimizer/SKILL.md -a claude-code --skill automl-optimizer

Installation paths:

Claude
.claude/skills/automl-optimizer/
Powered by add-skill CLI

Instructions

# AutoML Optimizer

## Overview

Automates the tedious process of hyperparameter tuning and model selection. Instead of manually trying different configurations, define a search space and let AutoML find the optimal configuration through intelligent exploration.

## Why AutoML?

**Manual Tuning Problems**:
- Time-consuming (hours/days of trial and error)
- Subjective (depends on intuition)
- Incomplete (can't try all combinations)
- Not reproducible (hard to document search process)

**AutoML Benefits**:
- ✅ Systematic exploration of search space
- ✅ Intelligent sampling (Bayesian optimization)
- ✅ All experiments tracked automatically
- ✅ Find optimal configuration faster
- ✅ Reproducible (search process documented)

## AutoML Strategies

### Strategy 1: Hyperparameter Optimization (Optuna)

```python
from specweave import OptunaOptimizer

# Define search space
def objective(trial):
    # Suggest hyperparameters
    params = {
        'n_estimators': trial.suggest_int('n_estimators', 100, 1000),
        'max_depth': trial.suggest_int('max_depth', 3, 10),
        'learning_rate': trial.suggest_float('learning_rate', 0.01, 0.3, log=True),
        'subsample': trial.suggest_float('subsample', 0.5, 1.0),
        'colsample_bytree': trial.suggest_float('colsample_bytree', 0.5, 1.0)
    }
    
    # Train model
    model = XGBClassifier(**params)
    
    # Cross-validation score
    scores = cross_val_score(model, X_train, y_train, cv=5, scoring='roc_auc')
    
    return scores.mean()

# Run optimization
optimizer = OptunaOptimizer(
    objective=objective,
    n_trials=100,
    direction='maximize',
    increment="0042"
)

best_params = optimizer.optimize()

# Creates:
# - .specweave/increments/0042.../experiments/optuna-study/
#   ├── study.db (Optuna database)
#   ├── optimization_history.png
#   ├── param_importances.png
#   ├── parallel_coordinate.png
#   └── best_params.json
```

**Optimization Report**:
```markdown
# Optuna Optimization Report

## Search Space
-

Validation Details

Front Matter
Required Fields
Valid Name Format
Valid Description
Has Sections
Allowed Tools
Instruction Length:
10886 chars