Optimize Windsurf API performance with caching, batching, and connection pooling. Use when experiencing slow API responses, implementing caching strategies, or optimizing request throughput for Windsurf integrations. Trigger with phrases like "windsurf performance", "optimize windsurf", "windsurf latency", "windsurf caching", "windsurf slow", "windsurf batch".
View on GitHubjeremylongshore/claude-code-plugins-plus-skills
windsurf-pack
plugins/saas-packs/windsurf-pack/skills/windsurf-performance-tuning/SKILL.md
February 1, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/saas-packs/windsurf-pack/skills/windsurf-performance-tuning/SKILL.md -a claude-code --skill windsurf-performance-tuningInstallation paths:
.claude/skills/windsurf-performance-tuning/# Windsurf Performance Tuning
## Overview
Optimize Windsurf API performance with caching, batching, and connection pooling.
## Prerequisites
- Windsurf SDK installed
- Understanding of async patterns
- Redis or in-memory cache available (optional)
- Performance monitoring in place
## Latency Benchmarks
| Operation | P50 | P95 | P99 |
|-----------|-----|-----|-----|
| Read | 50ms | 150ms | 300ms |
| Write | 100ms | 250ms | 500ms |
| List | 75ms | 200ms | 400ms |
## Caching Strategy
### Response Caching
```typescript
import { LRUCache } from 'lru-cache';
const cache = new LRUCache<string, any>({
max: 1000,
ttl: 60000, // 1 minute
updateAgeOnGet: true,
});
async function cachedWindsurfRequest<T>(
key: string,
fetcher: () => Promise<T>,
ttl?: number
): Promise<T> {
const cached = cache.get(key);
if (cached) return cached as T;
const result = await fetcher();
cache.set(key, result, { ttl });
return result;
}
```
### Redis Caching (Distributed)
```typescript
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);
async function cachedWithRedis<T>(
key: string,
fetcher: () => Promise<T>,
ttlSeconds = 60
): Promise<T> {
const cached = await redis.get(key);
if (cached) return JSON.parse(cached);
const result = await fetcher();
await redis.setex(key, ttlSeconds, JSON.stringify(result));
return result;
}
```
## Request Batching
```typescript
import DataLoader from 'dataloader';
const windsurfLoader = new DataLoader<string, any>(
async (ids) => {
// Batch fetch from Windsurf
const results = await windsurfClient.batchGet(ids);
return ids.map(id => results.find(r => r.id === id) || null);
},
{
maxBatchSize: 100,
batchScheduleFn: callback => setTimeout(callback, 10),
}
);
// Usage - automatically batched
const [item1, item2, item3] = await Promise.all([
windsurfLoader.load('id-1'),
windsurfLoader.load('id-2'),
windsurfLoader.load('id-3'),
]);
```
## Connection Optimizatio