jeremylongshore/claude-code-plugins-plus-skills
log-aggregation-setup
plugins/devops/log-aggregation-setup/skills/setting-up-log-aggregation/SKILL.md
January 22, 2026
Select agents to install to:
npx add-skill https://github.com/jeremylongshore/claude-code-plugins-plus-skills/blob/main/plugins/devops/log-aggregation-setup/skills/setting-up-log-aggregation/SKILL.md -a claude-code --skill setting-up-log-aggregationInstallation paths:
.claude/skills/setting-up-log-aggregation/# Log Aggregation Setup
This skill provides automated assistance for log aggregation setup tasks.
## Overview
Sets up centralized log aggregation (ELK/Loki/Splunk) including ingestion pipelines, parsing, retention policies, dashboards, and security controls.
## Prerequisites
Before using this skill, ensure:
- Target infrastructure is identified (Kubernetes, Docker, VMs)
- Storage requirements are calculated based on log volume
- Network connectivity between log sources and aggregation platform
- Authentication mechanism is defined (LDAP, OAuth, basic auth)
- Resource allocation planned (CPU, memory, disk)
## Instructions
1. **Select Platform**: Choose ELK, Loki, Grafana Loki, or Splunk
2. **Configure Ingestion**: Set up log shippers (Filebeat, Promtail, Fluentd)
3. **Define Storage**: Configure retention policies and index lifecycle
4. **Set Up Processing**: Create parsing rules and field extractions
5. **Deploy Visualization**: Configure Kibana/Grafana dashboards
6. **Implement Security**: Enable authentication, encryption, and RBAC
7. **Test Pipeline**: Verify logs flow from sources to visualization
## Output
**ELK Stack (Docker Compose):**
```yaml
# {baseDir}/elk/docker-compose.yml
## Overview
This skill provides automated assistance for the described functionality.
## Examples
Example usage patterns will be demonstrated in context.
version: '3.8'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.11.0
environment:
- discovery.type=single-node
- xpack.security.enabled=true
volumes:
- es-data:/usr/share/elasticsearch/data
ports:
- "9200:9200"
logstash:
image: docker.elastic.co/logstash/logstash:8.11.0
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
depends_on:
- elasticsearch
kibana:
image: docker.elastic.co/kibana/kibana:8.11.0
ports:
- "5601:5601"
depends_on:
- elasticsearch
```
**Loki Configuration:**