Upload files to S3 and invalidate CloudFront cache. Use when user needs to (1) upload HTML/files to treenod-static-origin bucket, (2) invalidate CloudFront cache, or (3) generate doc.treenod.com URLs. Primarily for HTML files, but supports CSS, JS, images.
View on GitHubFebruary 4, 2026
Select agents to install to:
npx add-skill https://github.com/treenod-IDQ/treenod-market/blob/main/plugins/util/skills/s3-uploader/SKILL.md -a claude-code --skill s3-uploaderInstallation paths:
.claude/skills/s3-uploader/## Setup
### Configure AWS Credentials
**Option A: Claude Code Settings (Recommended)**
Add to `~/.claude/settings.json` (macOS/Linux/WSL) or `%USERPROFILE%\.claude\settings.json` (Windows):
```json
{
"environmentVariables": {
"AWS_ACCESS_KEY_ID": "your-access-key",
"AWS_SECRET_ACCESS_KEY": "your-secret-key"
}
}
```
**Option B: Environment Variables**
```bash
export AWS_ACCESS_KEY_ID="your-access-key"
export AWS_SECRET_ACCESS_KEY="your-secret-key"
```
**Option C: AWS CLI**
```bash
aws configure
```
The skill automatically detects credentials from any of these sources.
### Optional Configuration
| Variable | Default | Description |
|----------|---------|-------------|
| `S3_BUCKET` | `treenod-static-origin` | S3 bucket name |
| `S3_PREFIX` | `doc.treenod.com/data/` | S3 key prefix |
| `CLOUDFRONT_DISTRIBUTION_ID` | `EYDV6OWEIIKXK` | CloudFront distribution |
| `PUBLIC_URL_BASE` | `https://doc.treenod.com/data/` | Public URL base |
## Notes
- **boto3 auto-install**: The skill automatically installs boto3 if not found on first run.
- **Cross-platform**: Works on Windows, macOS, Linux, and WSL.
- **Credential auto-load**: Automatically loads AWS credentials from Claude Code settings if not in environment.
## Commands
Run from skill directory.
### upload - Upload file to S3
```bash
python scripts/s3_upload.py upload <file_path> # Upload file
python scripts/s3_upload.py upload <file_path> --invalidate # Upload + invalidate cache
python scripts/s3_upload.py upload <file_path> --force # Skip duplicate check
python scripts/s3_upload.py upload <file_path> --key custom.html # Custom S3 key name
python scripts/s3_upload.py upload <file_path> --auto-name # Auto-generate filename
python scripts/s3_upload.py upload <file_path> --auto-name --description dau-report # With description
```
**Output:**