Message queues and event-driven backend architecture. RabbitMQ, Kafka, pub/sub patterns, and async communication.
View on GitHubpluginagentmarketplace/custom-plugin-backend
backend-development-assistant
January 20, 2026
Select agents to install to:
npx add-skill https://github.com/pluginagentmarketplace/custom-plugin-backend/blob/main/skills/messaging/SKILL.md -a claude-code --skill messagingInstallation paths:
.claude/skills/messaging/# Messaging & Event-Driven Skill
**Bonded to:** `architecture-patterns-agent` (Secondary)
---
## Quick Start
```bash
# Invoke messaging skill
"Set up RabbitMQ for my microservices"
"Implement event-driven order processing"
"Configure Kafka for high-throughput messaging"
```
---
## Message Broker Comparison
| Broker | Best For | Throughput | Ordering |
|--------|----------|------------|----------|
| RabbitMQ | Task queues, RPC | Medium | Per queue |
| Kafka | Event streaming, logs | Very high | Per partition |
| Redis Pub/Sub | Real-time, simple | High | None |
| SQS | AWS serverless | Medium | FIFO optional |
---
## Examples
### RabbitMQ Producer/Consumer
```python
import pika
import json
# Producer
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
channel.queue_declare(queue='orders', durable=True)
def publish_order(order):
channel.basic_publish(
exchange='',
routing_key='orders',
body=json.dumps(order),
properties=pika.BasicProperties(delivery_mode=2) # persistent
)
# Consumer
def process_order(ch, method, properties, body):
order = json.loads(body)
print(f"Processing order: {order['id']}")
ch.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_qos(prefetch_count=1)
channel.basic_consume(queue='orders', on_message_callback=process_order)
channel.start_consuming()
```
### Kafka Event Streaming
```python
from kafka import KafkaProducer, KafkaConsumer
import json
# Producer
producer = KafkaProducer(
bootstrap_servers=['localhost:9092'],
value_serializer=lambda v: json.dumps(v).encode('utf-8')
)
producer.send('user-events', {'type': 'USER_CREATED', 'user_id': '123'})
# Consumer
consumer = KafkaConsumer(
'user-events',
bootstrap_servers=['localhost:9092'],
group_id='notification-service',
auto_offset_reset='earliest',
value_deserializer=lambda m: json.loads(m.decode('utf-8'))
)
for message in consumer:
p