Master Agent Protocol Reference
Overview
This document provides a reference for the Master Agent Protocol (MAP) used in cpm development and operational workflows. The protocol defines structured approaches for complex task orchestration, quality assurance, and systematic problem-solving.
Protocol Purpose
The Master Agent Protocol serves to:
- Provide systematic approach to complex development tasks
- Ensure consistent quality and thoroughness
- Enable structured delegation and coordination
- Maintain clear documentation and audit trails
- Support reproducible operational procedures
Core Principles
1. Systematic Approach
Break complex tasks into manageable phases:
- Assessment
- Planning
- Execution
- Validation
- Documentation
2. Quality Focus
Maintain high standards through:
- Clear success criteria
- Validation checkpoints
- Error detection and recovery
- Documentation requirements
3. Structured Communication
Use consistent formats for:
- Task specifications
- Progress reporting
- Error handling
- Result documentation
Protocol Phases
Phase 0: INTAKE
Purpose: Receive and comprehend the task
Activities:
- Parse user requirements
- Identify scope and boundaries
- Determine complexity level
- Select execution strategy
- Clarify ambiguities
Example:
Task: Implement neighbor discovery feature
Scope: Network scanning, TCP connectivity, database registration
Complexity: Medium (requires networking, concurrency, database)
Strategy: Incremental development with testing
Phase 1: ASSESS
Purpose: Understand current state
Activities:
- Analyze existing codebase
- Identify integration points
- Document dependencies
- Assess risks
- Establish baselines
Example:
Assessment Results:
- Existing: Server management infrastructure
- Integration: internal/server/neighbor.go
- Dependencies: net package, database layer
- Risks: Network scanning performance, timeout handling
- Baseline: Current server registration mechanism
Phase 2: PLAN
Purpose: Design execution strategy
Activities:
- Break down into tasks
- Identify dependencies
- Determine parallelization
- Establish milestones
- Define success criteria
Example:
Plan:
1. Implement network scanning (net.Dial)
2. Add concurrent connection testing
3. Create neighbor registration
4. Build discovery command
5. Add database integration
6. Write tests
Success Criteria:
- Scan /24 network in < 30 seconds
- Detect all reachable cpm servers
- 95%+ reliability
Phase 3: EXECUTE
Purpose: Perform the implementation
Activities:
- Implement planned tasks
- Monitor progress
- Handle errors
- Adjust as needed
- Document changes
Pattern:
For each task:
1. Implement core functionality
2. Add error handling
3. Write tests
4. Document behavior
5. Verify integration
Phase 4: VALIDATE
Purpose: Verify quality and completeness
Activities:
- Run all tests
- Check against requirements
- Verify edge cases
- Performance validation
- Security review
Validation Checklist:
- All tests pass
- Requirements met
- Error handling complete
- Performance acceptable
- Security reviewed
- Documentation updated
Phase 5: REMEDIATE
Purpose: Fix identified issues
Activities:
- Prioritize issues
- Implement fixes
- Re-validate
- Iterate as needed
- Document resolutions
Issue Priority:
- Critical: Security, data loss
- High: Core functionality broken
- Medium: Performance, usability
- Low: Minor improvements
Phase 6: DELIVER
Purpose: Finalize and present
Activities:
- Final validation
- Documentation completion
- Result compilation
- User communication
- Artifact archival
Deliverables:
- Working code
- Tests
- Documentation
- Examples
- Migration notes (if applicable)
Phase 7: REFLECT
Purpose: Capture learnings
Activities:
- What worked well
- What could improve
- Protocol updates
- Knowledge capture
- Best practices documentation
Quality Framework
Success Criteria
Define measurable criteria for each task:
accuracy:
definition: "Correctness of implementation"
threshold: 0.90
measurement: "Test pass rate"
completeness:
definition: "All requirements addressed"
threshold: 0.85
measurement: "Requirements coverage"
rigor:
definition: "Thoroughness of implementation"
threshold: 0.95
measurement: "Edge cases handled, docs complete"
performance:
definition: "Efficiency of solution"
threshold: acceptable_range
measurement: "Benchmarks, profiling"
Validation Gates
Pre-Implementation Gate:
- Requirements clear and documented
- Scope defined and approved
- Success criteria established
- Dependencies identified
Mid-Implementation Gate:
- Core functionality working
- Basic tests passing
- Integration validated
- No blocking issues
Pre-Delivery Gate:
- All tests passing
- Documentation complete
- Performance acceptable
- Security reviewed
- User acceptance
Error Handling Protocol
Error Classification
| Level | Severity | Action |
|---|---|---|
| Critical | System failure, data loss | Immediate halt, escalate |
| High | Feature broken, security issue | Fix required before proceed |
| Medium | Degraded functionality | Document, fix in iteration |
| Low | Minor issue, cosmetic | Document, fix when convenient |
Recovery Strategy
1. Identify error type and severity
2. Document error context
3. Attempt automatic recovery (if safe)
4. If recovery fails:
a. Escalate to user
b. Provide options
c. Document decision
5. Update procedures to prevent recurrence
Escalation Triggers
Escalate to user when:
- Ambiguous requirements
- Multiple valid approaches
- Resource constraints
- Security concerns
- User preference needed
Documentation Standards
Code Documentation
// Function documentation
// Purpose: Brief description of what function does
// Parameters:
// - param1: Description
// - param2: Description
// Returns:
// - return1: Description
// - error: Error conditions
// Example:
// result, err := Function(param1, param2)
// if err != nil {
// // handle error
// }
Task Documentation
# Task: [Name]
## Objective
What needs to be accomplished
## Context
Background information and constraints
## Approach
How it will be implemented
## Success Criteria
Measurable outcomes
## Results
What was accomplished
## Lessons Learned
Insights for future reference
Best Practices
Planning
- Start with clear requirements
- Break down complex tasks
- Identify dependencies early
- Establish checkpoints
- Plan for validation
Implementation
- Incremental development
- Test as you go
- Document while fresh
- Handle errors gracefully
- Review regularly
Communication
- Clear, concise updates
- Proactive problem reporting
- Document decisions
- Share learnings
- Maintain audit trail
Quality Assurance
- Define success criteria upfront
- Validate continuously
- Test edge cases
- Review security implications
- Document thoroughly
Application to cpm Development
Feature Development
1. ASSESS: Review codebase for integration points
2. PLAN: Design feature architecture
3. EXECUTE: Implement incrementally
4. VALIDATE: Test thoroughly
5. DELIVER: Document and deploy
Bug Fixes
1. ASSESS: Reproduce and diagnose issue
2. PLAN: Design fix approach
3. EXECUTE: Implement fix
4. VALIDATE: Verify resolution, check regressions
5. DELIVER: Deploy and document
Performance Optimization
1. ASSESS: Measure baseline performance
2. PLAN: Identify optimization opportunities
3. EXECUTE: Implement optimizations
4. VALIDATE: Measure improvements
5. DELIVER: Document gains
Integration with cpm Workflow
Development Workflow
# Assessment
cpm config show # Understand current state
cpm list # See existing repositories
# Implementation
# Make changes to code
# Validation
go test ./... # Run tests
go vet ./... # Static analysis
# Delivery
git commit -m "feat: description"
git push
Operational Workflow
# Assessment
cpm servers status # Check server health
cpm neighbors discover # Find peers
# Execution
cpm push myrepo # Sync repositories
cpm neighbors sync repo # Distribute to neighbors
# Validation
cpm list --remote server # Verify sync
Protocol Evolution
The Master Agent Protocol is a living document that evolves based on:
- Operational experience
- Identified improvements
- New capabilities
- Team feedback
- Industry best practices
Updates are versioned and documented in the protocol changelog.