Skip to content

[ENH] Decompose CNC into Planner and Executor modules #24

@OppaAI

Description

@OppaAI

Summary

CNC currently acts as orchestrator, context builder, stream handler, and memory coordinator all in one. As GRACE grows more capable this becomes a bottleneck. Decomposing CNC into specialized modules makes the architecture more scalable and maintainable.

Current Behavior

CNC does everything:

  • Receives user input (orchestrator)
  • Builds context (context builder)
  • Calls Cosmos (inference handler)
  • Streams response (stream handler)
  • Manages memory via MCC (memory coordinator)

Proposed Enhancement

Decompose into specialized modules:

CNC (conductor — thin orchestrator only)
    ├── Planner    — decides what to do and how
    │   ├── context assembly
    │   ├── tool selection (future agentic)
    │   └── goal tracking
    └── Executor   — carries out the plan
        ├── Cosmos inference
        ├── streaming
        └── memory updates
# Future CNC._handle()
plan = await self.planner.plan(user_input, memory_context)
result = await self.executor.execute(plan)
await self.mcc.store(result)

Impact

  • CNC becomes a thin conductor, not the whole orchestra
  • Each module has single responsibility
  • Easier to add agentic capabilities (Planner decides tools)
  • Easier to test each module independently
  • Maps to human prefrontal cortex (planner) + motor cortex (executor)

Notes

  • Not needed for M1-M3 — implement when agentic missions begin (M10/M11)
  • Planner module is the foundation for MCP tool server
  • CNC decomposition aligns with AIVA server taking over orchestration role

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions