Architecture
Context Layer is a runtime execution boundary between applications and LLM providers. Provider credential handling uses BYOK.
Architecture Overview
Application
│
▼
Context Layer
│
▼
LLM Provider
Application
Defines the execution request, determines how results are used, and orchestrates external tools and services.
Context Layer
Governs how execution occurs. The Context Layer runtime governs how execution requests are admitted, how execution context is constructed, and how provider invocations are controlled. Developers interact with Context Layer through invokeCL(message, options).
LLM Provider
Executes the model and returns generated output.
Why This Architecture Exists
Context Layer separates application intent from provider invocation. The runtime applies consistent rules across all executions regardless of deployment. Execution rules cannot be bypassed at the application layer.
What Exists Outside the Runtime
The runtime does not include:
- application business logic
- prompt authoring strategies
- external services
- databases
- external APIs
Applications orchestrate these systems -- Context Layer governs the LLM execution boundary only.
Conceptual Summary
Applications decide what work should be executed.
Context Layer governs how execution occurs.
Model providers generate the output.
Related Runtime Documentation
Describes how Execution Authority governs admission, validation, and invocation during runtime execution.
Documents the Authority Contract and the behavioral guarantees enforced during execution.
Explains the separation of responsibilities between applications, Context Layer, and LLM providers. Describes what Context Layer controls and what lies outside its authority.