Contextualizing with Circuits

Context

In a previous post, the importance of Context in Observability was discussed. To recap, Context is crucial to understand the circumstances (or situation) surrounding an observation of behavior (action) or change (state). It gives meaning and significance to observations and enhances the ability to infer internal states and project implications.

In discussing software, a key element of Context is the computational components that serve a functional purpose or support processing structuring.

In identifying such components, there needs to be a Boundary defined. Boundaries are formed conceptually by taking a particular management perspective of a system and its subsystems.

Three typical component classification schemes are Configuration, Resource, or Service.

Scene

In interpreting a script or a scene within a movie, humans must, at a minimum, identify the setting (foreground, background) and the actors (humans, relations, states) and understand the dialog (interactions and intentions) from the multi-sensory feed flowing into the brain, that is then decoded and fused by networks of neural circuits.

Observability is similar, except that most vendor solutions today have not had a billion years to evolve efficient and effective ways of detecting the salient features and aspects within a setting (environment), scene, or situation.

Instead, Observability solutions must rely on noisy channels, such as yesteryear techs like traces, logs, and metrics, and then attempt to fuse them to form partial models of form, function, and flow.

Fusion

This after-the-fact component (boundary) analysis is inefficient and ineffective because the detail being collected pertains to items that can undergo nominal rates of change, including renaming, making the notion of identity (continuity) impractical, primarily when the identification process is mainly based on a set of rules referencing namespaces within code artifacts. Added to that is the problem of fusing multiple telemetry channels with diverse emittance (events) and traffic (packets) characteristics.

The fusion of sensory feeds takes place so far removed from a local Context that there is a considerable burden on memory (buffers), storage (streams), and computing (sorts) in the backends, which limits capabilities and adds unnecessary constraints.

Circuit

The Substrates API takes a very different approach. Instead of trying to do component segmentation and sensory fusion in the backend, the instrumentation developer can create and configure a Circuit for each contextual Component within an application, process, or service.

Each Circuit has a contextual channel for each dedicated sensor type. The sensors are created and wired to the Circuit. Each Circuit, representing one or more Contexts, is, in effect, a mini-backend with an event queue, executor, and a pipeline consisting of one or more sensor types (Instrument), channels (Conduits), and observers (Outlets).

Circuits can also share Events with others, allowing for higher-order Context abstractions to be embedded locally and then exposed remotely.