Our approach to observability has been constrained by the notion that instruments and observers are fundamentally distinct entities. The prevailing paradigm positions low-level instrumentation within applications and views observers as passive consumers of this data. This post offers an alternative perspective: observers are instruments, constructing more comprehensive and insightful observations by integrating data from lower-level instruments. Coupled with location-independent deployment, this hierarchical view of instrumentation facilitates a more robust and adaptable approach.
Elevating Observability
Traditional observability frameworks offer a limited perspective, where low-level yesteryear instruments (logs, metrics, traces) reside within applications, observers process this data externally, and the resulting data is either stored or visualized. This approach overlooks a significant aspect: observers generate new observations that are valuable telemetry data in their own right. We should be building more of these higher-level instruments than traditional loggers or metrics instruments – observers on observers.
Breaking Down Boundaries
The OpenTelemetry project for example provides distinct APIs for observability data producers (metrics, traces, logs) within applications and for consumers (processors, receivers) within Collector processes. These consumers subsequently forward the data to remote backends that process it via vendor-specific APIs. This separation necessitates developers to acquire expertise and proficiency in diverse APIs for virtually the same tasks. Furthermore, they must make early architectural decisions regarding processing location, deploy intricate collection infrastructure even for rudimentary analysis requirements, and maintain distinct codebases for local and remote processing logic.
Unified Architecture
Such frameworks create artificial boundaries between data collection (instruments, sensors) and analysis (observers, pipelines). This architectural split stems from an assumption that local collection and remote processing are fundamentally different concerns. They are not! Observers should be deployable wherever they make the most sense based on actual requirements. Requirements such as responsiveness, scope of analysis, resource capacity, network and storage cost, as well as data privacy, should be considered.
Anywhere Analysis
An observability toolkit should abstract away the location of the analysis engine, enabling developers to define monitoring logic that can execute either locally within the managed instrumented process or in a remote aggregating analysis environment without modification. It should seamlessly switch between local and remote execution modes without the need to change code by providing consistent and coherent APIs and patterns regardless of the analysis location, reducing developers’ cognitive load.
A New Foundation
The Humanary Substrates API introduces a novel approach to observability, elevating observers from passive data consumers to active participants. This transformation entails adopting a flexible and hierarchical architecture that dynamically empowers observers to synthesize and analyze data across diverse locations. By doing so, we have developed an observability solution that is more adaptable, contextually aware, and capable of generating insightful outcomes that drive tangible actions.
By enabling each observer to function as a pluggable module with a standardized interface, we facilitate the aggregation and filtering of data by observers at various levels before forwarding it to higher-level observers or storage. The API gifts teams the autonomy to optimize observability as an integral component of their systems, supporting proactive and intelligent monitoring with minimal computational overhead.