Packages

Table of contents

  1. Core
    1. Config
  2. Log
  3. Interceptors
  4. Errors
    1. Notifier
  5. Tracing
  6. Hystrix Prometheus (Deprecated — will be archived in v1)
  7. Options
  8. grpcpool
  9. Workers
  10. Data Builder

Core

The core module is the base module and provides the base implementation for ColdBrew. It works in conjunction with the other modules to provide the full functionality of Cold Brew.

Documentation can be found at core-docs

Config

ColdBrew config package contains the configuration for the core package. It uses envconfig to load the configuration from the environment variables.

Documentation can be found at config-docs

Log

log provides slog-native structured logging for ColdBrew services. It uses a custom slog.Handler that automatically injects per-request context fields (trace ID, gRPC method, HTTP path) into every log record. Native slog.LogAttrs calls work out of the box after core.New() initializes the framework. Use log.AddAttrsToContext to add typed context fields without interface boxing, or log.AddToContext for untyped key-value pairs. The Handler is composable — it can wrap any slog.Handler for custom output formats or fan-out.

Documentation can be found at log-docs

Interceptors

Interceptors provides a common set of reusable interceptors for grpc services

Documentation can be found at interceptor-docs

Errors

A drop-in replacement for the standard errors package that adds stack trace capture and gRPC status codes. Standard library helpers (Is, As, Unwrap, Join) are re-exported. Error notification is provided by the Notifier sub-package below.

Documentation can be found at errors-docs

Notifier

notifier provides notifier services for error reporting (airbrake, bugsnag, rollbar, sentry). Notifier replies on Errors package to get the stack trace information.

Documentation can be found at notifier-docs

Tracing

Tracing is a library that provides distributed tracing to Go applications. It offers features such as collecting performance data of an application, identifying where requests are spending most of their time, and segmenting requests. It supports exporting traces via OpenTelemetry to any OTLP-compatible backend (Jaeger, Grafana Tempo, Honeycomb, etc.) and New Relic.

Documentation can be found at tracing-docs

Hystrix Prometheus (Deprecated — will be archived in v1)

hystrixprometheus provides a Prometheus metrics collector for Hystrix. This package is deprecated as hystrix-go is unmaintained. Use interceptors.SetDefaultExecutor with failsafe-go instead. See integrations.

Documentation can be found at hystrixprometheus-docs

Options

options is a request-scoped key-value store that propagates metadata through context.Context using RWMutex+map. Used by interceptors to pass metadata between layers. Only string keys are supported.

Documentation can be found at options-docs

grpcpool

grpcpool is a pool of grpc.ClientConns that can be used to make requests to a grpc server. It implements grpc.ClientConnInterface to enable it to be used directly with generated proto stubs.

Documentation can be found at grpcpool-docs

Workers

Workers is a worker lifecycle library that manages background goroutines with automatic panic recovery, configurable restart with backoff, tracing, and structured shutdown. Built on suture, it provides a builder pattern for defining workers, helpers for common patterns (periodic tasks, channel consumers, batch processors), and dynamic child worker management via WorkerContext. See the Workers howto for usage examples.

Documentation can be found at workers-docs

Data Builder

Data builder is a library to compile and execute data-processing logic. Users can express any data-processing logic as functions that accept and return structs. Based on these struct types, the library is able to resolve dependencies at compile time to catch issues with the computation graph (such as missing inputs, missing data-builder functions, cyclic dependencies). Compilation infers a sequence in which the data-processing functions can be run (and can support parallel execution). Any App that acts on a request, processes it in multiple steps, and returns some data that depends on these steps could be written using data-builder.

Documentation can be found at data-builder-docs