Reducer
The Reducer drops or compacts noisy events before the forwarder ships them to the log analyzer. Drop mode runs against a per-node budget; Compact mode is lossless, with events compacted at the edge and expanded at search time in Splunk and Elasticsearch.
Workflow
The Reducer app processes events from a variety of log forwarders, such as Fluentd, Fluent Bit, Filebeat, and Logstash. Configure the app to process all or a subset of the events, allowing for targeted analysis and event regulation.
graph LR
A["<div style='font-size: 14px;'>π Forwarder</div><div style='font-size: 10px; text-align: center;'>Sidecar Process</div>"] --> B["<div style='font-size: 14px;'>π‘ Receive</div><div style='font-size: 10px; text-align: center;'>Stream Events</div>"]
B --> C["<div style='font-size: 14px;'>π Transform</div><div style='font-size: 10px; text-align: center;'>into TenXObjects</div>"]
C --> D["<div style='font-size: 14px;'>π Enrich</div><div style='font-size: 10px; text-align: center;'>Add Context</div>"]
D --> E["<div style='font-size: 14px;'>π¦ Regulate</div><div style='font-size: 10px; text-align: center;'>Filter Events</div>"]
E --> F["<div style='font-size: 14px;'>π€ Output</div><div style='font-size: 10px; text-align: center;'>Write to Forwarder</div>"]
classDef deploy fill:#7c3aed88,stroke:#6d28d9,color:#ffffff,stroke-width:2px,rx:8,ry:8
classDef receive fill:#9333ea88,stroke:#7c3aed,color:#ffffff,stroke-width:2px,rx:8,ry:8
classDef transform fill:#2563eb88,stroke:#1d4ed8,color:#ffffff,stroke-width:2px,rx:8,ry:8
classDef enrich fill:#059669,stroke:#047857,color:#ffffff,stroke-width:2px,rx:8,ry:8
classDef regulate fill:#dc2626,stroke:#b91c1c,color:#ffffff,stroke-width:2px,rx:8,ry:8
classDef output fill:#ea580c88,stroke:#c2410c,color:#ffffff,stroke-width:2px,rx:8,ry:8
class A deploy
class B receive
class C transform
class D enrich
class E regulate
class F output
π Forwarder: Runs 10x as a sidecar process to log forwarders for real-time event analysis
π‘ Receive: Read events continuously from log forwarders via IPC
π Transform: Structures log events into well-defined TenXObjects
π Enrich: Applies enrichment rules to augment TenXObjects with intelligent context
π Report: Publishes cost insight metrics for visualization and alerting
π¦ Regulate: Filter noisy events to prevent over-billing, or losslessly compact them for Splunk/Elasticsearch search-time expansion
π€ Output: Writes regulated events back to forwarder to ship to destination analyzers
Architecture
The Reducer executes as a forwarder sidecar to filter βnoisyβ events or losslessly compact survivors before they ship to a log analyzer.
Reducers filter 'noisy' events before forwarding, using symbol identities to track per-event-type spend against a local hourly budget and probabilistically shed events that push any one pattern over its share. Simple, autonomous, no coordination. See per-node budget mode.
A declarative file keyed by the joined rateReducerFieldNames values (e.g. symbolMessage, container) caps specific patterns with an explicit sample rate and expiry. Operators (or an AI assistant via the Log10x MCP) append entries based on Reporter cost attribution, commit to git, and every reducer pulls the file on its next reload. Each mute is diff-reviewed, self-expires, and maps 1:1 to the field-sets the Reporter attributes cost to. See mute file mode.
Safety & Reliability
The Reducer runs as a sidecar alongside your log forwarder with fail-open design β if the reducer crashes or stops, your logs continue flowing normally at full volume to your analyzer.
| Topic | Detail |
|---|---|
| Fail-open design | Logs continue flowing if 10x goes down |
| Backpressure handling | Disk buffering prevents data loss during spikes |
| Resource requirements | 512MB heap + 2 threads handles 100+ GB/day |
| Rollback | helm uninstall takes ~1 minute, no data loss |
See the Reducer FAQ for complete operational details, capacity planning, and deployment guidance.