Skip to main content

Event Streaming

Quint publishes real-time event streams to Kafka for integration with SIEMs, data warehouses, and custom analytics pipelines. Three topics cover agent activity, spawn detection, and relationship graphs.

Topics

TopicDescription
agent.events.rawEvery scored tool call / HTTP request
agent.spawns.detectedSpawn pattern matches from MCP tool calls
agent.relationshipsParent-child relationships from the correlation engine

Configuration

{
  "kafka": {
    "brokers": ["broker1:9092", "broker2:9092"],
    "enabled": true,
    "async": true,
    "batch_size": 100,
    "batch_time_ms": 1000
  }
}
FieldDefaultDescription
brokersKafka broker addresses
enabledfalseEnable/disable streaming
asynctrueFire-and-forget publishing
batch_size100Messages per batch
batch_time_ms1000Flush interval in milliseconds
Producer settings: Snappy compression, leader-ack (RequireOne), LeastBytes balancing. Internal buffer holds up to 10,000 messages; messages are dropped with a warning if the buffer is full.

Message Schemas

agent.events.raw

Published for every scored tool call (MCP gateway/relay) or HTTP request (forward proxy).
{
  "event_id": "uuid",
  "timestamp": "2025-03-05T10:30:00Z",
  "agent_id": "anthropic:bold-amber-falcon",
  "agent_name": "bold-amber-falcon",
  "session_id": "sess-uuid",
  "server_name": "github",
  "tool_name": "list_repos",
  "action": "mcp:github:list_repos.list",
  "risk_score": 25,
  "risk_level": "low",
  "verdict": "allow",
  "trace_id": "trace-uuid",
  "depth": 0,
  "parent_agent_id": "",
  "transport": "stdio",
  "arguments_hash": "sha256:abc123...",
  "scoring_source": "local+remote",
  "behavioral_flags": ["new_resource_access"],
  "metadata": {
    "local_score": 20,
    "remote_score": 25
  }
}

agent.spawns.detected

Published when a spawn pattern matches a tool call.
{
  "event_id": "uuid",
  "timestamp": "2025-03-05T10:30:05Z",
  "pattern_id": "openai-handoff",
  "parent_agent": "openai:swift-coral-otter",
  "child_hint": "research_bot",
  "spawn_type": "delegation",
  "confidence": 0.90,
  "tool_name": "transfer_to_research_bot",
  "server_name": "openai-agents",
  "arguments_ref": "sha256:def456..."
}

agent.relationships

Published when the correlation engine establishes or updates a parent-child relationship.
{
  "timestamp": "2025-03-05T10:30:10Z",
  "parent_agent": "anthropic:bold-amber-falcon",
  "child_agent": "derived_bold-amber-falcon_a3f2",
  "confidence": 0.955,
  "depth": 1,
  "spawn_type": "direct",
  "signal_type": "context",
  "signal_count": 3
}

SIEM Integration Patterns

Splunk

# Index Kafka topic via Splunk Connect for Kafka
[kafka_input://agent_events]
topic = agent.events.raw
brokers = broker1:9092
sourcetype = quint:agent:event

Elasticsearch

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "topics": "agent.events.raw,agent.spawns.detected,agent.relationships",
  "connection.url": "http://elasticsearch:9200",
  "type.name": "_doc",
  "key.ignore": true
}

Datadog

Use the Datadog Kafka integration to forward events as logs:
logs:
  - type: kafka
    topic: agent.events.raw
    brokers: broker1:9092
    service: quint-proxy
    source: kafka
Kafka streaming is independent of cloud scoring. You can use Kafka without the cloud API, or the cloud API without Kafka, or both.