Datadog Integration Guide
Arbitex streams audit events to Datadog using the Logs Intake API v2. Events are formatted as OCSF v1.1 JSON and sent in batches, tagged with configurable source, service, and tag metadata.
Prerequisites
Section titled “Prerequisites”- A Datadog account with a Logs plan that supports custom log ingestion
- A Datadog API key with
logs_writepermission - Outbound HTTPS from Arbitex platform pods to
http-intake.logs.{site}on port 443
Configuration
Section titled “Configuration”| Variable | Required | Default | Description |
|---|---|---|---|
DATADOG_API_KEY | Yes | — | Datadog API key |
DATADOG_SITE | No | datadoghq.com | Datadog site domain (see table below) |
DATADOG_SOURCE | No | arbitex | Log source name (ddsource field) |
DATADOG_SERVICE | No | arbitex-platform | Service name (service field) |
DATADOG_TAGS | No | env:production | Comma-separated tags applied to all events |
DATADOG_BATCH_SIZE | No | 100 | Maximum events per batch |
DATADOG_FLUSH_INTERVAL | No | 5 | Maximum seconds between batch flushes |
DATADOG_MAX_RETRIES | No | 3 | Maximum retry attempts on transient failures |
DATADOG_DEAD_LETTER_PATH | No | /var/log/arbitex/datadog_dead_letter.jsonl | Path for the dead letter queue file |
Site domains
Section titled “Site domains”| Site | DATADOG_SITE value |
|---|---|
| US1 (default) | datadoghq.com |
| US3 | us3.datadoghq.com |
| US5 | us5.datadoghq.com |
| EU1 | datadoghq.eu |
| AP1 | ap1.datadoghq.com |
| US1-FED | ddog-gov.com |
Example configuration
Section titled “Example configuration”DATADOG_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxDATADOG_SITE=datadoghq.comDATADOG_SOURCE=arbitexDATADOG_SERVICE=arbitex-platformDATADOG_TAGS=env:production,team:securityEvent format
Section titled “Event format”Each event is sent as a Datadog log entry. The OCSF event object is JSON-serialized into the message field:
[ { "ddsource": "arbitex", "ddtags": "env:production,team:security", "hostname": "arbitex-platform-pod-abc123", "service": "arbitex-platform", "message": "{\"class_uid\":6003,\"class_name\":\"API Activity\",\"severity\":\"Informational\",\"time\":1741737600000,\"actor\":{\"user\":{\"email_addr\":\"alice@example.com\"},\"org\":{\"uid\":\"org_01jq...\"}},\"src_endpoint\":{\"ip\":\"203.0.113.45\"},...}" }]Events are posted to:
https://http-intake.logs.{DATADOG_SITE}/api/v2/logswith the header DD-API-KEY: {api_key}.
The hostname is derived from the platform pod’s system hostname at runtime.
Health check
Section titled “Health check”The connector validates the API key by calling GET https://api.{DATADOG_SITE}/api/v1/validate. A 200 response indicates the key is valid. A 403 response indicates the key is invalid or lacks permissions.
Verification
Section titled “Verification”Check connector health
Section titled “Check connector health”curl -s -H "Authorization: Bearer $ADMIN_TOKEN" \ https://api.arbitex.ai/api/admin/siem/connectors | jq '.[] | select(.connector_id == "datadog")'Search in Datadog
Section titled “Search in Datadog”In the Datadog Logs explorer, filter by:
source:arbitex service:arbitex-platformTo find DLP events:
source:arbitex @class_name:"Security Finding"Note: Datadog parses the message field as JSON automatically when log processing is enabled. OCSF fields become searchable as @field attributes in the log explorer.
Troubleshooting
Section titled “Troubleshooting”| Symptom | Likely cause | Resolution |
|---|---|---|
status: not_configured | DATADOG_API_KEY not set | Set the variable and restart the platform |
status: error | Invalid API key | Verify the key in Datadog → Organization settings → API keys |
status: degraded | Unexpected validation response | Check platform logs and Datadog status page |
| Logs not appearing | Wrong DATADOG_SITE | Verify your Datadog account’s site matches DATADOG_SITE |
| Logs appear as plain text | Log processing pipeline not configured | Enable log processing in Datadog for JSON parsing of the message field |
Dead letter events are written to DATADOG_DEAD_LETTER_PATH in JSONL format and are not automatically replayed.