Skip to content

Datadog Integration Guide

Arbitex streams audit events to Datadog using the Logs Intake API v2. Events are formatted as OCSF v1.1 JSON and sent in batches, tagged with configurable source, service, and tag metadata.


  • A Datadog account with a Logs plan that supports custom log ingestion
  • A Datadog API key with logs_write permission
  • Outbound HTTPS from Arbitex platform pods to http-intake.logs.{site} on port 443

VariableRequiredDefaultDescription
DATADOG_API_KEYYesDatadog API key
DATADOG_SITENodatadoghq.comDatadog site domain (see table below)
DATADOG_SOURCENoarbitexLog source name (ddsource field)
DATADOG_SERVICENoarbitex-platformService name (service field)
DATADOG_TAGSNoenv:productionComma-separated tags applied to all events
DATADOG_BATCH_SIZENo100Maximum events per batch
DATADOG_FLUSH_INTERVALNo5Maximum seconds between batch flushes
DATADOG_MAX_RETRIESNo3Maximum retry attempts on transient failures
DATADOG_DEAD_LETTER_PATHNo/var/log/arbitex/datadog_dead_letter.jsonlPath for the dead letter queue file
SiteDATADOG_SITE value
US1 (default)datadoghq.com
US3us3.datadoghq.com
US5us5.datadoghq.com
EU1datadoghq.eu
AP1ap1.datadoghq.com
US1-FEDddog-gov.com
Terminal window
DATADOG_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
DATADOG_SITE=datadoghq.com
DATADOG_SOURCE=arbitex
DATADOG_SERVICE=arbitex-platform
DATADOG_TAGS=env:production,team:security

Each event is sent as a Datadog log entry. The OCSF event object is JSON-serialized into the message field:

[
{
"ddsource": "arbitex",
"ddtags": "env:production,team:security",
"hostname": "arbitex-platform-pod-abc123",
"service": "arbitex-platform",
"message": "{\"class_uid\":6003,\"class_name\":\"API Activity\",\"severity\":\"Informational\",\"time\":1741737600000,\"actor\":{\"user\":{\"email_addr\":\"alice@example.com\"},\"org\":{\"uid\":\"org_01jq...\"}},\"src_endpoint\":{\"ip\":\"203.0.113.45\"},...}"
}
]

Events are posted to:

https://http-intake.logs.{DATADOG_SITE}/api/v2/logs

with the header DD-API-KEY: {api_key}.

The hostname is derived from the platform pod’s system hostname at runtime.

The connector validates the API key by calling GET https://api.{DATADOG_SITE}/api/v1/validate. A 200 response indicates the key is valid. A 403 response indicates the key is invalid or lacks permissions.


Terminal window
curl -s -H "Authorization: Bearer $ADMIN_TOKEN" \
https://api.arbitex.ai/api/admin/siem/connectors | jq '.[] | select(.connector_id == "datadog")'

In the Datadog Logs explorer, filter by:

source:arbitex service:arbitex-platform

To find DLP events:

source:arbitex @class_name:"Security Finding"

Note: Datadog parses the message field as JSON automatically when log processing is enabled. OCSF fields become searchable as @field attributes in the log explorer.


SymptomLikely causeResolution
status: not_configuredDATADOG_API_KEY not setSet the variable and restart the platform
status: errorInvalid API keyVerify the key in Datadog → Organization settings → API keys
status: degradedUnexpected validation responseCheck platform logs and Datadog status page
Logs not appearingWrong DATADOG_SITEVerify your Datadog account’s site matches DATADOG_SITE
Logs appear as plain textLog processing pipeline not configuredEnable log processing in Datadog for JSON parsing of the message field

Dead letter events are written to DATADOG_DEAD_LETTER_PATH in JSONL format and are not automatically replayed.