Skip to content

Datadog Logs connector

The Datadog connector is a P0 production connector that forwards Arbitex audit events to Datadog via the Logs Intake API v2. Events are wrapped in Datadog log envelopes and sent as JSON arrays for efficient batched ingestion.


  • Events are accumulated in an internal buffer (up to 100 events or 5 seconds, whichever comes first).
  • Batches are sent as a JSON array to POST https://http-intake.logs.{site}/api/v2/logs.
  • Each event is wrapped in a Datadog log envelope:
    {
    "ddsource": "arbitex",
    "ddtags": "env:production",
    "hostname": "<system hostname>",
    "service": "arbitex-platform",
    "message": "<OCSF event JSON as string>"
    }
  • The connector accepts HTTP 200 and 202 as success. On 429 or 503, it retries with exponential backoff (up to max_retries attempts).
  • On persistent failure, events are written to a dead letter JSONL file.
  • The health check calls GET https://api.{site}/api/v1/validate with the API key. A 200 response indicates a valid key; 403 reports Error.

SiteDATADOG_SITE valueRegion
US1 (default)datadoghq.comUnited States
EU1datadoghq.euEuropean Union
US3us3.datadoghq.comUnited States (US3)
US5us5.datadoghq.comUnited States (US5)
AP1ap1.datadoghq.comAsia Pacific
US1-FEDddog-gov.comUS Government

Set DATADOG_SITE to the appropriate value for your Datadog organization.


VariableRequiredDefaultDescription
DATADOG_API_KEYYesDatadog API key. Obtain from Organization Settings → API Keys.
DATADOG_SITENodatadoghq.comDatadog site domain (see table above).
DATADOG_SOURCENoarbitexLog source (ddsource) field — used for automatic pipeline matching.
DATADOG_SERVICENoarbitex-platformService name (service) field — used in log explorer and APM correlation.
DATADOG_TAGSNoenv:productionComma-separated ddtags applied to all logs, e.g. env:production,region:us-east-1.
DATADOG_BATCH_SIZENo100Maximum events per batch send.
DATADOG_FLUSH_INTERVALNo5Maximum seconds between buffer flushes.
DATADOG_MAX_RETRIESNo3Maximum retry attempts on transient failures.
DATADOG_DEAD_LETTER_PATHNo/var/log/arbitex/datadog_dead_letter.jsonlPath for dead letter JSONL fallback.

  1. In Datadog, go to Organization Settings → API Keys (or Account Settings → API Keys in some plans).
  2. Click New Key, enter a name (e.g., arbitex-siem), and copy the key value.
  3. API keys have no scope restrictions for log ingestion — no additional permissions are needed.

Set the environment variables on your Arbitex deployment:

Terminal window
DATADOG_API_KEY="your-api-key-here"
DATADOG_SITE="datadoghq.com" # or your region's site
DATADOG_SOURCE="arbitex"
DATADOG_SERVICE="arbitex-platform"
DATADOG_TAGS="env:production,team:platform"
Section titled “Step 3 — Set up a log pipeline (recommended)”

Datadog Log Pipelines parse the message field (which contains the OCSF JSON string) into structured attributes for searching and alerting.

  1. In Datadog, go to Logs → Configuration → Pipelines.
  2. Click Add a new pipeline.
  3. Set the filter to source:arbitex.
  4. Add a JSON Parser processor:
    • Source attribute: message
    • This extracts OCSF fields (class_uid, time, severity_id, actor.user.uid, etc.) as top-level log attributes.
  5. (Optional) Add a Date Remapper processor:
    • Source attribute: time (epoch milliseconds)
    • This sets the official log timestamp from the OCSF time field.
  6. (Optional) Add a Severity Remapper to map severitystatus.

Step 4 — Create log indexes and retention

Section titled “Step 4 — Create log indexes and retention”

By default, logs land in the default index. For compliance retention requirements, create a dedicated index:

  1. Go to Logs → Configuration → Indexes.
  2. Click New Index, name it arbitex-audit.
  3. Set a filter of source:arbitex.
  4. Set the retention period (e.g., 90 days for SOC 2, 1 year for HIPAA).

In the Arbitex admin UI, go to Admin → SIEM. The Datadog connector row shows:

  • Healthy — API key validation returned 200
  • Error — API key is invalid (403)
  • Degraded — API validation returned an unexpected status
  • Not configuredDATADOG_API_KEY is not set

Click Send test event to send a synthetic OCSF event. In Datadog Log Explorer, search:

source:arbitex @api.operation:siem_test_event

The event should appear within a few seconds.


Once the log pipeline is in place and fields are extracted, use Datadog Log Explorer to investigate Arbitex events:

# All DLP blocks
source:arbitex @class_uid:2001 @finding.types:DLP
# Auth failures in the last hour
source:arbitex @class_uid:3002 status:error
# Specific user activity
source:arbitex @actor.user.uid:usr_01HZ_ALICE
# High-severity events
source:arbitex @severity_id:[4 TO 5]

Failed batches are written to /var/log/arbitex/datadog_dead_letter.jsonl. Each line:

{
"event": { ... },
"error": "HTTP 429: Too Many Requests",
"connector": "datadog",
"timestamp": 1741564800.0
}

To replay, extract event objects and POST them directly to the Datadog Logs API:

Terminal window
jq -sc '[.[].event]' /var/log/arbitex/datadog_dead_letter.jsonl \
| curl -s -X POST "https://http-intake.logs.datadoghq.com/api/v2/logs" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: $DATADOG_API_KEY" \
--data-binary @-