recalled.dev
Core concepts

Events API

The events API is how your app pushes audit records into Recalled and reads them back.

Examples below show JSON payloads. For ready-to-run snippets in curl, Python, Go, PHP, Ruby, Java and Rust, see Use from any language.

Create an event

POST /v1/events

json
{
  "action": "invoice.deleted",
  "actor": {
    "type": "user",
    "id": "user_123",
    "name": "Alice",
    "email": "alice@example.com"
  },
  "organization": "org_abc",
  "targets": [{ "type": "invoice", "id": "inv_42" }],
  "metadata": { "reason": "duplicate" },
  "occurred_at": "2026-04-14T09:12:45.000Z"
}

Required: action. Recommended: actor.id, organization. Everything else is optional metadata.

The server computes a SHA-256 hash of the event chained to the previous event in the same project AND an HMAC-SHA256 signature over the canonical payload, keyed by a secret that lives outside the database. The chain detects reordering and gaps; the signature detects content rewrites. Call `GET /v1/events/verify` to audit both at once.

Field reference

Every field you can send on POST /v1/events, with its type, required status and what it's for.

action, required

String, 1 to 255 chars. The verb-style name of what happened. This is the only mandatory field.

Recalled doesn't enforce a naming scheme but we recommend domain.subject.verb dot-separated, past tense:

  • Good: user.logged_in, invoice.deleted, billing.subscription.updated, api_key.rotated
  • Bad: click, error, something happened, User Login

Consistent naming pays off later: it's what powers exact match filtering (?action=user.delete), wildcard retention rules (user.*) and full-text search.

For the full naming convention, the standard verb list and a category-by-category catalogue of what to log, see What to log.

organization, optional

String, max 128 chars. The tenant identifier in your own product, not a Recalled concept.

If your SaaS is multi-tenant, put your internal customer/tenant ID here (e.g. org_acme, tenant_42). Recalled uses it to:

  • Filter events in dashboard and API (?organization=org_acme)
  • Narrow an embed token so <RecalledFeed /> acts as a per-tenant drill-down inside your admin panel
  • Route GDPR deletion by organization if needed

If your app is single-tenant or the event isn't tied to a specific customer (cron, system tasks), leave it empty.

actor, optional object

Who performed the action. All sub-fields are optional but actor.id is strongly recommended when the action is triggered by a human user.

Sub-fieldTypeConstraintPurpose
actor.idstring1-255 charsStable user ID from your DB. Enables per-user filtering and GDPR deletion via DELETE /v1/actors/:id
actor.typestring1-64 charsuser, service, api_key, system, etc. Distinguishes human vs automated
actor.namestringmax 255 charsDisplay name, shown in dashboard and embed feed
actor.emailstringmax 255 chars, valid emailOptional, shown in dashboard

Leave actor out entirely for system events (cron, migration, startup tasks).

targets, optional array

List of resources the action operated on. Max 20 entries per event, and the serialized JSON of the whole array must stay under 4 KB. Each entry has:

Sub-fieldTypeConstraintPurpose
typestring1-64 chars, requiredResource type (invoice, project, api_key)
idstring1-255 chars, requiredResource ID in your DB
namestringmax 255 chars, optionalDisplay name

Example, a user moved 2 items to a folder:

json
{
  "action": "folder.items.moved",
  "actor": { "id": "user_1" },
  "targets": [
    { "type": "item", "id": "item_a", "name": "Invoice Q1" },
    { "type": "item", "id": "item_b", "name": "Invoice Q2" },
    { "type": "folder", "id": "folder_archive", "name": "Archive" }
  ]
}

metadata, optional object

Free-form JSON. Put anything you want to remember about the context:

json
{
  "metadata": {
    "reason": "duplicate",
    "source": "admin_panel",
    "diff": { "before": "draft", "after": "paid" }
  }
}

No schema is enforced, so it's flexible but not searchable by inner field. Serialized JSON must stay under 8 KB, typical events come in well under 1 KB. Beyond that, the API rejects the event with HTTP 413.

occurred_at, optional ISO 8601

When the action actually happened, as seen by your app. Format 2026-04-14T09:12:45.000Z.

If omitted, the server timestamps the event at ingest time. That's what you want for real-time logging. Only set it explicitly when replaying historical events or when there's a meaningful delay between the action and the API call.

Per-event size limits

Each event's payload is capped at ingest. These caps apply to POST /v1/events only.

FieldLimit
action255 chars
metadata8 KB serialized JSON
targets4 KB serialized JSON, 20 entries max
actor.id, actor.name, actor.email255 chars each

A typical event weighs under 500 bytes total. The caps are roughly 20× the usual metadata size, generous enough to absorb a richly-tagged event without leaving the door open to a client that accidentally dumps a stack trace, a request body or an entire document into a single event.

When a payload exceeds a cap, the API returns:

http
HTTP/1.1 413 Payload Too Large
Content-Type: application/json

{
  "error": {
    "code": "EVENT_TOO_LARGE",
    "message": "metadata is too large: 12453 bytes, limit is 8192",
    "details": {
      "field": "metadata",
      "size": 12453,
      "limit": 8192
    }
  }
}

If you keep hitting this in legitimate cases, you probably want to split the data: log a slim event referencing an external resource (S3 key, blob storage URL) instead of inlining the payload itself.

Fields the server fills in

You never send these, Recalled adds them on ingest:

FieldMeaning
idUUID assigned at ingest
project_idInferred from the API key
ip_addressIP of the ingest request
user_agentUser-Agent header of the ingest request
hashSHA-256 of prev_hash concatenated with the canonical event payload. Chain evidence
prev_hashhash of the previous event in this project, null for the very first one
signatureHMAC-SHA256 of the canonical payload, prefixed with the key version (e.g. v1:...). Server-side secret never stored in the database
anonymized_atISO timestamp set when PII was scrubbed via GDPR erasure. null otherwise

List events

GET /v1/events?limit=50&cursor=<iso>

Query params:

  • limit (default 50, max 200)
  • cursor, ISO timestamp from the previous page's nextCursor
  • organization, tenant filter
  • actor_id, filter on a specific actor id
  • action, exact match filter on a single action
  • actions, comma-separated list of actions to include (e.g. user.login,user.logout). Max 50 entries.
  • actions_exclude, comma-separated list of actions to exclude. Max 50 entries.
  • ip_address, filter on a specific IP
  • date_from, date_to, ISO bounds

Returns { data: Event[], nextCursor: string | null }.

Search

GET /v1/events/search?q=<term>

Full-text search across action, actor_name, actor_email, actor_id. Cursor-paginated like list.

Query params:

  • q, required, the search term (1-255 chars)
  • limit, cursor, pagination like list
  • organization, actor_id, actions, actions_exclude, ip_address, date_from, date_to, same filter semantics as list, applied on top of the text search

Get one

GET /v1/events/:id

Returns a single event (same shape as list items), scoped to the project of the API key.

Export

GET /v1/exports?format=csv or format=json

Streams the filtered events as a downloadable file. Same filters as list.

Verify the chain

GET /v1/events/verify

Walks every event in the project in occurred-at order and checks:

  • Chain link: each prev_hash equals the previous row's hash.
  • Stored hash: recompute sha256(prev_hash || canonical_payload) and compare to hash.
  • HMAC signature: recompute hmac-sha256(secret, canonical_payload) and compare to signature.

Optional query params ?from=<ISO> and ?to=<ISO> limit the check to a window.

The response always returns HTTP 200. The payload tells you what happened:

json
{
  "data": {
    "ok": true,
    "verified": 1284,
    "anonymized": 3,
    "unsigned": 0,
    "gaps": [
      { "at": "2026-03-01T00:00:00.000Z", "reason": "plan_retention", "purged_count": 112 }
    ],
    "failure": null
  }
}

When something fails, ok is false and failure pinpoints the offender:

json
{
  "data": {
    "ok": false,
    "verified": 842,
    "anonymized": 0,
    "unsigned": 0,
    "gaps": [],
    "failure": {
      "event_id": "01HX...",
      "reason": "signature_mismatch",
      "at": "2026-04-12T14:07:13.000Z"
    }
  }
}

Failure reasons:

  • hash_mismatch: a row's payload no longer matches its stored hash.
  • signature_mismatch: a row's payload no longer matches its HMAC signature.
  • chain_broken: a row's prev_hash points nowhere, and no retention_checkpoint explains the gap.

Anonymized rows are reported as anonymized (skipped safely). Rows predating the HMAC rollout are reported as unsigned; running the backfill script clears the count.

Receipts: a portable, citable proof for one event

GET /v1/events/:id/receipt

Returns a single self-contained JSON receipt for one event, with two URLs you can hand out to anyone:

json
{
  "data": {
    "type": "recalled.receipt.v1",
    "event_id": "01HX...",
    "action": "file.deleted",
    "actor": { "type": "agent", "id": "claude-sonnet-4.6" },
    "target": { "type": "file", "id": "f_42" },
    "occurred_at": "2026-05-02T17:42:00.000Z",
    "hash": "...",
    "prev_hash": "...",
    "signature": "v1:...",
    "verification_url": "https://api.recalled.dev/v1/receipts/01HX...",
    "view_url": "https://recalled.dev/receipts/01HX..."
  }
}

The view_url is a public webpage that confirms the event exists and the chain is intact, with no API key required. The verification_url is the raw JSON version of the same check. Use this when an AI agent needs to cite an action it took, or when you want to prove to a customer that an event happened without giving them dashboard access. See the Agent audit guide for the full pattern.