Skip to main content
POST
/
v1
/
events
/
batch
Batch Ingest Events
curl --request POST \
  --url https://api.example.com/v1/events/batch \
  --header 'Content-Type: application/json' \
  --data '
{
  "events": [
    {}
  ],
  "events[].event_name": "<string>",
  "events[].external_customer_id": "<string>",
  "events[].event_id": "<string>",
  "events[].customer_id": "<string>",
  "events[].timestamp": "<string>",
  "events[].source": "<string>",
  "events[].properties": {}
}
'
{
  "202": {},
  "400": {},
  "401": {},
  "500": {},
  "message": "<string>"
}
Use this endpoint when batching usage events for high-volume ingestion or backfilling historical data. More efficient than individual event calls when you have multiple events to send.

Use Cases

  • High-volume event ingestion (logs, metrics, telemetry)
  • Backfilling historical usage data
  • Batch imports from data warehouses
  • Scheduled bulk uploads
  • End-of-day usage summaries

Request Body

events
array
required
Array of event objects to ingest. Minimum 1 event, maximum 1000 events per request.Each event object has the same structure as the single event endpoint.

Event Object Structure

events[].event_name
string
required
Unique identifier for the type of event.
events[].external_customer_id
string
required
Your system’s unique identifier for the customer.
events[].event_id
string
Optional idempotency key. Recommended for ensuring exactly-once processing.
events[].customer_id
string
FlexPrice’s internal customer ID. Optional if using external_customer_id.
events[].timestamp
string
ISO 8601 timestamp. Defaults to current server time if omitted.
events[].source
string
Optional identifier for the event source.
events[].properties
object
Arbitrary key-value pairs with event metadata.

Response

message
string
Confirmation message: "Events accepted for processing"

Examples

curl -X POST https://us.api.flexprice.io/v1/events/batch \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "events": [
      {
        "event_name": "api_request",
        "external_customer_id": "customer_123",
        "event_id": "req_001",
        "timestamp": "2024-03-20T10:00:00Z",
        "properties": {
          "endpoint": "/v1/users",
          "method": "GET",
          "status": 200
        }
      },
      {
        "event_name": "api_request",
        "external_customer_id": "customer_123",
        "event_id": "req_002",
        "timestamp": "2024-03-20T10:05:00Z",
        "properties": {
          "endpoint": "/v1/orders",
          "method": "POST",
          "status": 201
        }
      },
      {
        "event_name": "storage_usage",
        "external_customer_id": "customer_456",
        "event_id": "storage_001",
        "timestamp": "2024-03-20T10:00:00Z",
        "properties": {
          "storage_gb": 250.5,
          "region": "us-east-1"
        }
      }
    ]
  }'

Response Codes

202
Accepted
Events accepted for processing. All events are queued and will be processed asynchronously.
400
Bad Request
Invalid request. Common issues:
  • Empty events array
  • More than 1000 events in a single request
  • Missing required fields in one or more events
  • Invalid timestamp format
401
Unauthorized
Missing or invalid API key.
500
Server Error
Internal server error. Events were not accepted.

Limits

  • Batch size: 1-1000 events per request
  • Rate limits: Same as single event endpoint (per-API-key limits apply)
  • Payload size: Recommended max 10MB per request

Best Practices

  1. Optimal batch size: Use 100-500 events per request for best performance. Larger batches reduce HTTP overhead but may timeout.
  2. Include event IDs: Always provide unique event_id values for idempotency, especially for retries.
  3. Handle partial failures: If the entire batch fails, consider splitting it into smaller batches and retrying.
  4. Order matters for backfills: When backfilling, send events in chronological order (oldest first) to ensure accurate billing period calculations.
  5. Monitor ingestion lag: Use the monitoring endpoint to track processing delays during high-volume ingestion.
  6. Validate before sending: Check event structure locally before making the API call to avoid rejecting entire batches.

Backfilling Historical Data

When importing historical usage data:
// Batch events by day for manageable request sizes
const eventsByDay = groupEventsByDay(historicalEvents);

for (const [day, events] of Object.entries(eventsByDay)) {
  // Split into chunks of 500
  const chunks = chunkArray(events, 500);
  
  for (const chunk of chunks) {
    await client.events.ingestBatch({ events: chunk });
    
    // Add delay to avoid rate limits
    await sleep(100);
  }
}