Event Logs
Event logs are the foundation of observability and operational intelligence in ChatBotKit, automatically capturing detailed information about every significant action, interaction, and state change across the platform. These logs provide a time-series record of system behavior that supports multiple critical use cases including real-time monitoring, historical analysis, debugging complex issues, compliance reporting, usage analytics, and billing reconciliation.
The event logging system operates transparently in the background, recording events from all major platform components including conversations, message exchanges, integration activities, bot interactions, dataset operations, skillset executions, and file processing. Each event captures relevant context including resource identifiers, event type, timestamp, metadata, and relationships to other platform objects, creating a comprehensive audit trail that can be queried, analyzed, and exported for various purposes.
Event logs are particularly valuable for understanding system behavior at scale, identifying performance bottlenecks, troubleshooting integration issues, generating usage reports, detecting anomalies, and ensuring platform reliability. The logs provide visibility into both successful operations and errors, making them essential for maintaining production systems and optimizing conversational AI deployments.
Listing Event Logs
The event log listing endpoint provides flexible querying capabilities for retrieving historical event data with comprehensive filtering, pagination, and ordering options. You can query events by various dimensions including time range, event type, associated resources, and custom metadata, enabling both broad operational monitoring and targeted investigation of specific scenarios or issues.
Event log queries support cursor-based pagination for efficient traversal of large result sets, with configurable page sizes and ordering (ascending or descending by timestamp). The endpoint can return data in both JSON and JSONL (JSON Lines) formats, with JSONL being particularly efficient for processing large volumes of events through streaming pipelines or batch analysis tools.
The response includes an array of event log entries, each containing:
- Event metadata: Unique identifier, event type, creation timestamp
- Resource associations: References to related bots, conversations, integrations, datasets
- Context information: User ID, organization context, platform identifiers
- Custom metadata: Event-specific data captured at the time of occurrence
- Pagination data: Cursor tokens for retrieving subsequent pages
Filtering Event Logs
Event logs can be filtered by metadata fields using deep object notation, allowing you to query for specific types of events, resources, or conditions:
This filtering capability enables targeted queries for:
- Resource-specific events: All events related to a particular bot, conversation, or integration
- Event type filtering: Focus on specific categories like errors, completions, or status changes
- Time-based analysis: Retrieve events within specific time windows for trend analysis
- Custom attribute queries: Filter by any metadata field captured during event creation
Pagination and Performance
For optimal performance when working with large event log datasets, the endpoint implements cursor-based pagination. Use the cursor parameter with values returned from previous queries to traverse the result set efficiently:
Best Practices:
- Use reasonable page sizes (50-500 events) to balance latency and data transfer
- Implement cursor-based pagination for processing large event volumes
- Filter events to reduce result set size when possible
- Consider using JSONL format for bulk processing and analysis
- Cache event log data when performing repeated queries
- Use descending order (newest first) for monitoring recent activity
Use Cases:
- Real-time operational monitoring and alerting
- Troubleshooting conversation or integration issues
- Generating usage and billing reports
- Compliance and audit trail documentation
- Performance analysis and optimization
- Customer support investigation and debugging
- Product analytics and feature usage tracking
- Security monitoring and anomaly detection
Important Notes:
- Event logs are retained according to your plan's data retention policy
- High-volume applications should implement appropriate filtering and pagination
- Event log queries count against API rate limits
- Bulk exports may be more efficient for large historical analysis
- Events are immutable once created and cannot be modified or deleted
- Sensitive data may be redacted in event logs based on privacy settings
Exporting Event Logs
The event log export endpoint provides bulk data extraction capabilities for downloading large volumes of historical event data in multiple formats including JSON, JSONL (JSON Lines), and CSV. This endpoint is specifically optimized for batch processing, data warehousing integration, compliance archival, and detailed analytics workflows that require processing comprehensive event datasets outside the platform.
Unlike the list endpoint which is designed for interactive queries with pagination, the export endpoint streams complete result sets efficiently, making it ideal for periodic data exports, backup operations, integration with external analytics platforms, and generating comprehensive audit reports that span extended time periods or encompass thousands of events.
The export functionality supports the same powerful filtering capabilities as the list endpoint, allowing you to narrow exports to specific event types, resources, time ranges, or metadata attributes. This granular control ensures you can extract precisely the data you need without transferring unnecessary information, optimizing bandwidth usage and downstream processing efficiency.
For large-scale data exports, JSONL format offers significant advantages over standard JSON by streaming events as newline-delimited records, enabling incremental processing without loading the entire dataset into memory:
CSV format provides maximum compatibility with spreadsheet applications, business intelligence tools, and traditional data processing pipelines:
Filtering Exported Data
Apply the same filtering parameters available in the list endpoint to focus your export on specific subsets of event data:
Common filtering scenarios include:
- Resource-specific exports: Extract all events for a particular bot, conversation, integration, or dataset
- Event type filtering: Export only specific event categories like completions, errors, or status changes
- Time-bounded extracts: Retrieve events within defined time windows for periodic reporting
- Metadata-based selection: Filter by custom attributes to extract events matching specific business criteria
Format Selection and Use Cases
JSON Format (Accept: application/json):
- Best for: Direct API consumption, single-shot data transfers, small to medium datasets
- Characteristics: Complete array response, easy to parse, widely supported
- Limitations: Entire dataset loaded into memory, not suitable for very large exports
JSONL Format (Accept: application/jsonl):
- Best for: Large-scale exports, streaming processing, data pipeline integration
- Characteristics: One JSON object per line, streamable, memory-efficient
- Use cases: ETL processes, log aggregation systems, big data platforms
CSV Format (Accept: text/csv):
- Best for: Business analytics, spreadsheet analysis, reporting tools
- Characteristics: Tabular structure, universal compatibility, human-readable
- Use cases: Business reports, Excel analysis, BI tool imports
Export Best Practices
Performance Optimization:
- Use JSONL for exports exceeding 10,000 events
- Apply filtering to reduce dataset size when possible
- Consider incremental exports using time-based filtering
- Schedule large exports during off-peak hours
Data Management:
- Implement checkpointing for resumable exports
- Validate data integrity after export completion
- Compress exported files for storage and transfer
- Establish retention policies aligned with compliance requirements
Integration Patterns:
- Stream JSONL exports directly to data lakes or warehouses
- Use CSV exports for business intelligence and reporting tools
- Implement automated periodic exports for backup and archival
- Coordinate exports with event log retention policies
Metadata Handling in Exports
Event metadata is preserved in JSON and JSONL formats as structured objects, enabling rich downstream analysis. In CSV format, complex metadata fields are serialized as YAML strings within CSV cells, allowing representation of hierarchical data while maintaining CSV compatibility. This approach ensures no data loss during format conversion while supporting tools that expect flat tabular structures.
Important Considerations:
- Export operations count against API rate limits proportional to data volume
- Very large exports may experience timeouts; use filtering and incremental approaches
- Exported data reflects event logs at export time; concurrent modifications may not be included
- Sensitive data handling: ensure appropriate security for exported files containing event details
- Export file sizes should be monitored to prevent storage issues
- Exported event logs are immutable snapshots; they don't update if source events change
Subscribing to Live Event Logs
Event log subscription enables real-time streaming of events as they occur in your account. This creates a persistent connection that continuously delivers events, making it ideal for building real-time monitoring dashboards, live debugging tools, and reactive integrations.
To subscribe to live event logs, establish a streaming connection using the subscribe endpoint:
Once connected, the endpoint returns a streaming response that remains open, continuously sending event log entries as they are generated. Each event contains the full event data including type, related resource IDs, metadata, and timestamp.
Understanding the Streaming Response
The subscription endpoint uses JSON Lines (JSONL) format where each line
represents a separate event. Events are wrapped in an envelope with type
and data fields, matching the format used by other list endpoints:
The streaming connection remains active until either:
- The client closes the connection
- A network error or timeout occurs
- The server terminates the connection due to inactivity
Catching Up with Historical Events
When connecting, you can optionally request recent historical events to be replayed before receiving live updates. This is useful for ensuring you don't miss events that occurred during connection setup:
The historyLength parameter specifies how many recent events to replay.
These historical events are delivered first, followed by live events.
Use Cases
Live event streaming is particularly valuable for:
- Real-time dashboards: Display live activity and metrics
- Debugging: Monitor events as they occur during development
- External integrations: Stream events to third-party systems
- Alerting: Trigger actions based on specific event types
- Analytics: Feed events into real-time analytics pipelines
Implementation Example
Here's how to implement an event subscriber in JavaScript:
Best Practices:
- Implement reconnection logic to handle connection drops
- Use
historyLengthto catch up on missed events after reconnecting - Process events asynchronously to avoid blocking the stream
- Filter events client-side based on type if you only need specific events
- For historical analysis, use the
/event/log/listendpoint instead