Documentation Index Fetch the complete documentation index at: https://alguna.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
Send Usage Events
Send usage events to Alguna to power usage-based billing. Events are aggregated according to your billable metrics and automatically appear on customer invoices.
Event Structure
Events are the basic unit of billable usage. Each event represents a single action a customer may be billed for.
Required Fields
Field Type Description uniqueIdstring Unique identifier for deduplication eventNamestring Event type (matches your metric) accountIdstring Customer ID or alias timestampISO 8601 When the event occurred propertiesobject Event data for filtering/aggregation
Example Event
{
"uniqueId" : "evt_abc123xyz789" ,
"eventName" : "api.request" ,
"accountId" : "acc_customer123" ,
"timestamp" : "2024-01-20T10:30:00Z" ,
"properties" : {
"endpoint" : "/api/v1/users" ,
"method" : "GET" ,
"response_time_ms" : 45 ,
"bytes_transferred" : 2048 ,
"status_code" : 200
}
}
Send via API
The recommended method for real-time event ingestion.
Single Event
await alguna . events . ingest ({
uniqueId: 'evt_' + crypto . randomUUID (),
eventName: 'api.request' ,
accountId: 'acc_customer123' ,
timestamp: new Date (). toISOString (),
properties: {
endpoint: '/api/v1/users' ,
method: 'GET' ,
bytes_transferred: 2048 ,
},
});
Batch Events
Send multiple events in a single request (recommended for high volume):
await alguna . events . ingestBatch ([
{
uniqueId: 'evt_001' ,
eventName: 'api.request' ,
accountId: 'acc_customer123' ,
timestamp: '2024-01-20T10:30:00Z' ,
properties: { endpoint: '/api/v1/users' , bytes: 1024 },
},
{
uniqueId: 'evt_002' ,
eventName: 'api.request' ,
accountId: 'acc_customer123' ,
timestamp: '2024-01-20T10:30:01Z' ,
properties: { endpoint: '/api/v1/orders' , bytes: 2048 },
},
{
uniqueId: 'evt_003' ,
eventName: 'api.request' ,
accountId: 'acc_customer456' ,
timestamp: '2024-01-20T10:30:02Z' ,
properties: { endpoint: '/api/v1/products' , bytes: 512 },
},
]);
cURL Example
curl -X POST https://api.alguna.io/events/ingest \
-H "Authorization: Bearer $ALGUNA_API_KEY " \
-H "Content-Type: application/json" \
-d '{
"events": [
{
"unique_id": "evt_abc123",
"event_name": "api.request",
"account_id": "acc_customer123",
"timestamp": "2024-01-20T10:30:00Z",
"properties": {
"endpoint": "/api/v1/users",
"method": "GET"
}
}
]
}'
Response
{
"accepted" : 3 ,
"rejected" : 0 ,
"errors" : []
}
Send via CSV
Upload CSV files for batch ingestion or historical data.
unique_id, event_name, account_id, timestamp, properties
evt_001, api.request, acc_123, 2024-01-20T10:00:00Z, "{""endpoint"":""/api/users"",""bytes"":1024}"
evt_002, api.request, acc_123, 2024-01-20T10:01:00Z, "{""endpoint"":""/api/orders"",""bytes"":2048}"
evt_003, api.request, acc_456, 2024-01-20T10:02:00Z, "{""endpoint"":""/api/products"",""bytes"":512}"
Upload via Dashboard
Navigate to Usage Metering in the dashboard
Click Upload CSV
Select your CSV file
Map columns to fields
Click Upload
Upload via API
const result = await alguna . events . uploadCsv ({
file: csvFile ,
mapping: {
uniqueId: 'unique_id' ,
eventName: 'event_name' ,
accountId: 'account_id' ,
timestamp: 'timestamp' ,
properties: 'properties' ,
},
});
console . log ( 'Processed:' , result . processed );
console . log ( 'Errors:' , result . errors );
CSV Best Practices
Maximum file size: 100MB
Maximum rows per file: 1,000,000
Use ISO 8601 timestamps
JSON-encode properties column
Include header row
Send via Data Sources
Connect external data sources for automated ingestion.
BigQuery Integration
await alguna . dataSources . create ({
type: 'bigquery' ,
name: 'Production Usage' ,
config: {
projectId: 'your-gcp-project' ,
datasetId: 'analytics' ,
tableId: 'usage_events' ,
},
mapping: {
uniqueId: 'event_id' ,
eventName: 'event_type' ,
accountId: 'customer_id' ,
timestamp: 'created_at' ,
properties: {
bytes: 'bytes_transferred' ,
endpoint: 'api_endpoint' ,
},
},
schedule: '0 * * * *' , // Every hour
});
Supported Data Sources
Source Description BigQuery Google BigQuery tables Snowflake Snowflake data warehouse PostgreSQL Direct database connection S3 AWS S3 buckets (CSV/JSON) Segment Segment events Webhooks Incoming webhooks
See Data Ingestion for detailed setup.
Customer Identification
Using Alguna Account ID
{
accountId : 'acc_xyz789' // Alguna account ID
}
Using External ID (Alias)
Use your own customer identifiers:
// First, set up alias on the customer
await alguna . accounts . update ( 'acc_xyz789' , {
externalId: 'customer_123' , // Your system's ID
});
// Then use in events
{
accountId : 'customer_123' // Automatically resolved
}
Using Multiple Aliases
await alguna . accounts . update ( 'acc_xyz789' , {
aliases: [
{ type: 'stripe' , value: 'cus_abc123' },
{ type: 'salesforce' , value: '001ABC' },
{ type: 'internal' , value: 'user_12345' },
],
});
// Use any alias
{ accountId : 'cus_abc123' } // Stripe ID
{ accountId : '001ABC' } // Salesforce ID
{ accountId : 'user_12345' } // Internal ID
Idempotency & Deduplication
Unique ID Requirements
The uniqueId field prevents duplicate processing:
// Good: Unique per event
{ uniqueId : 'request_12345_2024-01-20T10:30:00Z' }
// Good: UUID
{ uniqueId : crypto . randomUUID () }
// Bad: Same ID for different events (will be deduplicated)
{ uniqueId : 'event' }
Deduplication Window
Events with the same uniqueId are deduplicated within a 7-day window.
Upsert Mode
Update existing events by using the same uniqueId:
// Initial event
await alguna . events . ingest ({
uniqueId: 'job_12345' ,
eventName: 'compute.job' ,
accountId: 'acc_123' ,
timestamp: '2024-01-20T10:00:00Z' ,
properties: { duration_seconds: 300 },
});
// Update with final duration
await alguna . events . ingest ({
uniqueId: 'job_12345' , // Same ID
eventName: 'compute.job' ,
accountId: 'acc_123' ,
timestamp: '2024-01-20T10:00:00Z' ,
properties: { duration_seconds: 450 }, // Updated value
});
Timestamps
Requirements
Must be ISO 8601 format
Must be within the current or previous billing period
Future timestamps are rejected
Examples
// Valid formats
'2024-01-20T10:30:00Z' // UTC
'2024-01-20T10:30:00+05:30' // With timezone
'2024-01-20T10:30:00.123Z' // With milliseconds
// Invalid
'2024-01-20' // Missing time
'01/20/2024 10:30 AM' // Wrong format
'2025-01-20T10:30:00Z' // Future date
Backdating Events
Ingest historical events (within billing period):
await alguna . events . ingestBatch (
historicalEvents . map ( e => ({
uniqueId: `historical_ ${ e . id } ` ,
eventName: 'api.request' ,
accountId: e . customerId ,
timestamp: e . occurredAt , // Historical timestamp
properties: e . data ,
}))
);
Properties
Properties carry event data used for filtering and aggregation.
Common Property Patterns
// API usage
{
properties : {
endpoint : '/api/v1/users' ,
method : 'GET' ,
status_code : 200 ,
response_time_ms : 45 ,
bytes_transferred : 2048 ,
}
}
// Compute usage
{
properties : {
instance_type : 'gpu.large' ,
duration_seconds : 3600 ,
region : 'us-east-1' ,
}
}
// Storage usage
{
properties : {
storage_class : 'standard' ,
bytes_stored : 1073741824 ,
region : 'eu-west-1' ,
}
}
// AI/ML usage
{
properties : {
model : 'gpt-4' ,
input_tokens : 1500 ,
output_tokens : 500 ,
latency_ms : 2300 ,
}
}
Property Types
Type Example Use Case String "us-east-1"Filtering, grouping Number 1024Aggregation (sum, max) Boolean trueFiltering
Nested Properties
{
properties : {
request : {
method : 'POST' ,
path : '/api/users' ,
},
response : {
status : 200 ,
bytes : 1024 ,
},
}
}
Error Handling
Validation Errors
try {
await alguna . events . ingest ( event );
} catch ( error ) {
if ( error . code === 'VALIDATION_ERROR' ) {
console . error ( 'Invalid event:' , error . details );
// {
// field: 'timestamp',
// message: 'Invalid ISO 8601 format'
// }
}
}
Common Errors
Error Cause Solution INVALID_TIMESTAMPWrong format or future date Use ISO 8601, current/past dates ACCOUNT_NOT_FOUNDUnknown accountId Create account or check alias DUPLICATE_EVENTSame uniqueId processed Use unique IDs per event INVALID_PROPERTIESNon-JSON properties Ensure valid JSON structure
Retry Strategy
async function ingestWithRetry ( events , maxRetries = 3 ) {
for ( let attempt = 1 ; attempt <= maxRetries ; attempt ++ ) {
try {
return await alguna . events . ingestBatch ( events );
} catch ( error ) {
if ( error . status >= 500 && attempt < maxRetries ) {
// Server error, retry with backoff
await sleep ( Math . pow ( 2 , attempt ) * 1000 );
continue ;
}
throw error ;
}
}
}
High-Volume Ingestion
Batch Recommendations
Volume Batch Size Frequency < 1,000/min 1-100 Real-time 1,000-10,000/min 100-500 Every 1-5 seconds > 10,000/min 500-1,000 Continuous batching
Async Ingestion
For very high volume, use async mode:
const result = await alguna . events . ingestAsync ({
events: largeEventBatch ,
callbackUrl: 'https://yourapp.com/webhooks/ingestion' ,
});
// Returns immediately
console . log ( 'Job ID:' , result . jobId );
console . log ( 'Status:' , result . status ); // 'processing'
Rate Limits
Tier Requests/sec Events/request Standard 100 1,000 Enterprise 500 10,000
Query Usage
Current Period Usage
const usage = await alguna . usage . get ({
accountId: 'acc_customer123' ,
metricCode: 'api_calls' ,
});
console . log ( 'Usage:' , usage . value );
console . log ( 'Period:' , usage . periodStart , '-' , usage . periodEnd );
Usage History
const history = await alguna . usage . history ({
accountId: 'acc_customer123' ,
metricCode: 'api_calls' ,
startDate: '2024-01-01' ,
endDate: '2024-01-31' ,
granularity: 'daily' ,
});
Usage by Property
const breakdown = await alguna . usage . breakdown ({
accountId: 'acc_customer123' ,
metricCode: 'api_calls' ,
groupBy: [ 'endpoint' ],
});
Webhooks
Event Description usage.recordedEvent processed (opt-in) usage.threshold_exceededUsage exceeded threshold
Threshold Alerts
await alguna . alerts . create ({
accountId: 'acc_customer123' ,
metricCode: 'api_calls' ,
threshold: 80 , // 80% of plan limit
action: 'webhook' ,
webhookUrl: 'https://yourapp.com/alerts' ,
});
Best Practices
Batch Events Send events in batches for better performance.
Unique IDs Generate truly unique IDs to prevent duplicates.
Accurate Timestamps Use the actual event time, not ingestion time.
Rich Properties Include properties needed for filtering and aggregation.
Troubleshooting
Events Not Appearing
Check event timestamp is within billing period
Verify accountId exists or alias is configured
Confirm eventName matches your metric
Review API response for errors
Usage Shows Zero
Verify events are being sent successfully
Check metric configuration and filters
Confirm event properties match metric requirements
Duplicate Charges
Ensure uniqueId is truly unique per event
Check for duplicate event sends in your code
Review event logs for repeated ingestion
Next Steps
Define Metrics Configure billable metrics.
Metrics Reference Complete metrics configuration guide.