Eliq API Guidelines

Eliq API Guidelines

Eliq enforces API rules to ensure stability, with action taken only if critical issues arise.

Eliq API Guidelines

1. Purpose

To ensure the stability and optimal performance of the Eliq Data Management and Insights APIs, we ask clients to follow the usage rules outlined in this document. These rules are designed to maintain a consistent, reliable, and secure experience across our platform.

Eliq monitors API usage to proactively detect potential issues. If abnormal usage is observed, we will initiate a review with your team to identify the root cause and agree on corrective actions. Clients are expected to engage within 5 business days of initial contact and to implement agreed measures within 10 business days, unless otherwise mutually agreed.

If the issue remains unresolved or recurs without effective mitigation, Eliq may apply additional charges to offset the operational impact. Temporary access restrictions or termination of the service agreement would only be considered in critical cases, specifically where continued misuse presents a significant risk to the platform—and always as a last resort, after all collaborative options have been exhausted.

2. Scope

These rules apply to all consumers of the APIs, including Eliq customers, partners, internal teams, and third-party integrators.

3. Rule Types

  • Requirements: Must be followed. Breach may lead to operational action or contract consequences.
  • Recommendations: Should be followed. These improve performance and resilience but are not enforced unless misuse causes broader impact.

4. API Usage Requirements

  • Batch endpoints (e.g., /jobs) must be used instead of multiple single requests when loading data via the Data Management API in production.
  • Documented format requirements (e.g., ndjson, JSON) must be adhered to.
  • For upsert/delete/export jobs, file naming and file size limits must be respected; unintentional filename reuse should be avoided.
  • HTTP status codes and error response bodies must be inspected. Requests must not retried blindly upon error; errors must be handled according to their type:
    • 4xx (Client error): must be reviewed and corrected before retries.
    • 5xx (Server error): exponential backoff must be applied when 5xx errors are encountered.
  • Repeated retries of failing or malformed requests must be avoided.
  • Duplicate resources must not be created (e.g., re-post the same job file under a different name to force reprocessing).
  • User consents must only be updated following an explicit user action.
  • Client-side timeouts (suggested: 30 seconds for standard API calls) must be set.
  • Compromised tokens must be revoked immediately.
  • Secrets (API keys, Client Secrets or tokens) must not be exposed in public code or repositories.
  • TLS (HTTPS) must always be used for API calls.
  • Only the data required for the specific service, in line with privacy laws, must be accessed.
  • Load or penetration testing must not be performed without prior written permission.
  • Data must not be reverse-engineered, scraped, or replicated at scale beyond the scope of intended usage.
  • Agreed-upon product limitation of the APIs must not be circumvented (e.g., unauthorized data exports).
  • Deprecated endpoints and fields must be migrated away from within 180 days of notice (Note: any endpoint may be changed, without notice, if required to address a critical security issue).

5. API Usage Recommendations

  • Only /health or /heartbeat endpoints should be used for availability checks, not functional endpoints like GET /user or GET /location.
  • Polling should only be performed when necessary, and interval guidelines should be followed.
    • Status endpoints should not be polled more than once every 5 seconds.
    • More than 1000 requests per minute should not be sent to any single endpoint, unless explicitly agreed otherwise in writing.
    • Insights API resources (e.g., user/location) should not be polled more than once every 60 seconds, unless recently updated.
  • Datapoints already loaded in the Eliq data platform should not be re-ingested unless they were erroneous.
  • Static or infrequently changing data (e.g., user/location metadata) should be cached for at least 24 hours.
  • Malformed, empty, or invalid files should not be submitted to batch jobs.
  • Request ID (X-Request-Id / transaction_id) returned in headers or response body should be logged and stored when a response is used in a support request.
  • Unnecessary jobs should be cancelled via /jobs/{jobId}/cancel rather than allowing unnecessary processing.

6. Enforcement and Remediation

Violations are handled as follows:

  • Engagement expected within 5 business days of Eliq's contact.
  • Corrective action expected within 10 business days, unless otherwise agreed.
  • Repeated or unaddressed violations may result in:
    • Operational surcharges
    • Temporary access restrictions
    • Termination of the service agreement (only in critical, high-risk scenarios)

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Updated on: 
May 8, 2025