Browse docs
Getting StartedUpdated August 19, 2024

Quick Start

Welcome to Telemetry! Telemetry is a logging and analytics platform designed to help you capture, store, and query your data effortlessly. Whether you're tracking application metrics, monitoring user behavior, or analyzing business performance, Telemetry provides a seamless experience with consistent APIs across multiple programming languages.

If you're interested in learning more about how Telemetry works under the hood, check out our Telemetry Architecture Doc for a detailed overview. For a more in-depth use case, such as tracking OpenAI costs by instrumenting API calls, refer to our guide on Tracking OpenAI Costs.

Install Telemetry

This tutorial uses JavaScript. For other languages, please refer to the individual SDKs. The overall API structure is consistent across different programming languages.

npm install telemetry-sh

Log some data

Telemetry automatically creates tables when data is logged. In the following example, we log some Uber ride data to a table called uber_rides. Telemetry will automatically create this table and its corresponding schema with columns: city, price, and timestamp.

import telemetry from "telemetry-sh";

telemetry.init("YOUR_API_KEY");

telemetry.log("uber_rides", {
  city: "paris",
  price: 42,
  timestamp: new Date().toISOString()
});

Explore data with our UI

You can do all sorts of different visualizations and queries in our interactive UI located at https://telemetry.sh.

The telemetry.sh dashboard UI

Explore View

The Explore tab is the fastest way to go from raw events to charts and tables.

Explore is scoped per table. To open it:

  1. Go to your team.
  2. Open Tables.
  3. Select a table.
  4. Open the Explore tab on that table page.

Route shape:

/team/{team}/table/{table}?tab=explore
  1. Pick a graph type (Samples, Table, Line, Bar, or Stacked Area).
  2. Set a time range and add one or more filters.
  3. Choose columns to include (including nested fields like data.toolName).
  4. Click Run to execute and render results.

Tip: controls are draft state. Results update only when you click Run.

Hypothetical example dataset (fictional)

For the Explore examples below, assume you logged synthetic test events (no real user data) with a structure like this:

{
  "timestamp_utc": "2026-02-27T12:00:00.000Z",
  "event": "tool_call",
  "status": "success",
  "data": {
    "toolName": "smart_avantis_buy",
    "args": {
      "symbol": "BTC-USD",
      "amountUsd": 2500
    }
  }
}

You can log this to a table such as agent_demo_events and use that table in Explore.

Example 1: Filter tool calls and inspect raw rows

Using the fictional schema above, try this in Explore:

  1. Set graph type to Samples.
  2. Add filter: event = tool_call.
  3. Add filter: data.toolName = smart_avantis_buy.
  4. Click Run.

URL template (replace placeholders):

/team/{team}/table/agent_demo_events?tab=explore&graphType=samples&f=event:=:tool_call&f=data.toolName:=:smart_avantis_buy
Example 2: Compare chart view vs table view

Use the same filters, then:

  1. Select Line and click Run.
  2. Switch to Table and click Run again.

This gives you a quick way to validate both trends and exact values.

Example 3: Nested columns in All Columns mode

When you keep All columns selected in Explore, generated SQL explicitly enumerates columns so nested fields are included.

Example generated SQL shape:

SELECT "timestamp_utc", "event", "data.toolName", "data.args.symbol"
FROM "your_table"
WHERE timestamp_utc >= now() - INTERVAL '7 days'
ORDER BY timestamp_utc DESC
LIMIT 200

Dashboards

Dashboards let you pin Explore charts/tables and query results into one shared view.

To create and use dashboards:

  1. Open any table in Explore or open a saved query result.
  2. Click Add to Dashboard.
  3. Choose an existing dashboard, or create a new dashboard first.
  4. Give the widget a title and confirm.

Dashboard route shape:

/team/{team}/dashboard/{dashboardSlug}

Use dashboards to keep key signals in one place for your team (latency, error rates, cost, and conversion metrics).

Create alerts

Once you have a chart or query result, you can turn it into an alert in the Telemetry UI.

  1. Open a table in Explore or run a query in Results/Chart view.
  2. Click Create Alert.
  3. Configure the condition (aggregation, last N points, comparison, and threshold).
  4. Set a check interval and one or more email recipients.
  5. Click Create Alert to save. You will be redirected to /team/{team}/alert/{alertSlug} to monitor status and history.

Tip: leave Ignore the last data point enabled for bucketed time series, since the newest bucket is often incomplete.

For a full walkthrough, see the Alerts guide.

Interactive inline examples

Example 1: Latency spike detector (p95)

Try this in Explore on a table with latency_ms:

  1. Set graph type to Line.
  2. Set aggregation to p95 with metric latency_ms.
  3. Click Run, then click Create Alert.
  4. Condition: p95 of last 5 data points is Greater than 850.
  5. Interval: Every minute, then add recipient emails.

URL template (replace placeholders):

/team/{team}/table/{table}?tab=explore&graphType=line&agg=p95&metric=latency_ms&time=7d
Example 2: Error burst detector (count)

Detect spikes in failed requests:

  1. In Explore, filter rows with status >= 500.
  2. Use a line chart with aggregation count.
  3. Click Create Alert.
  4. Condition: Average of last 3 data points is Greater than 20.

This catches short error bursts while avoiding noise from a single bad bucket.

Example 3: Quiet traffic detector (drop below threshold)

Use this for cron jobs, queues, or ingest pipelines:

  1. Query or chart event volume over time.
  2. Create an alert with comparison Less than.
  3. Condition: Sum of last 10 data points is Less than 50.

If this triggers, it usually means data stopped flowing before customers notice.

Integrate with your codebase

You can integrate Telemetry into your product or internal dashboards by querying your data with SQL through the query API.

const results =
  await telemetry.query(`
    SELECT
      city,
      AVG(price)
    FROM
      uber_rides
    GROUP BY
      city
  `);