Skip to main content

Data Teams

For analysts and data engineers building pipelines, writing queries, and creating reusable analyses.

Try These First

Open the agent and ask:

Show me a funnel from signup to first feature used
Which queries are taking longest to run in our events table?
Find null rates across all columns in the accounts table

Key Tables

TableWhat's in it
eventsProduct usage events with timestamps
usersUser profiles linked to accounts
accountsCustomer accounts with MRR/ARR
feature_usageAggregated feature adoption

Common Workflows

Build a DAG pipeline

  1. Start with a base query (e.g., raw events)
  2. Add derived cells that transform upstream results
  3. Use the canvas to visualize and edit dependencies
  4. Run downstream to refresh the entire pipeline

Data quality checks

Ask the agent:

  • "Find duplicate user IDs in the events table"
  • "Show me null rates for each column in accounts"
  • "Compare row counts between users and feature_usage"

Create reusable metrics

Once you've validated a calculation, save it to the semantic layer:

measures:
- name: daily_active_users
type: count_distinct
sql: user_id
filters:
- sql: "timestamp >= current_date - interval '1 day'"

Next Steps