SQL for Product Managers: 10 Queries You Need to Know (2026)
Stop waiting for the data team. These 10 SQL queries let you answer your own product questions about engagement, retention, revenue, churn, and more.
Why Product Managers Need SQL
Every product decision comes down to data. How many users are active today? Which features drive retention? Where do people drop off in the signup funnel? These are questions product managers ask daily, and most of the time, the answers sit in a database waiting to be queried.
The problem is the bottleneck. You write a Jira ticket, the data team picks it up next sprint, and by the time you get the CSV, the context has shifted. The feature you were evaluating already shipped a v2. The A/B test window closed. The exec meeting happened without the numbers.
SQL removes that bottleneck. With read-only access to your analytics database and a handful of queries, you can answer 80% of your product questions in minutes instead of days. You do not need to become a database engineer. You need to know 10 patterns well enough to modify them for your specific tables.
This guide gives you those 10 patterns. Each one includes the plain English question it answers, a working SQL query, and a line-by-line explanation so you understand what every part does. Copy them, adapt the table and column names to your schema, and start pulling your own data.
1. DAU/MAU (Daily and Monthly Active Users)
The question: How many unique users are active each day, and what is our DAU/MAU ratio?
DAU/MAU is the single most-watched engagement metric in SaaS. It tells you what percentage of your monthly user base shows up on any given day. A ratio above 20% is considered healthy for most B2B products. Below 10% signals a habit problem.
-- Daily active users for the last 30 days
SELECT
DATE(event_timestamp) AS day,
COUNT(DISTINCT user_id) AS dau
FROM events
WHERE event_timestamp >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY DATE(event_timestamp)
ORDER BY day;
-- DAU/MAU ratio
WITH daily AS (
SELECT
DATE(event_timestamp) AS day,
COUNT(DISTINCT user_id) AS dau
FROM events
WHERE event_timestamp >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY DATE(event_timestamp)
),
monthly AS (
SELECT COUNT(DISTINCT user_id) AS mau
FROM events
WHERE event_timestamp >= CURRENT_DATE - INTERVAL '30 days'
)
SELECT
d.day,
d.dau,
m.mau,
ROUND(100.0 * d.dau / m.mau, 1) AS dau_mau_ratio
FROM daily d
CROSS JOIN monthly m
ORDER BY d.day;
COUNT(DISTINCT user_id) ensures each user is counted once per day, regardless of how many events they triggered. The CTE named monthly calculates the total unique users over 30 days, and CROSS JOIN attaches that single number to every daily row so you can divide.
2. Feature Adoption Rate
The question: What percentage of active users tried a specific feature this week?
When you launch a new feature, you need to know if people actually find and use it. Feature adoption rate compares the number of users who triggered a specific event against the total active user base for the same period.
WITH active_users AS (
SELECT COUNT(DISTINCT user_id) AS total_active
FROM events
WHERE event_timestamp >= CURRENT_DATE - INTERVAL '7 days'
),
feature_users AS (
SELECT COUNT(DISTINCT user_id) AS feature_active
FROM events
WHERE event_name = 'export_csv_clicked'
AND event_timestamp >= CURRENT_DATE - INTERVAL '7 days'
)
SELECT
au.total_active,
fu.feature_active,
ROUND(100.0 * fu.feature_active / au.total_active, 1) AS adoption_pct
FROM active_users au
CROSS JOIN feature_users fu;
Replace 'export_csv_clicked' with whatever event your feature logs. If adoption is below 5% after two weeks, the feature is either hard to discover or not solving a real problem. Both are actionable insights.
3. User Retention by Cohort
The question: Of the users who signed up in a given week, how many came back in week 1, week 2, week 3?
Retention is the single best predictor of product-market fit. This query groups users into weekly signup cohorts and counts how many returned in each subsequent week.
WITH cohorts AS (
SELECT
user_id,
DATE_TRUNC('week', created_at) AS cohort_week
FROM users
),
activity AS (
SELECT
user_id,
DATE_TRUNC('week', event_timestamp) AS activity_week
FROM events
)
SELECT
c.cohort_week,
COUNT(DISTINCT c.user_id) AS cohort_size,
COUNT(DISTINCT CASE
WHEN a.activity_week = c.cohort_week + INTERVAL '1 week'
THEN a.user_id END) AS week_1,
COUNT(DISTINCT CASE
WHEN a.activity_week = c.cohort_week + INTERVAL '2 weeks'
THEN a.user_id END) AS week_2,
COUNT(DISTINCT CASE
WHEN a.activity_week = c.cohort_week + INTERVAL '3 weeks'
THEN a.user_id END) AS week_3,
COUNT(DISTINCT CASE
WHEN a.activity_week = c.cohort_week + INTERVAL '4 weeks'
THEN a.user_id END) AS week_4
FROM cohorts c
LEFT JOIN activity a ON c.user_id = a.user_id
GROUP BY c.cohort_week
ORDER BY c.cohort_week;
Each CASE WHEN checks if a user from the cohort was active in a specific week offset. LEFT JOIN ensures users with zero activity still appear in the cohort count. To get percentages, divide each week column by cohort_size.
4. Conversion Funnel Analysis
The question: How many users make it through each step of the signup-to-payment funnel?
Funnel analysis shows you exactly where users drop off. This query counts unique users at each funnel step and calculates the conversion rate between steps.
WITH funnel AS (
SELECT
COUNT(DISTINCT CASE WHEN event_name = 'signup_page_viewed'
THEN user_id END) AS step_1_viewed,
COUNT(DISTINCT CASE WHEN event_name = 'signup_completed'
THEN user_id END) AS step_2_signed_up,
COUNT(DISTINCT CASE WHEN event_name = 'onboarding_completed'
THEN user_id END) AS step_3_onboarded,
COUNT(DISTINCT CASE WHEN event_name = 'trial_started'
THEN user_id END) AS step_4_trial,
COUNT(DISTINCT CASE WHEN event_name = 'payment_completed'
THEN user_id END) AS step_5_paid
FROM events
WHERE event_timestamp >= CURRENT_DATE - INTERVAL '30 days'
)
SELECT
step_1_viewed,
step_2_signed_up,
ROUND(100.0 * step_2_signed_up / NULLIF(step_1_viewed, 0), 1) AS view_to_signup_pct,
step_3_onboarded,
ROUND(100.0 * step_3_onboarded / NULLIF(step_2_signed_up, 0), 1) AS signup_to_onboard_pct,
step_4_trial,
ROUND(100.0 * step_4_trial / NULLIF(step_3_onboarded, 0), 1) AS onboard_to_trial_pct,
step_5_paid,
ROUND(100.0 * step_5_paid / NULLIF(step_4_trial, 0), 1) AS trial_to_paid_pct
FROM funnel;
NULLIF(..., 0) prevents division-by-zero errors. Each step counts distinct users who triggered that event within the time window. The result is a single-row summary showing absolute numbers and percentages for every transition in your funnel.
5. Revenue per User (ARPU)
The question: What is our average revenue per user, and how does it trend month over month?
ARPU helps you understand whether your monetization is improving even if total users stay flat. A rising ARPU with stable user count means your pricing or upsell strategy is working.
SELECT
DATE_TRUNC('month', p.payment_date) AS month,
COUNT(DISTINCT u.id) AS paying_users,
SUM(p.amount) AS total_revenue,
ROUND(SUM(p.amount) / COUNT(DISTINCT u.id), 2) AS arpu
FROM payments p
JOIN users u ON p.user_id = u.id
WHERE p.payment_date >= CURRENT_DATE - INTERVAL '6 months'
AND p.status = 'succeeded'
GROUP BY DATE_TRUNC('month', p.payment_date)
ORDER BY month;
This query joins payments to users, groups by month, and divides total revenue by the number of unique paying users. Filter by status = 'succeeded' to exclude refunds and failed charges. Compare ARPU across months to spot pricing trends.
6. Churn Analysis
The question: How many users churned this month, and what is our monthly churn rate?
Churn is the inverse of retention. This query identifies users who were active last month but not this month, giving you both the churn count and rate.
WITH last_month_users AS (
SELECT DISTINCT user_id
FROM events
WHERE event_timestamp >= DATE_TRUNC('month', CURRENT_DATE) - INTERVAL '1 month'
AND event_timestamp < DATE_TRUNC('month', CURRENT_DATE)
),
this_month_users AS (
SELECT DISTINCT user_id
FROM events
WHERE event_timestamp >= DATE_TRUNC('month', CURRENT_DATE)
AND event_timestamp < DATE_TRUNC('month', CURRENT_DATE) + INTERVAL '1 month'
)
SELECT
(SELECT COUNT(*) FROM last_month_users) AS last_month_active,
(SELECT COUNT(*) FROM last_month_users
WHERE user_id NOT IN (SELECT user_id FROM this_month_users)) AS churned,
ROUND(100.0 *
(SELECT COUNT(*) FROM last_month_users
WHERE user_id NOT IN (SELECT user_id FROM this_month_users))
/ NULLIF((SELECT COUNT(*) FROM last_month_users), 0),
1) AS churn_rate_pct;
The query defines two sets: users active last month and users active this month. Users in the first set but not the second are churned. For subscription businesses, you can replace event-based activity with subscription status checks on the subscriptions table.
7. A/B Test Results
The question: Did variant B outperform variant A in conversion rate, and is the difference statistically meaningful?
This query pulls the core numbers you need to evaluate an A/B test: sample size, conversions, and conversion rate per variant.
SELECT
e.variant,
COUNT(DISTINCT e.user_id) AS users,
COUNT(DISTINCT CASE WHEN c.user_id IS NOT NULL
THEN e.user_id END) AS conversions,
ROUND(100.0 *
COUNT(DISTINCT CASE WHEN c.user_id IS NOT NULL
THEN e.user_id END)
/ COUNT(DISTINCT e.user_id),
2) AS conversion_rate_pct
FROM experiment_assignments e
LEFT JOIN conversions c
ON e.user_id = c.user_id
AND c.event_name = 'purchase_completed'
AND c.event_timestamp >= e.assigned_at
WHERE e.experiment_name = 'pricing_page_v2'
GROUP BY e.variant
ORDER BY e.variant;
The LEFT JOIN to conversions ensures users who never converted are still counted in the denominator. The condition c.event_timestamp >= e.assigned_at makes sure you only count conversions that happened after the user entered the experiment. Feed the output numbers into a significance calculator to determine if the difference is real.
8. User Segmentation
The question: How do power users, regular users, and dormant users compare in terms of count and revenue?
Segmenting your user base by activity level reveals who your most valuable users are and how large each segment is. This helps prioritize features for the right audience.
WITH user_activity AS (
SELECT
u.id AS user_id,
u.plan,
COUNT(e.id) AS event_count,
MAX(e.event_timestamp) AS last_active
FROM users u
LEFT JOIN events e ON u.id = e.user_id
AND e.event_timestamp >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY u.id, u.plan
),
segments AS (
SELECT
user_id,
plan,
event_count,
last_active,
CASE
WHEN event_count >= 50 THEN 'power'
WHEN event_count >= 10 THEN 'regular'
WHEN event_count >= 1 THEN 'light'
ELSE 'dormant'
END AS segment
FROM user_activity
)
SELECT
segment,
COUNT(*) AS user_count,
ROUND(AVG(event_count), 1) AS avg_events,
COUNT(CASE WHEN plan = 'paid' THEN 1 END) AS paid_count,
ROUND(100.0 * COUNT(CASE WHEN plan = 'paid' THEN 1 END)
/ COUNT(*), 1) AS paid_pct
FROM segments
GROUP BY segment
ORDER BY
CASE segment
WHEN 'power' THEN 1
WHEN 'regular' THEN 2
WHEN 'light' THEN 3
WHEN 'dormant' THEN 4
END;
Adjust the thresholds (50, 10, 1) based on your product. The CASE statement in ORDER BY ensures results display from most engaged to least engaged. The paid percentage per segment shows which engagement levels convert to revenue.
9. Time-to-Value
The question: How long does it take new users to reach their first meaningful action?
Time-to-value measures the gap between signup and the moment a user first gets value from your product. A shorter time-to-value strongly correlates with higher retention.
WITH first_value AS (
SELECT
u.id AS user_id,
u.created_at AS signup_time,
MIN(e.event_timestamp) AS first_value_time
FROM users u
JOIN events e ON u.id = e.user_id
AND e.event_name = 'query_generated'
WHERE u.created_at >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY u.id, u.created_at
)
SELECT
COUNT(*) AS users_reached_value,
ROUND(AVG(EXTRACT(EPOCH FROM (first_value_time - signup_time)) / 3600), 1)
AS avg_hours_to_value,
PERCENTILE_CONT(0.5) WITHIN GROUP (
ORDER BY EXTRACT(EPOCH FROM (first_value_time - signup_time)) / 3600
) AS median_hours_to_value,
PERCENTILE_CONT(0.9) WITHIN GROUP (
ORDER BY EXTRACT(EPOCH FROM (first_value_time - signup_time)) / 3600
) AS p90_hours_to_value
FROM first_value;
Replace 'query_generated' with whatever event represents the first "aha moment" in your product. The query calculates average, median, and 90th percentile time-to-value in hours. Median is more useful than average here because a few outliers (users who come back weeks later) can skew the mean.
10. Support Ticket Trends
The question: Are support tickets increasing, and which categories are growing fastest?
Tracking ticket volume by category over time helps you identify emerging product issues before they become widespread. A spike in a specific category often signals a bug or a confusing UX flow.
SELECT
DATE_TRUNC('week', t.created_at) AS week,
t.category,
COUNT(*) AS ticket_count,
ROUND(AVG(EXTRACT(EPOCH FROM (t.resolved_at - t.created_at)) / 3600), 1)
AS avg_resolution_hours
FROM support_tickets t
WHERE t.created_at >= CURRENT_DATE - INTERVAL '12 weeks'
GROUP BY DATE_TRUNC('week', t.created_at), t.category
ORDER BY week DESC, ticket_count DESC;
This groups tickets by week and category, showing both volume and average resolution time. If a category spikes in one week, investigate. If resolution time is climbing, the support team may need better internal tooling or documentation for that issue type.
For a week-over-week comparison, add a window function:
WITH weekly_tickets AS (
SELECT
DATE_TRUNC('week', created_at) AS week,
category,
COUNT(*) AS ticket_count
FROM support_tickets
WHERE created_at >= CURRENT_DATE - INTERVAL '12 weeks'
GROUP BY DATE_TRUNC('week', created_at), category
)
SELECT
week,
category,
ticket_count,
LAG(ticket_count) OVER (PARTITION BY category ORDER BY week) AS prev_week,
ticket_count - LAG(ticket_count) OVER (PARTITION BY category ORDER BY week) AS change
FROM weekly_tickets
ORDER BY week DESC, ticket_count DESC;
How AI Makes SQL Accessible for Product Managers
You do not need to memorize any of the queries above. AI-powered SQL tools let you describe what you want in plain English and get a working query back in seconds.
For example, instead of writing a retention cohort query from scratch, you can type: "Show me weekly retention cohorts for users who signed up in the last 3 months." The AI generates the SQL, you review it, and you run it against your database.
This approach works well for PMs because:
- You think in questions, not syntax. PMs are trained to ask "what happened" and "why." AI SQL tools match that mental model by accepting natural language input.
- Schema awareness matters. Tools like AI2SQL connect to your actual database schema, so the generated queries use your real table and column names instead of generic placeholders.
- Iteration is fast. If the first query is not quite right, you refine the prompt. "Add a filter for paid users only" is faster than debugging a WHERE clause.
- You learn SQL by reading it. Every generated query teaches you a pattern. After a few weeks of using AI-generated SQL, you start recognizing GROUP BY, CTEs, and window functions naturally.
The goal is not to replace SQL knowledge. It is to remove the blank-page problem so you can focus on the question, not the syntax.
Try AI2SQL free and generate any of these queries by describing what you need in plain English.
Getting Started: A Checklist for PMs
If you have never queried your company's database before, here is a practical checklist to get from zero to running queries:
- Ask your engineering team for read-only database access. Specify that you want access to the analytics replica, not the production database. Most companies already have one.
- Identify your key tables. You typically need:
users(signup data),events(product activity),paymentsorsubscriptions(revenue data), andsupport_ticketsif applicable. - Install a SQL client. DBeaver (free), DataGrip, or even the query editor in your BI tool. Connect it to your read-only credentials.
- Start with Query 1 (DAU). Modify the table and column names to match your schema. Run it. If it works, you are set up.
- Save your queries. Build a personal library of the 10 queries above, customized for your database. Name them clearly:
pm-dau-mau.sql,pm-retention-cohorts.sql, and so on. - Schedule a weekly data pull. Every Monday, run your DAU, funnel, and churn queries. Paste the results into your product update. You now own your own metrics.
Frequently Asked Questions
Do product managers really need to learn SQL?
Yes. SQL lets PMs pull their own data without waiting for engineering or data teams. Even basic SELECT, WHERE, and GROUP BY queries cover 80% of product questions. You do not need to become a database expert, but knowing how to answer simple questions like DAU, retention, or conversion rates directly from the database saves hours of back-and-forth every week.
How long does it take a PM to learn enough SQL to be useful?
Most product managers can learn enough SQL to run the 10 queries in this guide within one to two weeks of casual practice. The core concepts are SELECT (read data), WHERE (filter), GROUP BY (aggregate), and JOIN (combine tables). Once you understand those four, you can answer most product analytics questions yourself.
What SQL queries should product managers know?
The most important SQL queries for product managers are: DAU/MAU (daily and monthly active users), feature adoption rate, user retention by cohort, conversion funnel analysis, revenue per user (ARPU), churn analysis, A/B test results, user segmentation, time-to-value measurement, and support ticket trend analysis. These 10 queries cover the majority of product decisions.
Can AI tools write SQL queries for product managers?
Yes. Tools like AI2SQL let you describe what you need in plain English and generate the correct SQL automatically. For example, you can type "show me daily active users for the last 30 days" and get a working query. This is especially useful for PMs who need answers fast but are still learning SQL syntax.
What database access do product managers typically get?
Most companies give PMs read-only access to a replica or analytics database. This means you can run SELECT queries to pull data but cannot modify production data. Common setups include read replicas of PostgreSQL or MySQL, data warehouses like Snowflake or BigQuery, or BI tools that connect to the database. Ask your engineering team for read-only credentials to get started.