{"id":16822,"date":"2026-05-14T09:14:25","date_gmt":"2026-05-14T09:14:25","guid":{"rendered":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/"},"modified":"2026-05-15T07:07:49","modified_gmt":"2026-05-15T07:07:49","slug":"customer-engagement-dashboard","status":"publish","type":"post","link":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/","title":{"rendered":"Customer Engagement Dashboard in 2026: Track Signals that Predict Churn, Not Just Report It"},"content":{"rendered":"<p><!-- DO NOT AUTO-UPDATE PUBLISH DATE ON EDIT\/SAVE --><\/p>\n<p>A <a href=\"https:\/\/userpilot.com\/product\/product-analytics\/\">customer engagement dashboard<\/a> that shows daily active users climbing while net revenue retention quietly drops is not doing its job. I&#8217;ve seen this scenario more times than I&#8217;d like: a product team confident in their engagement numbers, then blindsided by a renewal conversation. Activity was tracked; intent was not.<\/p>\n<p>What engagement dashboards originally promised was simple: collect <a href=\"https:\/\/userpilot.com\/blog\/using-analytics-to-improve-customer-engagement\/\">customer engagement data<\/a> in one place and know how your product is performing. For a long time, that was enough. In 2026, the definition of &#8220;engaged&#8221; has shifted, and a dashboard built around login frequency and session counts is increasingly insufficient for the decisions that actually matter.<\/p>\n<p>AI agents now interact with SaaS products alongside human users, executing tasks through API calls and MCP integrations without generating the behavioral signals dashboards were built to track. Unlike human users who click, scroll, and trigger events, agents complete tasks without ever touching your UI. A dashboard that can&#8217;t tell the difference is measuring half the picture and optimizing for the wrong half.<\/p>\n<p>The dashboards that actually reduce churn in 2026 are built around leading indicators that predict what&#8217;s coming, not lagging ones that confirm what already happened.<\/p>\n<h2 id=\"why-dashboards-fail\">Why most engagement dashboards fail before they start<\/h2>\n<p>The structural failure isn&#8217;t in the data, which is usually fine. It&#8217;s in which metrics get prioritized, and almost universally, teams default to metrics that are easy to track over metrics that are predictive.<\/p>\n<p>James Mitchinson, Userpilot&#8217;s Head of Customer Success, described exactly this pattern when talking about an account his team nearly lost:<\/p>\n<blockquote><p>&#8220;It was clear progress wasn&#8217;t being made, but there were still a lot of logins, and being able to look at the difference between those two things, lots of activity, but the outcomes aren&#8217;t really materializing, it gave us the opportunity to go and have a frank conversation with the executive stakeholder about some of the challenges and frustrations they were experiencing, and we got them back on track before they gave up. If we hadn&#8217;t intervened at that moment, I wonder, would they have looked for another solution or decided it wasn&#8217;t worth their time?&#8221;<br \/>\n<cite>\u2014 James Mitchinson, Head of Customer Success, <a href=\"https:\/\/userpilot.com\/\">Userpilot<\/a><\/cite><\/p><\/blockquote>\n<p>The <a href=\"https:\/\/userpilot.com\/blog\/real-time-customer-engagement\/\">dangerous accounts<\/a> aren&#8217;t the inactive ones, which are easy to spot. They&#8217;re accounts that look engaged on the surface but aren&#8217;t extracting value underneath, and a standard engagement dashboard built around login counts and session duration can&#8217;t surface that distinction.<\/p>\n<p>Wes Bush, founder of ProductLed, frames the root of this problem clearly:<\/p>\n<blockquote><p>&#8220;Engagement is the second-order effect of your product&#8217;s usefulness to customers.&#8221;<\/p><\/blockquote>\n<p>That reframe changes how you build a dashboard. Instead of tracking how often users log in, you track whether they&#8217;re completing the outcomes they came for.<\/p>\n<p>Before building your first metric list, define your engagement north star: one metric that reflects genuine product value, not activity. Everything else on the dashboard becomes a supporting signal. Without a north star, you end up with 25 metrics and no clear decision to make from any of them.<\/p>\n<div style=\"background-color: #e9e5fe; padding: 20px; color: black;\">\ud83d\udca1 Read related blog posts: <a href=\"https:\/\/userpilot.com\/blog\/product-usage-metrics\/\">Product usage metrics: which ones actually predict retention<\/a><\/div>\n<h2 id=\"leading-indicators\">Leading indicators: The metrics that predict what happens next<\/h2>\n<p>Leading indicators are the metrics you want dominating your engagement dashboard. They correlate with future retention and expansion rather than confirming past behavior. The following four are the baseline I&#8217;d build any SaaS <a href=\"https:\/\/userpilot.com\/blog\/product-dashboard\/\">product dashboard<\/a> around.<\/p>\n<h3>Time to value<\/h3>\n<p>Time to value (TTV) measures how long it takes a new user to reach their first meaningful outcome in your product. In SaaS, the average sits around 36 hours, though for any product where core value is delivered in a single session, faster is consistently better. Users who don&#8217;t experience the product&#8217;s value quickly tend to disengage before retention can set in, which makes TTV one of the clearest early-stage predictors of churn you can track.<\/p>\n<p>TTV matters most in the first 7-14 days of a new user&#8217;s life. If a user doesn&#8217;t reach their <a href=\"https:\/\/userpilot.com\/blog\/customer-activation-examples\/\">activation milestone<\/a> in that window, the probability of long-term retention drops significantly. Track it by cohort, not just as an overall average, because TTV often varies sharply by acquisition channel or user segment.<\/p>\n<h3>Activation rate<\/h3>\n<p>Activation rate measures the percentage of new users who complete the milestone that predicts long-term retention: the moment a user has experienced enough of your product to understand its core value. Defining that milestone is the hard part, and most teams define it either too early (completing a profile) or too broadly (logging in three times).<\/p>\n<p>When I look at activation rates in Userpilot, I break them down by the specific steps in the <a href=\"https:\/\/userpilot.com\/blog\/product-success\/\">activation funnel<\/a> rather than tracking a single aggregate. That&#8217;s where the actionable insight lives: not &#8220;activation is at 34%,&#8221; but &#8220;users are dropping off at step 3, specifically during domain verification.&#8221;<\/p>\n<p>When we launched Userpilot&#8217;s email feature, the funnel showed a sharp drop-off at exactly that verification step. Within a few hours, I built a targeting tooltip and checklist inside Userpilot highlighting the correct steps, with no engineering ticket required. The drop-off closed within days, and I wouldn&#8217;t have caught it without watching activation at that level of granularity.<\/p>\n<figure id=\"attachment_1\" aria-describedby=\"caption-attachment-1\" style=\"width: 1000px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-1\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/New-user-activation-dashboard-in-Userpilot.gif\" alt=\"New user activation dashboard in Userpilot showing step-by-step funnel with drop-off points\" width=\"1000\" height=\"563\" \/><figcaption id=\"caption-attachment-1\" class=\"wp-caption-text\"><br \/>New user activation dashboard in Userpilot.\u00a0<\/figcaption><\/figure>\n<h3>Feature adoption rate<\/h3>\n<p><a href=\"https:\/\/userpilot.com\/blog\/feature-adoption-101\/\">Feature adoption rate<\/a> measures what percentage of users have started engaging with a specific feature. The number that matters is adoption of your core features, not your peripheral ones. High adoption of a sidebar widget tells you almost nothing about retention; high adoption of the feature that delivers your product&#8217;s primary value is one of the clearest retention signals you have.<\/p>\n<p>I also track <a href=\"https:\/\/userpilot.com\/blog\/feature-adoption-101\/\">feature adoption<\/a> alongside continued usage, because initial uptake tells you about discovery while return visits tell you about value. When adoption is high but continued usage drops off sharply after the first week, that&#8217;s a signal the feature isn&#8217;t delivering what users expected, which is a product problem worth diagnosing before it shows up in churn.<\/p>\n<figure id=\"attachment_2\" aria-describedby=\"caption-attachment-2\" style=\"width: 1000px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-2\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/Core-feature-engagement-dashboard-in-Userpilot.gif\" alt=\"Core feature engagement dashboard in Userpilot showing adoption and usage trends over time\" width=\"1000\" height=\"563\" \/><figcaption id=\"caption-attachment-2\" class=\"wp-caption-text\">Tracking adoption rate alongside continued usage rate reveals whether a feature is being discovered (adoption) or genuinely delivering value (return usage).<\/figcaption><\/figure>\n<h3>Customer health score<\/h3>\n<p>A customer health score is a composite metric that bridges <a href=\"https:\/\/userpilot.com\/blog\/product-analytics\/\">product usage data<\/a> and <a href=\"https:\/\/userpilot.com\/blog\/churn-prediction\/\">churn prediction<\/a>. It aggregates signals from multiple sources, typically login frequency, feature usage breadth, support ticket volume, and recent NPS responses, into a single score that gives customer success teams a read on account health without requiring them to cross-reference five separate reports.<\/p>\n<p>The specific signals you weight will depend on your product. In a B2B tool used by teams, feature breadth (how many seats are actively using core features) carries more weight than raw session count. Self-serve products tend to find time-to-activation and return visits in the first 30 days most predictive, though the right answer always comes from your own cohort data.<\/p>\n<figure id=\"attachment_3\" aria-describedby=\"caption-attachment-3\" style=\"width: 1351px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-3\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2022\/12\/product-usage-dashboard-1.png\" alt=\"Product usage dashboard in Userpilot showing user stickiness, feature engagement, and retention signals\" width=\"1351\" height=\"639\" \/><figcaption id=\"caption-attachment-3\" class=\"wp-caption-text\"><br \/>The combination of stickiness, feature engagement, and retention data in a single view is what makes health score calculations possible without building a custom BI report.<\/figcaption><\/figure>\n<h3>Customer engagement score<\/h3>\n<p>A customer engagement score measures how engaged your existing customers and free trial prospects are, typically through a weighted combination of in-product actions. It&#8217;s distinct from a health score in that it captures engagement quality rather than account-level health. A user who completes a core workflow earns more engagement score than one who opens the same feature and immediately closes it.<\/p>\n<p>In Userpilot, <a href=\"https:\/\/userpilot.com\/blog\/userpilot-custom-analytics-dashboards\/\">custom analytics dashboards<\/a> let you configure which events contribute to your engagement score and how they&#8217;re weighted. That customization matters: an engagement score built around the wrong events tracks activity without tracking value, which puts you back at the same problem most dashboards start with.<\/p>\n<h2 id=\"lagging-indicators\">Lagging indicators: The metrics that still belong on your dashboard<\/h2>\n<p>Lagging indicators aren&#8217;t useless. They give you essential context, validate product decisions, and surface patterns that leading indicators miss. The mistake is treating them as your primary signal for retention risk, because by the time they move, the retention window for most at-risk accounts has already closed.<\/p>\n<p><a href=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging.png\"><img decoding=\"async\" class=\"alignnone size-full wp-image-638052\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging.png\" alt=\"Leading vs lagging indicators\" width=\"1800\" height=\"1300\" srcset=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging.png 1800w, https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging-450x325.png 450w, https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging-1024x740.png 1024w, https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging-768x555.png 768w, https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2025\/02\/image-2-leading-vs-lagging-1536x1109.png 1536w\" sizes=\"(max-width: 1800px) 100vw, 1800px\" \/><\/a><\/p>\n<h3>Net Promoter Score (NPS)<\/h3>\n<p>NPS measures customer loyalty by asking users how likely they are to recommend your product on an 10-point scale. Customers who score 9 or 10 are promoters; those who score 0-6 are detractors. The resulting score gives you a directional read on sentiment that&#8217;s useful for trend tracking across cohorts, not for catching individual accounts before they churn.<\/p>\n<p>Leading off your dashboard with NPS has one consistent problem: timing. By the time NPS drops, the damage is usually done. A customer frustrated enough to score you a 3 is not starting a retention conversation with that survey: that conversation has been underway for weeks without a signal on your dashboard.<\/p>\n<figure id=\"attachment_4\" aria-describedby=\"caption-attachment-4\" style=\"width: 2880px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-4\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/NPS-dashboard.png\" alt=\"NPS dashboard in Userpilot showing promoter, passive, and detractor breakdown with trend over time\" width=\"2880\" height=\"1616\" \/><figcaption id=\"caption-attachment-4\" class=\"wp-caption-text\"><br \/>NPS dashboard in Userpilot.<\/figcaption><\/figure>\n<h3>Customer Satisfaction Score (CSAT)<\/h3>\n<p>CSAT measures satisfaction with specific interactions: a support ticket resolution, a new feature release, an onboarding call. It&#8217;s more transactional than NPS and more accurate as a result, because it captures satisfaction in context rather than asking users to rate their entire relationship with your product from memory. <a href=\"https:\/\/userpilot.com\/blog\/b2b-customer-satisfaction-surveys\/\">B2B CSAT surveys<\/a> work best triggered immediately after a meaningful interaction, not sent on a quarterly cadence.<\/p>\n<h3>Customer Effort Score (CES)<\/h3>\n<p>CES measures how easy it is for users to complete tasks with your product, typically by asking &#8220;how easy was it to complete this task&#8221; on a 1-7 scale after a specific interaction. A low CES score on a core feature is one of the clearest signals you have that a usability problem exists, which is worth knowing well before it starts driving churn. Of the three satisfaction metrics, CES is the most actionable for product teams because it points at friction rather than just reporting dissatisfaction.<\/p>\n<p>I watch CES scores on <a href=\"https:\/\/userpilot.com\/blog\/user-friction\/\">high-friction steps<\/a> specifically, the places in your product where users are most likely to give up or submit a support ticket. If CES drops after a UI change, you know the change introduced friction; if it improves after a redesign, you have evidence the fix worked.<\/p>\n<h3>User retention rates<\/h3>\n<p>Retention rates (daily, weekly, monthly) measure the percentage of users who return to your product in a given period. Like NPS, they&#8217;re diagnostic: they confirm that churn happened rather than predicting it. By the time a monthly <a href=\"https:\/\/userpilot.com\/blog\/customer-retention-rate-metrics-saas\/\">retention rate<\/a> drops, you&#8217;re already past the intervention window for the accounts that drove the decline.<\/p>\n<p>Track retention rates by cohort rather than as an overall aggregate. A product with stable overall retention but declining retention in its newest cohorts has a structural onboarding problem that aggregate numbers will hide until it compounds into a revenue problem.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/04\/retention-dashboard-engagement-dashboard_af86e1d3add6bfeb15d3ba5a24f00640.gif\" \/><\/p>\n<h3>Customer Lifetime Value (CLV)<\/h3>\n<p><a href=\"https:\/\/userpilot.com\/blog\/calculate-lifetime-value-saas\/\">Customer Lifetime Value<\/a> estimates the total revenue a customer generates across their relationship with your product. It ties everything else together: strong retention and strong feature adoption translate directly into higher CLV. Track it by segment and by acquisition channel to understand which customer types are actually driving your business, rather than just inflating your user count.<\/p>\n<h3>Churn rate and funnel metrics<\/h3>\n<p>Churn rate is the ultimate lagging indicator: by the time it shows up in your dashboard, the damage is already done. Track it as your revenue baseline, but pair it with <a href=\"https:\/\/userpilot.com\/blog\/saas-funnel-metrics\/\">funnel metrics<\/a> that show where users are converting (or dropping off) before they reach the churn stage. Cost per acquisition and conversion rates belong alongside churn data, not siloed in a marketing-only report.<\/p>\n<figure id=\"attachment_6\" aria-describedby=\"caption-attachment-6\" style=\"width: 2940px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-6\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2022\/02\/funnel-analysis-Userpilot.png\" alt=\"Funnel analysis in Userpilot showing drop-off points across the user journey\" width=\"2940\" height=\"1699\" \/><figcaption id=\"caption-attachment-6\" class=\"wp-caption-text\"><br \/>Funnel analysis in Userpilot. The step-level drop-off view is what transforms funnel data from a reporting exercise into a prioritization tool for your onboarding and activation work.<\/figcaption><\/figure>\n<h2 id=\"ai-sessions\">The 2026 addition: Tracking AI-assisted sessions<\/h2>\n<p>Standard <a href=\"https:\/\/userpilot.com\/blog\/product-usage-metrics\/\">customer engagement metrics<\/a> were designed around human behavior: clicks, scrolls, session duration, page depth. That design assumption is starting to break down. AI agents now interact with SaaS products through API calls and MCP integrations, completing tasks without generating any of the behavioral signals your dashboard was built to capture.<\/p>\n<p>Yazan Sehwail, Userpilot&#8217;s CEO, identified the compounding problem when describing what increased shipping velocity means for product teams trying to track feature engagement:<\/p>\n<blockquote><p>&#8220;As producing and building features become a lot cheaper, instead of every quarter, you&#8217;re releasing one or two features, now you&#8217;re releasing 7, 8, 9. It becomes even harder for product teams to manually have to track each one and understand usage for each one.&#8221;<\/p><\/blockquote>\n<p>More features shipping means more engagement patterns to track across your dashboard. Layered on top of that, AI-assisted sessions generate activity that traditional metrics will misread as human engagement, inflating session counts and usage numbers without a corresponding signal about whether real value is being delivered. If an AI agent is using your product intensively through an MCP integration, your dashboard might read it as a highly engaged human account.<\/p>\n<p>The new metric layer worth adding for AI-assisted sessions:<\/p>\n<ul>\n<li><strong>Agent task completion rate:<\/strong> What percentage of tasks initiated by an AI agent are successfully completed? Incomplete tasks at scale are a product quality signal, not just a usage anomaly.<\/li>\n<li><strong>Human vs. agent session ratio:<\/strong> What share of your active sessions are agent-driven? A product where agents account for a significant share of sessions needs separate analysis tracks for each user type.<\/li>\n<li><strong>AI interaction quality:<\/strong> For products with a built-in AI assistant, what&#8217;s the rate of successful interactions versus failed or abandoned ones? This is where Lia&#8217;s usage data becomes a meaningful engagement signal.<\/li>\n<\/ul>\n<p><img decoding=\"async\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/team-product-Userpilot.png\" \/><\/p>\n<p>Yazan&#8217;s framing of what MCP changes specifically for teams trying to get a complete picture of product usage:<\/p>\n<blockquote><p>&#8220;If you as a marketer wanted to see, using session replay, NPS data, survey data, and product usage data, you&#8217;re able to get your answer without having to go to Userpilot, without having to pull data and upload it to someone. So this is why MCP is gonna be a game changer.&#8221;<\/p><\/blockquote>\n<p>For most teams in 2026, AI-assisted session data is still sparse, and the analysis infrastructure for separating agent sessions from human sessions is new. Adding a segment to your existing dashboard that filters by session type is the practical starting point, with the full layer built incrementally as data accumulates. What matters most on day one isn&#8217;t a complete picture: it&#8217;s not misreading agent activity as human engagement while you build toward one.<\/p>\n<div style=\"background-color: #e9e5fe; padding: 20px; color: black;\">\ud83d\udca1 Read related blog posts: <a href=\"https:\/\/userpilot.com\/blog\/product-analytics\/\">Userpilot Product Analytics: track feature adoption, retention, and user behavior in one dashboard<\/a><\/div>\n<h2 id=\"how-to-build\">How to build a customer engagement dashboard that actually drives decisions<\/h2>\n<p>Building a customer engagement dashboard is not a configuration task. It&#8217;s a prioritization task. Before you touch any dashboard settings, you need one clear answer: what is the single metric that best reflects whether your customers are getting genuine value from your product?<\/p>\n<h3>Step 1: Define your engagement north star<\/h3>\n<p>Your north star is the one metric that your retained users almost always cross and your churned users almost never do. For a project management tool, it might be &#8220;users who complete their first project milestone.&#8221; For a collaboration tool, it might be &#8220;users who invite at least two teammates within their first week.&#8221; For any product, it should map to the moment value becomes undeniable.<\/p>\n<p>Getting this right takes more than intuition. Pull cohort data comparing your most retained users against your churned users and look for the behavioral divergence point: the action or milestone that separates the two groups most clearly. That&#8217;s your north star candidate, and it&#8217;s worth validating in a few customer conversations before locking it in as your <a href=\"https:\/\/userpilot.com\/blog\/new-user-retention\/\">retention<\/a> anchor.<\/p>\n<h3>Step 2: Layer in 3-5 supporting signals<\/h3>\n<p>Once you have a north star, build your supporting metric layer around it. Keep it to five metrics maximum for the primary dashboard view. A longer list turns into background noise that nobody checks until something goes wrong.<\/p>\n<p>For most SaaS products, the right supporting signals are: activation rate (broken down by step, not overall), feature adoption rate for your top 3 core features, time to value, <a href=\"https:\/\/userpilot.com\/blog\/increase-product-stickiness-saas\/\">user stickiness<\/a>, and a customer health score weighted to your product&#8217;s specific retention predictors. Each metric should have a named owner and a defined intervention threshold, which is the point at which someone takes action rather than just noting the number.<\/p>\n<h3>Step 3: Autocapture events, then label what matters<\/h3>\n<p>To measure <a href=\"https:\/\/userpilot.com\/blog\/userpilot-autocapture\/\">customer engagement<\/a> accurately, you need to track what users actually do in your product. Autocapture handles this automatically by recording every user interaction without requiring manual event setup, which means you don&#8217;t lose data while you&#8217;re still figuring out which events matter. The discipline is labeling meaningful events post-capture so your dashboard doesn&#8217;t drown in interaction noise with no signal hierarchy.<\/p>\n<p>Autocapture also enables retroactive analysis: you can look back at historical data to understand what users were doing before a churn spike rather than realizing after the fact that you weren&#8217;t tracking the right events. That retroactive capability is one of the most practically valuable advantages of tool-level autocapture over custom event tracking, particularly for teams without a dedicated data engineering resource.<\/p>\n<figure id=\"attachment_7\" aria-describedby=\"caption-attachment-7\" style=\"width: 1444px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-7\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/Autocapturing-event-data-in-Userpilot.png\" alt=\"Autocapture event data interface in Userpilot showing recorded user interactions and labeling options\" width=\"1444\" height=\"1028\" \/><figcaption id=\"caption-attachment-7\" class=\"wp-caption-text\"><br \/>Autocapture in Userpilot records every user interaction automatically. The labeling step that follows is what turns raw event data into the meaningful signals your engagement dashboard is built on.<\/figcaption><\/figure>\n<h3>Step 4: Pair the numbers with qualitative insight<\/h3>\n<p>Quantitative data tells you where users drop off, but it takes qualitative insight to understand why. Without that pairing, you&#8217;ll optimize the wrong step or fix the wrong feature. Two methods that work consistently: <a href=\"https:\/\/userpilot.com\/blog\/rage-clicks\/\">session recordings<\/a> to watch how users actually navigate your product, and in-app surveys triggered at the right moment to collect feedback while the experience is still fresh.<\/p>\n<figure id=\"attachment_8\" aria-describedby=\"caption-attachment-8\" style=\"width: 1440px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-8\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/User-Recording-Full-Screen-1.png\" alt=\"Watching a user session recording in Userpilot showing navigation patterns and friction points\" width=\"1440\" height=\"1024\" \/><figcaption id=\"caption-attachment-8\" class=\"wp-caption-text\"><br \/>Session replay in Userpilot. Filtering recordings by the specific events that feed your engagement metrics narrows the search from hundreds of sessions to the handful that explain a metric movement.<\/figcaption><\/figure>\n<p>Session recordings are especially useful when a metric moves unexpectedly. If feature adoption drops 15% after a UI change, watching 10 recordings of users who stopped using that feature will surface the problem faster than any amount of additional quantitative slicing. I filter recordings by the specific events feeding my engagement metrics, which narrows the search considerably.<\/p>\n<p>In-app surveys close the feedback loop between quantitative signals and user intent. If activation rate drops, a triggered survey asking users what stopped them during onboarding gives you the verbatim answer needed to fix it rather than guess at it. <a href=\"https:\/\/userpilot.com\/product\/user-feedback\/\">In-app surveys<\/a> work best tied to a specific product event, not sent on a time-based schedule that has no relationship to what the user just experienced.<\/p>\n<figure id=\"attachment_9\" aria-describedby=\"caption-attachment-9\" style=\"width: 824px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"size-full wp-image-9\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2024\/02\/User-survey-in-Userpilot-1.gif\" alt=\"In-app survey created and triggered in Userpilot appearing contextually during user flow\" width=\"824\" height=\"747\" \/><figcaption id=\"caption-attachment-9\" class=\"wp-caption-text\"><br \/>In-app survey in Userpilot. Triggering a survey at the moment a user hits friction yields far higher response rates and more actionable answers than a scheduled email sent three days later.<\/figcaption><\/figure>\n<h2 id=\"what-healthy-looks-like\">What a healthy engagement dashboard looks like in practice<\/h2>\n<p>A healthy customer engagement dashboard has three layers: one north star at the top, 3-5 supporting signals in the middle, and a qualitative context layer at the bottom. Every metric on it connects to a decision your team can act on. If a metric doesn&#8217;t have a named owner and a defined intervention threshold, it&#8217;s taking up space.<\/p>\n<p>The dashboard I work from at Userpilot is organized around feature health reports. Once a feature ships, I create a report tracking meaningful events to see usage and adoption over time, then look for where drop-off is happening and in which step users are getting stuck. The fix is sometimes engineering, but often it&#8217;s in-app messaging: a tooltip, a checklist, a contextual prompt at the moment a user hesitates.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/product-dashboard-showing-friction-scaled.png\" \/><\/p>\n<p>Shifting from overall metrics to cohort views was the single most important change I made to how I read <a href=\"https:\/\/userpilot.com\/blog\/improve-user-experience\/\">engagement data<\/a>. An overall activation rate of 35% means almost nothing without knowing whether it&#8217;s 50% for your enterprise cohort and 20% for your self-serve cohort. Engagement failures are almost always cohort-specific, not universal, and aggregate views hide them until they compound into revenue problems.<\/p>\n<p>Lia, Userpilot&#8217;s AI analytics assistant, changes what&#8217;s possible here for teams without dedicated data analysts. You can ask Lia what&#8217;s driving a retention drop, which segment has the lowest activation rate this month, or which feature is seeing unexpected drop-off, and get a data-backed answer without pulling a report or waiting for an analyst to free up. That kind of on-demand analysis makes engagement dashboards genuinely useful to people who aren&#8217;t already fluent <a href=\"https:\/\/userpilot.com\/blog\/product-analytics\/\">in product analytics<\/a>, which is most of the people whose decisions the dashboard is supposed to inform.<\/p>\n<p>For teams building their first engagement dashboard, the most common mistake is adding metrics before defining the north star. Start with one north star, three supporting signals, and one qualitative layer. Add a metric when you have a specific decision to make that the current dashboard can&#8217;t answer, and remove one that hasn&#8217;t driven a decision in 60 days.<\/p>\n<p>James put the timing problem directly when we talked about what AI-assisted monitoring makes possible for his team at scale:<\/p>\n<blockquote><p>&#8220;Churn prevention really has to come from earlier intervention where you&#8217;re working with a customer when they are still motivated to be successful with the platform, and the only way that you&#8217;re gonna be able to do that early intervention at scale with a large book of customers is if you have meaningful signals to tell you when the customer is likely to be engaged with you and when that level of support is gonna be impactful.&#8221;<br \/>\n<cite>\u2014 James Mitchinson, Head of Customer Success, Userpilot<\/cite><\/p><\/blockquote>\n<p>A dashboard that predicts churn is built before the churn signal arrives. Accounts that warn you through NPS or a falling retention rate are already past the intervention window by the time the data shows up. What separates the accounts you keep is catching the leading indicators early, when the signal still points at something fixable and there&#8217;s time to act on it.<\/p>\n<p><!-- PROMOTION PLAN (for use after publish - do not paste into WordPress body) LINKEDIN (Abrar's voice, post within 24 hours of publish): \"Most customer engagement dashboards are full of data and short on signal. The problem: they're built around lagging indicators (NPS, session counts, login frequency) that confirm what already happened. The accounts that churn aren't usually the quiet ones. They're the ones that look engaged right up until renewal. I wrote about the distinction between metrics that predict churn and metrics that just describe it, with examples from how we approach this at Userpilot with our own feature health tracking. Link in comments.\" USERPILOT COMPANY ACCOUNT (post within 48 hours of publish): Angle: \"Your customer engagement dashboard may be lying to you.\" Lead with the leading vs. lagging distinction and link to the post. PAID TRAFFIC (week 1+): - LinkedIn Ads: target Product Managers and Customer Success titles at B2B SaaS companies. Creative: \"High logins, zero outcomes - your engagement dashboard may be missing the signal that matters.\" CTA: Read the guide. - Google Ads: target \"customer engagement dashboard\" and \"customer engagement metrics SaaS\" keywords. BACKLINK OUTREACH (week 2+): - Outreach to ProductLed (Wes Bush quote cited - good reason to reach out) - Outreach to Custify and Qualaroo (their benchmarks cited - natural link-exchange opportunity) NEWSLETTER (include in next Userpilot product email): One-paragraph mention of the leading vs. lagging framework, link to the post. REPURPOSE (weeks 2-4): - Pull [IMAGE 2] (leading vs. lagging infographic) as a standalone LinkedIn carousel post from Abrar's profile. - Pull [IMAGE 3] (Lia screenshot) for a short LinkedIn post on AI-assisted analytics. - Pull the \"high logins, zero outcomes\" concept as a punchy text-only LinkedIn post. - Pull the AI-assisted sessions section as a short LinkedIn post with the Yazan MCP quote. SEO REPORT (Phase 4 weave summary): Keywords woven in: customer engagement dashboard (H1, intro, multiple H2s, how-to section), customer engagement data (intro para 2), customer engagement metrics (AI section intro), customer engagement score (H3 in leading indicators), customer effort score (lagging H3), customer satisfaction score (lagging H3), net promoter score\/NPS (lagging H3), customer lifetime value\/CLV (lagging H3), user engagement (multiple body sections), engagement metrics (TLDR), feature adoption rate (leading H3), activation rate (leading H3), customer health score (leading H3), user behavior (multiple), churn rate (lagging section), funnel metrics (lagging section), qualitative feedback (step 4), proactive engagement (implicit throughout), customer interactions (implicit), data-driven decisions (implicit throughout). Missing facts woven in: TTV SaaS benchmark ~36 hours (leading indicators section), cohort-specific retention failure mode (how-to step), leading vs. lagging distinction as core thesis (throughout), customer engagement analytics vs. customer service analytics distinction (brief mention in lagging indicators framing), AI-assisted session types (AI section), predictive churn risk concept (throughout). Keywords skipped: \"average resolution time,\" \"ticket volume,\" \"support operations,\" \"sales funnel\" - not relevant to the post's thesis; forcing them would have damaged the argument. \"Enhanced personalization at scale\" - too generic for the voice. \"Out of the box\" - too casual and vague for this context. --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Having one will enable your team to collaborate more effectively. With ease, you&#8217;ll gather, analyze, and share user engagement insights that will reduce churn and drive product growth.<\/p>\n","protected":false},"author":71,"featured_media":638054,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"content-type":"","inline_featured_image":false,"footnotes":""},"categories":[290,488],"tags":[1090,332,388,1040,495,986,316,391,694,693,182],"class_list":["post-16822","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-updated","category-user-engagement","tag-best-tools-for-customer-engagement","tag-customer-engagement","tag-customer-engagement-software","tag-engagement-data","tag-saas-customer-engagement","tag-saas-user-engagement","tag-user-engagement","tag-user-engagement-metrics","tag-user-engagement-strategies","tag-user-engagement-strategy","tag-user-engagement-tools"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.2) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Customer Engagement Dashboard: Track Signals to Predict Churn<\/title>\n<meta name=\"description\" content=\"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Customer Engagement Dashboard: Track Signals to Predict Churn\" \/>\n<meta property=\"og:description\" content=\"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\" \/>\n<meta property=\"og:site_name\" content=\"Thoughts about Product Adoption, User Onboarding and Good UX | Userpilot Blog\" \/>\n<meta property=\"article:published_time\" content=\"2026-05-14T09:14:25+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-05-15T07:07:49+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1800\" \/>\n\t<meta property=\"og:image:height\" content=\"945\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Abrar Abutouq\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Abrar Abutouq\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"18 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\"},\"author\":{\"name\":\"Abrar Abutouq\",\"@id\":\"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17\"},\"headline\":\"Customer Engagement Dashboard in 2026: Track Signals that Predict Churn, Not Just Report It\",\"datePublished\":\"2026-05-14T09:14:25+00:00\",\"dateModified\":\"2026-05-15T07:07:49+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\"},\"wordCount\":3699,\"commentCount\":0,\"image\":{\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png\",\"keywords\":[\"best tools for customer engagement\",\"customer engagement\",\"customer engagement software\",\"engagement data\",\"saas customer engagement\",\"saas user engagement\",\"user engagement\",\"user engagement metrics\",\"user engagement strategies\",\"user engagement strategy\",\"user engagement tools\"],\"articleSection\":[\"Updated\",\"User Engagement\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\",\"url\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\",\"name\":\"Customer Engagement Dashboard: Track Signals to Predict Churn\",\"isPartOf\":{\"@id\":\"https:\/\/userpilot.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png\",\"datePublished\":\"2026-05-14T09:14:25+00:00\",\"dateModified\":\"2026-05-15T07:07:49+00:00\",\"author\":{\"@id\":\"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17\"},\"description\":\"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage\",\"url\":\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png\",\"contentUrl\":\"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png\",\"width\":1800,\"height\":945,\"caption\":\"Customer Engagement Dashboard in 2026: Track Signals that Predict Churn, Not Just Report It\"},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/userpilot.com\/blog\/#website\",\"url\":\"https:\/\/userpilot.com\/blog\/\",\"name\":\"Thoughts about Product Adoption, User Onboarding and Good UX | Userpilot Blog\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/userpilot.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17\",\"name\":\"Abrar Abutouq\",\"description\":\"Product Manager at Userpilot \u2013 Building products, product adoption, User Onboarding. I'm passionate about building products that serve user needs and solve real problems. With a strong foundation in product thinking and a willingness to constantly challenge myself, I thrive at the intersection of user experience, technology, and business impact. I\u2019m always eager to learn, adapt, and turn ideas into meaningful solutions that create value for both users and the business.\",\"sameAs\":[\"https:\/\/www.linkedin.com\/in\/abrar-abutouq-93aa8b147\/\"],\"url\":\"https:\/\/userpilot.com\/blog\/author\/abraruserpilot-co\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Customer Engagement Dashboard: Track Signals to Predict Churn","description":"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/","og_locale":"en_US","og_type":"article","og_title":"Customer Engagement Dashboard: Track Signals to Predict Churn","og_description":"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.","og_url":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/","og_site_name":"Thoughts about Product Adoption, User Onboarding and Good UX | Userpilot Blog","article_published_time":"2026-05-14T09:14:25+00:00","article_modified_time":"2026-05-15T07:07:49+00:00","og_image":[{"width":1800,"height":945,"url":"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png","type":"image\/png"}],"author":"Abrar Abutouq","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Abrar Abutouq","Est. reading time":"18 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#article","isPartOf":{"@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/"},"author":{"name":"Abrar Abutouq","@id":"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17"},"headline":"Customer Engagement Dashboard in 2026: Track Signals that Predict Churn, Not Just Report It","datePublished":"2026-05-14T09:14:25+00:00","dateModified":"2026-05-15T07:07:49+00:00","mainEntityOfPage":{"@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/"},"wordCount":3699,"commentCount":0,"image":{"@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage"},"thumbnailUrl":"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png","keywords":["best tools for customer engagement","customer engagement","customer engagement software","engagement data","saas customer engagement","saas user engagement","user engagement","user engagement metrics","user engagement strategies","user engagement strategy","user engagement tools"],"articleSection":["Updated","User Engagement"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/","url":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/","name":"Customer Engagement Dashboard: Track Signals to Predict Churn","isPartOf":{"@id":"https:\/\/userpilot.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage"},"image":{"@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage"},"thumbnailUrl":"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png","datePublished":"2026-05-14T09:14:25+00:00","dateModified":"2026-05-15T07:07:49+00:00","author":{"@id":"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17"},"description":"A customer engagement dashboard is invaluable for interpreting customer data and making the right business decisions. Learn how to build one.","inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/userpilot.com\/blog\/customer-engagement-dashboard\/#primaryimage","url":"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png","contentUrl":"https:\/\/blog-static.userpilot.com\/blog\/wp-content\/uploads\/2026\/05\/Product-Dashboard-in-2026-Build-For-the-Decision-Not-the-Sprint-Review.png","width":1800,"height":945,"caption":"Customer Engagement Dashboard in 2026: Track Signals that Predict Churn, Not Just Report It"},{"@type":"WebSite","@id":"https:\/\/userpilot.com\/blog\/#website","url":"https:\/\/userpilot.com\/blog\/","name":"Thoughts about Product Adoption, User Onboarding and Good UX | Userpilot Blog","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/userpilot.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/userpilot.com\/blog\/#\/schema\/person\/de3e3a90716a9ee4b1d8e559d76ecf17","name":"Abrar Abutouq","description":"Product Manager at Userpilot \u2013 Building products, product adoption, User Onboarding. I'm passionate about building products that serve user needs and solve real problems. With a strong foundation in product thinking and a willingness to constantly challenge myself, I thrive at the intersection of user experience, technology, and business impact. I\u2019m always eager to learn, adapt, and turn ideas into meaningful solutions that create value for both users and the business.","sameAs":["https:\/\/www.linkedin.com\/in\/abrar-abutouq-93aa8b147\/"],"url":"https:\/\/userpilot.com\/blog\/author\/abraruserpilot-co\/"}]}},"_links":{"self":[{"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/posts\/16822","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/users\/71"}],"replies":[{"embeddable":true,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/comments?post=16822"}],"version-history":[{"count":9,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/posts\/16822\/revisions"}],"predecessor-version":[{"id":638185,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/posts\/16822\/revisions\/638185"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/media\/638054"}],"wp:attachment":[{"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/media?parent=16822"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/categories?post=16822"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/userpilot.com\/blog\/wp-json\/wp\/v2\/tags?post=16822"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}