What is Real User Monitoring and Why is it Valuable?

Real user monitoring (RUM) collects detailed analytics on the actual experience of visitors interacting with a website, mobile app or other digital service. As the name suggests, RUM focuses on capturing metrics from real users under real conditions – not simulated tests.

By passively monitoring live user sessions, RUM reveals invaluable insights including:

  • The detailed performance each visitor receives from web pages and app screens
  • Errors and failures causing problems for users in different locations
  • The specific usage paths and behaviors of customers through sites
  • How experiences vary across device types, browsers and operating systems
  • The impact website changes have on visitor metrics over time

RUM sheds light on issues that load tests, synthetic monitoring and lab analytics cannot detect. It highlights real-world performance, reliability and engagement issues affecting real visitors to empower data-driven optimization of digital experiences.

For example, web analysis firm Convert found common problems RUM identifies:

  • 57% of website visitors encounter delays, errors or failures
  • Issues are 3x worse on mobile compared to desktop
  • 25% of shopping carts are abandoned on slow-loading sites

Without RUM providing user-centric visibility, teams remain unaware of many customer experience shortfalls on their apps and sites.

How RUM Differs From Synthetic Monitoring

While sometimes confused or conflated, RUM and synthetic monitoring take very different approaches:

RUM vs Synthetic Monitoring

Synthetic monitoring uses automated scripts and checks to simulate user journeys across apps and report on performance metrics. It validates availability and baseline responsiveness of sites and services from outside the firewall.

RUM passively collects analytics generated by actual user activity within apps to reveal real-world experience and engagement.

Synthetic monitoring asks: Does this website work correctly when tested in prerecorded ways from different locations?

RUM asks: What performance and reliability do real users see on this site during live, production activity?

The two approaches offer complementary lenses onto application health:

  • Synthetic reveals infrastructural and baseline code quality issues
  • RUM uncovers real visitor experience deficiencies

Used together, they provide complete visibility for DevOps and application teams to identify and resolve the full spectrum of potential performance and functional issues.

Why Real User Data Matters

Modern web and mobile apps have become hugely complex systems under the hood. A simple website connects dozens of front end, API, microservice, database and CDN components across multiple cloud services.

This intricacy makes effectively testing and modeling real-world use cases nearly impossible. Factors like user locations, browser choices, journey variations and code changes further complicate app behavior profiling.

Without real usage data, digital teams lack visibility into critical aspects of customer experiences including:

Real load performance – how fast key user journeys actually unfold across interconnected system components.

Regional variations – higher latency in Asia vs Europe for example causing fragmented performance.

Browser differences – Chrome vs Safari browser code and network stack optimization differences leading to vast experience gaps even on the same device.

Mobile vs desktop contrasts – the significantly heavier resource demands of mobile apps leading to more failures.

Recent Catchpoint RUM research found 2x higher error rates for mobile visitors compared to desktop users across analyzed sites due to these amplified demands.

User flow failure points – checkout payment issues only affecting users coming from specific campaign links as one illustration. Or involuntary customer sign offs when unsuccessfully updating account details.

Without observing real customer data, critical reliability, speed and engagement challenges staying undetected and unresolved – directly impacting business metrics:

  • 9% revenue lift from improving web performance per Akamai
  • 18% conversion gains via smoothing mobile purchase flows per Monetate

Leveraging RUM supplies the contextual live site analytics for digital teams to identify and tackle the experience deficiencies causing visitor loss.

What Data Does Real User Monitoring Capture?

Sophisticated RUM platforms ingest a wealth of data detailing precise real visitor web and mobile application usage including:

Performance metrics – Exact timings on page, API and component load speeds and response times. Plus network request transfer throughput.

Errors and failures – Server and browser exceptions or warning messages plus failover handoff notices.

Usage behaviors – Button clicks, gestures, form inputs and submissions, or purchases.

Session context – User type, location, device model, OS version, browser and more.

Site navigation journeys – Timestamped sequence of pages, content or screens accessed during visits.

By centralizing multiple analytics signals associated with live visitor experiences, RUM solutions unlock deep understanding of how real application usage unfolds across critical dimensions like performance, reliability and engagement.

Segmenting the observational datasets by geography, application module and other facets provides further insightful cuts into dynamic production application health. API granularity analytics detail precisely which services and endpoints fail more often for users when application response times spike for example.

These rich contextual insights empower teams to move from speculative guesses to data-driven precision in identifying and resolving the specific elements degrading real visitor website and mobile application outcomes day-to-day.

How Real User Monitoring Technically Works

Modern RUM solutions utilize lightweight JavaScript tags, mobile SDKs, or packet inspection to gather detailed client-side telemetry during live production application sessions, including:

RUM Architecture

Collection

  1. Embedding lightweight JavaScript tags on web application pages and screens enables passive gathering of metrics like page load timings, network requests, errors and user actions as visitors navigate sites.

  2. For mobile apps, SDK instrumentation similarly records performance, reliability and engagement analytics with minimal overhead.

  3. Packet-level network taps transparently inspect all traffic flows in high security environments avoiding any application code changes.

Processing

  1. The embedded or tap collection code publishes the performance and behavior statistics to nearby RUM servers for processing.

  2. Servers normalize raw metrics from many front end client references into consistent indicators on back end and middleware application responsiveness, availability and usage.

  3. Results get stored in time-series databases for flexible segmentation and analysis by web and mobile teams.

Visualization and Alerting

  1. RUM dashboards spotlight application areas needing optimization via detailed historical charts, session replays and real-time monitoring.

  2. Alert rules trigger notifications on detection of performance degradations or user experience regressions that require investigation.

RUM can either run as an on-premise solution with enterprise collecting and owning all user analytics, or leverage externally hosted Software-as-a-Service platforms. Each approach balances costs, data security, compliance and ease-of-deployment considerations differently.

Who Can Benefit From RUM?

The rich production usage visibility RUM offers aids multiple persona:

Site reliability engineers rapidly identify and fix infrastructure bottlenecks and backend outages degrading visitor application performance based on granular operational metrics.

Software developers pinpoint front end code enhancements needed using clear visibility into which application modules, journeys and API integrations frustrate users most through excessive latency, errors or complexity.

Product owners check application changes consistently improve not regress real visitor outcomes once released using historical comparisons.

Digital marketing discovers through segmented analytics which campaign landing pages inadvertently hamper conversions through delays or failures affecting key user workflows like checkout.

Executives track true normalized return on web performance investments using calibrated insight into customer experience and business metric boosts.

RUM analytics offers an objective source of truth on production application effectiveness and continuous improvement opportunities based on real visitor perspectives for all these stakeholders.

Challenges and Considerations With Real User Monitoring

Despite the rich insights RUM unlocks, incorporating enhanced production user observability introduces important considerations around:

  • Data protection – Collecting extensive visitor usage statistics raises critical privacy concerns requiring careful handling under data governance policies like GDPR while preventing data loss through encryption and access controls.

  • System overhead – Injecting page tags steals client device CPU cycles from hosting applications that especially impacts mobile visitor experience, requiring a minimalist scripting approach.

  • Analytics complexity – The multitude of captured metrics like page timings, transfer speeds and error codes felt by individual users across sessions must get carefully aggregated and normalized to yield clear operational insights.

  • Actionable reporting – Support is mandatory for custom application health reporting aligned to business KPIs like user funnel fallout rates or load time regional differentials rather than just technology metrics.

Sophisticated RUM platforms provide turnkey capabilities in all these areas. But choices between open source vs commercial tools or extensibility vs ease-of-use do require consideration by application owners based on their user data risk tolerance, skill levels and reporting needs.

Leading Real User Monitoring Solutions

All major application performance management (APM) vendors offer RUM-based digital experience capabilities either natively or via integration partnerships that instrumentation teams should evaluate, including:

Datadog – Specialist SaaS solution emphasizing visualizing web vitals and mapping detailed user journeys.

Dynatrace – Heavy focus on advanced analytics like AI-assisted visitor cohort segmentation for precision web personalization.

New Relic – Long-time APM player now enriching RUM behavioral analytics via applied machine learning models.

Catchpoint – Leading independent web monitoring platform with enterprise-scale RUM extract, transform and load pipelines.

OpenSource RUM – Lightweight open source JavaScript agents providing basic RUM with backend integration flexibility.

When assessing solutions, key capabilities like configurability of data flows, compliance support and tying user metrics to business outcomes all require due diligence beyond baseline technical feature checklists alone.

The Necessity of a User-Centric Lens

Real user monitoring delivers an invaluable user-centric lens on the true quality of digital experiences modern websites, mobile and web applications practically deliver to customers today. By directly observing production visitor usage sessions, RUM provides the contextual insights DevOps, SRE and application teams need to collectively expose then optimize the reliability, speed and engagement of online services. Blending RUM with synthetic monitoring offers a complete view of both simulated and real-world performance to drive continuous improvements balanced across application owner needs from code quality to business outcomes.