Types of Web Speed Measurements: Synthetic, Chrome UX Report, and Real User Monitoring
Let's clarify the differences between various types of web speed measurements. We’ll discuss synthetic data (“synth”), Google user data (“CrUX”), and data from all users (“RUM”).
These measurements fall under automated web speed monitoring, which we highly recommend you keep active.
CrUX as the Foundation, Synth as a Necessary Complement
The foundation is always to have data from Google users (Chrome UX Report, CrUX), which excellently measures user experience and also influences organic traffic or cost-per-click in Google Ads.
We supplement Google user numbers in our speed tester with synthetic measurements (known as “synth”), which we gather daily, allowing us to monitor metric values or identify specific issues.
Synthetic and CrUX, indispensable partners.
For larger websites or in justified cases, we supplement CrUX and synthetic data for web speed consulting with monitoring of all users (Real User Monitoring, RUM).
Synthetic Data (Synth)
Data is machine-gathered, for example, using Lighthouse or WebpageTest.org.
- Advantage: Detailed speed reports.
- Disadvantage: It’s only a report for the first load, so user metrics like INP or CLS are absent or inaccurate.
In our monitoring, we use our own Lighthouse for testing.
Data from Google Users (CrUX)
Data from the Chrome UX Report (CrUX for short) comes directly from Google Chrome users.
- Advantage: Includes all Core Web Vitals metrics directly from users.
- Disadvantage: Data is only cumulative for the last 28 days and mostly for the entire domain, not specific URLs.
In our PLUS monitoring, we download data for individual pages from the CrUX API. Monthly graph data in the “Domains” report comes from the Chrome UX Report on BigQuery.
Data from All Users (RUM)
So-called Real User Monitoring (RUM) can collect information from all users or a sample.
- Advantage: Data collected via JavaScript is immediate.
- Disadvantage: It’s expensive, complex to implement and evaluate, and returns different values than CrUX.
Currently, PageSpeed.ONE monitoring does not collect RUM data. We recommend collecting this data only for larger sites or if you have specific reasons (e.g., optimizing the INP metric) and use SpeedCurve RUM for these purposes.
Differences Between Specific Metrics
Broadly speaking, the differences in Web Vitals collection across synthetic, CrUX, and RUM are as follows:
- LCP (Loading Speed) – collected uniformly. However, in synthetic measurements, it is gathered from specific measurements or medians, making it quite imprecise for the entire domain.
- CLS (Layout Shift) – can show significant differences. In synthetic, only the first page load is usually measured. In RUM, CLS calculation stops when data is sent. CrUX, however, continues measuring until the user switches windows.
- INP (Interaction Speed) – can show significant differences too. INP is typically not measured in synthetic data, whereas it is cumulatively measured in RUM and CrUX for specific pages.
It's important to note that data varies across different measurement tools and is sensitive to their settings.
Additionally, you’ll get varied results for MPA applications (each page loads from the server) versus SPA (only the first page loads from the server). Simply put, for SPA, both CLS and INP are measured cumulatively for the entire session. See also the article on SPA and Web Vitals on web.dev.
Sample Differences Between Metrics
Let's look at an example from one of our clients for whom we conducted a RUM analysis to see how metrics can vary across measurements:
| synth | RUM | CrUX | |
|---|---|---|---|
| LCP | 1.57 s | 1.66 s | 2.22 s |
| CLS | 0.01 | 0 | 0.05 |
| INP | - | 136 ms | 310 ms |
Several notes on the table:
- Worse LCP metric in CrUX (data from the Chrome UX Report) compared to RUM can be explained by worse devices since CrUX on mobiles is collected only on Android.
- Synthetic is limited to only a few tests on a few devices, so we don’t take its values too seriously, but it helps us monitor changes on the web, see for instance the Watchdog report.
- INP metric is not collected by synthetic data. In CrUX, data accumulates for SPA-type applications, hence it will always be worse here.
- Similarly, CLS can be worst in CrUX. It is calculated throughout the entire page usage, not just at the first load (synthetic) or until measurement data is sent (RUM).
If you're interested in the topic, check out the article on the differences between CrUX and RUM according to web.dev.