“Technical” Report

The “Technical” report doesn’t focus on user metrics but rather on technical indicators like data volume or request count.

We gather all URLs you input through test settings and each day, we take the median from the tests to show you these figures.

Technical Report - Overall Data Volume An occasional blip is no concern, but if the decline persists and doesn’t bounce back, it’s a cue for reflection.

While we don’t treat technical indicators as primary metrics, they can assist you in identifying the roots of problems with user metrics Core Web Vitals reported by the Watchdog or seen in Google data (Chrome UX Report) on the Domains or Pages report.

🔐 The “Technical” report is a feature available only in PLUS tests.

Why Measure Technical Indicators for Web Speed?

In the modern field of web speed optimization, these technical indicators are not typically front and center.

From our experience in web speed consulting, we know that even websites that download large amounts of data can still be fast for users if the initial load is well optimized.

Nevertheless, in our “Technical” report, we do track these secondary indicators. Data volume, request count, and other metrics are worth monitoring for various reasons:

  • They are useful for finding correlations between changes in user metrics (e.g., LCP, CLS, INP) and website development changes (e.g., image data volume or the size of blocking JavaScript).
  • Conserving data volume is considered courteous towards users, who may have limited data for mobile downloads.
  • Often, less data transferred means the website operator saves on infrastructure.

In PLUS tests, you will see the “Technical” report for all page types you include in the test settings.

The graph below shows both the current metric status and its development over time, separated for mobile and desktop.

Based on our consulting experience, we’ve included the following information and reports in the “Technical” report:

HTML Data Volume

The graph shows the development of HTML data volume for individual pages. It reflects the state in which HTML arrives at the test browser, including compression like Gzip or Brotli. The smaller the HTML data size, the better.

We recommend keeping the size under 20 kB. This affects loading metrics like FCP or LCP.

Number of DOM Nodes

The development of the number of DOM nodes for individual pages over time. A complex DOM tree complicates JavaScript’s work, potentially affecting metrics like TBT or INP.

We recommend a maximum of 1,500 DOM nodes per page, an ambitious target but one worth striving for.

CSS Data Volume

The development of CSS file data volume for individual pages over time. This shows the state after potential Gzip or Brotli compression on your server. CSS typically blocks the first render, so its size impacts metrics like FCP or LCP. Ideally, keep CSS data volume under 50 kB.

Number of Blocking JS

The development of the number of JS files blocking the first render. JavaScript doesn’t have to be blocking, unlike CSS.

The fewer blocking JS files you have, the better for metrics like FCP or LCP.

JS Data Volume

The development of JS file data volume for individual pages over time. This includes both blocking and non-blocking files. The graph shows the state after potential Gzip or Brotli compression on your server.

Smaller JS means less code for the browser to process and execute, impacting interaction metrics like TBT or INP. Blocking JS also affects FCP and LCP, thus the first render.

Third-party JS Data Volume

The development of third-party JS file data volume for individual pages over time. This category includes JS executed outside your main domain, so you might see your own files hosted elsewhere.

Typically, this includes data volume for external code like analytics tools, chats, A/B testing, ads, etc.

This code also impacts interaction metrics like TBT or INP. Keeping it as small as possible is crucial.

Font Data Volume

The development of font data volume for individual pages over time. Fonts are typically needed for rendering content and thus impact the LCP metric.

We recommend keeping font data volume under 50 kB.

Image Data Volume

The development of image data volume for individual pages over time. Images are often needed for rendering content and can impact the LCP metric.

We recommend keeping image data volume under 100 kB per page.

Total Data Volume

The development of total data volume for all downloaded files for individual pages over time. Total data volume might not directly affect metrics and user experience, but we recommend keeping it under 0.5 MB per page.

Monitoring technical indicators can greatly assist when you decide to focus on any of the above metrics and specific page types.

What to Do When You See a Deterioration in a Technical Indicator?

First, let’s not panic. Make sure you observe the impact on user metrics in the same time frame (see Watchdog, Pages, or Domains reports).

Sometimes you may not be able to influence the deterioration, such as with third-party components. Nevertheless, it’s wise to pay attention to third parties.

Technical Report - Changes in Overall Data Volume Improving technical indicators is always good news.

In the test run detail Lighthouse, you can see a broader context.

Do you still see a problem on your side in the Technical report and simultaneously a deterioration impacting Core Web Vitals, like issues in the Watchdog or Domains reports?

  1. Identify the specific change in the graphs. Which page types are affected? Is it the same for mobile and desktop?
  2. Click on the specific value in the graph to access the Lighthouse test, which will show you the measured value with additional context.
  3. Ask developers, marketers, and other team members what changed during the specific period.

On Vzhůru dolů, there’s a complete tutorial on speed tuning using PageSpeed.ONE monitoring.

Schedule a Monitoring Demo