logo logo

Improving the Quality of Web Pages with Lighthouse

Google-Chrome-Lighthouse-Improving Quality

The Internet gets heavier and heavier every year. If we check the state of the web we can see that a median page on mobile weights at about 1.5MB, with the majority of that being JavaScript and images. The growing size of websites, together with other factors, like network latency, CPU limitations and render blocking patterns, contributes to the complicated performance puzzle.

Web performance is all about the users. We want our users to interact with the application we build in meaningful ways. Performance plays a vital role in the success of a website, let it be a social media or online retailer. If users can’t find what they are looking for, then they leave the site and it takes only less than a second.

Web performance testing is very important at all phases of your development life cycle, from an early stage in your development process up to end-to-end performance testing on your staging environments. There are several highly popular and recommended tools for end-to-end performance testing such as: Jmeter, Flood and others, but this article is going to focus on early stage testing, meaning improving the quality of web pages during the development phase. Web performance testing should be moved to the left in the testing process, which in other words means Shift left testing

We often underestimate the effects of testing early on in the software development life cycle, as the agile methodology describes. Regularly testing code with each code increment helps to guarantee the quality of the project, but also saves you a lot of time and money.

Early Stage Performance Testing Metrics

We saw how important it is to capture performance issues at the early stages of the development cycle. When it comes to web performance metrics, these are the key factors that we should be looking for:

  1. First Contentful Paint (FCP): Measures the time from navigation to the time when the browser renders the first bit of content from the DOM. This is an important milestone for users because it provides feedback that the page is actually loading.
  2. First meaningful paint: Measures the page load and how a user perceives the performance of your page.
  3. Speed Index: A page load performance metric that shows you how quickly the contents of a page are visibly populated. The lower the score, the better.
  4. The Time to Interactive (TTI): Measures how long it takes a page to become interactive.
  5. Mobile devices have much less CPU power than desktops and laptops. Whenever you profile a page, use CPU Throttling to simulate how your page performs on mobile devices.

To fetch these performance metrics Google has open sourced Lighthouse, an automated tool for improving quality of web pages. You can run it against any web page, public or one requiring authentication. It has audits for performance, accessibility, progressive web apps, SEO and more.

You can run Lighthouse in Chrome DevTools, from the command line, or as a Node module. You give Lighthouse a URL to audit, it runs a series of audits against the page, and then it generates a report on how well the page did.

Let’s see how we can use Lighthouse from Chrome DevTools. Navigate to chrome web developer console and you should see Audit and select record.

In the image below, we can see that Lighthouse Audit has provided some useful information on how the web page performs and it also holds some information about how to make the performance better:

Lighthouse Audit - Performance Testing

 

Now, let’s see how we can get this information as part of the Continuous deployment pipeline. Lighthouse provides a node.js module which can be used to fetch the audit metrics.

Lighthouse architecture is built around Chrome Debugging Protocol which is a set of low-level API to interact with a Chrome instance. It interfaces a Chrome instance through the Driver. The Gatherers collect data from the page using the Driver. The output of a Gatherer is an Artifact, a collection of grouped metrics. An Artifact then is used by an Audit to test for a metric. The Audits assert and assign a score to a specific metric. The output of an Audit is used to generate the Lighthouse report that we are familiar with.

Lighthouse Architecture - Performance Testing

 

Let’s create a simple node project to get lighthouse dependency:

 

We use Taiko as a node module which will help us drive the chrome browser:

const {
  openBrowser,
  goto,
  currentURL,
  closeBrowser,
  client
} = require('taiko');
import lighthouse from 'lighthouse';
const config = require('lighthouse/lighthouse-core/config/lr-desktop-config.js');
const ReportGenerator = require('lighthouse/lighthouse-core/report/report-generator');
const fs = require('fs');
(async () => {
  try {
    await openBrowser();
    await goto('taiko.dev');
    let url = await currentURL();
    let port = await client()
      .webSocketUrl.split('/devtools/')[0]
      .replace('ws://', '')
      .split(':')[1];
    let lhr = await lighthouse(
      url,
      {
        port,
        output: 'html',
        logLevel: 'error'
      },
      config
    );
    const report = ReportGenerator.generateReport(lhr.lhr, 'html');
    fs.writeFile('audit.html', report, err => {
      if (err) throw err;
    });
  } catch (error) {
    console.error(error);
  } finally {
    await closeBrowser();
  }
})();

Running this code will create an audit.html for the given URL. We now have a running Lighthouse project improving the quality of our web page! 🎉
Let’s see how we can capture metrics from Lighthouse in the next blog – Coming Soon 😎

 

About the author

Sai Krishna

Work at ThoughtWorks as Lead Consultant with 8 yrs of experience. Over the course of my career, I have worked on testing different mobile applications and building automation frameworks. Active contributor to Appium and also a member of Appium organization.

I love to contribute to open source technologies and passionate about new ways of thinking.

Leave a Reply

FacebookLinkedInTwitterEmail