Progressive web metrics at your fingertipz
In favour of better support and many cool features of: - Lighthouse CI - is a suite of tools that make continuously running, saving, retrieving, and asserting against Lighthouse results as easy as possible. - Lighthouse CI Action - action integrates Lighthouse CI with Github Actions environment. Making it simple to see failed tests, upload results, run jobs in parallel, store secrets, and interpolate env variables. - Treo.sh - Page speed monitoring made simple.
Progressive web metrics at your fingertipz. 💅
CLI tool and lib to gather performance metrics via Lighthouse.
Documentation on these metrics in the works. If you hit bugs in the metrics collection, report at Lighthouse issues. How to use article
$ yarn global add pwmetrics # or $ yarn add --dev pwmetrics
$ pwmetricspwmetrics http://example.com/
--runs=n Does n runs (eg. 3, 5), and reports the median run's numbers.
Median run selected by run with the median TTI.
pwmetrics http://example.com/ --runs=3
--json Reports json details to stdout.
pwmetrics http://example.com/ --json
returns...
{runs: [{
"timings": [
{
"name": "First Contentful Paint",
"value": 289.642
},
{
"name": "Largest Contentful Paint",
"value": 292
},
...
--output-path File path to save results.
pwmetrics http://example.com/ --output-path='pathToFile/file.json'
--config Provide configuration (defaults to
package.json
). See Defining config below.pwmetrics --config=pwmetrics-config.js
--submit Submit results to Google Sheets. See Defining submit below.
pwmetrics --submit
--upload Upload Lighthouse traces to Google Drive. See Defining upload below.
pwmetrics --upload
--view View Lighthouse traces, which were uploaded to Google Drive, in DevTools. See Defining view below.
pwmetrics --view
CLI options useful for CI
--expectations Assert metrics results against provides values. See Defining expectations below.
pwmetrics --expectations
--fail-on-error Exit PWMetrics with an error status code after the first unfilled expectation.
pwmetrics --fail-on-error
# run pwmetrics with config in package.json pwmetrics --config
package.json
... "pwmetrics": { "url": "http://example.com/", // other configuration options } ...
# run pwmetrics with config in pwmetrics-config.js pwmetrics --config=pwmetrics-config.js
pwmetrics-config.js
module.exports = { url: 'http://example.com/', // other configuration options. Read _All available configuration options_ }
pwmetrics-config.js
const METRICS = require('pwmetrics/lib/metrics');module.exports = { url: 'http://example.com/', flags: { // AKA feature flags runs: 3, // number or runs submit: true, // turn on submitting to Google Sheets upload: true, // turn on uploading to Google Drive view: true, // open uploaded traces to Google Drive in DevTools expectations: true, // turn on assertion metrics results against provides values json: true, // not required, set to true if you want json output outputPath: 'stdout', // not required, only needed if you have specified json output, can be "stdout" or a path chromePath: '/Applications/Google\ Chrome\ Canary.app/Contents/MacOS/Google\ Chrome\ Canary', //optional path to specific Chrome location chromeFlags: '', // custom flags to pass to Chrome. For a full list of flags, see http://peter.sh/experiments/chromium-command-line-switches/. // Note: pwmetrics supports all flags from Lighthouse showOutput: true, // not required, set to false for pwmetrics not output any console.log messages failOnError: false // not required, set to true if you want to fail the process on expectations errors }, expectations: { // these expectations values are examples, for your cases set your own // it's not required to use all metrics, you can use just a few of them // Read Available metrics where all keys are defined [METRICS.TTFCP]: { warn: '>=1500', error: '>=2000' }, [METRICS.TTLCP]: { warn: '>=2000', error: '>=3000' }, [METRICS.TTI]: { ... }, [METRICS.TBT]: { ... }, [METRICS.SI]: { ... }, }, sheets: { type: 'GOOGLE_SHEETS', // sheets service type. Available types: GOOGLE_SHEETS options: { spreadsheetId: 'sheet_id', tableName: 'data', uploadMedian: false // not required, set to true if you want to upload only the median run } }, clientSecret: { // Data object. Can be get // either // by (using everything in step 1 here)[https://developers.google.com/sheets/api/quickstart/nodejs#step_1_turn_on_the_api_name] // // example format: // // installed: { // client_id: "sample_client_id", // project_id: "sample_project_id", // auth_uri: "https://accounts.google.com/o/oauth2/auth", // token_uri: "https://accounts.google.com/o/oauth2/token", // auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs", // client_secret: "sample_client_secret", // redirect_uris: [ // "url", // "http://localhost" // ] // } // // or // by (using everything in step 1 here)[https://developers.google.com/drive/v3/web/quickstart/nodejs] } }
Recipes for using with CI
# run pwmetrics with config in package.json pwmetrics --expectations
package.json
... "pwmetrics": { "url": "http://example.com/", "expectations": { ... } } ...
# run pwmetrics with config in pwmetrics-config.js pwmetrics --expectations --config=pwmetrics-config.js
Submit results to Google Sheets
Instructions:
sheets.options.spreadsheetIdproperty.
client_secretand put it into the config as value of
clientSecretproperty.
# run pwmetrics with config in package.json pwmetrics --submit
# run pwmetrics with config in pwmetrics-config.js pwmetrics --submit --config=pwmetrics-config.js
pwmetrics-config.js
js module.exports = { 'url': 'http://example.com/', 'sheets': { ... }, 'clientSecret': { ... } }
Upload Lighthouse traces to Google Drive
Instructions:
client_secretand put it into the config as value of
clientSecretproperty.
# run pwmetrics with config in package.json pwmetrics --upload
# run pwmetrics with config in pwmetrics-config.js pwmetrics --upload --config=pwmetrics-config.js
pwmetrics-config.js
js module.exports = { 'url': 'http://example.com/', 'clientSecret': { ... } }
Show Lighthouse traces in timeline-viewer.
Required to use
uploadflag
timeline-viewer - Shareable URLs for your Chrome DevTools Timeline traces.
# run pwmetrics with config in package.json pwmetrics --upload --view
# run pwmetrics with config in your-own-file.js pwmetrics --upload --view --config=your-own-file.js
pwmetrics-config.js
js module.exports = { 'url': 'http://example.com/', 'clientSecret': { ... } }
All metrics now are stored in separate constant object located in
pwmetrics/lib/metrics/metrics;
// lib/metrics/metrics.ts { METRICS: { TTFCP: 'firstContentfulPaint', TTLCP: 'largestContentfulPaint', TBT: 'totalBlockingTime', TTI: 'interactive', SI: 'speedIndex' } }
Read article Performance metrics. What’s this all about? which is decoding this metrics.
const PWMetrics = require('pwmetrics');const options = { flags: { runs: 3, // number or runs submit: true, // turn on submitting to Google Sheets upload: true, // turn on uploading to Google Drive view: true, // open uploaded traces to Google Drive in DevTools expectations: true, // turn on assertation metrics results against provides values chromeFlags: '--headless' // run in headless Chrome } };
const pwMetrics = new PWMetrics('http://example.com/', options); // All available configuration options can be used as
options
pwMetrics.start(); // returns Promise
Option | Type | Default | Description |
---|---|---|---|
flags* | Object |
{ runs: 1, submit: false, upload: false, view: false, expectations: false, disableCpuThrottling: false, chromeFlags: '' } |
Feature flags |
expectations | Object | {} | See Defining expectations above. |
sheets | Object | {} | See Defining submit above. |
clientSecret | Object | {} | Client secrete data generated by Google API console. To setup Google Developer project and get credentials apply everything in step 1 here. |
*pwmetrics supports all flags from Lighthouse. See here for the complete list.
Apache 2.0. Google Inc.