We get that page speed matters, but practically optimizing a website for performance is hardly so straightforward as “let’s make this shit blazing fast.” There’s more to it. We know, for instance, that the order in which elements load may matter more than just the total page load time – but even this can be pretty hard. These efforts accrue real technical debt, which means they cost real money. For folks where budgets and talents and times are constrained, we need to be able to determine where cranking that speedometer has the most bang for its buck, where speed matters most, and where it doesn’t (gasp).

The Activity Impact Score introduced by Tammy Everts for Soasta measures what impact page speed has on the length of time people spend on your site. This compares a performance metric 1 like load time in milliseconds with session length, because this can be a useful indicator that people are consuming content, wherein longer sessions mean likelier discovery of new events, new and old services, cool repos, archives – all the myriad things — let’s say — that libraries do and their patrons forget.

Pages are grouped into content types 2 — lists, events, searches, landing pages — and the proportion of the overall requests associated with that group is used with the Spearman Ranked Correlation between their load times and the user’s session length to calculate an activity impact score 3 on a scale between -1 — low impact — and 1 — high impact.

The bar chart represents relative activity impact scores and the line represents load time in milliseconds.

The bar chart represents relative activity impact scores and the line represents load time in milliseconds.

Higher scores (the homepage, search, and subject guides) demonstrate greater correlation between page speed and session length. So we can use the example above to determine that our “about” page group — informational pages where I threw-in parking, policies, and the like — has a relatively low activity impact score despite fast load times, so these kinds of pages don’t benefit all that much from really cranking it up.

And although we might hear the carrion call of those databases, baking pitifully in the lag desert, the score of our homepage proves its speed has way more impact on the length of time people are hanging around. Our time, then, is better off doting there, leaving poorer scorers to choke in the dust a little bit longer.


  1. I wrote a thing for Weave: Journal of Library User Experience about Meaningfully Judging Performance in Terms of User Experience.
  2. With some headscratching I managed to group pages with Google Analytics, but I couldn’t say whether a tool like mPulse (by the folks who brought you the Activity Impact Score) wouldn’t be easier.
  3. The activity impact score uses a similar method as the conversion impact score, where Tammy explains this better.

Also published on Medium.

Michael Schofield is a service and user-experience designer specializing in libraries and the higher-ed web. He is a co-founding partner of the Library User Experience Co., a developer at Springshare, librarian, and part of the leadership team for the Practical Service Design community.