Exciting new ideas from the web performance hackathon

What’s the best way to get your team to develop new, innovative solutions?

A few years ago, we started running regular hackathons – intensive two-day events during which developers race against the clock to build new products (or extend existing ones) to solve a given web performance problem.

Recently, we decided to do more to harness all the talent in the office, and we widened the net, inviting representatives from every department in the web performance division, from customer support to sales.

We weren’t disappointed with the result, and the hackathon we ran this January yielded some of the most original ideas we’ve seen to date.

Here’s a quick round-up of what they came up with…

Team A


This team was all about making web performance data more visible, easier to interpret and more convenient to access. Businesses tend to care more about how their sites are performing when key individuals across departments can see and understand what that means.

The team used our recently developed API to take data from scheduled tests in Performance Analyser and present it in different ways on a variety of custom dashboards, each one carefully designed for different kinds of user.

This could be used to keep track of different aspects of performance over time, including key visual performance metrics such as Speed Index:


…or the performance and scale of third-party content:


Team A didn’t stop there, though. They also built a Chrome extension to highlight image size inefficiency. Again, this was all about making important performance data quick and easy to access.

Team B


This team sought to make data from NCC Group’s Real User Monitoring (RUM) service more accessible by emulating the Chrome extension for Google Analytics. The idea was to present data in a way that a lot of people are already familiar with, keeping it on the screen while users navigate the site, highlighting interesting changes.

This included a button for the browser toolbar that would highlight whether the current page’s performance was better (green) or worse (red) than it was the previous day.


Clicking the button displays a popup with a few more details:


Finally, clicking on ‘Show More’ in the pop-up brings up the Performance Charts view, abutted to the top of the current web page, showing:

  • Page load duration – yesterday vs today
  • Impressions – yesterday vs today
  • Performance distribution comparison (All URLs/Current URL) – yesterday vs today


Team C


Team C looked at new ways to visualise and interact with RUM data, with a particular focus on drilling down into the key factors affecting performance and conversion. They achieved this through an interactive interface that used groups of different coloured and different sized bubbles  to represent the data.


Team D


This team was interested in taking a fresh look at what marketers would like to get out of performance data.

Their concept was a dashboard providing an initial, uncluttered summary of important data clusters, such as:

  • Users who have never been on the site before
  • Returning users
  • Conversion rate for new users
  • Conversion rate for returning users
  • Bounce rate.

This would include the ability to compare the data for different time periods and to also drill down into a Sankey chart of performance data related to the selected cluster.

Team E


Team E focused on third-party content – this is a common source of performance problems and is also potentially difficult for organisations to keep track of. The team therefore used scheduled jobs in Performance Analyser to produce a timeline showing the third-party resources on a page for a given period.


Team F


Team F did something a bit different. Taking a slightly more liberal interpretation of the brief, this team built a scrolling shooter-style arcade game based on monitoring data. Errors appeared in the form of enemy spacecraft … unfortunately, though, taking out these alien invaders didn’t deal with the errors in the real world.

Team G


his team focused on breaking down barriers between different products, combining the breadth of coverage offered by Real User Monitoring and the more detailed insight available from Performance Analyser.

The idea was to make it easier for customers to investigate anomalies surfaced by RUM. Tests could either be fired off automatically if certain thresholds were breached, or manually, if a user simply wanted to take a more detailed look at a particular page’s performance.


Once again, NCC Group’s web performance hackathon proved an excellent way to generate innovative ideas and fast turnarounds on proofs of concept. It also proved a great way for a number of participants to learn new skills. Prizes were awarded for the best, most developed ideas, and this time first prize went to Team A for a complete solution that helped to deliver the most relevant performance insights to different audiences.

Learn more about Performance Monitoring, Real User Monitoring or Performance Analyser.

Leave a Reply

Your email address will not be published. Required fields are marked *