On October 9, results for New Zealand's Local Body Elections were announced. The day saw nearly 80 mayors and hundreds of other local representatives democratically elected. As we were responsible for the elections2010.co.nz website, an official source of election results, this was a big day for us. We began publishing results from noon, the deadline for people to cast their vote. We thought there was a good chance it would attract a little extra traffic from that point forward.

As this screenshot (below) from Dawn™ shows, we were right about the increased traffic. In fact, more specifically, the traffic for the 24 hours from noon Saturday to noon Sunday included these statistics:

  • Over 15 million hits,
  • Almost 1 million page views,
  • Over 80,000 visits,
  • Over 40GB traffic (notable given it is mostly a text-based website, with few images. All video content is served through Youtube, and not part of this statistic.)

These are big numbers for a site running on one Linux server, especially given content was fetched from databases, and there were constant updates to the website because results for electorates around the country were coming in over a space of several hours.

People wanted to know who was winning around the country, so it was vital that elections2010.co.nz stay up to provide those answers. stuff.co.nz, the most popular news website in the country, was linking to the elections website, and was the single biggest source of traffic. That's why we used Dawn, our website monitoring software, to keep an eye on the site.

"Dawn helped us to ensure that people could get access to the results of the elections as soon as they were published,” says our Head of Development Rainer Spittel.

We knew the CMS was going to be updated constantly over the day as more results came in, and that CMS updates are a much bigger use of CPU than someone visiting the website. The graph from Dawn above shows that the server handled the traffic smoothly from a CPU perspective. The graph shows early morning to midnight for Saturday, the busiest day for the website.

"During peak load, I was using Dawn to compare and monitor present and historical CPU and memory use, and other metrics like page-load times. Dawn presents this information clearly and intuitively, and it means you can make assessments quickly and confidently," adds Rainer.

Despite anticipating heavy traffic, the project team here at SilverStripe didn’t have to stay in the office all Saturday by the servers, because they knew they’d receive SMS alerts from Dawn if there were any issues. We’d also prepared in advance, naturally.

"Dawn allowed us to forecast how much traffic the server could handle. It made it obvious where some pages were slow when we were building the website, and wouldn’t survive the level of traffic we anticipated for election results day. It focussed our attention to webpages needing optimisation, and as a result, we optimised important pages on the website to a level where a single server could handle over 500 page views per second. Dawn validated that we made the right choices, and we met all our targets in terms of performance, robustness and response time,” says Rainer.

The great thing about Dawn is that its simple interface means that less technical people can also use it, such as our Project Manager Diana Hennessy, who says "I was reassured by logging into Dawn during the peak traffic on Saturday afternoon. The system is very intuitive and it was easy to see that under the traffic, the server was coping brilliantly, and the website was not generating errors or faults."

Best of all, in the lead-up to the election, we were able to add Dawn to the site during a normal upgrade, without needing an outage, so people had constant access to information about their candidates, ensuring once again that democracy was the winner on the day.

Post your comment

Comments for this post are now closed.

Comments

RSS

No one has commented on this page yet.