article on how to measure web page load times. As they rightly say, the ability to measure load times for a web page is both very important and also very difficult to determine. I would like to explore some of these factors and to add a few considerations of my own that don't always get included under the topic of web site performance.
Build it and they may come
If people have a choice of whether or not to visit your new web site, they may decide the wait is simply too long. Standing in line at the grocery store we may roll our eyes when the cashier takes too long, but try the same situation on-line and the person "in line" simply hits the back arrow and finds a more responsive site. So, load times are in some ways as important as the quality of the content. The easier it would be for a visitor to find the same content or service elsewhere, the faster your site has to load.
As bad as it may seem that you are losing people to other sites due to slow load times, the problem becomes even worse if those people are finding your site via a Google search. Google measures all their user activity to determine what Google search users like and don't like. Things they like go up and things they don't like go down. When it comes to measuring search results, a click from the search results is seen as a valid indicator of value. But, when a user quickly returns to the search results to try the next link, Google sees that as evidence that the site at the end of the link was not valuable. And, as you might have concluded, the subsequent searches will list your site lower on the search results.
Don't evaluate a site under optimal conditions
One of the most common mistakes I've seen with inexperienced development teams is that the web site owner reviews the site on the developer's computer. It's a natural thing to do because the site has not been optimized or the web server may not have been prepared -- and all of these reasons are quite valid -- but if you don't look at the site under some sort of realistic conditions while developing the site, you may end up with a slow site that cannot be easily fixed. And before you announce your publicly, you certainly must test it out under a wide range of conditions.
I was involved in a situation where an office I was consultant to put up it's very first web page for an online learning program -- this was a long time ago -- and they did not check their new web site under normal conditions. They loved their new site because it looked so much better than all the other web sites. They had wall-to-wall intricately designed graphics. Today that accomplished in a reasonable way, but back in the web 1.0 days, there was no way to get such images into a design without very slow load times. And that's what they had but didn't know it. So they went to announce their new online learning program by purchasing a $50k+ full-page advertisement in the New York Times Sunday edition. Saturday night I got call saying the site was down and could I look at it. It didn't take long to see that the site was running but that the graphics were choking the system. What would have happened if large numbers of Times readers tried to access it on Sunday is yet another issue. It turns out they thought it was broken because the manager went to show his wife the new site and it would not come up. That was the first time the site had been reviewed on a computer other than that of the graphic designer's laptop. I was able to optimize the images and get it down to merely slow, but the lesson I saw that night is one I've seen various versions of many times since.
Even if a site loads well under realistic conditions, it's unlikely that you will have hundreds or thousands of simultaneous testers. A properly designed site can still come to a crawl if the server has limited capacity. Usually, it's a limitation of bandwidth that will slow the site first. Any decent server purchased within the last five years is going to have enough capacity to serve even the largest Internet connection. But as people divide their servers into multiple virtual servers, the capacity allocated to the web site might not be sufficient for heavy use. So one way to evaluate the capacity of the server and the Internet connection is to perform a stress test. The cheapest way is to get as many people as possible to use the site at the same time and then see what the levels of used resources are. Divide the number of people into the amount of resources to determine how many people could be served. This will give you a basic feel of how many users you might have, but there are so may variables that this number should be considered only an approximation. But it's not an easy calculation even if you do know how many resources a user typically uses. Users who return to your site multiple times may have your site's graphics in their browser cache. That means they take a lot of resources on the first visit but not on the subsequent visits because they are not downloading your banners and other graphics from your site. Users who come from some ISPs, such as AOL, may get images from their ISP's cache. In this case, it's not the first time a person visits but it may be the first time a person visits from a given ISP. All other users, even when it's their first visits, will be downloading your content from the ISP's cache and not from your site. Of course, this is good news for your site in terms of reducing strain, but, as you can see, calculating resources is never easy.
But, if you are on a fixed set of equipment and Internet connections, the most comprehensive review comes from using an external stress test firm. Such firms have servers located around the world and will attempt to reproduce varying levels of traffic from a number of geographically dispersed locations. They will also use different browsers to see if there are any browser-related bugs in your site's design.
The solution I like most is host the web site on a hosted platform where more capacity can be added instantly. Some providers allow a web site to burst up to a higher level of capacity than paid level. If the site consistently goes about the paid level, then the hosting provider will expect you to upgrade to the next service level. While this is a great solution, don't let this option make you lazy. A slow site may not choke the Internet connection of the hosting firm, but your users will still leave the site if is slow. No one is increasing their speed on their side.
Evaluate and Adjust
Many times the parts of the site that will be of most interest to users will be obvious and at other times a site may generate traffic to pages or parts of the site that were not envisioned to be of great interest. Because Google search results flattens a web site's hierarchy of content levels, you may find that a page three levels down in your site is more popular than the pages above it. There can be multiple reasons for this. Perhaps the higher level pages were not clear as how to find the content users really wanted and visitors use Google to find content on your site. But in other cases, your site may simply being supporting needs you did not realize your audience had -- or you found an audience that you didn't expect to be interested. If the cause is user confusion, fix the text and menu options. Web site statistics can be used to see the "path" users take as they click through the site. Look for results that don't make sense. If a large number of people are going to a page and not continuing on as would be expected, perhaps this indicates a problem. Consider bringing up the content or putting a link to it on the home page.
You should also ask yourself if the usage patterns you are seeing indicate a different form of interest than what the site was originally built to address. If so, adjust your site to meet these new needs -- assuming the new audience is of interest to your organization, of course. One of the lessons that's important to learn is that the goal is not just to make a site load faster; rather, it's to get the user to the desired content as quickly as possible.