Loading Form...
Thank you
Nov 5, 2009 | 10 minute read
written by Linda Bustos
An under-performing site has serious consequences to revenue – online and across channels.
Today's shoppers have high expectations when it comes to buying online. Websites which take too long to load can result in negative brand perception, diminished goodwill and a significant loss in overall sales. In our recent webinar Every Second Counts: How Website Performance Impacts Shopper Behavior, we explored the findings of a new study by Forrester Research on behalf of Akamai which has identified two seconds as the new threshold for acceptable web page response times.
Slow Rendering Websites Lead To Lost Online Sales
Consumers Who Make Purchases Are Particularly Concerned About Performance
Those consumers who actually purchase at online stores are more likely to cite site performance as the reason they are unsatisfied with an online experience. Next to pricing and shipping issues, poor site performance is a major cause of dissatisfaction.
A Majority of Consumers Abandon Intended Purchases in the Checkout Process – Directly Impacting Sales
More respondents are willing to, and do, abandon purchases than ever before. The percentage of consumers who intend to make a purchase but leave after the checkout process has begun is up 18% from 2006.
Forrester asked “Thinking of the last time you visited an online store where you intended to buy a product but did not finish the online purchase, at what point did you leave the site?”
The Ripples of a Bad Experience Go Beyond Web Sales
The Overall Brand or Image of the Company Will Also Suffer
When faced with a dissatisfying shopping experience:
Site performance also impacts cross-channel shopping:
A Poor Performing Site Opens the Doors for Competitors
Not only does an under-performing site lead to customer frustration, but 64% of shoppers state they will simply purchase from another online store. This number is up 16% from the 2006 study.
Consumer Expectations for Site Performance are Changing
This represents a significant evolution in consumer expectation from the 2006 study, which showed the majority of customer expectations at less than four seconds. A major factor is the increase in broadband access. 3 years ago, only 54% of consumers had broadband access at home. Today it's 91%, and nearly half have it at work or school.
Conclusions From Forrester / Akamai Study
Shoppers Demand Even Faster Sites
We know performance is a problem. With half of consumers expecting a page to load in less than 2 seconds, many online retailers are setting the bar even higher at sub-one second response time.
Dynamically Generated Content on the Rise
At the same time as demand for faster page load time increases, so is the demand for more engaging functionality and rich content like RIA's (rich Internet applications) such as price sliders or search and category results that re-sort on the fly. This creates a challenge for online companies. Adding content increases page weight and impacts response time. Web page size has more than tripled in the last 5 years.
Performance Issues Multiply with Distance
Often people try to fix performance issues by trying to "thin out" their site (decrease page weight). But many problems lie outside of your data center.
The farther visitors are from data center where the site is hosted, the more response times degrade. Your performance may be 2 seconds at your data center in Atlanta, but 3 seconds on the opposite coast, and even longer for international visitors.
Traffic spikes can also slow down your site (holidays, special promotion campaigns etc). Response times can spike as high as 50 seconds during the holiday season.
Consumers love rich images, dynamic content and personalized information, but the Web is only "fun" when it's fast. Image caching has been around for a while, but today's Web 2.0 features need acceleration technologies like the ability to pre-fetch uncacheable content to quickly deliver dynamic components of your site closer to where your consumer is located (intelligent caching). Route optimization finds the best performing route, and connection optimization aims to take the shorter trip less times.
The 80/20 Rule Applied to Page Loads
When optimizing a site for speed, there is an 80/20 rule applied to page loads -- only 20% of the time will be spent loading HTML. So, even if you have a super fast dual quad-core, 32GB of RAM application server serving the dynamic HTML content in 200ms, it doesn’t matter, if the rest of your UI takes 5 seconds to download and render on the users browser.
In this example, the redirect and HTML took up 300ms, and the rest of the site took 3 seconds to load:
Minimize, Minimize, Minimize! We want to minimize connections and minimize data transfer (file size). We can do this by reducing object counts, compressing images and text, and caching properly.
Minimize HTTP Requests
Reducing connections as much as possible has shown the most performance improvement with the least changes. Combine your JS and CSS files and keep the image count to a minimum. If you need to load a lot of images for style, use a CSS image sprite. Image sprites require a bit extra effort in your style sheets but you can convert 10 images into 1 and take down the number of connections and potentially reduce file size.
Use HTTPS/SSL Only Where Necessary
Creating and tearing down HTTP connections is expensive, and adding SSL (secure sockets layer) to the mix makes it even worse due to the necessary handshaking and encryption/decryption. Restrict SSL to data sensitive areas of your site such as the checkout only.
“Minify” Your JavaScript and Style Sheets
Compilers and browsers don’t care about nicely spaced, human-readable JavaScript and style sheets as long as proper syntax is used. Javascript doesn’t need extra spaces and a lot of the times doesn’t need semi-colons – these can be removed to create smaller JS/CSS files, which will reduce transfer time. Use a tool to compress JS/CSS for production pages -- there are a number of simple, open source tools available.
Put JavaScript Includes at the Bottom
The HTTP 1.1 specification claims a “single-user client SHOULD NOT maintain more than 2 connections with any server at one time.” This means you can "download in parallel" (2 connections at once) but, when a browser hits a JavaScript include, it will not begin any other downloads until this script is transferred. If you have scripts at the start of your HTML, you will severely limit the browsers ability to retrieve in parallel.
If the JS is not needed for immediate use, put the include tags at the bottom. Scripts at the end of the HTML allow maximized concurrent downloads and the page to render while the JavaScript finishes. (Of course some JavaScript cannot go at the bottom).
Use Cache Control Headers
Properly caching a page at the browser level can significantly reduce load times. It also helps with subsequent page load times if content is shared (same JS/CSS files). If there is no cache control header within a HTTP request response, the browser does not know how long it can cache the object and therefore won’t cache it at all. It's best to set static content to "never expires" and dynamic content with appropriate expiration. Ensure your JS and CSS files are externalized for caching.
Other Thoughts on Design Best Practices
HTTP 302 redirects are slow – avoid them if possible.
Use multiple static content servers. Although HTTP 1.1 prevents more than 2 connections at a time from one server, if the servers have different domain names the browser will make addition parallel connections to those servers and download more content concurrently. One good example of this is with Google Maps Street View. There is a lot of heavy image content, so they use multiple image servers and download a lot of data in parallel making the page load time quick.
Use Gzip compression for page responses. Gzip compresses the HTML response at the server and sends it back to the browser, which will then do the decompression. There is more CPU overhead but quicker transfers is typically a better trade-off.
Manual Testing
The problem with manual testing is it's usually done with clients and servers on the same 1GB per second or hundred megabit network (or sub-network), so you're not really seeing the impact on page size or HTTP requests. Tools like YSlow and Firebug will show you relative load time comparisons. These Firefox plugins are excellent way to make sure your request response headers are properly setting browser caching and the browser is actually caching the objects.
The Throttle Control Linux tool simulates an Internet connection with varying throughputs and latency levels. Another way to test manually is to route your connection through the Internet to some degree, if possible, or use remote data centers. Even VPNs add a good degree of latency and can be used for testing.
Automated Testing
Testing of HTML efficiency usually doesn’t come up during the load/performance test cycle. Load generators are typically on the sub-network and so page size and UI caching are not of major concern. Automated testing should be done at some point with only HTML and then with all objects being loaded for comparison. Testing can also be done through a Firewall/WAN simulation or with the Throttle Control tool to understand how latency and limited bandwidth affects you.
Load and performance testing typically focuses in on the application itself and is used to work out any slow performing components of the application. For example, if we’re running a Java web application, we’re going to do all the usual JVM tuning so that our application runs quickly and smoothly. We usually don’t focus on the UI very much. Furthermore, load generators are typically on the same sub-network so the tests won’t be affected by latency or bandwidth limitations. This means that large page sizes or invalid caching are not going to be noticed to a large degree.
How do companies typically engage with Akamai?
There are different solutions for different sized businesses. It's a managed service with monthly billing which may be priced based on various metrics such as secure transactions or account page views.
What about minifying HTML?
Minifying JS/CSS is easy, you only do it once. But for HTML, we recommend the Gzip method with your Apache server in front of your application server. HTML will be Gzipped at a smaller size than with regular HTML.
Do you have any data on the impact of conversion rates between 2 seconds and 4 seconds?
In this survey it was apples to apples with the 2006 survey - we only measured how long before consumers would abandon a site. Only customers were surveyed, not retailers.
Where do you typically see designing for page load expertise in the organization? Where should this expertise lie?
It needs to be something that's considered across all the business areas. We often see problems when the marketing people are not talking to the IT people (especially with outsourcing and platform-as-a-service). We always recommend that the great ideas be delivered in a way that creates a positive customer (site) experience, and design/feature performance should be brought to the table every time these discussions are happening.
What about flaky wifi connections, are people more tolerant than when they are "wired"?
The research did not address connection types specifically in the survey.