In 2006, the average online shopper expected a page to load in four seconds; today’s shopper expects a page to load in two seconds or less. Organizations have realized that delivering a consistently high online performance is very difficult to ensure. Even with the advanced hardware and service additions in place, site performance can be mediocre at best due to the inefficiencies in page construction and rendering.
According to a PhoCus Wright research, up to 57% of shoppers will abandon a site after waiting three seconds for a page to load, and 8 out of 10 people will not return to a site after a disappointing page load experience. Almost one third of these dissatisfied users will go on to tell others about their disappointing experience. Almost 60% of mobile web users expect sites to perform as quickly on their handheld devices as they do on their home computers, and about the same number of mobile users say they would be unlikely to visit a site again after a poor mobile web experience.
The world out there is a great leveler: even spending a fortune on hardware and software infrastructure systems do not deliver results when it comes to improving site performance. Why, the average load time for Fortune 500 sites is still seven seconds! Now, this begs the question: how do some companies deliver an excellent show online, bringing in more revenue and more page views?
Companies that depend on web-based applications to generate revenue soon realize that site performance directly correlates with increased conversions, increased shopping cart size for ecommerce, increased page views for advertising, and higher search rankings. We found that successful companies tackle web performance challenges by following a course of action that directly has a bearing on business metrics:
Recording and analyzing site traffic: Despite the advances in technology and increased bandwidth availability, page rendering on a browser is not possible without making a number of requests—or ‘roundtrips’—to the server. An average web page is a staggering 965k in size and contains 85 objects. This means that it takes dozens of requests to the server to retrieve the page’s content, making the overall page load time slower. Web savvy companies identify the need for resource consolidation and caching. They invest in clever systems that detect patterns of resource usage and formulate rules for resource consolidation and caching, grouping resources according to the pages that request them.
Modifying site responses in real time: How best can companies quickly gauge audience interests and preferences determine their strike rate. They know the importance of changing the behavior of web pages without altering their appearance to the user. This helps them in devising fail-proof customer engagement plans. They wouldn’t mind investing in a system that makes dynamic page modifications based on cached processing instructions for each page.
Working towards a higher SEO ranking means more visits: Google and other search engines allocate either a set period of time or a set quantity of data for crawling each site. Increasing the number of crawled pages directly affects rankings and traffic. So by implementing a clever set of site optimization measures, the companies ensure that the Googlebot web crawler covers approximately twice as many pages than other ‘unaccelerated’ sites.
Increasing the page speed brings in more people: Eighty percent of performance issues happen at the front end, at the browser level.
AOL conducted studies in which they measured site performance and page views. They found that visitors to the top 10% best performing sites viewed 50% more pages than visitors to sites in the bottom 10%. On average, visitors to the faster half of the sites viewed 9% more pages than visitors to the slower sites. Smart businesses use automated front-end optimization tools to make the pages render faster in a browser.
Predicting where visitors are likely to go: Successful companies keep learning about how visitors use the site and predict what pages they would most likely want to see next. These companies use tools that automatically analyze usage patterns and page content, and develop a dynamic repository of rules and cached resources. By strategically modifying web pages on the fly, companies lead users along a dynamically determined path.
Relying less on widgets and snippets: Third-party scripts such as ads, social media widgets, and analytics tags block pages from rendering until all the scripts are loaded. This delay in page loading time affects the overall user experience, leading to low conversion and revenue. Microsoft’s Bing site conducted a study wherein they slowed page load times by 2 seconds. As a result of the slowdown, users ran approximately 2% fewer queries and clicked 3.75% less often. Conversely, speeding up the site by 2 seconds resulted in a 5% revenue increase. Prudent companies make use of expert systems that increase performance by improving the utilization of browser cache.
Recognizing that not all browsers are created equal: The days of being able to create websites that are optimized for only one or two browsers are clearly behind us. Each browser type has its own preference as to how it renders pages, how many connections are opened, etc. Techniques that work for one browser type can slow down, or even break, pages in a different browser. Browsers based on the WebKit open source engine—including Apple Safari and Google Chrome—as well as a growing array of mobile browsers, are increasingly taking market share from Internet Explorer and Firefox. Although Internet Explorer continues to be a widely used browser, its capabilities vary significantly across versions. Web performance optimization can be real great assistance for businesses to accelerate their reach. Intuitive capabilities of advanced tools help companies to tailor the behavior of web pages and exploit the features of the user’s browser.
Performance optimization is an essential business driver. Website performance has a direct correlation with revenue in both ecommerce and advertiser-supported applications. Given the high demands that modern web applications place on servers, the high expectations of users, and the need to address a wide variety of browsers, companies need to address application-level inefficiencies using a great tool on a continuous basis without effecting changes on the network, server, or software.
The author is Managing Director - INDIA & SAARC, Radware
Disclaimer: This article is published as part of the IDG Contributor Network. The views expressed in this article are solely those of the contributing authors and not of IDG Media and its editor(s).