The Internet has an obesity problem. Yes, you heard that right. Websites are too fat, and they’re getting fatter. In fact, a recent article in Wired magazine said that the average web page is now the same size, data-wise, as the classic computer game Doom. A compressed copy of the shareware version of the game takes up 2.39MB of space, according to the article, whereas the average web page takes about 2.3MB to download. That’s a lot of girth.
Compare that to the first web pages, which averaged about 200KB and could easily save to a 1.44 MB floppy disc (remember floppy discs?). Compare that to today’s web pages. In fact, many sites today are even larger. Wired’s home page is about 7.8MB, which would require about six 1.44MB floppy discs.
So let’s look at some of the causes of the Internet’s weight problem.
With our constant need to track every visitor action on our websites, we’ve been adding all sorts of code. A web page used to be 50 lines long, but for some of the tracking code from programs such as Google Analytics and other commercially available apps, just the code itself is about 50-100 lines. Add on to that the code snippets from online forms, display ads, and other extras, and you’ve got a lot of overhead.
Additionally, you can add programming or coding practices that are not very conscientious about the use of bandwidth. With every new programming language, that includes libraries and other pieces of overhead, programmers throwing everything but the kitchen sink into websites, without considering whether a piece of code is even necessary for the functioning of a certain page. For example, if you have 10 libraries, but you only need two specific functions for a particular page, why make the user load all 10 libraries?
So what’s the problem? Why, with seemingly unlimited and extremely fast bandwidth, are we even worried about this? Two important reasons. First, Internet service providers are starting to cut back on unlimited data plans and have started throttling back download rates when users go over their monthly limits. This is happening more and more often as the large providers need to show a profit after they’ve gotten you to commit to their crazy introductory deals.
Secondly, not everybody in the world has a fast Internet connection. There are still about 2.5 million users who access the Internet via dial-up (remember the familiar, yet all-too-frustrating whistling sounds that indicated you were eventually going to connect for about 10 minutes of email downloading and web surfing time?). Also, with the developing countries providing the fastest Internet growth, most people are now accessing the Internet with cell phones. Heavy sites make accessing the web on a handset a painful experience. Yes, there are some websites that don’t understand that mobile phones and 1800px pictures don’t mix well.
The first solution is to think before you code. Use a bit of website planning before you design a web page that can quickly add up to 7-8MB. Don’t make a user load 60 libraries when two will do. Don’t add high-resolution images; instead, reduce the size of images and make them appear high resolution.
And what if your website is already super heavy? Conduct a code audit and carefully cut out the fat (or you could hire a consulting company to help you with that).
The ideal size per web page is about 2MB or less, but sometimes it’s hard to keep it that low. According to the above-cited Wired article, Google’s Accelerated Mobile Pages (AMP) program might help as well.
And if you’d like more information on how Softtek can help your website slim down and provide faster load times for your clients while improving your users’ online experience, read more about our outsourcing software development services here.
Finally, if you’ve been in this situation or know of any other healthy diet regime for websites, please drop me a line below!