Software bloat
From Free net encyclopedia
Software bloat is a derogatory term used to describe the tendency of newer computer programs to use larger amounts of system resources (mass storage space, processing power and/or RAM) than older programs. It is also used in a more general context to describe programs which appear to be using more system resources than necessary, or implementing extraneous features. Software exhibiting these tendencies is referred to as bloatware or, less commonly, fatware.
Contents |
Background
Software developers involved in the industry during the 1970s had severe limitations on disk space and memory. Every byte and clock cycle counted, and much work went into fitting the programs into available resources. The extra time spent by programmers translated directly into smaller, more efficient software products, and hence was seen to translate directly into sales revenue.
However, technological advances have since multiplied processing capacity and storage density by orders of magnitude, while reducing the relative costs by similar orders of magnitude (see Moore's Law). Additionally, the spread of computers through all levels of business and home life has produced a software industry many times larger than it was in the 1970s.
Possible causes
Some of the observed bloat is caused simply by the addition of new features and content, such as templates, or by the use of higher-level programming languages. However, at other times the cause may be a programmer's lack of attention to optimization or design, often frowned upon by other programmers as a sign of carelessness or laziness. As a result, the emphasis in software design could be argued to have shifted away from tightness of design, algorithms and resource usage. Instead, time-to-market may be seen as becoming the key focus of developers.
The extra time needed to optimize software delays time-to-market, losing some potential sales and increasing labor costs. The improvement in quality due to optimization was previously thought to more than make up for these costs, but with modern hardware, it is now more common that the payoff from optimization is too small to justify it.
The software industry has responded to this changing tradeoff by emphasizing rapid code development, automating programming tasks that had previously been areas of fine craftsmanship, and re-automating on top of that. The result is multiple layers of software abstraction resting on top of each other, and the task of the modern software programmer often consists more of administering automatic code generators and pre-written components to their will than in the fine handling of software to be completely optimized - though, with the establishment of well-founded, stable, optimized and dependable software toolkits, this enables functional code to be created much faster and more powerful than coding up equivalents by hand, where development time would be significantly longer. A case in point is NeXT's OpenStep Foundation Kit and Application Kit - a set of reusable objects that enabled developers to create functional and usable code faster than conventional methods.
Since any single application is usually small enough to fit on any computer's hard disk or RAM, its developers are not even considering the size implications for a user, who has to install many bloated products.
Some hold that the result of modern rapid application development practices, forgoing the optimization practices of the past, is that modern software running on faster computers does not present a user impression that is significantly faster — this user impression, due to the consumption of underlying technical advances by layers of software abstraction in pursuit of time-to-market, is the essence of bloatware. Unfortunately the abolition of this software abstraction can hamper the underlying development of the program. Software structures that are well crafted in place to allow for easy extensibility and maintenance will assist software developers in that upgrading existing code will be simpler and faster.
However, the optimization at the machine-code level need not be done by hand. Modern compilers often take optimization of code into consideration, and this forgoes the need for hand-manipulation of assembly code. Naturally, this software optimization is never one-hundred percent perfect, but then the resulting effect from a programmer making the optimized code fully optimized is negligible.
Reasons for existence
Why is there so much bloatware around?
Joel Spolsky in his Strategy Letter IV: Bloatware and the 80/20 Myth argues that while 80% of the users only use 20% of the features (a variant on the Pareto principle), each one uses different features. Thus, "lite" software editions turn out to be useless for most, as they miss that one or two special features that are present in the "bloated" version. Spolsky sums the article with a quote by Jamie Zawinski:"Convenient though it would be if it were true, Mozilla is not big because it's full of useless crap. Mozilla is big because your needs are big. Your needs are big because the Internet is big. There are lots of small, lean web browsers out there that, incidentally, do almost nothing useful. But being a shining jewel of perfection was not a goal when we wrote Mozilla."
Examples
A common example of software bloat is the evolution of word processing programs, which have for long been deemed especially resource-hungry in the range of typical productivity applications. It can be argued that the basic tasks — writing and simple type-setting — have been possible since the first such programs were introduced, and that more advanced features bring needless weight to those who rarely need them. On the other hand, users have since grown used to modern convenience features, and less feature-packed applications are still available for those who prefer them.
In the second half of the 1990s and early 2000s, Java was the platform of choice for bloatware. Client-side scripting techniques were not mature and most web applications chose implementing small user interface features, such as a tree view, in Java, which requires a massive virtual machine to load. Moreover, some early Java applications required a specific virtual machine (namely Microsoft's implementation), which was only available at a specific hardware platform, rendering Java-ization meaningless. Advancements in virtual machines and developer's preferences on more native ways of implementing web-based UI features such as Javascript greatly reduced this Java-related bloatware feel.
The term bloatware in Linux circles tends to be used by advanced Linux users as a pejorative to refer to distros that contain what they perceive to be an excess of software. It can also be used to refer to Windows. It is perceived that when a beginner-friendly distribution such as Linspire, Mandrake or SuSE is installed, a considerable number of unnecessary programs are installed that have a negative effect on the OS's stability or speed. On the other hand, there are also other beginner-friendly distributions such as Kubuntu or Ubuntu, which is distributed on 1 CD with basic and popular programs and users can easily add more programs after installation.
The wide range of Linux distributions mean that they can be contained on anything from a single floppy disk (1.44MB) to several CDs or DVDs. Mandriva, a popular distribution, is traditionally delivered on 3 CDs, or a single DVD. The complete latest version (including all packages) of Debian requires no fewer than 14 CDs. Slackware, the apparent distro of choice for Linux users against bloatware, also comes on 4 CDs (only 2 of which are required for a full install). Most popular distributions come with hundreds of packages, which makes individual selection infeasible at times; installers of Mandriva or Fedora Core do not show the real complete package listing and might install additional packages which the user might not make use of. But this software can be removed or more added later during use. Moreover, distributors intending to make generic packages result in packages requiring other packages which are prerequisites for the features that a user might not plan to use.