Agner`s CPU blog

Software optimization resources | E-mail subscription to this blog | www.agner.org

 
thread Why is my computer so slow? - Agner Fog - 2009-10-06
reply Why is my computer so slow? - Declan Cooper - 2009-11-28
reply Why is my computer so slow? - Igor Levicki - 2010-01-02
last replythread Why is my computer so slow? - Shervin Emami - 2010-12-29
last reply Why is my computer so slow? - Agner - 2010-12-29
 
Why is my computer so slow?
Author: Agner Fog Date: 2009-10-06 07:48

When I buy a new computer, I am happy that it runs much faster than the previous one I had. But after a year or two it gradually becomes slower and slower. Finally, I have to buy a new and faster computer and the history starts over again. But why?

I remember the good old 1980's where a PC was just a PC. The computing power of the original IBM PCs was ten thousand times less than today, and so was the RAM and hard disk capacity. Yet, they appeared to be faster than computers are today. In those days, you could press a key (there was no mouse) and expect to get an immediate response on the screen. Today, you sometimes get an immediate response, sometimes not. The other day I had to wait more than half a minute from I clicked on a drop-down menu before the menu actually appeared on my screen. The computer was out of RAM and was busy installing an update I never asked for. Come on, computer industry, you can do better than that! My computer is ten thousand times faster than it was back then, and yet I am wasting more time waiting in front of my screen than I was back in the old DOS days.

As you know, I am doing a lot of research on computer performance, and here is my list of the main culprits:

  • Platform-independent programming. Programming languages like Java, C# and Visual Basic produce a byte code or intermediate code that is interpreted or compiled on the end-user's computer. Interpreting the byte code takes at least ten times as long time as executing compiled code. Some systems use a just-in-time compiler that translates the byte code to machine code, but the just-in-time compilation takes time too, of course, and the two-step compilation process gives a less optimal machine code in the end.
      
  • Higher level of programming. The trend in programming tools goes toward ever higher levels of abstraction. This enables the programmer to make more complex functionalities in less time. But the data and instructions have to pass through multiple layers, tiers or domains of the ever-more complex software engineering model.
      
  • Runtime frameworks. Higher-level programming tools often require various frameworks to be running on the end user's machine. This includes the Java virtual machine, .NET runtime framework, Flash, Adobe AIR, Microsoft Silverlight, various script interpreters, and runtime libraries. These frameworks often use much more resources than the application itself.
      
  • Data bases. Some applications use a database for storing simple user data where a plain old data file would suffice. 
      
  • System database. The Windows system database is such a big resource-thief that it deserves a separate point. The system database grows every time you install a new application and it is not always cleaned up when you uninstall something. Most applications store their configuration data in the system database and it can take several seconds to retrieve these data.
      
  •  Virus scanners. Some virus scanners check all files every time they are accessed. Other scanners check files only when they are created or in a background process. Some of the most common virus scanners slow down everything so much that it becomes a real nuisance to the user.
     
  • Graphical user interface. User interfaces tend to become more and more fancy. Many resources are used on graphical elements that have only aesthetic value and tend to distract the user's attention away from his or her work.
     
  • Background processes. A typical computer may have fifty or more processes running in the background. Many of these processes are unnecessary. Some of these processes belong to a specific application, but they are started every time the computer starts up even if the application they belong to is never used.
     
  • Network. Some applications rely on remote data, assuming that the user has a fast internet connection. This makes response times more unpredictable than when data are stored locally.
     
  • Automatic updates. It is becoming more and more common for applications to update themselves automatically. This is not always good from the user's point of view. The program may suddenly behave differently from what the user is used to - and new features also mean a potential for new bugs and incompatibilities. Automatic updates are necessary when there is a security reason, such as updating a virus scanner or closing a security hole in the operating system. Other updates should be made only if the user wants them. The automatic checking for updates takes a lot of network resources, and when an update has been downloaded it has to be installed, which can take a lot of time. Some applications will search for updates every time the computer is started, even if the application is never used.
     
  • Installation. Even though many software developers pay attention to the performance of their programs, they are paying less attention to the installation process. Some installation procedures are written in script languages that run incredibly slowly. The user may want to go away and do something else while the program is being installed but it keeps popping up questions that the user has to answer.
     
  • Boot up. The time it takes from the user turns on the computer till he or she can actually start to work is often several minutes. This time goes to downloading and installing updates, starting background processes, checking for plug-and-play hardware, reading the system database, and loading system components and drivers.

Now we are getting closer to answering the question of why a computer gets slower after a couple of years. Every time a new piece of software is installed we are also installing a potential new resource-thief. Newer software is possibly programmed with more resource-hungry high-level tools and frameworks than older software. We keep updating software all the time. Sometimes it is updated automatically. Sometimes we update it because we want new features. And sometimes we have to update our software because the old version is incompatible with something else we have installed on the computer. The increasing number of background processes use more RAM. The system database keeps growing. And the hard disk becomes fragmented.

There are many incentives to installing new software: A website that we visit requires a new version of Java, Flash, QuickTime, or whatever. A document we have received requires a new version of the word processor. A software package that we install tells us that it needs the latest version of .NET. And the next gadget we buy tells us to update to a newer version of Windows.

A lot of the software we are running and the websites we are visiting have built-in incentives to make us update and buy more. The software industry keeps making products that require bigger and faster computers. And the hardware industry follows Moore's law and produces ever more powerful computers that make these developments in the software industry possible. This is what keeps the whole business going.

I don't think there is an evil conspiracy between the hardware and software industry to make us buy more and more. It's simply the market mechanisms. The software industry is focusing on producing more advanced features in a shorter development time for competitive reasons. This makes them choose the highest level of development tools.

As consumers, we would be more satisfied if software developers would focus more on optimizing performance and minimizing resource use. New software products are typically tested under the most favorable conditions. The software developer may test his product on the fastest state-of-the-art computer that is running nothing else. Network features are tested with an isolated test server - not on an overloaded server in a crowded network. It is time that software developers begin to test their products under worst case conditions rather than best case conditions.

   
Why is my computer so slow?
Author:  Date: 2009-11-28 08:54
I really like this article, and could not agree more with you regarding the spirit of optimal (or say "stable production mode") case scenario code testing. If I may, I would like to elaborate on your statements to include the "compounding" effect of this upon computer centers, such as core labs for medical imaging and pharmaceutical research and development.

In certain industries such as biotech/pharmaceutical, computer systems validation is required before any application can be utilized as a means to determine a clinical endpoint, or diagnosis.

Simply stated, computer system validation requires evidence of known outcomes with predictable and reproducible results. The basic components of validation include installation qualification, operational qualification, and performance qualification.

The point I want to make is that the last component, PQ is increasingly completed within a context of a poorly designed or developed user requirements specification document, OR within a testing framework that does not adequately "stress-test" the production environment. The analogy is very similar to the multiple operations of a computer system that aren't taken into account during design specifications, as you imply.

In this case, the "production environment" (e.g. a core lab containing many end-user systems, server(s), and the appropriate contingency systems and sub-systems), has a baseline idle time as well as peak periods. These peak periods are the time points that usually drive decisions for modifications to the environment to address bottlenecks.

If there were ways to better simulate these peak times in the PQ (including the points of your article, which I believe are overlooked), then the designers and developers could move to a more rational predictability of outcomes model, thus, User Acceptance Testing, for example, could play a greater role earlier in the code development process. Other benefits may include the optimization of the operating system(s) for the intended use of the computer while running the application.

Thanks for great information...

   
Why is my computer so slow?
Author: Igor Levicki Date: 2010-01-02 11:17
The root of the issue as I see it is the uneducated consumer who is accepting all junk that is being shoveled onto them.

People don't know that they don't need jqs.exe loaded for Java applets to run, same goes for many other applications like Adobe Reader, Office, WinAMP, QuickTime, Real Player, HP/Epson/XYZ printer drivers, etc, who are all shoveling their useless processes into RAM under the excuse of faster startup as if the whole computer belongs to them when in fact the icon in the tray is there for brand awareness spreading.

Not only people accept junk -- they ask for more of it!

It's all about "bang for the buck" nowadays. Why would a disc burning software need media player functionality is really beyond me. But, in having one bundled, it offers "more value" to the customer while allowing the software company to keep the same high price. What is devastating is that the consumers end up paying for unwanted features, the software company gets to keep their profit at the same level as opposed to usual method of competing with price reduction, and the developer is in effect paid less for more work. Net result is having two poor pieces of code instead of a single good one.

   
Why is my computer so slow?
Author: Shervin Emami Date: 2010-12-29 11:25
Nice summary, I totally agree!

But I think you also forgot to mention the effect of bloated software design, that causes huge delays in memory caches and hard-disk activity. Since even simple applications now use so many runtime libraries and other files, the hard-disk seems to be accessed for almost any operation, thereby significantly limiting the power of a fast CPU and fast memory caches. These days our computers have multiple core CPUs running in the GigaHertz with GigaBytes of fast RAM, and yet it is still expected to have noticeable delays when doing such basic things like opening Notepad or viewing the contents of a folder, and like you said, its still normal to have various sudden delays of several seconds while things update in the background, or large chunks of RAM are shifted between Virtual RAM and Hard-Disk! Obviously Linux is generally better than Windows in this regard, but I'm even surprised at how slow Linux behaves these days.

Its just amazing that software developers never question the idea that a 2 second delay in opening Notepad means about 6 billion CPU cycles, when it should surely be possible in just a few thousand cycles if the Operating System and software were written more efficiently. Programmers believe all the marketing hype that current compilers will automatically generate better code from C/C++/Java than would normally be possible with hand-optimized Assembly code, so everyone just assumes that adding more & more higher layers of abstraction to C/C++/Java/.Net code will just be optimized away by their infinitely wise compiler and automatically make full use of cache & SIMD instructions to their full potential, etc...

Little do they know that while they can expect maybe a 20% improvement in their code by upgrading parts of their hardware, they can expect a 2000% or more improvement by writing highly efficient code that takes advantage of SIMD and other low-level features. Sure it would be great if the C/C++/JIT compilers could generate code as fast as a reasonable Assembly language programmer, but that just seems way too complex to expect compilers to achieve in many years to come.

To see the difference in efficient code and standard code, I just tested my 3D modelling program (Draw3D) I wrote about a decade ago that was hand-optimized for a 75MHz Pentium CPU, and it now runs approximately 50 times faster on my new laptop (using just software rendering without any GPU or SIMD instructions), whereas most software like MS Word & PhotoShop & even Notepad barely run any faster now than they did a decade ago!

   
Why is my computer so slow?
Author: Agner Date: 2010-12-29 12:02
I agree that bloated software is a big problem, but I don't agree that assembly coding is the solution. In my experience, a well designed C++ program (without the .net) will run just as fast. Assembly programming is too tedious and error prone, and not necessary in the general case.