What Does “Personal Computer” Mean? Communicating a New Paradigm

businessmen of late 19th-early 20th centrury

Historic Desktops.

Things have changed significantly since the advent of the personal computing paradigm. The idea of reserving an entire machine for just a single user was absurd in the late 1960s. Then the future was invented at Xerox PARC with the Xerox Alto computer (also known as the Interims-Dynabook) and its graphical user interface with early components of the desktop metaphor, Ethernet access, and laser printing. Since the mid 1980s, PCs have become standard equipment for almost every office and knowledge worker. Today, the family tree of PCs extends from desktop PCs—placed on top of, or underneath, the desk—to laptops, notebooks, sub-notebooks, netbooks, tablets, and the like. However, all these terms just refer to the form factor of the hardware case. They miss the point of the personal relationship between users and their digital desktops.

The user is key in this equation; therefore it is the P in PC that needs the care and attention of user experience experts. It’s the data, tools, and preferences that make up the user’s personal working environment. Damage to, or loss of, any of these components can have a severe impact on the usefulness and perceived robustness of the system. This is most obvious for data loss, but tools that are not backward-compatible have the same effect. Worse yet, an unexpected change of user preferences—by updating the operating system, for instance—causes a drop in efficiency because the user’s familiarity with the system suffers when the system does not behave as expected any more.

The scope of the issue is even broader today than it was fifteen years ago, because network-based services and social software now draw the user’s attention to the World Wide Web. Formerly local and private documents are now transformed into social objects by uploading them to photo, video, slide sharing, collaboration, and community sites.

What’s a PC anyway? It is the personal relationship between users and their documents, applications, customized settings, and online data and connections that matter. The hardware is only a means of lighting the pixels that make up a magical window into the digital world. The PC continues to offer a familiar local desktop environment. The online environment, though, is a mere access device that can be exchanged for any other computer with a web browser and Internet access.

Things are changing once again and will make the computer box as we know it obsolete—or, rather, will replace it by a virtual box.

The requirements for PCs in large and medium-sized enterprises are ease of administration, low energy costs, and, last but not least, flexibility for employees to move to other work places, to work from home, or to work mobile at any other location. Physical PCs do not sufficiently address these requirements because the personal working environment is confined to the hardware where it is installed and running. An alternative is the virtualized PC, which runs the operating system on emulated PC hardware. This is called desktop virtualization.

It is quite easy to confuse the “desktop” as used in “desktop computer,” with the one used in the “desktop” metaphor. The latter is, in a sense, a virtual desktop already. Now, in addition, desktop virtualization turns the computer into software by introducing a new layer between hardware and the operating system. The virtualization layer consists of a hypervisor that emulates PC hardware on top of a host computer, to run standard operating systems, such as Windows or Linux.

Running thousands of desktop instances in data centers is quite energy- and cost-efficient compared to the same number of actual PCs. Defining pool policies, for sizing, cloning, and the recycling of desktops, as well as group assignments between user directories and desktop pools, saves a lot of work for the administrator. He or she decides if the assignment between employee and virtual machine should either be persistent, as it used to be with a PC under the desk in the office, or flexible, to grant temporary access to PCs in call-centers, classrooms, or Internet cafes.

What’s left of the PC on the user’s end? There is still a mouse, keyboard, and monitor. But, since all computation takes place in the data center, neither a powerful CPU, nor memory, nor a hard drive is required on the client side. The fan is obsolete as well, which leads to an access device with no moving parts and no noise; thus, lower power consumption, longevity of the client, and improved ergonomics at the work place are advantages of the virtual desktop. Maintenance costs are also low or non-existent, à la thin clients that no longer have a local operating system. Mobile access to the desktop session is possible with any remote desktop protocol (RDP) client. (Another example is the Oracle Virtual Desktop Infrastructure which uses a Java-enabled web browser.)

In order to deliver a competitive end-user experience to the clients, certain areas need to be considered. CPU, memory, and storage do not usually pose a problem on the virtualization side because the requirements for a specified number of machines can be estimated in advance. The usage of other shared resources is more difficult to predict—such as access to the network and the amount of storage needed to accommodate hundreds of simultaneously running virtual machines. When it comes to motion graphics, bandwidth between data center and thin client, as well as the client’s display performance, also becomes an issue.

Response times are typically discussed in three orders of magnitude from 0.1 to 10 seconds. A system response that takes longer than 0.05-0.2 seconds is no longer perceived as co-instantaneous. For example, a skilled typist produces 300 characters per minute. This equates to pressing a key every 0.2 seconds on average. If the response time is about the same, then the output is at least one letter late! The range between 0.2 and 2 seconds is where the user feels in control of the system. The delay is recognized, but the loop of command and result feels like a smooth dialog with the system. If an operation takes longer than two seconds, a progress indicator should be displayed. But even a progress bar cannot keep the attention of the user for longer than ten seconds. After that, the user has to recognize the system state once again, and plan the next interaction steps to accomplish the task. The measurements for the perceptual level, dialog level, and cognitive level have been well known for decades. Thus, web developers should remember that rich Internet applications should also respond in time, or provide appropriate feedback to improve usability in cases when network or computation latency is too high.

In order to perceive smooth animations, a fourth time range that is even an order below 0.1 seconds, should be considered. Movie cameras use a standard exposure of 24 fps (frames per second). However, the human eye is able to detect frequencies of 60 Hz, which is also the typical frame rate for HDTV. When it comes to games on large screens with fast animations, rates up to 100 fps are necessary to maintain the illusion of motion.

Today, games and other applications with high-frequency, full-screen updates (video editing or CAD for instance), are out of scope for desktop virtualization. But fast refresh rates are necessary for cursor movements and direct manipulation tasks like window dragging or scrolling through long documents. Otherwise, the user may perceive hiccups in the flow and may become confused or even irritated interacting with the system. Flash movies and other video content should run smoothly as well. Here, it is better to sacrifice the image quality a little bit instead of running out of sync between image and sound. This is managed by the protocol between the thin client and the virtualization server.

Desktop virtualization is ready for prime time because virtual desktop infrastructure (VDI) systems provide a level of quality and service for enterprise customers that is comparable to the classic gray PC boxes. Flexibility, mobility, and total cost of ownership considerations convince hospitals, universities, telcos, banks, and other companies with hundreds or thousands of employees to deploy virtualized desktops. For example, there was an installation for JavaOne 2009 at the Moscone Center in San Francisco. Throughout the week, every conference attendee could access three personally assigned virtual machines running Windows 7RC, Ubuntu 8.10, and OpenSolaris 2009.06. All together, 12,000 desktops were created with four Sun VDI hosts, five hypervisor hosts, and three storage servers.

The PC is no longer the center of the digital universe. On one end of the spectrum, the user’s data and attention move to social web services; on the other end, the personal computer itself will move into the data center and become part of the “cloud” service. New kinds of business models are being developed right now. Sooner rather than later, service providers will offer desktop as a service to companies of all sizes.

Comments are closed.