M o d u l e I I
Chapter Highlights Section I Computer Systems: End User and Enterprise Computing Introduction A Brief History of Computer Hardware Real World Case: AstraZeneca, UnitedHealth, and Others: IT Asset Management—Do You Know What You’ve Got? Types of Computer Systems Microcomputer Systems Midrange Systems Mainframe Computer Systems Technical Note: The Computer System Concept Moore’s Law: Where Do We Go from Here? Section II Computer Peripherals: Input, Output, and Storage Technologies Peripherals Input Technologies Real World Case: IT in Healthcare: Voice Recognition Tools Make Rounds at Hospitals Output Technologies Storage Trade-Offs Semiconductor Memory Magnetic Disks Magnetic Tape Optical Disks Radio Frequency Identification Predictions for the Future Real World Case: IBM, Wachovia, and PayPal: Grid Computing Makes It Easier and Cheaper Real World Case: Apple, Microsoft, IBM, and Others: The Touch Screen Comes of Age
Learning Objectives 1. Understand the history and evolution of computer
hardware. 2. Identify the major types and uses of microcom-
puter, midrange, and mainframe computer systems.
3. Outline the major technologies and uses of com- puter peripherals for input, output, and storage.
4. Identify and give examples of the components and functions of a computer system.
5. Identify the computer systems and peripherals you would acquire or recommend for a business of your choice, and explain the reasons for your selections.
obr76817_ch03_076-128.indd Page 77 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 77 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
78 ● Module II / Information Technologies
SECTION I Computer Systems: End User and Enterpr ise Computing All computers are systems of input, processing, output, storage, and control compo- nents. In this section, we discuss the history, trends, applications, and some basic con- cepts of the many types of computer systems in use today. In Section II, we will cover the changing technologies for input, output, and storage that are provided by the pe- ripheral devices that are part of modern computer systems. Read the Real World Case regarding management of IT assets. We can learn a lot about how different organizations keep track of their IT assets and manage their life- cycle from this case. See Figure 3.1 .
Today we are witnessing rapid technological changes on a broad scale. However, many centuries elapsed before technology was sufficiently advanced to develop com- puters. Without computers, many technological achievements of the past would not have been possible. To fully appreciate their contribution, we must understand their history and evolution. Although a thorough discussion of computing history is beyond the scope of this text, a brief consideration of the development of the computer is possible. Let’s look quickly into the development of computers. At the dawn of the human concept of numbers, humans used their fingers and toes to perform basic mathematical activities. Then our ancestors realized that by using some objects to represent digits, they could perform computations beyond the limited scope of their own fingers and toes. Can’t you just see in your mind a cave full of cave- men performing some group accounting function using their fingers, toes, sticks, and rocks? It creates a comical, yet accurate, picture. Shells, chicken bones, or any number of objects could have been used, but the fact that the word calculate is derived from calculus , the Latin word for “small stone,” sug- gests that pebbles or beads were arranged to form the familiar abacus, arguably the first human-made computing device. By manipulating the beads, it was possible with some skill and practice to make rapid calculations. Blaise Pascal, a French mathematician, invented what is believed to be the first mechanical adding machine in 1642. The machine partially adopted the principles of the abacus but did away with the use of the hand to move the beads or counters. In- stead, Pascal used wheels to move counters. The principle of Pascal’s machine is still being used today, such as in the counters of tape recorders and odometers. In 1674, Gottfried Wilhelm von Leibniz improved Pascal’s machine so that the machine could divide and multiply as easily as it could add and subtract. When the age of industrialization spread throughout Europe, machines became fixtures in agricultural and production sites. An invention that made profound changes in the history of industrialization, as well as in the history of computing, was the me- chanical loom, invented by a Frenchman named Joseph Jacquard. With the use of cards punched with holes, it was possible for the Jacquard loom to weave fabrics in a variety of patterns. Jacquard’s loom was controlled by a program encoded into the punched cards. The operator created the program once and was able to duplicate it many times over with consistency and accuracy. The idea of using punched cards to store a predetermined pattern to be woven by the loom clicked in the mind of Charles Babbage, an English mathematician who lived in the 19th century. He foresaw a machine that could perform all mathematical calculations, store values in its memory, and perform logical comparisons among val- ues. He called it the Analytical Engine . Babbage’s analytical engine, however, was never built. It lacked one thing: electronics. Herman Hollerith eventually adapted Jacquard’s concept of the punched card to record census data in the late 1880s.
A Brief History of Computer Hardware
obr76817_ch03_076-128.indd Page 78 8/3/10 5:30 PM user-f498obr76817_ch03_076-128.indd Page 78 8/3/10 5:30 PM user-f498 /Volumes/203/MHSF225/foe94488_disk1of1/0073385387/foe85387/Volumes/203/MHSF225/foe94488_disk1of1/0073385387/foe85387
Chapter 3 / Computer Hardware ● 79
open collaboration that allowed him and his team to under- stand the problem, design the solution, and then execute the plan. “Julian and I worked extremely closely together, and from there our partnership cascaded down to the other members of the team,” Warrington says. “But I need to be very clear about that: in the beginning, the knowledge and expertise were clearly with them—they were teaching and we were learning.” The problem, Warrington says, is that in the increas- ingly strategic world of IT asset management, “the toolset itself meets only 30 percent of the overall need: on top of that, you need to build the processes, understand the costs, come up with standards, develop interfaces with other major vendors, and much more—we simply didn’t have all the skills necessary to cover that total lifecycle. But PS’Soft did have those skills, both in-house and through their contacts.” In addition, says Warrington, PS’Soft and BDNA had the global experiences necessary to help AstraZeneca get its arms around its global sprawl of IT gear, which was essential so the company could (1) begin to gain greater leverage in purchasing negotiations, and (2) be able to fairly, but aggres- sively, hold its own during audits by software vendors. “In so many countries where we operate, the tradition has been that budgets are managed locally, making it impos- sible to see the global aggregate in detail,” Warrington says. “We simply did not have the ability to get a global view. The old tools we used gave us something of a snapshot, but didn’t let us have enough insight to be able to manage the situa- tion. At the same time, the IT vendors are getting very ag- gressive with audits, and without offering a specific number I can tell you that millions and millions of dollars are at stake—and before our engagement with PS’Soft, no matter how hard we tried with the old toolset, we were just not able to achieve those potential cost savings from vendors.” Over time, Warrington says, AstraZeneca gained that necessary level of control and knowledge: “Now Astra- Zeneca is in a position to enter negotiations from a position of strength, confidence, and knowledge.” And that achieve- ment has given the company a new perspective on the realm of IT asset management. Warrington says, “Too many com- panies just look on IT asset management as nothing more than bean counting, versus looking deeper and understand- ing the ROI and ROA that can be achieved. “But we learned first hand that there is a huge opportu- nity to get control over what you have, to satisfy even the most rigorous audit, and to negotiate better contracts. And that’s a lot more than bean counting,” says Warrington. IT organizations in diversified companies—particularly those grown through acquisition—wage a seemingly end- less battle against unnecessary IT diversity and related costs. Conceived, planned, and executed in 18 months, UnitedHealth Group’s (UHG) Hercules program proves
G lobal pharmaceuticals giant AstraZeneca needed some strong medicine of its own to fix a bur-geoning IT asset management problem. It was brought about by multiple acquisitions and their nonstand- ard gear, a high-tech workforce spread across 255 facilities in 147 countries, and a total of more than 67,000 employees using more than 90,000 hardware and software assets rang- ing from notebooks to SAP and Oracle enterprise applica- tions and databases. With software vendors becoming more aggressive on audits as sales of new products are generally weak, and with greater internal collaboration requiring a more consistent set of tools to simplify processes and maintenance, the $31 billion pharmaceuticals company realized a few years ago that Microsoft’s Systems Management Server was simply overmatched for the job of managing the global enterprise’s complex base of IT assets. So Microsoft recommended the asset management products offered by a French company called PS’Soft, which is a subsidiary of BDNA Corp., a top provider of IT infra- structure inventory and analysis solutions. In the years that AstraZeneca has been steadily getting its IT assets under control, PS’Soft has distinguished itself like few other IT vendors, according to AstraZeneca Global IT Asset lead Bernard Warrington. “In all my years, our engagement with PS’Soft was one of the first and only times we had an IT vendor show such willingness to work as a true partner and really try to solve our problems with us,” Warrington says. Referring to PS’Soft’s Julian Moreau, Warrington described the uniquely
F IGURE 3.1
AstraZeneca, UnitedHealth, and Others: IT Asset Management—Do You Know What You’ve Got?
Companies are increasingly focusing on managing the myriad of platforms, hardware, and software that make up their IT infrastructures.
Source: © Royalty Free/Corbis.
obr76817_ch03_076-128.indd Page 79 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 79 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
80 ● Module II / Information Technologies
that the complexity can be conquered, while protecting or improving IT’s service levels. By creating a standard desk- top configuration and consistent management processes, Hercules reduced total cost of ownership to $76 per month per desktop, from more than $240. In 2004, with the CEO’s support, Alistair Jacques, then SVP of UHG-IT, launched Hercules, focusing it on standardizing and streamlining the processes behind desktop management: procurement, configuration, installation, life cycle, and asset management. In addition to this focus on process, two techniques stand out as key to the program’s success. Working with finance, IT developed a chargeback model that imposes a premium on nonstandardized desktop configurations: $170 per month versus $45 per month for a standard configuration. This value price encourages business managers to choose the more efficient infrastructure. UHG also reduced costly on-site support by reorganizing it: A cen- tral IT team manages high-level support activities, complet- ing 95 percent remotely, while select, on-site end users (often non-IT administrative staff trained by IT) provide ba- sic support to colleagues. UHG-IT treated desktop management as a business process challenge rather than a technology issue. This ap- proach freed them to use tactics like non-IT staff for desktop support and value pricing. To date, UHG has converted 75,000 out of 90,000 devices to the new standards, delivering $42 million in annual savings. UHG can now manage nearly four times the number of end users with the same number of IT personnel as in 2004, all while actually improving—not diminishing—service levels. IT now deploys 99.4 percent of releases, updates, and patches in three hours, instead of 65 percent in three weeks. Indeed, companies that blow off asset management do so at their own peril. At the same time, 99 percent of compa- nies that her organization comes across don’t have a proper asset management process in place, according to Elisabeth Vanderveldt, vice president of business development at Montreal-based IT services and consulting firm Conamex International Software Corp. That’s a staggering number, considering the value that life-cycle management can bring to an organization. And it’s
indicative of the widespread lack of respect for this impor- tant aspect of IT operations. The ideal time to start considering an asset management program is before the business and its IT infrastructure is even up and running, but the common scenario is that cor- porations look to asset management after they’ve encoun- tered a problem running the infrastructure. Businesses’ mentality about asset management is evolving, however. Companies used to consider only reliability, availabil- ity, and overall equipment effectiveness in the equation. But now, he said, there is recognition of factors like continuing pres- sures on cost and green technology. “It really requires a mature organization to understand what’s going to be needed to assess and execute a life-cycle management strategy,” says Don Barry, associate partner in global business services in the supply chain operations and asset management solutions at IBM. Why is a life-cycle management program important? For one thing, it puts IT in much better control of its assets, and this can have a number of benefits. “IT can make really intelligent decisions around what they should get rid of, and they might even find they have more money in the budget and they can start taking a look at newer technology and see if they can bring it in-house. With- out that big picture, they just end up spending more and more money than had they been proactive,” says Vanderveldt. Life-cycle management also has value as a risk manage- ment tool, and it aids in the disaster recovery process as well, she adds. “It’s also beneficial for those moments that are just completely out of your control, like mergers. acquisitions and uncontrolled corporate growth, either organic or inor- ganic,” says Darin Stahl, an analyst at London, Ontario based Info-Tech Research Group. “IT leaders without this tool set are now charged with pulling all this information together on short notice. That could be diminished consid- erably in terms of turnaround time and effort for IT guys if they have a holistic asset management program in place.”
Source: Adapted from Bob Evans, “Global CIO Quick Takes: AstraZeneca Saves Millions with BDNA,” InformationWeek , February 22, 2010; Rick Swanborg, “ Desktop Management: How UnitedHealth Used Standardiza- tion to Cut Costs,” CIO.com , April 28, 2009; and Kathleen Lau, “Asset Man- agement: Do You Know What You’ve Got?,” CIO Canada , August 13, 2008.
1. What are the companies mentioned in the case trying to control, or manage, through these projects? What is the problem? And how did they get there?
2. What are the business benefits of implementing strong IT asset management programs? In what ways have the companies discussed in the case benefited? Provide sev- eral examples.
3. One of the companies in the case, UnitedHealth Group, tackled the issue by imposing standardization and “charg- ing” those stepping outside standard models. How should they balance the need to standardize with being able to provide business units with the technologies best suited to their specific needs? Justify your answer.
1. An important metric in this area considered by compa- nies is the Total Cost of Ownership (TCO) of their IT assets. Go online and research what TCO is and how it is related to IT asset management. How are companies using TCO to manage their IT investments? Prepare a presentation to share your research with the rest of your class.
2. What does Don Barry of IBM mean by “life-cycle” in the context of this case? How would this life-cycle man- agement work when it comes to IT assets? Break into small groups with your classmates and create a working definition of life-cycle management and how it works as you understand it from the case.
CASE STUDY QUESTIONS REAL WORLD ACTIVITIES
obr76817_ch03_076-128.indd Page 80 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 80 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
Chapter 3 / Computer Hardware ● 81
Census data were translated into a series of holes in a punched card to represent the digits and the letters of the alphabet. The card was then passed through a machine with a series of electrical contacts that were either turned off or on, depending on the existence of holes in the punched cards. These different combinations of off/on situ- ations were recorded by the machine and represented a way of tabulating the result of the census. Hollerith’s machine was highly successful; it cut the time it took to tabu- late the result of the census by two-thirds, and it made money for the company that manufactured it. In 1911, this company merged with its competitor to form Interna- tional Business Machines (IBM). The ENIAC (Electronic Numerical Integrator and Computer) was the first elec- tronic digital computer. It was completed in 1946 at the Moore School of Electrical Engineering of the University of Pennsylvania. With no moving parts, ENIAC was programmable and had the capability to store problem calculations using vacuum tubes (about 18,000). A computer that uses vacuum tube technology is called a first-generation compu- ter. The ENIAC could add in 0.2 of a millisecond, or about 5,000 computations per second. The principal drawback of ENIAC was its size and processing ability. It oc- cupied more than 1,500 square feet of floor space and could process only one program or problem at a time. As an aside, the power requirements for ENIAC were such that adjacent common area lighting dimmed during the power up and calculation cycles. Figure 3.2 shows the ENIAC complex. In the 1950s, Remington Rand manufactured the UNIVAC I (Universal Automatic Calculator). It could calculate at the rate of 10,000 additions per second. In 1957, IBM developed the IBM 704, which could perform 100,000 calculations per second. In the late 1950s, transistors were invented and quickly replaced the thousands of vacuum tubes used in electronic computers. A transistor-based computer could per- form 200,000–250,000 calculations per second. The transistorized computer repre- sents the second generation of computer. It was not until the mid-1960s that the third generation of computers came into being. These were characterized by solid-state technology and integrated circuitry coupled with extreme miniaturization. No history of electronic computing would be complete without acknowledging Jack Kilby. Kilby was a Nobel Prize laureate in physics in 2000 for his invention of the integrated circuit in 1958 while working at Texas Instruments (TI). He is also the inventor of the handheld calculator and thermal printer. Without his work that generated a patent for a “Solid Circuit made of Germanium,” our worlds, and most certainly our computers, would be much different and less productive than we enjoy today.
F IGURE 3.2 ENIAC was the first digital computer. It is easy to see how far we have come in the evolution of computers.
Source: Photo courtesy of United States Army.
obr76817_ch03_076-128.indd Page 81 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 81 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
82 ● Module II / Information Technologies
In 1971, the fourth generation of computers was characterized by further minia- turization of circuits, increased multiprogramming, and virtual storage memory. In the 1980s, the fifth generation of computers operated at speeds of 3–5 million calcula- tions per second (for small-scale computers) and 10–15 million instructions per sec- ond (for large-scale computers). The age of microcomputers began in 1975 when a company called MITS intro- duced the ALTAIR 8800. The computer was programmed by flicking switches on the front. It came as a kit and had to be soldered together. It had no software programs, but it was a personal computer available to the consumer for a few thousand dollars when most computer companies were charging tens of thousands of dollars. In 1977 both Commodore and Radio Shack announced that they were going to make personal computers. They did, and trotting along right beside them were Steve Jobs and Steve Wozniak, who invented their computer in a garage while in college. Mass production of the Apple began in 1979, and by the end of 1981, it was the fastest selling of all the personal computers. In August 1982 the IBM PC was born, and many would argue that the world changed forever as a result. Following the introduction of the personal computer in the early 1980s, we used our knowledge of computer networks gained in the early days of computing and com- bined it with new and innovative technologies to create massive networks of people, computers, and data on which anyone can find almost anything: the Internet. Today we continue to see amazing advancements in computing technologies. Okay, it’s time to slow down a bit and begin our discussion of today’s computer hardware.
Today’s computer systems come in a variety of sizes, shapes, and computing capabili- ties. Rapid hardware and software developments and changing end-user needs con- tinue to drive the emergence of new models of computers, from the smallest handheld personal digital assistant/cell phone combinations to the largest multiple-CPU main- frames for enterprises. See Figure 3.3 .
Types of Computer Systems
FIGURE 3.3 Examples of computer system categories.
Network servers, minicomputers, Web servers, multiuser systems, etc.
Personal computers, network computers, technical workstations, personal digital assistants, information appliances, etc.
prise systems, superservers, ,
Enter transaction processors supercomputers
Source: Courtesy of Hewlett-Packard.
obr76817_ch03_076-128.indd Page 82 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 82 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
Chapter 3 / Computer Hardware ● 83
Categories such as mainframe, midrange , and microcomputer systems are still used to help us express the relative processing power and number of end users that can be sup- ported by different types of computers. These are not precise classifications, and they do overlap each other. Thus, other names are commonly given to highlight the major uses of particular types of computers. Examples include personal computers, network servers, network computers, and technical workstations. In addition, experts continue to predict the merging or disappearance of sev- eral computer categories. They feel, for example, that many midrange and main- frame systems have been made obsolete by the power and versatility of networks composed of microcomputers and servers. Other industry experts have predicted that the emergence of network computers and information appliances for applica- tions on the Internet and corporate intranets will replace many personal comput- ers, especially in large organizations and in the home computer market. Still others suggest that the concept of nanocomputers (computing devices that are smaller than micro) will eventually pervade our entire understanding of personal computing. Only time will tell whether such predictions will equal the expectations of industry forecasters.
The entire center of gravity in computing has shifted. For millions of consumers and business users, the main function of desktop PCs is as a window to the Internet. Computers are now communications devices, and consumers want them to be as cheap as possible.
Microcomputers are the most important category of computer systems for both businesspeople and consumers. Although usually called a personal computer , or PC, a microcomputer is much more than a small computer for use by an individual as a communication device. The computing power of microcomputers now exceeds that of the mainframes of previous computer generations, at a fraction of their cost. Thus, they have become powerful networked professional workstations for business professionals. Consider the computing power on the Apollo 11 spacecraft. Most certainly, landing men on the moon and returning them safely to earth was an extraordinary feat. The computer that assisted them in everything from navigation to systems monitoring was equally extraordinary. Apollo 11 had a 2.048 MHz CPU that was built by MIT. Today’s standards can be measured in the 4 GHz in many home PCs (MHz is 1 million com- puting cycles per second and GHz is 1 billion computing cycles per second). Further, the Apollo 11 computer weighed 70 pounds versus today’s powerful laptops weighing in as little as 1 pound. This is progress, for sure. Microcomputers come in a variety of sizes and shapes for a variety of purposes, as Figure 3.4 illustrates. For example, PCs are available as handheld, notebook, laptop, tablet, portable, desktop, and floor-standing models. Or, based on their use, they in- clude home, personal, professional, workstation, and multiuser systems. Most micro- computers are desktops designed to fit on an office desk or laptops for those who want a small, portable PC. Figure 3.5 offers advice on some of the key features you should consider when acquiring a high-end professional workstation, multimedia PC, or be- ginner’s system. This breakdown should give you some idea of the range of features available in today’s microcomputers. Some microcomputers are powerful workstation computers (technical worksta- tions) that support applications with heavy mathematical computing and graphics dis- play demands, such as computer-aided design (CAD) in engineering or investment and portfolio analysis in the securities industry. Other microcomputers are used as network servers . These are usually more powerful microcomputers that coordinate telecommunications and resource sharing in small local area networks (LANs) and in Internet and intranet Web sites.
obr76817_ch03_076-128.indd Page 83 7/28/10 11:25 AM userobr76817_ch03_076-128.indd Page 83 7/28/10 11:25 AM user /Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles/Volumes/207/MHSF191/foe85387_disk1of1/0073385387/foe85387_pagefiles
84 ● Module II / Information Technologies
F IGURE 3.4 Examples of microcomputer systems:
a . A notebook microcomputer. Source: Hewlett-Packard.
b . The microcomputer as a professional workstation. Source: Corbis.
c . The microcomputer as a technical workstation. Source: Courtesy of Hewlett-Packard.
F IGURE 3.5 Examples of recommended features for the three types of PC users. Note: www.dell.com and www. gateway.com are good sources for the latest PC features available.
To track products, customers, and firm performance, more than just a fast machine is necessary:
• 3–4 GHz dual-core processor • 4–8 GB RAM • 500 GB hard drive • Up to 19-inch flat-panel
• CD-RW/DVD�RW • Network interface card • Color laser printer
Multimedia Heavy or Gamer
Media pros and dedicated gamers will want at least a Mac G4 or a 2–3 GHz Intel dual-core chip, and
• 4–8 GB RAM • 250� GB hard drive • 19-inch or better flat-panel display • 16� or better DVD�RW • Video cards (as fast and as powerful
as budget permits)
• Sound cards • Laser printer (color or B&W)