Tuesday, September 2, 2008

Evolution Of Computers:

The first computer was conceptualized by a 19th century British mathematician named Charles Babbage -- creator of the speedometer -- whose Analytical Engine, a programmable logic center with memory, was never built even though Babbage worked on it for 40 years. During World War II, the British built Colossus I, a computer designed to break Nazi military codes, while at Harvard, IBM's Mark I computer was assembled. In 1946, ENIAC (Electronic Numerical Integrator and Calculator), was put to work at the University of Pennsylvania. Designed to perform artillery firing calculations, ENIAC was America's first digital electronic computer. Unlike the Mark I, which used electromechanical relays, ENIAC used vacuum tubes -- 18,000 of them, in fact -- and was much faster. The Mark I, Colossus and ENIAC used the binary system, which reduced the counting process to two digits, 0 and 1. This was compatible with the basis of computer operation, combinations of on-off (yes-no) switches called logic gates that responded to electrical charges and produced information called bits. (A cluster of eight bits was called a byte.) In 1947, Bell Labs invented the transistor, which replaced the vacuum tube as a faster, smaller conduit of electrical current, and a few years later the Labs produced Leprechaun, the world's first fully transistorized computer. One of the transistor's inventors, William Shockley, left Bell Labs and started his own firm in Palo Alto, California -- the center of an area that would soon be known as Silicon Valley. Shockley (among others) discovered that any number of interconnected transistors or circuits could be placed on a microchip of silicon. One drawback of the microchip -- that it was "hard-wired" and thus could perform only the duties for which it was designed -- was resolved by Intel Corporation's invention of the microprocessor. The microprocessor could program a single chip to perform a variety of tasks. These technological advances made computers smaller, faster and more affordable, paving the way for the advent of the personal computer.
The first personal computer was the Altair 8800, which briefly appeared on the scene in 1975. Two years later, the Apple II was unveiled. According to Time, it was "the machine that made the revolution," and was the offspring of Steven Jobs and Steven Wozniak. The latter was Apple's star designer, while Jobs used uncanny marketing skills to build Apple into a highly profitable concern with stock worth $1.7 billion by 1983. The personal computer did not long remain the private domain of Apple and one or two other upstart companies. In 1981, IBM introduced its PC, having previously focused its efforts on manufacturing mainframe business computers. Incorporating the Intel microprocessor, the IBM PC set the standard for quality. That same year, Adam Osborne, a Bangkok-born British columnist, introduced a 24-pound portable computer with detachable keyboard, 64K of memory and a diminutive five-inch screen, which retailed for $1,795. The progenitor of the laptop, the Osborne I was so successful that a host of imitators quickly followed. Even more compact than the Osborne, Clive Sinclair's 12-ounce ZX80, marketed in 1980 (as the Timex Sinclair 1000 in the U.S.) introduced many people to the possibilities of computers, in spite of its limitations, thanks to a $99 list price. One of the most popular and least expensive personal computers of the early Eighties was the $595 Commodore 64.
Concerned were expressed about the consequences of the computer revolution. One argument was that computers would widen the gap between the "haves" and the "have-nots," as it seemed that only those with good educations and lots of discretionary income could afford to plug into the "information network" as represented, early in the decade, by nearly 1,500 databases like Source, a Reader's Digest subsidiary, a legal database called Westlaw, and the American Medical Association's AMA/NET. Others worried that people would come to rely too much on computers to do what they had once done in their own heads, namely to remember and analyze, therefore making a lot of learning unnecessary. Still others feared that the computer revolution would result in an increasingly isolated populace; when people worked at home, networking with their company via computer, they would lose the personal contact that made the workplace an essential element in the social fabric of the community. There was fear, as well, that the computerization of industry would cost workers their jobs. Futurists like Alvin Toppler, author of the 1980 bestseller The Third Wave, envisioned a not-too-distant future when an entire family would learn, work and play around an "electronic hearth" -- the computer -- at home.
Proponents countered that the computer would prove to be a tremendous boon to humanity. The salutary effect of computers in the field of medicine was considerable, providing more precise measurements and monitoring in such procedures as surgical anethesia, blood testing and intravenous injections. Computers promised widened employment horizons for 10 million physically handicapped Americans, many of whom were unable to commute back and forth to work. And a majority of Americans believed computers were an excellent educational tool. A 1982 Yankelovich poll showed Americans were generally sanguine about the dawning Computer Age; 68 percent thought computers would improve their childrens' education, 80 percent expected computers to become as commonplace as televisions in the home, and 67 percent believed they would raise living standards through enhanced productivity. As a consequence, 2.8 million computers were sold in the U.S. in 1982, up from 1.4 million in 1981, which was in turn double the number sold in 1980. In 1982, 100 companies shared the $5 billion in sales, with Texas Instruments, Apple, IBM, Commodore, Timex and Atari the frontrunners in the personal computer market. And by 1982 there were more than 100,000 computers in the nation's public schools. Educators were delighted to report that students exposed to computers studied more and were more proficient in problem solving. Kids joined computer clubs and attended summer computer camps. In lieu of naming a "Man of the Year" for 1982, Time named the computer the "Machine of the Year."
By mid-decade the computer craze was at full throttle. On Wall Street, 36 computer-related companies had gone public with $798 million worth of stock. By 1984 retail sales of personal computers and software had reached $15 billion. A flurry of entrepreneurship in the so-called computer "aftermarket" produced $1.4 billion in sales of such commodities as computer training and tutoring services, specialty furniture, and publishing. In the latter field there were 4,200 books on personal computing in print (compared to only 500 in 1980), as well as 300 magazines, including Byte and Popular Computing. In addition, thousands of software developers competed fiercely for their share of sales that exceeded $2 billion annually by 1984. Another $11 billion was made leasing mainframe software to banks, airlines and the government. The biggest software manufacturer was Microsoft, with 1984 revenues of about $100 million. Systems software, like Microsoft's MS-DOS and AT&T's UNIX instructed the various elements of a computer system to work in unison. Another popular software program was Lotus 1-2-3, a spreadsheet plus electronic filing system popular in the business market. Software quality ranged from outstanding to awful, as did prices. And software errors delated the release of Apple's Macintosh computer by two years. Software pirating became a big business, as well. It was estimated that as many as 20 pirated copies of a program were peddled for every one legitimate purchase.
The public sector embraced the revolution. The Grace Commission -- the presidential review of waste in government -- criticized what it saw as a failure to seize the opportunities presented by the new technologies, and was followed by a five-year plan, launched in 1983, to increase government computer expenditures. The goal, according to one General Services Administration official, was to put "a million computers in the hands of managers as well as those who need them daily for such things as air traffic control, customs, passports and Social Security." The White House itself became computerized, with 150 terminals linked to a powerful central system. Cabinet members like Treasury Secretary Donald Regan carried desktop units with them everywhere. An email network linked the administration with 22 federal agencies. The Library of Congress began copying its more valuable collections onto digital-optical disks for public retrieval. The National Library of Medicine at Bethesda, Maryland stored electronic copies of five million medical books and articles in a single database. In 1984 the Internal Revenue Service began using optical scanners to process 18 million 1040EZ tax forms into a computer system. And the FBI's Organized Crime Information System, a $7 million computer, began compiling data on thousands of criminals and suspects.
By 1988, computer viruses had become a major concern. In that year alone, over 250,000 computers were infected in a nine-month period, raising grave doubts about the vulnerability of data systems. In September of that year the first computer virus criminal trial was conducted in Fort Worth, Texas. The plaintiff, a disgruntled ex-employee, was accused of infecting his former company's computer with a virus that deleted 168,000 sales commission records. Like their biological counterparts, such viruses were designed to reproduce perfect replicas of themselves to infect software that came into contact with the host computer. And there was no telling where a virus would strike. One virus, produced by the owners of a computer store in Lahore, Pakistan, managed to infect 10,000 IBM PC disks at George Washington University in the U.S. The SCORES virus spread from a Dallas-based computer services company to Boeing, NASA, the IRS and the House of Representatives. There was concern that one day a "killer virus" would find its way int the nation's electronic funds-transfer system, crash the stock exchange computer centers, or scramble air-traffic control systems. In response, 48 states quickly passed computer mischief laws, while companies and computer users scrambled to create antiviral programs.
Clearly, the computer revolution of the 1980s opened up a new frontier, one that Americans, true to their pioneer traditions, were eager to explore. Unanticipated perils lay in wait, but the possibilities in terms of enhancing our lives were limitless.
Supercomputers
As the end of the decade approached, supercomputers -- costing between $5 million and $25 million each -- were being used to locate new oil deposits, create spectacular Hollywood special effects, and design new military weapons, not to mention artificial limbs, jet engines, and a host of other products for private industry. These machines were able to crunch data at a speed measured in gigaFLOPS -- billions of operations per second. In size, most were no larger than a hot tub. The National Science Foundation established five supercomputer centers which by 1988 were linked to 200 universities and research labs. The Los Alamos National Laboratory utilized eleven supercomputers. In 1988, IBM financed a machine incorporating parallel-processing, the use of 64 processors in tandem, which would make it 100 times faster than supercomputers then in use. At the same time, IBM was working on the TF-1, consisting of 33,000 high-speed processing units -- a computer that would be 20,000 times faster than anything on the market.

No comments: