History of Computing

Computer Literacy, Spring 2000

Hopefully you plan on taking advantage of the extra credit assignment associated with this lecture and visit the American Computer Museum.
As you can tell this lecture is like a chapter in a book, no neat, clean definitions, READ, READ before you take the test. You will need to know a lot from this lecture for the final. Plus a lot of this stuff is just stuff you should know, it will make you look smart in everyday conversations.
 

2mousehistory.GIF (130815 bytes)

Introduction

Some people get turned off by "history." Their reaction to the subject is "Why do I need to know this and what affect does it have on my life." An understandable reaction. I feel that way sometimes about the history of the Roman empire. My goal for this subject is to introduce you to the people and events that helped shape the way we use computers today. I want to introduce you to the personalities that determine why we do what we do with computers in the 90s. Many of these people are dead and gone but many of them are still kickin'! I'm not concerned about you remembering dates except to use them to keep events in perspective. Back to the future is the best way to describe computer history.

Way-y-y back in time

The original  term "computers" actually referred to humans, usually women, using old-fashioned desk calculators to compute artillery shell trajectories in World War I. It took 40 hours to produce one firing table. You can get an idea how slow this process was!

The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer.

In 1642, Blaise Pascal (1623-1662), the 18-year-old son of a French tax collector, invented what he called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on. The drawback to the Pascaline, of course, was its limitation to addition.

In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716), improved the Pascaline by creating a machine that could also multiply.

The first mechanical computer was invented in the mid-1800s by Charles Babbage.   It was a big hunk of metal that took hours and hours to "program" and performed very limited functions.  Mostly simple addition, subtraction, multiplication, and division.  There were no keyboards, screens, and certainly, no memory.

Whoever has the impression that women didn't play an intricate role in the development of computers, don't know their history.  Babbage's assistant was Lady Augusta Ada Byron; yes, the daughter of Lord Byron the famous poet.  She helped developed instructions for doing computations on the first computers and has been credited with developing the first computer program.  In the 1950s the Navy developed a new programming language and named it after Lady Byron - called Ada.  That language is still in use today.

The first major use of a computer was for the 1890 government census.  Ever wonder why the Census is only completed every 10 years in the United States?  Because that's  how long it took to complete the count using manual processes.  Herman Hollerith built a computer that he used to completed the 1890 census and reduced the time from seven and a half years to 2 1/2 years.  With all the political bickering we have nowadays over the Census I think we're back up to seven years.

In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacquard loom concept to computing. His first task was to find a faster way to compute the U.S. census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers compiled their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroghs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.

The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other. To eliminate this bulkiness, John V. Atanasoff (b. 1903), a professor at Iowa State College (now called Iowa State University) and his graduate student, Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. This approach was based on the mid-19th century work of George Boole (1815-1864) who clarified the binary system of algebra, which stated that any mathematical equations could be stated simply as either true or false. By extending this concept to electronic circuits in the form of on or off, Atanasoff and Berry had developed the first all-electronic computer by 1940. Their project, however, lost its funding and their work was overshadowed by similar developments by other scientists.
 

Generations

To make it easier to keep a perspective on the fast changing computer world, the idea of "generations" was created.

First generation - Vacuum Tube

This generation lasted most of the 1950s.  Some of us "old-timers" remember when televisions used little glass tubes as the main component.  Thousands of these tubes were used in a typical computer and created vast amounts of heat.   They also burned out easily and were hard to maintain.

 In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war. Shortly after this the UNIVAC and ENIAC were developed, you will see below the impact of these two computers.

First generation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation computers were the use of vacuum tubes (responsible for their breathtaking size) and magnetic drums for data storage.
 

Second generation - Transistors

In 1959 three scientists, working for Bell Labs, invented the transistor which quickly replaced the vacuum tube used to transfer the electrical signals inside computers and numerous other electronic appliances.  I remember having a tiny transistor radio when I was a child - great stuff back then!  The transistor allowed for better computers because they reduced the amount of heat generation and didn't wear out as easily as the vacuum tube.

Throughout the early 1960's, there were a number of commercially successful second generation computers used in business, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers.
 

Third generation - Integrated Circuits on Chips

Around the mid 60s, scientists discovered sand!  Yes, sand.  You know, that stuff you find on all the beaches and gets down in your bathing suit whenever you swim in tthe ocean.  In the scientific world they call it silicon.  Now you know why they call it "Silicon Valley" - that area south of San Francisco where a great number of technology companies are located.

Silicon was transformed into "chips" that hold tiny electronic circuits.   One tiny chip of silicon was able to replace huge boards of transistors and further reduced the size and cost of computers and allowed for more reliability and compactness.   This breakthrough led to the IBM 360 series of computers, manufacturered of course by IBM.  The 360 computer was housed in a blue metal box, hence the nickname of "Big Blue" for IBM.

Fourth Generation - Microprocessors

In 1971 Intel Corporation introduced the first microprocessor.  Even though it too was built on a silicon chip, the architecture was so different from the previous methods that it revolutionized the computer industry.  Microprocessors were used not only in computers but watches, radios, cars, toasters, microwave ovens, etc.  Because of the microprocessor, computers could be made smaller, faster, cheaper, simpler, and with a greater capacity than ever before.  No longer did we need these huge rooms that were environmentally controlled, with specially trained technicians to operate them.  Some experts say that the PC was made possible only by the invention of the microprocessor.

Fifth - Networks

Where it was easy to differentiate between the first, second, and third generations, it hasn't been that simple to determine the beginning of the Fifth Generation.  Some folks say we aren't there yet.  While this generation may not be marked by a distinctly different machine, it can be marked by a much different way of using computers.   Networks have been around since the late 60s and early 70s.  Back then however, there was no such things as a personal computer and the Internet was only used by a select few.  In the 90s we have small computers which can be operated as stand-alone machines or connected to larger computers to transmit data back and forth.  Certainly the explosion of Internet usage by the average person over the last five years has had a definite mark on the way we use computers and communicate with others.

Electronic computer industry began about 50 years ago

The ENIAC (Electronic Numerical Integrator and Calculator) was first large scale electronic digital computer.  Now there's a catchy name for a computer don't ya think!

It was introduced in the 1940s and used by the U.S. Army in WWII.  It weighed 30 tons, had 18,000 vacuum tubes, occupied a space about 30 X 50 feet, and stood two stories high.  Try putting that in your dorm room or on your dining room table at home!    Today you can replace it with one fingernail size chip.

The first commercial electronic digital computer was the UNIVAC - short for Universal Automatic Computer.  Where do they come up with these names!  Again, the U.S. Census Bureau was the first customer to use this computer.   It was also the first computer built for business applications and not strictly military usage.

Both the U.S. Census Bureau and General Electric owned UNIVACs. One of UNIVAC's impressive early achievements was predicting the winner of the 1952 presidential election, Dwight D. Eisenhower.
 

Interesting Facts, Events, and People

Here are some interesting tidbits of history -
Of all the pioneers of personal computers and the Web, Douglas Engelbart may be the poorest and least-known. But the man who invented the mouse is finally winning recognition as the 1997 winner of the Lemelson-MIT prize - a $500,000 jackpot awarded annually to an American inventor. Engelbart, 72, began his work in 1951. In the 1960s, at Stanford Research Institute, his inventions included on-screen windows with menus, groupware, and hypertext. He came up with the mouse in 1963, carving the first one from wood. He got no royalties, though, because researchers then assigned patents to their employers. Engelbart says his big frustration is that his inventions have never fulfilled his dream: having computers help solve society's problems. He still promotes his ideals through the Bootstrap Institute, a think tank he founded in 1989. Compared with the challenges ahead, he warns, "everything in the past is peanuts." As quoted in Business Week, magazine, 1997.
An interesting note:  Douglas Engelbart visited Montana State University and the American Computer Museum, Bozeman, in May of this year.  I had a chance to talk to him personally and asked him how he came up with the idea of the mouse.  He replied that he sketched the idea of the mouse at a conference because he was bored!  The original mouse had three buttons on it.  When I asked how he came up with the number three he stated that three were all he could fit on the device.  Interestingly that we still work with three buttons on some mice.  Guess what the original name was for the mouse device:  X-Y Position Indicator for a Display System!  Okay, now we understand why the name changed.

Personal Computer Industry

It wasn't until the late 1980s that the  PC industry really took off and made significant inroads into the way computers were used.  Until that time they were a "tinkerer's toy" so to speak.  Several forces converged around 1988 to 1990 that caused a significant shift in computing.  More powerful, more reliable, easy-to-use PCs were available to replace mainframes and minicomputers in the workplace.   Prices of PCs began falling from the $5000-$7000 range down to the $2000-$3000 range.  More people could afford to purchase one for their home.  A wider variety of PC software became available that turned the PC from a "business machine" into one that had more personal uses such as word processing,  home accounting, and games.  In short, the PC was no longer considered strictly a "business" machine.  In the early 90s Microsoft introduced software that made the IBM PC compatibles easier to use for the average person.

All of these events happened about the same timeframe and now, in the late 90s, about 55%-60% of the U.S. population considers a PC in the home an essential appliance.
 

Grace Hopper

I always like to talk about Grace Hopper because she was such an interesting person and very influential in computer history.  What I remember most about this extraordinary woman was her motto "It's better to beg forgiveness than to ask permission."   You see, many of her ideas were scoffed at, ridiculed, and outright denied.   She was a reserve officer in the U.S. Navy and during the 50s and 60s developed many concepts and ideas that are still with us today.  She created the programming language COBOL that is still used in business.  After awhile, she got tired of asking permission to develop something or other for computers and having her superiors shoot it down as another dumb idea.  So she just did it and then told the bosses about it later.

Several times the Navy brass tried to get rid of her and force her retirement from the Navy.  She just picked up the phone, called the President of the United States, who in turn told the brass to leave her alone.  She eventually retired on her own in the late 80s as the oldest person to ever retire from the U.S. military.

Back when computers used vacuum tubes, the computer that Grace was using quit running.   Her colleagues spent hours trying to determine the cause of the malfunction.   She finally opened up the back of the computer, found a dead moth on one of the vacumn tubes, and declared that she had removed the "bug" from the computer.   Since that time, the term "bug" has been used to describe any kind of problem with computer hardware or software.

Graphical User Interface and the Mouse

Along the road of evolution and revolution in computers, many companies have made mistakes.  None more so than Xerox.  You're most familiar with their copy machines.  I bet you didn't know that they developed some of the greatest inventions for computers.  At the time they didn't know it either.  They're still paying for their mistakes.  Read on -

The first modern GUI (graphical User Interface) and mouse were in fact engineered at a Xerox research center called PARC, Palo Alto Research Center.  (Remember Douglas Englebart invented the very first mouse.)   A couple of their scientists created these tools to make it easier for people to use and interact with computers.  However, management didn't see a need for them since the company was more concerned about copy machines.  Here's an excerpt from Business Week magazine:

It was one of the great fumbles of all time. In the 1970s, Xerox Corps.'s Palo Alto Research Center (PARC) developed the technologies that would drive the personal computer revolution. "By 1979, we had it all - graphical user interfaces, mice, windows and pull-down menus, laser printing, distributed computing, and Ethernet," recalls M. Grank Squires, a PARC founder in 1970 and now chief administrative officer of SematechInc., the chip-industry consortium in Austin, Tex.
Xerox had the PC and networking businesses firmly hooked - but didn't try to reel them in. It didn't even patent PARC's innovations. Management was too preoccupied with aggressive competition from Japan in its core copier business, says CEO Paul A. Allaire. "If we'd been really good, we could have done both. We probably should have," he admits.
Instead, PARC's technologies became the foundation for such icons as Apple Computer Inc. and 3COM Corp. Apple co-founder Steven P. Jobs visited PARC in 1979 and was astonished at what he saw. His PARC tour inspired the Macintosh. That same year, Ethernet inventor Robert M. Metcalfe left PARC to start 3Com (he now owns Candlestick Park in San Francisco!). These are just two famous examples of the great Xerox giveaway.
"Xerox won't duplicate past errors." Business Week, September 29, 1997
And all you Apple diehards thought Steve Jobs and Steve Wozniak developed the mouse and GUI.  Nope!  After Steve Jobs' visit to Xerox PARC he took what he saw back to his garage and applied it to his computing invention.  Thus, Xerox lost out big time and Apple went on to become one of the most successful computing companies ever.

In 1990 Microsoft adapted these two creations to the IBM PC compatible machines.

Software Industry

Until the late 70s, computer "software" as we know and use it today didn't exist.  Not one computer magazine existed until the late 70s.  Each computer built before the 70s had to have its own individual programs written specially for it.   Can you imagine the time it took to do that.  It was common back then for a company to order a computer and have to wait two or three years for it to be built, programmed and delivered.  We get impatient now if we have to wait two or three days for our new computers.  Most companies wrote their own software and usually the program only performed a single task - payroll for example.  If the company wanted a computer to also control their inventory they had to purchase and wait for another computer that would do just that one operation.

The 1970s saw the beginning of the software revolution in that one program was written that would operate on many computers.  Still the program would only perform one task on one machine.  But at least it cut down the time that companies had to wait for the programs.

PCs helped revolutionize the software industry by creating a need for "off the shelf" software first made available in the early 80s.  Most people have the impression that word processing software was the first type of program created for PCs.   Nope, it was a program called Visicalc; a spreadsheet program.  Even though it was the first "off the shelf" software program for PCs, Lotus 1-2-3 became the most popular.

Wider usage of PCs called for simpler installation of software that the average user could complete on their own.  This type of software was made available in late 80s and 90s.  While some of you may not agree that today's software is easier to install and use, trust me, it is.  I remember in 1986 typing in seven pages of numbers that represented the latest, greatest word processing program.  It took me about two days of nonstop typing to "load" that program to my computer's memory - which by the way, consisted of an old fashioned cassette tape recorder.

Apple Computer

Along about 1977 the "two Steves,"  Steve Jobs & Steve Wozniak started building small computers in their garage using $1000 they got from selling a VW car.  Their invention was an immediate success as was the Apple Computer Company.   To look at the company today you wouldn't realize it was one of the most successful, fastest growing companies of all time.  Apple Computers sold more PCs in 1980 and 1981 than any other company.  Of course at that time they were just about the only company selling personal computers.  There were a few others - Radio Shack sold the TRS-80, nicknamed the "Trash 80" and Atari had a simplistic PC out on the market.  In fact, Atari was the first computer I ever owned.

So what happened to Apple?  If they were so dominant during the 80s why do they only have about 3-5 percent of the market today?  They make terrific computers and, if you ask any Apple user, they swear by the "Macs."  Mostly bad business management throughout the years attributed to their severe decline.  Also, Apple has never allowed other companies to license their technology and make peripheral devices for the Apple.  Some experts say this decision alone caused most of Apple's trouble.

If you compare the open architecture of the IBM-PC compatible PCs with that of Apple's closed architecture you begin to understand this problem.   Let's say you are a young entrepreneur with a hankering to start your own computer device company.  One company, Apple, says to you "NO, you can't have our blueprints.  We don't want you making a product that competes with ours."  Another company, IBM, says, "Well sure, you can make devices that work with our computers.  Yeah they may compete with our products, but that's okay.  Here's the blueprints, have at it."   Which company are you going to go with?  Over the years that's exactly what has happened.  More and more companies began making products and software for IBM compatibles and less and less for the Apple.  Pretty soon the market forces and consumer demand took hold and the Apple computers faded from the forefront.

There's little argument that Apples are superior products in many ways to the IBM PCs.   But, if you have to spend extra money for that machine, are you willing?  And, if you can't get the latest, greatest software for the machine, are you willing to wait?   Most consumers have said no.  So little by little, the Apple lost market share and the IBM PC compatibles grew in popularity.

IBM

As I mentioned before, IBM entered the computer business with mainframes in the 1950s.   And they only saw a market for six of these beasts in the entire world.  Yeah, okay.  Another bad business decision.  Except that IBM quickly recovered from this misstep and went on to become the biggest manufacturer of mainframe and mini computers.  In 1981 they introduced their version of the personal computer.   Because they chose to share their architecture with other companies, many clones were produced.  These clones were cheaper and used the same software as the original IBM.  To differentiate between the Apple architecture and that of IBM and its clones, the term "IBM-PC compatible" was coined.  That moniker has now been shortened to simply "PC" to describe a non-Apple personal computer.

In 1996, IBM was the  third biggest seller of desktop PCS to corporate buyers.

Microsoft

A great picture of Microsoft 1978
And what good is a computer history lesson if you don't mention Microsoft.
Their stated mission was "putting a computer on every desk and in every home, running Microsoft software." Although it seemed ludicrous at the time, they meant it. Accidental Empires by Robert Cringley
Microsoft was borne in the mid 70s by Bill Gates and Paul Allen.  They originally started the company in Albuquerque, New Mexico where the Altair 8800 was being manufactured.  Later they moved their company to Bellevue, Washington.  Paul Allen was actually the computer genius behind the company while Bill was the brains of the business side.

It's interesting to note that Bill got his computing start when he was still in high school near Seattle, stealing computer time.  Back then, a company that owned a computer would sell time on the machine to other users.  Bill couldn't exactly afford all the computer time he wanted so he and some friends devised a way to use the computer without paying for it.  Ironic since now Bill is the biggest crusader against computer piracy!

Remember earlier in this lecture, I discussed Paul Roberts visiting MSU in May of this year.  I mentioned that he created the first personal computer, the Altair 8800.   He owned the company that Bill Gates and Paul Allen worked for prior to their starting Microsoft.  In fact, Paul Allen worked fulltime for MIPS and Bill was parttime.  Bill complained to Paul Roberts that he wasn't making enough money writing software so he was going to quit and do something else.  Mr. Roberts relented and put Bill on fulltime status with the company.  Several months later, for reasons unmentioned by Mr. Roberts, Paul Allen and Bill Gates were fired from (or should we say "Let go") from MIPS and began their own company, Microsoft, writing software for the Altair 8800.

Mr. Roberts is now a practicing physician and is not directly involved in the computer industry except to tinker with the machines every once in a while.

If you don't remember anything else about this history lesson, I'd like you to remember that Microsoft DID NOT invent DOS (Disk Operating System).  Tim Paterson, who worked for a small company called Seattle Computing, actually created DOS.  Here's the story:

In 1980 and 1981 when IBM was building their first personal computer, they were looking for some software to run on the machine.  They attempted to write their own but failed miserably.  They heard of this small company in Albuquerque that was writing software for the Altair 8800 and contacted the company to see if they had an operating system that could be used on the new IBM PC.  Microsoft confirmed that they did indeed already have software that could be used on the IBM machine.  Bill Gates flew to Boca Raton, Florida, signed the agreement to supply the software, then immediately flew to Seattle and bought the software that they had just licensed to IBM.  If you're not already convinced that it was a pretty shady deal, here's the rest of the story.

Microsoft refused to sell outright the software to IBM.  They would only supply it on a "license" basis; that is, IBM had to pay a royalty for each copy of the software that they used.  But when Microsoft obtained the software from Seattle Computing, Microsoft would only agree to an outright purchase of the programming code.   At the time Seattle Computing was more interested in the hardware side of computers and in fact, saw no future in software.  They were more than happy to get the software out of their hair and sold it to Microsoft for a whopping $75,000.  At the time, Seattle Computing thought they had made the deal of a lifetime and that Microsoft was pretty stupid.  So much for stupid huh!

Intel Corporation

In 1971 Intel introduced the first microprocessor  with an initial cost of $200.   It had 2,300 transistors and process a maximum of .06 MIPS (millions of instructions per second).

In 1996 the Intel chip population worldwide was 350 billion.  That equals 2 chips for every person on earth.  The cost of this chip was $974,  the number of transistors on the chip was 5.5 million and the initial MIPS was 300.  That's 300 million instructions per second! Now that's fast.

Internet

Just a few tidbits about the Internet and then we'll be done.  As I mentioned way back at the beginning of the semester the Internet was begun in late 1960s by the military.  It was mostly used by researchers, universities, and military.  In 1990, a scientist by the name of  Tim Berners-Lee got tired of the way you had to find documents on the Internet.  That is, you either had to know the exact location of the document or spend literally days looking for the document.  He decided there had to be a better way of storing,  locating, and displaying  documents  on the Internet.  So he developed this idea of a World Wide Web where documents could be linked to other documents and be displayed in a more visually appealing way.  He never patented his creation and to this day has never received a dime from any profits made off the Web concept.  He's now the head of the scientific group that determines standards for the Web and controls the way that it operates.

In 1994 a young college student in Indiana decided that there had to be a better way of manuevering around on this thing called a Web.  He, with a few friends, set about writing a software program that would help the average user navigate the Internet and the Web. His name was Marc Andreeson and the program he developed was originally called Mosaic.  He wanted the university to license and sell the program on a commercial basis.  A dispute arose between the University and Marc over rights to the software license fees so Marc headed out to Silicon Valley, started his own company and began selling the first Web browser program that you now know as Netscape Navigator!  On the day he took Netscape Corporation public he was worth $171 million.   Because of the stock price decline he's now worth about $20 million.

Just one more quick note.  In 1995 Microsoft wrote off as foolishness this idea of anyone other than scientists and researchers using the Internet.  Bill Gates stated that the average person wasn't interested and that any company that spent their resources developing products for the Internet was stupid.  A programmer that worked in the "bowels" of Microsoft visited his old college campus one day and saw a bunch of students using the World Wide Web.  He was flabbergasted.  He returned to Microsoft and told his bosses that the company needed to start developing products for the Internet or they were going to lose out big time.  All the bosses, in essence, told the programmer to sit down and be quiet since Bill had already stated that Microsoft wasn't interested in the Internet.

The kid was desperate so he sent an email message directly to Bill and told him of the campus experience.  Bill looked for himself and, sure enough, the Internet was bursting at the seams.  In 1996 Bill Gates publicly stated that every resource available to Microsoft would be directed towards the Internet and the Web. A quick turnabout in corporate philosophy!

I don't know what happened to the young kid but Bill Gates is now worth $50 billion dollars and going strong!

History of computers in film and TV (My favorite subject)
 

Summary

Hopefully you enjoyed this walk back in time and realize how much our computing world has changed over the years.  I also hope you now understand why we do what we do and how much of history plays a part in our everyday lives.

There are several good books I recommend for reading more about the history of computing.

Web Sites for Exploring Computer History

There are many, many web sites devoted to the history of the computer and a couple of on-line museums if you're so interested:

Winding It Up

You are living history in the making. Computers are here to stay, like it or not. They will become more versatile and more pervasive in everything we do. The computer industry vows to make them easier to use (they've come a long way already) and much more common than what we have today. When the Star Trek TV series first came on television, many people thought that some of the technology wouldn't be available until the 21st century - if at all. But if you think about it, much of that "farfetched" technology is available to us today. It's amazing to think of how far we've come in just the last 20 years - but, we ain't seen nothin' yet!