The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer.
In 1642, Blaise Pascal (1623-1662), the 18-year-old son of a French tax collector, invented what he called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on. The drawback to the Pascaline, of course, was its limitation to addition.
In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716), improved the Pascaline by creating a machine that could also multiply.
The first mechanical computer was invented in the mid-1800s by Charles Babbage. It was a big hunk of metal that took hours and hours to "program" and performed very limited functions. Mostly simple addition, subtraction, multiplication, and division. There were no keyboards, screens, and certainly, no memory.
Whoever has the impression that women didn't play an intricate role in the development of computers, don't know their history. Babbage's assistant was Lady Augusta Ada Byron; yes, the daughter of Lord Byron the famous poet. She helped developed instructions for doing computations on the first computers and has been credited with developing the first computer program. In the 1950s the Navy developed a new programming language and named it after Lady Byron - called Ada. That language is still in use today.
The first major use of a computer was for the 1890 government census. Ever wonder why the Census is only completed every 10 years in the United States? Because that's how long it took to complete the count using manual processes. Herman Hollerith built a computer that he used to completed the 1890 census and reduced the time from seven and a half years to 2 1/2 years. With all the political bickering we have nowadays over the Census I think we're back up to seven years.
In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacquard loom concept to computing. His first task was to find a faster way to compute the U.S. census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers compiled their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroghs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.
The machine was cumbersome because hundreds of gears and shafts were
required to represent numbers and their various relationships to each other.
To eliminate this bulkiness, John V. Atanasoff (b. 1903), a professor at
Iowa State College (now called Iowa State University) and his graduate
student, Clifford Berry, envisioned an all-electronic computer that applied
Boolean algebra to computer circuitry. This approach was based on the mid-19th
century work of George Boole (1815-1864) who clarified the binary system
of algebra, which stated that any mathematical equations could be stated
simply as either true or false. By extending this concept to electronic
circuits in the form of on or off, Atanasoff and Berry had developed the
first all-electronic computer by 1940. Their project, however, lost its
funding and their work was overshadowed by similar developments by other
scientists.
In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war. Shortly after this the UNIVAC and ENIAC were developed, you will see below the impact of these two computers.
First generation computers were characterized by the fact that operating
instructions were made-to-order for the specific task for which the computer
was to be used. Each computer had a different binary-coded program called
a machine language that told it how to operate. This made the computer
difficult to program and limited its versatility and speed. Other distinctive
features of first generation computers were the use of vacuum tubes (responsible
for their breathtaking size) and magnetic drums for data storage.
Throughout the early 1960's, there were a number of commercially successful
second generation computers used in business, universities, and government
from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand,
and others. These second generation computers were also of solid state
design, and contained transistors in place of vacuum tubes. They also contained
all the components we associate with the modern day computer: printers,
tape storage, disk storage, memory, operating systems, and stored programs.
One important example was the IBM 1401, which was universally accepted
throughout industry, and is considered by many to be the Model T of the
computer industry. By 1965, most large business routinely processed financial
information using second generation computers.
Silicon was transformed into "chips" that hold tiny electronic circuits. One tiny chip of silicon was able to replace huge boards of transistors and further reduced the size and cost of computers and allowed for more reliability and compactness. This breakthrough led to the IBM 360 series of computers, manufacturered of course by IBM. The 360 computer was housed in a blue metal box, hence the nickname of "Big Blue" for IBM.
It was introduced in the 1940s and used by the U.S. Army in WWII. It weighed 30 tons, had 18,000 vacuum tubes, occupied a space about 30 X 50 feet, and stood two stories high. Try putting that in your dorm room or on your dining room table at home! Today you can replace it with one fingernail size chip.
The first commercial electronic digital computer was the UNIVAC - short for Universal Automatic Computer. Where do they come up with these names! Again, the U.S. Census Bureau was the first customer to use this computer. It was also the first computer built for business applications and not strictly military usage.
Both the U.S. Census Bureau and General Electric owned UNIVACs. One
of UNIVAC's impressive early achievements was predicting the winner of
the 1952 presidential election, Dwight D. Eisenhower.
Of all the pioneers of personal computers and the Web, Douglas Engelbart may be the poorest and least-known. But the man who invented the mouse is finally winning recognition as the 1997 winner of the Lemelson-MIT prize - a $500,000 jackpot awarded annually to an American inventor. Engelbart, 72, began his work in 1951. In the 1960s, at Stanford Research Institute, his inventions included on-screen windows with menus, groupware, and hypertext. He came up with the mouse in 1963, carving the first one from wood. He got no royalties, though, because researchers then assigned patents to their employers. Engelbart says his big frustration is that his inventions have never fulfilled his dream: having computers help solve society's problems. He still promotes his ideals through the Bootstrap Institute, a think tank he founded in 1989. Compared with the challenges ahead, he warns, "everything in the past is peanuts." As quoted in Business Week, magazine, 1997.An interesting note: Douglas Engelbart visited Montana State University and the American Computer Museum, Bozeman, in May of this year. I had a chance to talk to him personally and asked him how he came up with the idea of the mouse. He replied that he sketched the idea of the mouse at a conference because he was bored! The original mouse had three buttons on it. When I asked how he came up with the number three he stated that three were all he could fit on the device. Interestingly that we still work with three buttons on some mice. Guess what the original name was for the mouse device: X-Y Position Indicator for a Display System! Okay, now we understand why the name changed.
All of these events happened about the same timeframe and now, in the
late 90s, about 55%-60% of the U.S. population considers a PC in the home
an essential appliance.
Several times the Navy brass tried to get rid of her and force her retirement from the Navy. She just picked up the phone, called the President of the United States, who in turn told the brass to leave her alone. She eventually retired on her own in the late 80s as the oldest person to ever retire from the U.S. military.
Back when computers used vacuum tubes, the computer that Grace was using quit running. Her colleagues spent hours trying to determine the cause of the malfunction. She finally opened up the back of the computer, found a dead moth on one of the vacumn tubes, and declared that she had removed the "bug" from the computer. Since that time, the term "bug" has been used to describe any kind of problem with computer hardware or software.
The first modern GUI (graphical User Interface) and mouse were in fact engineered at a Xerox research center called PARC, Palo Alto Research Center. (Remember Douglas Englebart invented the very first mouse.) A couple of their scientists created these tools to make it easier for people to use and interact with computers. However, management didn't see a need for them since the company was more concerned about copy machines. Here's an excerpt from Business Week magazine:
It was one of the great fumbles of all time. In the 1970s, Xerox Corps.'s Palo Alto Research Center (PARC) developed the technologies that would drive the personal computer revolution. "By 1979, we had it all - graphical user interfaces, mice, windows and pull-down menus, laser printing, distributed computing, and Ethernet," recalls M. Grank Squires, a PARC founder in 1970 and now chief administrative officer of SematechInc., the chip-industry consortium in Austin, Tex.
Xerox had the PC and networking businesses firmly hooked - but didn't try to reel them in. It didn't even patent PARC's innovations. Management was too preoccupied with aggressive competition from Japan in its core copier business, says CEO Paul A. Allaire. "If we'd been really good, we could have done both. We probably should have," he admits.
Instead, PARC's technologies became the foundation for such icons as Apple Computer Inc. and 3COM Corp. Apple co-founder Steven P. Jobs visited PARC in 1979 and was astonished at what he saw. His PARC tour inspired the Macintosh. That same year, Ethernet inventor Robert M. Metcalfe left PARC to start 3Com (he now owns Candlestick Park in San Francisco!). These are just two famous examples of the great Xerox giveaway.
"Xerox won't duplicate past errors." Business Week, September 29, 1997And all you Apple diehards thought Steve Jobs and Steve Wozniak developed the mouse and GUI. Nope! After Steve Jobs' visit to Xerox PARC he took what he saw back to his garage and applied it to his computing invention. Thus, Xerox lost out big time and Apple went on to become one of the most successful computing companies ever.
In 1990 Microsoft adapted these two creations to the IBM PC compatible machines.
The 1970s saw the beginning of the software revolution in that one program was written that would operate on many computers. Still the program would only perform one task on one machine. But at least it cut down the time that companies had to wait for the programs.
PCs helped revolutionize the software industry by creating a need for "off the shelf" software first made available in the early 80s. Most people have the impression that word processing software was the first type of program created for PCs. Nope, it was a program called Visicalc; a spreadsheet program. Even though it was the first "off the shelf" software program for PCs, Lotus 1-2-3 became the most popular.
Wider usage of PCs called for simpler installation of software that the average user could complete on their own. This type of software was made available in late 80s and 90s. While some of you may not agree that today's software is easier to install and use, trust me, it is. I remember in 1986 typing in seven pages of numbers that represented the latest, greatest word processing program. It took me about two days of nonstop typing to "load" that program to my computer's memory - which by the way, consisted of an old fashioned cassette tape recorder.
So what happened to Apple? If they were so dominant during the 80s why do they only have about 3-5 percent of the market today? They make terrific computers and, if you ask any Apple user, they swear by the "Macs." Mostly bad business management throughout the years attributed to their severe decline. Also, Apple has never allowed other companies to license their technology and make peripheral devices for the Apple. Some experts say this decision alone caused most of Apple's trouble.
If you compare the open architecture of the IBM-PC compatible PCs with that of Apple's closed architecture you begin to understand this problem. Let's say you are a young entrepreneur with a hankering to start your own computer device company. One company, Apple, says to you "NO, you can't have our blueprints. We don't want you making a product that competes with ours." Another company, IBM, says, "Well sure, you can make devices that work with our computers. Yeah they may compete with our products, but that's okay. Here's the blueprints, have at it." Which company are you going to go with? Over the years that's exactly what has happened. More and more companies began making products and software for IBM compatibles and less and less for the Apple. Pretty soon the market forces and consumer demand took hold and the Apple computers faded from the forefront.
There's little argument that Apples are superior products in many ways to the IBM PCs. But, if you have to spend extra money for that machine, are you willing? And, if you can't get the latest, greatest software for the machine, are you willing to wait? Most consumers have said no. So little by little, the Apple lost market share and the IBM PC compatibles grew in popularity.
In 1996, IBM was the third biggest seller of desktop PCS to corporate buyers.
Microsoft was borne in the mid 70s by Bill Gates and Paul Allen. They originally started the company in Albuquerque, New Mexico where the Altair 8800 was being manufactured. Later they moved their company to Bellevue, Washington. Paul Allen was actually the computer genius behind the company while Bill was the brains of the business side.
- Their stated mission was "putting a computer on every desk and in every home, running Microsoft software." Although it seemed ludicrous at the time, they meant it. Accidental Empires by Robert Cringley
It's interesting to note that Bill got his computing start when he was still in high school near Seattle, stealing computer time. Back then, a company that owned a computer would sell time on the machine to other users. Bill couldn't exactly afford all the computer time he wanted so he and some friends devised a way to use the computer without paying for it. Ironic since now Bill is the biggest crusader against computer piracy!
Remember earlier in this lecture, I discussed Paul Roberts visiting MSU in May of this year. I mentioned that he created the first personal computer, the Altair 8800. He owned the company that Bill Gates and Paul Allen worked for prior to their starting Microsoft. In fact, Paul Allen worked fulltime for MIPS and Bill was parttime. Bill complained to Paul Roberts that he wasn't making enough money writing software so he was going to quit and do something else. Mr. Roberts relented and put Bill on fulltime status with the company. Several months later, for reasons unmentioned by Mr. Roberts, Paul Allen and Bill Gates were fired from (or should we say "Let go") from MIPS and began their own company, Microsoft, writing software for the Altair 8800.
Mr. Roberts is now a practicing physician and is not directly involved in the computer industry except to tinker with the machines every once in a while.
If you don't remember anything else about this history lesson, I'd like you to remember that Microsoft DID NOT invent DOS (Disk Operating System). Tim Paterson, who worked for a small company called Seattle Computing, actually created DOS. Here's the story:
In 1980 and 1981 when IBM was building their first personal computer, they were looking for some software to run on the machine. They attempted to write their own but failed miserably. They heard of this small company in Albuquerque that was writing software for the Altair 8800 and contacted the company to see if they had an operating system that could be used on the new IBM PC. Microsoft confirmed that they did indeed already have software that could be used on the IBM machine. Bill Gates flew to Boca Raton, Florida, signed the agreement to supply the software, then immediately flew to Seattle and bought the software that they had just licensed to IBM. If you're not already convinced that it was a pretty shady deal, here's the rest of the story.
Microsoft refused to sell outright the software to IBM. They would only supply it on a "license" basis; that is, IBM had to pay a royalty for each copy of the software that they used. But when Microsoft obtained the software from Seattle Computing, Microsoft would only agree to an outright purchase of the programming code. At the time Seattle Computing was more interested in the hardware side of computers and in fact, saw no future in software. They were more than happy to get the software out of their hair and sold it to Microsoft for a whopping $75,000. At the time, Seattle Computing thought they had made the deal of a lifetime and that Microsoft was pretty stupid. So much for stupid huh!
In 1996 the Intel chip population worldwide was 350 billion. That equals 2 chips for every person on earth. The cost of this chip was $974, the number of transistors on the chip was 5.5 million and the initial MIPS was 300. That's 300 million instructions per second! Now that's fast.
In 1994 a young college student in Indiana decided that there had to be a better way of manuevering around on this thing called a Web. He, with a few friends, set about writing a software program that would help the average user navigate the Internet and the Web. His name was Marc Andreeson and the program he developed was originally called Mosaic. He wanted the university to license and sell the program on a commercial basis. A dispute arose between the University and Marc over rights to the software license fees so Marc headed out to Silicon Valley, started his own company and began selling the first Web browser program that you now know as Netscape Navigator! On the day he took Netscape Corporation public he was worth $171 million. Because of the stock price decline he's now worth about $20 million.
Just one more quick note. In 1995 Microsoft wrote off as foolishness this idea of anyone other than scientists and researchers using the Internet. Bill Gates stated that the average person wasn't interested and that any company that spent their resources developing products for the Internet was stupid. A programmer that worked in the "bowels" of Microsoft visited his old college campus one day and saw a bunch of students using the World Wide Web. He was flabbergasted. He returned to Microsoft and told his bosses that the company needed to start developing products for the Internet or they were going to lose out big time. All the bosses, in essence, told the programmer to sit down and be quiet since Bill had already stated that Microsoft wasn't interested in the Internet.
The kid was desperate so he sent an email message directly to Bill and told him of the campus experience. Bill looked for himself and, sure enough, the Internet was bursting at the seams. In 1996 Bill Gates publicly stated that every resource available to Microsoft would be directed towards the Internet and the Web. A quick turnabout in corporate philosophy!
I don't know what happened to the young kid but Bill Gates is now worth $50 billion dollars and going strong!
History of computers in film and TV (My favorite subject)
There are several good books I recommend for reading more about the history of computing.