VIRGINIA MONTECINO
CS103 - syllabus
Montecino's CS 103 Course Page
Lecture 11 -  History of Computing

History of Computing

DEFINITION OF COMPUTER (based on von Neumann's concepts): a device that accepts input, processes data, stores data, and produces output.

See "The Machine that Changed the World: Part I Giant Brains," available at the Johnson Center Library and/or visit the companion Web site

We have gone from the vacuum tube to the transistor, to the microchip.  Then the microchip started talking to the modem.  Now we exchange text, sound, photos and movies in a digital environment.

Examples of machines in the computer evolution:

  • 14th C. - Abacus - an instrument for performing calculations by sliding counters along rods or in grooves 
  • 17th C. - Slide rule -  a manual device used for calculation that consists in its simple form of a ruler and a movable middle piece which are graduated with similar logarithmic scales 
  • 1642 - Pascaline--a mechanical calculator built by Blaise Pascal 
  • 1804 - Jacquard loom--a loom programmed with punched cards invented by Joseph Marie Jacquard
  • 1939 -1942 - Atanasoff Berry Computer built at Iowa State by Prof. John V. Atanasoff and graduate student Clifford Berry.  Represented several "firsts" in computing, including a binary system of of arithmetic, parallel processing, regenerative memory, separation of memory and computing functions, and more. Weighed 750 lbs. and had a memory storage of 3,000 bits (0.4K).  Recorded numbers by scorching marks into cards as it worked through a problem.
  • 1946 - ENIAC--World's first electronic, large scale, general-purpose computer, built by Mauchly and Eckert, and activated at the University of Pennsylvania in 1946.  ENIAC recreated on a modern computer chip.  See an explanation of ENIAC on a Chip by the Moore School of Electrical Engineering, University of Pennsylvania. The ENIAC is a 30-ton machine,  filled with 19,000 vacuum tubes, 6000 switches , and could add 5,000 numbers in a second, a remarkable accomplishment at the time. A reprogrammable machine, the ENIAC performed initial calculations for the H-bomb.
  • 1940s - Colossus - Alan Turing's vacuum tube computing machines broke Hitler's Enigma codes. 
  • 1950s -1960s - UNIVAC - "punch card technology" The first  commercially-successful computer, introduced in 1951 by Remington Rand. Over 40 systems were sold. Its memory was made of mercury-filled acoustic delay lines  that held 1,000 12-digit numbers. It used magnetic tapes that stored 1MB of data at a density of 128 cpi.  UNIVAC became synonymous with computer (for a while).See UNIVAC photo . See UNIVAC diagram
back to top
Pioneer computer scientists

Charles Babbage (1792-1871) - Difference Engine, Analytical Engine. Ada Byron, daughter of the poet, Lord Byron, worked with him. His description, in 1837, of the Analytical Engine, a mechanical digital computer anticipated virtually every aspect of present-day computers. Sketch of the Engine and notes by Ada Byron King, Countess of  Lovelace. 

Alan Turing -- 1912-1954.  British Codebreaker. Worked on the Colossus (code breaking machine, precursor to the computer) and the ACE (Automatic Computing Engine). Noted for many brilliant ideas, Turing is perhaps best remembered for the concepts of the Turing Test for Artificial Intelligence and the Turing Machine, an abstract model for modeling computer operations. The Turing Test is the "acid test" of true artificial intelligence, as defined by the English scientist Alan Turing. In the 1940s, he said "a machine has artificial intelligence when there is no discernible difference between the conversation generated by the  machine and that of an intelligent person."  Turing was instrumental in breaking the German enigma code during WWII..
Explanation of the Turing Test.by the The PT-Project, Illinois State University.  Try the Turing Test applet.
 
Pictures of the Enigma machine

Description of Enigma Machine

How the Enigma machine works.

More Information about the Enigma machine.

Try this these enigma applets by Ian Noble  or Russell Schwager

Picture of Enigma machine

J. von Neumann -- (1903-1957). A child prodigy in mathematics, authored landmark paper explaining how programs could be stored as data. (Unlike ENIAC, which had to be re-wired to be re-programmed.). Virtually all computers today, from toys to supercomputers costing millions of dollars, are variations on the computer architecture that John von Neumann created on the foundation of the work of Alan Turing's work in the 1940s.  It included three components used by most computers today: a CPU; a slow-to-access storage area, like a hard drive; and secondary fast-access memory (RAM ). The machine stored instructions as binary values (creating the stored program concept) and executed instructions sequentially - the processor fetched instructions one at a time and processed them. The instruction is analyzed, data is processed, the next instruction is analyzed, etc. Today "von Neumann architecture" often refers to the sequential nature of computers based on this model. See another von Neumann source.

John V. Atanasoff -- (1904 - 1995) - one of the contenders, along with Konrad Zuse and H. Edward Roberts and others, as the inventor of the first computer.  Eckert and Maunchly drew on Alansoff's work to create the ENIAC. Atanasoff's Computer.

Konrad Zuse-- (1910-1995) German who, during WW II, designed mechanical and electromechanical computers. See the Konrad Zuse Multimedia show - documentation about the development of his legendary computers.

H. Edward Roberts -- developed the MITS Altair 8800 in 1975.  Tthe Altair is considered by some to be the first microcomputer (personal computer)., The MITS Altair 8800 was based on a 2 MHz Intel 8080 chip, with 256 bytes, standard RAM. It was developed a year before the first Apple, by  Steve Wozniak and  Steve Jobs came out. Paul Allen and Bill Gates (then a student at Harvard) wrote a scaled down version of the Basic programming language to run on the Altair , which was the beginning of Microsoft.

Origins of  Modern Computing

1951-1959 - vacuum tube based technology. First commercial computers: Univac, IBM 701 . Vacuum Tubes are electronic devices, consisting of a glass or steel vacuum envelope and two or more electrodes between which electrons can move freely. 

1960-1968 - transistor based technology . The transistor, invented in 1948, by Dr. John Bardeen, Dr. Walter Brattain, and Dr. William Shockley . It almost completely replaced the vacuum tube because of its reduced cost, weight, and power consumption and its higher reliability. See explanation of what a transistor is. See what the first transistor looked like. The transistor is made to alter its state from a starting condition of conductivity (switched 'on', full current flow) to a final condition of insulation (switched 'off', no current flow). 

1969-1977 - integrated circuits (IC) based technology.  The first integrated circuit was demonstrated by Texas Instruments inventor, Jack Kilby, in 1958. It was 7/16" wide and contained two transistors.  Examples of early integrated circuit technology: Intel 4004, Dec pdp 8, CRAY 1 (1976) - a  75MHz, 64-bit machine with a peak speed of 160 megaflops, (One million floating point operations per second) the world's  fastest processor at that time. Now circuits may contain hundreds of thousands of transistors on a small piece of material, which revolutionized computing.  Here is a diagram of a modern integrated circuit, known as a chip.

1978 to 1986 - large scale integration (LSI); Alto--workstation with mouse; Apple, designed by Steve Wozniak, Steve Jobs. Applewas the first to have a "windows" type graphical interface and the computer mouse. The  PC and clone market begins to expand.  This begins first mass market of desktop computers. 

1986 to today - the age of the networked computing , the Internet, and the WWW. 

1992 - Bill Gates' Microsoft Corp. released Windows 3.1, an operating system that made IBM and IBM-compatible PCs more user-friendly by integrating a graphical user interface into the software. In replacing the old Windows command-line system, however, Microsoft created a program similar to the Macintosh operating system. Apple sued for copyright infringement, but Microsoft prevailed.  Windows 3.1 went to Win 95, then Win 98 .... (There are other OSs, of course, but Windows is the dominant OS today.  MACS, by Apple, still have faithful followers.

back to top

We can't talk about computers without mentioning:

The Birth of the Internet

The Internet, originally the ARPAnet (Advanced Research Projects Agency network), began as a military computer network. Other government agencies and universities created internal networks based on the ARPAnet model. The catalyst for the Internet today was provided by the National Science Foundation (NSF). Rather than have a physical communications connection from each institution to a supercomputing center, the NSF began a "chain" of connections in which institutions would be connected to their "neighbor" computing centers, which all tied into central supercomputing centers. This beginning expanded to a global network of computer networks, which allows computers all over the world to communicate with one another and share information stored at various computer "servers," either on a local computer or a computer located anywhere in the world. Universities were early users of the Internet. In 1995, large commercial Internet service providers (ISPs), such as MCI, Sprint , AOL and UUNET, offered service to large number of customers. 

The Internet now links thousands of computer networks reaching people all over the world. Since traffic on the Internet has become so heavy, some of the scientific and academic institutions that formed the original Internet developed a new global network called Internet 2. Known as the Abilene Project, and running on fast fiber-optic cable, it officially opened for business in February, 1999 at a ceremony in Washington, D.C. The network's 2.4 gigabit-per-second speed is 45,000 faster than a 56K modem 

See the The Internet Companion: A Beginner's Guide to Global Networking (2nd edition) - by Tracy LaQuey

The Birth of the WWW 

1990 - Tim Berners-Lee, a CERN computer scientist invented the World Wide Web.  It was originally conceived and developed for the high-energy physics collaborations, which require instantaneous information sharing between physicists working in different universities and institutes all over the world. Now the WWW is used by people all over the world, children and adults, for personal, commercial, and academic uses. Berners-Lee and Robert Cailliau wrote the first WWW client and server software, defining Web addresses (URLs), hypertext transfer protocol (http) and hypertext markup language (html). In December 1993, both men , along with Marc Andreesen and E. Bina of NCSA,  shared the Association for Computing (ACM) Software System Award for developing the World-Wide Web. The graphical Web browser, Mosaic, evolved into Netscape. 

The ease of using the World Wide Web has made it easier for people to connect with one another, overcoming the obstacles of time and space. This networking has spawned numerous virtual communities and cybercultures. See this list of resources on cybercultures.  The WWW has also become a convenient way to buy and sell services and goods. 

The Internet and WWW do not come without with ethical and legal ramifications, such as copyright infringement, computer spying and hacking , computer viruses, fraud, and privacy issues. See an overview of some of the ethical and legal issues.  See a basic overview of Internet copyright guidelines.

What's next??  - something interesting to ponder: Nanotechnology  - K. Eric Drexler, 43,  is the founding father of nanotechnology, the idea of using individual atoms and molecules to build  living and mechanical "things" in minature facturies.  His vision is that if scientists can engineer DNA on a molecular, why can't we build machines out of atoms and program them to build more machines? The requirement for low cost creates an interest in these "self replicating manufacturing systems," studied by von Neumann in the 1940's. These "nanorobots, " programmed by minature computers smaller than the human cell, could go through the bloodstream curing disease, perform surgery, etc.   If this technology comes about the barriers between engineered and living systems may be broken. Researchers at various institutions and organizations, like NASA and Xerox,  are working on this technology. Drexler's talk is available on the web. You need the RealPlayer from www.RealNetworks.com to see him and hear his talk. 

back to top


Some of the Many Women Pioneers in Computing: 

Ada Byron King - Portrait .Countess of Lovelace and daughter of the British poet, Lord Byron(1815-1852). - Ada was a mathematician and wrote extensive notes on Charles Babbage's calculating machine and suggested how the engine might calculate Bernoulli numbers.  This plan, is now regarded as the first "computer program."Sketch of the Engine and notes by Ada Byron King, Countess of  Lovelace.  A software language developed by the U.S. Department of Defense was named "Ada" in her honor in 1979. 

Edith Clarke (1883-1959) - At MIT, in June 1919, Clarke received the first Electrical Engineering degree  awarded to a woman . She developed and disseminated  mathematical methods that simplified calculations and reduced the time spent in solving problems in the design and operation of electrical power systems. 

Grace Murray Hopper (1906-1992)  -  See her Picture.  Hopper  earned an MA in 1930 and a Ph.D. in 1934 in Mathematics, from Yale University.  She retired from the Navy in 1967 with the rank of Rear Admiral. Hopper created a compiler system that translated mathematical code into machine language. Later versions, under her direction, became the forerunner to modern programming languages. She pioneered the integration of English into programs with the FLOW-MATIC.  Hopper received the Computer Sciences "Man of The Year Award" in 1969. She was the first woman to be inducted into the Distinguished Fellow British Computer Society in 1973. Thee term "bug," an error or defect in software or Yale University, where she hardware that causes a program to malfunction, originated, according to computer folklore, when Grace and her team found a dead moth that had been "zapped" by the relay and caused the device to fail. 

Erna Hoover - invented a computerized switching system for telephone traffic. For this achievement, she was awarded the first software patent ever issued (Patent #3,623,007) on Nov. 23, 1971). She was the first female supervisor of a technical department (at Bell Labs). 

Kay McNulty Mauchly Antonelli and Alice Burks - made calculations for tables of firing and bombing trajectories, as part of the war effort. This work prompted the development , in 1946, of the ENIAC, he world's first electronic digital computer.

Adele Goldstine - assisted in the creation of the ENIAC and wrote the manual to use it. 

Joan Margaret Winters - scientific programmer in SLAC Computing Services at the Stanford Linear  Accelerator Center, among other achievements. 

Alexandra Illmer Forsythe (1918-1980) - .During the 1960's and 1970's, she co-authored a series of textbooks on computer science.  She wrote the first computer science textbook.

Evelyn Boyd Granville - was one of the first African American women to earn a Ph.D. in Mathematics. During her career, she developed computer programs that were used for trajectory analysis in the Mercury Project (the first U.S. manned mission in space) and in the Apollo Project (which sent U.S. astronauts to the moon). 

See Pioneering Women of Computing for more details

back to top

back to top
VIRGINIA MONTECINO
CS103 - syllabus
Montecino's CS 103 Course Page