Courses/Computer Science/CPSC 203/CPSC 203 2008Winter L03/CPSC 203 2008Winter L03 Lectures/Lecture 18

From wiki.ucalgary.ca
Jump to: navigation, search
  • House Keeping
    • Text Reading -- Logic Associated Topics -- Chapter 9 (see pages below): Understand -- Switches, Binary Numbers.
    • Text Reading -- Algorithms to Computers Associated Topics. Chapters 5/6 (see pages below): Understand -- Operating Systems, Computer Parts (CPU, RAM, Hard Drive)
    • NOTE: Tutorial 1 the week of March 24tth will be devoted to TA's working with you on Group Projects
      • An example of output from TBA3 is on BB. Under "Course Documents", "TBA3 Word Example"


Lecture 18

Last class we introduced you to a few more complex patterns in (primarily) boolean logic, and how they can be implemented via digital circuits. Today we are going to trace the conceptual evolution of the computer. Computers begin with the notion of an Algorithm, essentially a mechanical procedure for calculation. The capabilities and limitations of computers are both based on this starting point.


OBJECTIVES:

  • You will be able to recognize algorithms in various guises.
  • You will appreciate the notion that a "Universal Turing Machine" is essential to computers. Anything that can act as a Universal Turing Machine can act as a computer.

From the Algorithms to the Modern Computer

Algorithms

Long before the computer -- the 'idea' of a computer has been fascinating humans. Interesting fact: At Los Alamos -- computers were human beings who did computations on a pad of paper, and passed their results to the next person.

Algorithm -- Named after the Persian mathematician Mohammed al-Khowarizimi (who worked out the step by step rules for decimal arithmatic). An algorithm is a set of well defined instructions for completing a task. One of the oldest algorithms is Euclid's algorithm for finding the greatest common divisor of two positive integers.

Examples of Algorithms

  • A recipe for baking a cake
  • Multiplication
  • A guessing game

Common Features

  • Inputs
  • Outputs
  • Process (the algorithm).

Question: Will the Algorithm halt?


Thought Experiment If the inputs were "mathematical conjectures" and the outputs were "mathematical proofs" -- would an algorithm be sufficient to derive all mathematical proofs from a series of conjectures? This problem is called "Hilbert's Entscheidungsproblem" ("decision problem"), and it inspired a young mathematician, Alan Turing to "mechanize" the notion of an algorithm.

The Turing Machine is the Idea of a computer

See: http://en.wikipedia.org/wiki/Turing_machine

Turing invented a "machine" to embody the idea of an algorithm. However, his "machine" is very abstract.

A Turing Machine has:
  • Tape (where the data is read from and written) of infinite size (sort of an infinite memory)
  • A Head (that can read/write from the tape)which has a series of states
  • A Table (of instructions). For every combination of Tape-Value/Head-State it gives an action on the tape and an action on the head
  • A State-Register -- stores the state of the Table (i.e. the last instruction).

Imagine the Reading Head is a person who has memorized a table of instructions. The 'Tape" can be a row of people who call out a number. The Table is the computer program.

Our question is: will the Head-person ever stop working? This is called the 'Halting Problem or Decision Problem'.

Turing constructed numerous Turing machines to enact specific algorithms. Then he did something rather grand, he constructed a Turing Machine where the initial part of the Tape has some data that further instructs the machine. This is called a "Universal Turing Machine (UTM)".

Essentially Turing's UTM was the design of a computer.

  • The Tape is computer memory (and also input and output)
  • The Head/State Table is the CPU
  • The initial set of data on the tape is a "computer program" or software
    • Note -- to a computer, software is just data.

The Von Neuman Architecture is the basic design of a Computer

see: http://en.wikipedia.org/wiki/Von_Neumann_architecture

Once the idea of a computer existed, there was a trans-Atlantic race to build it.

There was a British effort and an American effort. Both efforts brought together logicians and engineers. The British team was headed by Turing, the American team was headed by Von Neuman. It was John Von Neumann's architecture that was built first, and ever since, computers have been also called "Von Neumann Machine's" or "Von Neumann Architecture".

A Von Neuman Machine has:
  • Memory -- stores data and programs
  • Control Unit and Arithmatic/Logic Unit -- i.e. the central processing unit.
  • Input/Output -- a way of getting data and sending data.

Finally We have the Modern Computer

A modern computer has:
  • Hardware and software
    • Hardware includes:
      • CPU
      • Various Forms of Memory
      • The Bus along which data moves
      • Various peripheral devices (monitors, usb, keyboards)
    • Software includes:
      • Operating System Kernel -- the essential software that runs at all times
      • Systems Software -- makes the computer convenient for you to use
      • Applications -- lets the computer do the things you want it to


You might notice that the Modern Computer is essentially a Von Neumann Machine with a few more parts. Very little has changed at the "architectural" level.

The Post Modern Computer

We've noted that the essence of a computer is the Universal Turing Machine. Anything that can act like a Universal Turing Machine can be a computer. Some examples:


TEXT READINGS

Circuits

TIA 4th Edn: Chapter 9 pp 406- 431

TIA 3rd Edn: Chapter 9 pp 386 - 409

Lets Build A Computer

TIA 4th Edition: Chapter 5, 202 - 214; Chapter 6, 246 - 265

TIA 3rd Edition: Chapter 5, 190 - 202; Chapter 6, 234 - 252

Resources

The Universal Computer. The Road from Leibniz to Turing. By Martin Davis. 2000. WW Norton & Co.

Computers Ltd. What They Really Can't Do. By David Harel. 2000. Oxford University Press.

Connecting with Computer Science. Greg Anderson, David Ferro, and Robert Hilton. 2005. Thompson Press.

Mathematcis and Logic. Marc Kac and Stanislaw Ulam. 1968. Mentor Press.