Courses/Computer Science/CPSC 203/CPSC 203 Template/Lecture Template/Lecture 18
From wiki.ucalgary.ca
< Courses | Computer Science | CPSC 203 | CPSC 203 Template | Lecture Template
Contents
Housekeeping
(none)
Required Reading
TEXT
- Fluency Chapter 10: "What's the Plan?. Algorithmic Thinking".
- Fluency Chapter 23: "computers Can Do Almost {Everything, Nothing}. Limits to Computation.
SCIENTIFIC AMERICAN ARTICLES
Introduction
In this lecture, we'll take a change of pace and look at the origins of the modern computer from a historical perspective, as a natural outcome of attempts to formalize the notion of an "Algorithm"/
At the end of this lecture you will appreciate that:
- A Turing machine is an idealized computer, that can embody an algorithm.
- The Turing machine was used to demonstrate certain questions in logic are not decideable (we can never know if they are true or false).
- The "Von Neumann" architecture, took the Turing Machine concept and physically embodied it.
- The "Von Neumann" architecture remains the standard design of computer today -- though there are initial investigations into other models of computation such as Quantum and Cellular computing.
Glossary
(none)
Concepts
Algorithms
- Long before the computer -- the 'idea' of a computer has been fascinating humans. Interesting fact: At Los Alamos -- computers were human beings who did computations on a pad of paper, and passed their results to the next person.
- Algorithm
- Named after the Persian mathematician Mohammed al-Khowarizimi (who worked out the step by step rules for decimal arithmatic). An algorithm is a set of well defined instructions for completing a task. (See lecture 1 for some other capsule definitions of algorithsm [Lecture1]
- Examples of Algorithms
- A recipe for baking a cake
- Multiplication
- A guessing game
- Common Features
- Inputs
- Outputs
- Process (the algorithm).
- Question
- Will the Algorithm halt?
- Thought Experiment
- If the inputs were "mathematical conjectures" and the outputs were "mathematical proofs" -- would an algorithm be sufficient to derive all mathematical proofs from a series of conjectures? This problem is called "Hilbert's Entscheidungsproblem" ("decision problem"), and it inspired a young mathematician, Alan Turing to "mechanize" the notion of an algorithm.
The Turing Machine is the Idea of a Computer
- See: http://en.wikipedia.org/wiki/Turing_machine
- Turing invented a "machine" to embody the idea of an algorithm. However, his "machine" is very abstract.
- A Turing Machine has:
- Tape (where the data is read from and written) of infinite size (sort of an infinite memory)
- A Head (that can read/write from the tape)which has a series of states
- A Table (of instructions). For every combination of Tape-Value/Head-State it gives an action on the tape and an action on the head
- A State-Register -- stores the state of the Table (i.e. the last instruction).
- Imagine the Reading Head is a person who has memorized a table of instructions.
- The 'Tape" can be a row of people who call out a number.
- The Table is the computer program.
- Our question is: will the Head-person ever stop working?
- This is called the 'Halting Problem or Decision Problem'.
- Turing constructed numerous Turing machines to enact specific algorithms. Then he did something rather grand, he constructed a Turing Machine where the initial part of the Tape has some data that further instructs the machine. This is called a "Universal Turing Machine (UTM)".
- Essentially Turing's UTM was the design of a computer.
- The Tape is computer memory (and also input and output)
- The Head/State Table is the CPU
- The initial set of data on the tape is a "computer program" or software
- Note -- to a computer, software is just data.
The von Neumann Architecture is the Basic Design of a Computer
- see: http://en.wikipedia.org/wiki/Von_Neumann_architecture
- Once the idea of a computer existed, there was a trans-Atlantic race to build it.
- There was a British effort and an American effort. Both efforts brought together logicians and engineers. The British team was headed by Turing, the American team was headed by Von Neuman. It was John Von Neumann's architecture that was built first, and ever since, computers have been also called "Von Neumann Machine's" or "Von Neumann Architecture".
- A Von Neuman Machine has:
- Memory -- stores data and programs
- Control Unit and Arithmatic/Logic Unit -- i.e. the central processing unit.
- Input/Output -- a way of getting data and sending data.
Modern Computer
- A modern computer has:
- Hardware and software
- Hardware includes:
- CPU
- Various Forms of Memory
- The Bus along which data moves
- Various peripheral devices (monitors, usb, keyboards)
- Software includes:
- Operating System Kernel -- the essential software that runs at all times
- Systems Software -- makes the computer convenient for you to use
- Applications -- lets the computer do the things you want it to
- Hardware includes:
- Hardware and software
- You might notice that the Modern Computer is essentially a von Neumann Machine with a few more parts. Very little has changed at the "architectural" level.
The Postmodern Computer
- We've noted that the essence of a computer is the Universal Turing Machine. Anything that can act like a Universal Turing Machine can be a computer.
- Some examples:
- Cellular Automata
- Quantum Computing
- Biological Computing
- see:http://en.wikipedia.org/wiki/DNA_computing and http://www.cs.virginia.edu/~robins/Computing_with_DNA.pdf (a paticularly good article from Scientific American).
Summary
- The Universal Turing Machine is the mathematical Idealization of a Computer
- The Universal Turing Machine is also a device that can embody any clearly defined algorithm.
- It is realized by Von Neumann Machines -- which remain the architecture of modern computers.
Text Readings
Resources
- The Universal Computer. The Road from Leibniz to Turing. By Martin Davis. 2000. WW Norton & Co.
The Pattern on the Stone. The Simple Ideas that Maake Computers Work. By W. D. Hillis. 1998. Basic Books.
- Computers Ltd. What They Really Can't Do. By David Harel. 2000. Oxford University Press.
- Connecting with Computer Science. Greg Anderson, David Ferro, and Robert Hilton. 2005. Thompson Press.
- Mathematics and Logic. Marc Kac and Stanislaw Ulam. 1968. Mentor Press.
Homework
(none)