Georgetown University
Graduate School of Art and Sciences
Communication, Culture & Technology Program
CCTP-5038 History and Philosophy of Computing
Professor Martin Irvine
Spring 2025
Have you wondered about the deeper history of ideas that enabled us to get where we are today in all our computing and data systems? This course provides students with an inside view of the history of ideas that have made computing possible, starting from the earliest beginnings in the 1600s to our contemporary digital electronic systems, digital media, data networks, and AI. To facilitate access to the deeper history, students will learn how use and interpret earlier original primary documents and earlier examples of technologies archived in libraries and museums. We will study the history of computing from the perspective of the key concepts and design principles that have made computing possible, and not as a story of the inevitable "progress" of machines, products, or innovations. You will discover the very essential human ideas and capabilities that motivated the beginnings of computing in the Enlightenment era (1600s-1700s), and have continued to motivate the designs of computing systems in the various technical implementations down to us today.
Recovering the history of ideas in computer reveals that the key concepts in computing are accessible to everyone, but have accidentally become “black boxed” (inaccessible, closed-off from understanding) from two main causes: (1) our modern political economy for technology with computing tech industries that market closed “products” only accessible to specialists, and (2) the false modern institutionalized division of disciplines and sciences into "math/science/tech" and "humanities/social science/arts" domains of knowledge. We will apply our CCT interdisciplinary approach to “de-blackbox” (make accessible for understanding) the key concepts in computing, and create a truer view of computing that draws from overlapping histories in philosophy, logic, mathematics, science, communications, social contexts, and supporting technologies. Students will work on interdisciplinary research projects for doing their own “deep dive” into the history of concepts and contexts that have led to where we are today.
Syllabus Structure
The 14 week units of the course are organized around six "Nodal Moments" in the history of ideas and how they became implemented in different stages of the technologies. A "node" in this context means a moment in time and place where people, ideas, social conditions, and developments in underlying technologies intersect and interconnect, like a node in a network that connects many active "links" at the same time. We will also follow how earlier nodes produce links that become connected to further nodes through the whole history, leading right to our present moment.
Conceptual Framework and Methods Used in the Course
Intellectual history (history of ideas). We will study major moments in the history of philosophy, mathematics, logic, and design concepts for technologies, studied in their social contexts and as reinterpreted and re-applied throughout developments in other contexts.
Systems and Design Thinking: learning to understand how everything in computing – past and present – is based on designing complex (multi-part) systems of combined subsystems (modules); that is, as interconnecting components, combined orchestrated in a unifying master-design termed an architecture.
Semiotics. Theory of symbol systems (signs, symbols, diagrams, writing, notation for math, logic, and code), symbolic thought, interpretation, and physical representation that underlies all design thinking in computing, math, code, programming.
Archival methods. How to access and interpret earlier, original, primary sources (documents, artefacts, devices, technical documentation) from online data sources (archives, libraries, museums).
Learning Goals and Outcomes
Students will learn the combined methods in our framework for a truer and more complete understanding of the core concepts in computing technologies from their deeper histories of development, and be able to apply this knowledge to their own further research, learning, and career development. By the end of the course, students will be able to apply this interdisciplinary knowledge to many other fields and careers, including design, applications in digital media, and becoming capable communicators for "de-blackboxed" explanations of computing, including our complex data and AI systems Students will be able to explain the reasons behind the design of our contemporary computing systems, and better understand what is possible (and not) in future developments.
Learning archival methods for access to the history in original primary sources is an important skill in many disciplines and career paths. Students with this competency will always be in demand for being able to research, interpret, and apply the historical knowledge gained from direct access to primary sources (original documents, designs, artefacts, earlier technologies) in understanding the foundation of computing systems today, and what can be possible in new future designs.
View and download the pdf syllabus document:
Full description of the course, Georgetown Policies, and Georgetown Student Services.
Course Format
The course will be conducted as a seminar and requires each student’s direct participation in the learning objectives in each week’s class discussions. The course has a dedicated website designed by the professor with a detailed syllabus and links to weekly readings and assignments. Each syllabus unit is designed as a building block in the interdisciplinary learning path of the seminar. For each week, students will write a short essay with comments and questions about the readings and topics of the week (posted in the Canvas Discussions module). Students will also work in teams and groups on collaborative in-class projects and group presentations prepared before class meetings.
Students will participate in the course by through a suite of Web-based learning platforms and etext resources:
(1) A custom-designed Website created by the professor for the syllabus, links to readings, and weekly assignments:https://irvine.georgetown.domains/5038/ [this site].
(2) An e-text course library and access to shared Google Docs: most readings (and research resources) will be available in pdf format in a shared Google Drive folder prepared by the professor. Students will also create and contribute to shared, annotatable Google Docs for certain assignments and dialogue.
(3) The Canvas discussion platform for weekly assignments.
Grades
Grades will be based on:
- Weekly short writing assignments (posted to the Canvas Discussions platform) and participation in class discussions (50%). Weekly writing must be posted at least 4 hours before each class so that students will have time to read each other's work before class for a better informed discussion in class.
- A final research "capstone" project written as an essay or a creative application of concepts developed in the seminar (50%). Due date: one week after last day of class. Final projects will be posted as pdf documents in the Final Projects category in the Canvas Discussions platform.
Professor's Office Hours
To be announced. I will also be available most days before and after class meetings.
Books and Resources
This course will be based on an extensive online library of book chapters and articles in PDF format in a shared Google Drive folder (access only for enrolled students with GU ID). Most readings in each week's unit will be listed with links to pdf texts in the shared folder, or to other online resources in the GU Library.
The required books below are available at the GU bookstore, but, of course, you can get them from online sellers as well. Selections from these books are in pdfs in the e-text library, but these books are important to have (in print) for ongoing reference and your own annotations.
Required Books:
- Howard Rheingold, Tools for Thought: The History and Future of Mind-Expanding Technology. rev. ed. Cambridge, MA: MIT Press, 2000.
- Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: MIT Press, 2015.
Course Online Library (Google Drive: GU student login required)
- Main course library folders:
- Directory of folders for all topics (for your own reading and research)
- Georgetown Library Main Search Page
- Learn how to search search for books, journals, databases, and other media)
Using Research Tools for this Course (and beyond)
- Required: Use Zotero for managing bibliography and data for references and footnotes.
Directions and link to app, Georgetown Library (click open the "Zotero" tab).
You can save, organize, export and copy and paste your references with formatted metadata into any writing project. - Required: Internet Archive (archive.org). Sign up for a free account (gets you access to books and documents that can only be read online).
- The Archive is a rich library of free-access and downloadable books and journals in the history of computing and technology. You will learn how to do your own research in primary documents published before the 1950s.
Introduction to the Methods, Topics, and Learning Goals of the Course
- Introduction to the History and Philosophy of Computing.
Introductions
- Professor's Personal Introduction.
- Student introductions: Who are we? Backgrounds and interests to be considered in developing the course.
Course Introduction: Requirements, Expectations, Orientation
- Format of course, requirements, participation, weekly assignments, projects, outcomes (see above).
- Using the course Website syllabus (the main launchpad) and online etext library (shared Google Drive).
- Why I use custom-designed Websites for courses: teaching philosophy, instructional design, student access to materials.
- I update and revise this course Website as we study together: check back each week for updates.
- Classroom rules: how to use PCs and mobile devices: no social media or attention sinks during class.
Using Research Tools for this Course (and beyond)
- Required: Learn how to use Zotero for managing bibliography and data for references and footnotes.
Directions and link to app, Georgetown Library (click open the "Zotero" tab).
You can save, organize, export, and copy and paste your references with appropriate formatted metadata into any writing project. - Required: Using Georgetown Library, Main Search Page
https://www.library.georgetown.edu/- Learn how to search and access journals, books, and databases for further study and research.
- Required: Sign up for a free account on Internet Archive (archive.org) (gets you access to books and documents that can only be read online).
- The Archive is a rich library of free-access and downloadable books and journals in the history of computing and technology (and many other topics). This is a great resource for learning how to do your own research in primary documents published before the 1960s.
Introduction to the course: methods and main topics (Prof. Irvine, Presentation)
Main Topics and Learning Objectives
This week, students will learn about the conceptual framework, methods, and main topics of the course. Students will be introduced to the design principles for modern computing systems (including forms of digital data and media) so that we can ask important questions about "how and why did we get here," and "what makes all this possible"?
In this introductory week, we will begin framing the major questions which the course will enable all students to answer. In each course unit, students will be introduced to research methods and foundational concepts for doing their own investigations and developing answers and explanations. By the end of the course, students will be able to answer the following questions and explain why they are important for our understanding of computing today.
- Why is computing (as we know it) and everything digital based on philosophical assumptions that go back 400 years? What are the major ideas in computing that aren't technological in themselves?
- What is Computation? How long have we humans being doing computation?
- What is a Computer? What makes a computer a computer? And why should we always say Computer System rather than computer (as an object or product)?
- What do we mean by automation? What can, and can't be, automated in computer systems? Why is computation, as we understand in computer system design, called "automatic symbol processing" and "automated reasoning"?
- How can we "compute" all kinds of human symbolic forms (all our digital media from text and numbers to images and audio) that are not "about" numbers, and not numerical in themselves?
- What are the historical "roots" (deep histories) and "routes" (paths of development) of computing that have led us to where we are today? What "routes" haven't yet been followed -- well, fully, or at all?
About our required texts and main online sources:
Our required texts and online sources often overlap in the presentation of topics, but this is good: you will read explanations and historical contexts from different ways that the history of computing is conceived.
Introductory Readings
- Review: Course Introduction (Slide Presentation: Review)
- Prof. Irvine, Introduction to the Course (Methods and Approach) (Print out, save for reference.)
- Subrata Dasgupta, It Began with Babbage: The Genesis of Computer Science. Oxford, Oxford University Press, 2014. Selections. Read the Prologue, pp.1-8 for this week.
- Note: Computing didn't begin with Babbage, but, as we will see, he was the first to work out the many complex details for a mechanical computing system.
- George Ifrah, The Universal History of Computing, Available online in pdf. Skim Part 1, pp.3-96 (for an appreciation of the deep history of thinking about, and using, numbers and symbols before all the cumulative knowledge began to applied to computing machines), and read pp. 302-310 in Chap. 6. Pp. 99-154 is a preview of our first historical node.
- This book is a treasure of deep research and thinking by Ifrah, who is famous for his masterful book, The Universal History of Numbers The sections are sometimes oddly organized, and contain the translator's comments; but you can easily skip around to sections that you want to study.
- Peter J. Denning and Craig H. Martell, Great Principles of Computing. MIT Press, 2015 (in pdf). Introduction.
- This is a great, accessible introduction to the main principles of computing by a leading computer scientist. Many chapters and topics may seem beyond what you're ready for now, but keep reading and re-reading, and we will discuss the main topics throughout the course.
Preview Online Sources: get to know the sources that we will use in the course
- Go to these online sources this week just to become familiar with the contents. We will use topics in these sources in our course units, and they are great to know for reference (and fuller details and background than Wikipedia).
- Stan Augarten, Bit by Bit: An Illustrated History of Computers (Web version).
- Link is to the Table of Contents. Use the drop down menu (BIT BY BIT) in the navigation row at the top.
- For this week, go to the Preface and Chapter One: Introduction. Browse any other chapters to look ahead, and consult throughput the course.
- This will be a major online textbook through the course.
- History of Information (Online Encyclopedia of Technology, by Jeremy Norman)
- This is a richly resourced site. View by "Theme" (Topic Categories).
- Computer Timeline (Developed by Georgi Dalakov). This is a good resource that combines many solid sources in the history of computing. See the Sitemap for s list of "Timeline Stories" [not sorted]
Video Lessons Online:
- Crash Course Computer Science (list of video lessons)
- View the Preview and Lesson #1, Early Computing. This is a great series of lessons, and we will use them throughout the course.
- Code.org: How Computers Work (2): What Makes a Computer a Computer?
- We will also refer to other video lessons in this series later in the course.
Examples for Discussion (Prof. Irvine, in class):
- Examples of primary documents and artefacts for introducing our methods of study.
Writing assignment (Canvas Discussions Link)
- Discuss one or two important things that you learned from the readings and video lessons for this week, and write questions that you would like to go over in class. Were the concepts and methods in the "Introduction to the Course" (pdf) clear and understandable? What wasn't? Include questions or topics that you would like to have covered or explained more fully this week and throughout the course.
Learning Objectives and Main Topics
Learning the contexts and concepts for the first mechanical calculating machines during the 1640s-1710s designed by Blaise Pascal and Gottfried Leibniz. Learning from the original texts, documents, and examples of the machines. Learning about the social and intellectual conditions of the 17th-century "Scientific Revolution" (defined by major developments in mathematics and applied mathematics for scientific observations, instruments, and machines) as the foundations for the 18th century "Enlightenment" (new developments in all fields of knowledge separate from the authority of the church).
Important historical question:
Why were machines for automating computation (calculation) in decimal numbers first developed in France and Europe from the 1640s-1670s? Why then and there? Why not earlier (or later) or in some other country or region?
"Scientific Revolution" and "Enlightenment" Era
In learning about the context of this historical period, we will
use these two terms "Scientific Revolution" and "Enlightenment Era" just as conventional labels for what is really a very complex, and contested, period of history. Our focus will be on what converged to make the very conception of computation and design models for machines possible at all, and possible to implement in actual physical materials.
Pascal is important for his contributions to geometry, mathematics, scientific experiments, the philosophy of science, and for conceiving, designing, and building the very first mechanical arithmetic machine (when he was 19 years old). His machine is notable for a number of "firsts" in history, but we will study it for the major conceptual leaps that made it possible to automate human abstract, conceptual structures (numbers represented in symbols) in physical components.
Leibniz is important for many contributions in the history of human thought (the calculus, notation signs), but we will consider his work on the theory of signs and symbols, his design concepts for an arithmetic machine (mechanical calculator), and his development of the binary (base 2) number system.
What difference does it make?
How does it change (does it?) your understanding of the history of computing by having your own direct access to original primary documents and views of the early machines?
Readings and Background Sources:
- Brief backgrounds on the intellectual historical contexts: 1600-1840
- Wikipedia: "Scientific Revolution".
- Wikipedia: "Age of Enlightenment"
- Introductions to the origins of "calculating" and "computing"
- Joel Shurkin, Engines of the Mind (1996), Chap. 1.
- George Ifrah, The Universal History of Computing, Chap. 5, pp.99-132. (Online pdf.)
- Augarten, Bit by Bit, Read Chap. 1 (Introduction), and sections 1.2-1.9.
- Background Histories (Computer Timeline): Main Index | Linear Scroll Through Histories
- Prof. Irvine: Introduction to Node 1: Background and Sources (Slides)
[We will discuss in class.] - Introduction to Readings from Primary Sources: Part 1, edited and translated by Prof. Irvine. [Online pdf of Primary Sources: Part 1].
- For this week, read the short paragraph by Galileo (p.1); and the sections from Leibniz on The True Method, A Dialog, and the Letter to Walter von Tschirnhaus (pp.13-20).
Windows Into the Intellectual World of Pascal and Leibniz:
Important Primary Texts:
View and Survey the Contents of these Famous Books
[we will discuss in class]
- René Descartes, Geometry (Part of Discourse on Method) (1637) (Wikipedia background)
- Original Text (in French) (Internet Archive)
The link opens on the pages for Chap. 2: page through to p.318, p.356, p.370. Descartes calls his drawn lines in a diagram a "machine" for working out demonstrations. This is the traditional method with a ruler and compass. - Parallel Text edition (original French and English translation)
Note: Descartes' combination of algebra and geometry, and use of "paper machines." He is using an "analog" computing device. Translator translates as "instrument."
- Original Text (in French) (Internet Archive)
- Christiaan Huygens, The Pendulum Clock, or, Geometrical Demonstrations Concerning the Motion of Pendula as Applied to Clocks (1673) (Wikipedia Background)
- Original Latin edition (1673) of The Pendulum Clock (Horologium oscillatorium; sive, De motu pendulorum ad horologia aptato demonstrationes geometricae)
The link opens on the page with the illustration of the clock with gear components and pendulum. The Latin word horologium (and French horloge) comes from the Greek (via Latin) word meaning "time counter, time reckoner." - English Translation (Blackwell, 1986). Go to the page with Huygens' illustration of the clock, and read a few pages of his description of the geometrical-mathematical principles of the clock design.
- Note: The translation of arithmetic and geometry to numbers of gear teeth, wheel diameters, and a mesh-work system of components for the design of clocks and watches was already a well-known Art (a practiced craft, trade, technique). 17th-18th mathematicians described the formal (abstract, mathematical) rules and laws for time-keeping and navigation instruments. Pascal obviously saw how this technology could be re-purposed and redesigned for a calculating machine.
- Original Latin edition (1673) of The Pendulum Clock (Horologium oscillatorium; sive, De motu pendulorum ad horologia aptato demonstrationes geometricae)
- Isaac Newton, Universal Arithmetic: Or, A Treatise of Arithmetical Composition and Resolution
(1707) (Wikipedia Background)
- Original Latin Text (Arithmetica universalis, first printed in 1707) (Internet Archive)
- English Translation (1720): (Google Books) and (Internet Archive)
Go to p.1 for the definition of "Computation," and survey the presentation of arithmetic and algebra. (Newton sums up the accepted knowledge of his time). - Newton is famous for much more (optics, theory of gravity, applied mathematics in many fields), and his important work, The Mathematical Principles of Natural Philosophy (1687, and many later editions), which is usually cited as the embodiment of Enlightenment mathematics and science. He and Leibniz were rivals, and engaged in a useless long dispute over who first discovered and formalized the notation (mathematical symbols) for the calculus.
Examples for discussion in class
- Examples of primary documents and artefacts from libraries and museums
- Preview of Pascal's and Leibniz's Arithmetic Machines.
Writing assignment (Canvas Discussions Link)
- Choose one of these questions to write about with references to the readings and to one of the primary source texts assigned for this week (as above):
- What did you find most interesting, challenging, or difficult about studying the ideas and intellectual and technical contexts for this period?
- How is Leibniz's philosophy of symbols connected to his view of reasoning (rational thinking), mathematics, and necessary ("mechanical") thinking processes?
- Could you see how the concepts, assumptions, and ways of thinking with mathematics exemplified in the primary texts by Descartes, Huygens, and Newton would have been understood and applied by Pascal and Leibniz for conceiving and designing an arithmetic machine?
Learning Objectives
Learning the essential concepts and design principles of computing systems by studying major original documents and historical examples of the first calculating machines.
What can we learn about the key concepts in computing from studying the primary sources for the the design principles of the earliest arithmetic machines?
Readings and background:
- Prof. Irvine, ed., Primary Sources, Part 1. [Online pdf of Primary Sources: Part 1].
- Read the selection of texts on Pascal's and Leibniz's Arithmetic Machines.
We will study the Binary (base 2) Number System in the following historical Node (connecting Leibniz's work with that of George Boole)..
- Read the selection of texts on Pascal's and Leibniz's Arithmetic Machines.
- Prof. Irvine: Introduction to Node 1: Background and Sources (Slides).
Review the last 5 slides for the underlying design concepts for Pascal's and Leibniz's Arithmetic Machines.
- Calculating in Pre-modern Times: Online Exhibition by the Arithmeum, Bonn, Germany.
- Scroll through the examples up through the 18th Century.
- History of Information (site), Calculator and Computer Design (Ancient to Modern)
- See the historical context of Pascal's and Leibniz's work, 1642-1705 in a timeline view.
Video Lessons: Visualizing How the Design Principles Were Implemented
- View these videos for the design and operations of the machines from our point of view:
can you see how each component and its linked actions represent a technical solution to a design problem? How can we map physical instances of arithmetical numbers in the physical structures of components with the operations that we apply to them (addition, subtraction,...) in a system that "does arithmetic" (from human inputs), and is not just a "box of parts." - How Pascal's Arithmetic Machine Works (Mechanical Computing)
- Pascal's Machine: Digital Simulation (Dresden State Art Collections, Milestones of Knowledge)
- Pascal's Machine (Arithmeum, Bonn; in German but you can follow.)
- Leibniz's Machine (Arithmeum, Bonn; in German but you can follow.)
- Leibniz's Machine (Exhibition and video, Arithmeum, Bonn)
Case Studies: Pascal's and Leibniz's machines in the original documents
- In addition to the texts and images in the Primary Sources, Part 1 pdf, you can also access and think about the primary documents in online digital version:
- Pascal, Dedicatory Letter and Information for Users for his Arithmetic Machine (1645). Digital copies of the texts that Pascal had printed for his machine.
- Library of Clermont-Auvergne, France | Internet Archive (with the Royal Patent)
- View the description and illustrations of Pascal's Arithmetic Machine from the original printing of the Encyclopedia
- ARTFL edition, engraving Plate with description (Vol. 22:2).
Writing assignment (Canvas Discussions Link)
- Using the primary documents and demonstrations of details in Pascal's and Leibniz's machines (videos and illustrations in the readings), describe how the two early designers used the principle of "correspondence mapping" for numeral symbol representations and arithmetical operations in a system physical components. Select one or two examples of components that show the features for physically registering numerals and for conducting actions corresponding to arithmetical operations. Draw your own diagrams with labels and captions, if this helps and the learning more fun. Can you also see how "user agency" is designed into the system for directing input numeral representations to output numeral representations? Explain, as best you can so far, the "why" and "how" of the designs. Ask questions that we can follow up on in class.
Learning Objectives and Main Topics:
Learning the key concepts, computational models, design principles that Charles Babbage found ways to implement in the physical components available in England and Europe in the 1820s-1840s.
Background: A Computing System, From Design to Implementation
We will study Node 2 in two steps (as before): a week devoted to the main ideas and design concepts, followed by a week on the technical implementations and interpreting the primary sources that document the developments.
This week, you will learn the basic background on Babbage's great "conceptual leaps" that enabled him to come close to conceiving a "general purpose" mechanical computing system, and the ideas that he attempted to realize in actual working machines.
Next week, we will focus on the design principles and technical implementations in Babbage's Engines, and study the descriptions and commentary on the machines in the primary sources.
Readings: Background and Primary Sources
- Prof. Irvine, "Introduction to Node 2." Read p.1 of the Primary Sources, Node 2 compilation (in-class handout and online).
- Dasgupta, It Began with Babbage, Chaps. 1-2, pp. 9-27.
- Charles Babbage (background history, Computer Timeline)
- Ada Lovelace (background history, Computer Timeline)
- Augarten, Bit by Bit, From Chapter 2: Read Sections 2.1 - 2.8.
- Section 2.1 connects the ideas and designs from Leibniz's work to other developments in aids for computation and the first commercially established, small calculating machine: Thomas of Colmar's Arithmometer, which used a version of Leibniz's graduated cylinder ("Stepped Drum").
- See the version of the Arithmometer on display at the Smithsonian Museum of American History [link], which is part of the larger collection of Stepped Drum machines [link]. See other calculating machines in the Smithsonian collections [link], including Difference Engines designed with Babbage's concepts.
- Rheingold, Tools for Thought: The History and Future of Mind-Expanding Technology (Rev. ed., 2000), Excerpts from Chap. 1 and Chap. 2: The First Programmer was a Lady.
- Babbage and his Engines: Introduction. Computer History Museum
- This is an excellent short introduction. Read each section with the Heading titles (Overview - Modern Sequel).
- Survey of Primary Documents (the original publications for the selections in Primary Sources, Node 2) [pdf in Google Drive]. Survey the sources for this week. We will study in detail next week.
Writing assignment (Canvas Discussions Link)
- Discuss two or three of the key concepts in Babbage's method and design for a mechanical computer system. Can you explain the significance of the motivation for his design and implementation of the Difference Engine (computing error-free tables of pre-calculated numbers)?
Learning Objectives and Main Topics
The Primary Sources for Node 2: Learning the fundamental design principles for Babbage's Analytical Engine from the original papers, diagrams, Ada Lovelace's explanations, and actual versions of Babbage's machines.
Understanding the design principles and conceptual leaps for Babbage's Engines as computing systems in the context of applied mathematics. We will study Lovelace's interpretations of the design for the Analytical Engine, and the meanings of the diagrams, algorithms, symbols and operations:
Babbage and Lovelace provided the first macro design concepts for what became the standard model for computation: defining and organizing two orders or classes of symbols (for numerical representations and for operations on, over, or with them), both requiring homologous mapping to corresponding system components. This was the beginning of concepts for "data" and "programs."
Terminology:
- Analytical: computations that use algebra and mathematical functions, and not only basic arithmetical operations.
- Engine: a designed device that embodies mathematical and/or mechanical principles, and intended as an aid for human capabilities and/or for labor-saving (as in reducing the mental labor of calculations).
Readings and Primary Sources: Deeper into the history of ideas and machines
- Prof. Irvine, Primary Sources: Node 2 [online pdf].
Read the excerpts from the primary documents on Babbage's Engines and Ada Lovelace's contribution to the description of the Analytical Engine and explanation of the main design concepts.
- Accessing the sources in the historical archive (optional, view and return to later):
If you are curious about the original texts (as they were published and received in the 19th century) or about Babbage's original writings, here are links to good quality digital scans (view/read online, and/or download):- Ada Lovelace, translator, “Sketch of the Analytical Engine Invented by Charles Babbage, By L. F. Menabrea of Turin, from the Bibliothèque Universelle de Genève, October, 1842, with Notes upon the Memoir by the Translator, Ada Lovelace,” in Scientific Memoirs, Selected from the Transactions of Foreign Academies of Science and Learned Societies, ed. Richard Taylor, vol. 3 (London: Richard and John E. Taylor, 1843), 666–731.
- Lovelace's Table Diagram for computing the Bernoulli Numbers in the Analytical Engine, based on Babbage's algorithm (in Lovelace's Note G).
- Charles Babbage, Passages from the Life of a Philosopher. First edition, London, 1864. [Internet Archive] [Copy on Google Drive] (Babbage's autobiographical account of his work and major ideas.)
- Charles Babbage, Babbage’s Calculating Engines: Being a Collection of Papers Relating to Them; Their History and Construction. Ed. Henry P. Babbage. First published, 1889. Reprinted: Cambridge: Cambridge University Press, 2010. (The main collection of publications by and about Babbage's work.)
- The Babbage Papers, Science Museum, London, UK. The original manuscripts and diagrams in Babbage's papers. Scroll down to "Browse this Archive," and select a collection of papers by topic to view digital photos of the papers.
- Ada Lovelace, translator, “Sketch of the Analytical Engine Invented by Charles Babbage, By L. F. Menabrea of Turin, from the Bibliothèque Universelle de Genève, October, 1842, with Notes upon the Memoir by the Translator, Ada Lovelace,” in Scientific Memoirs, Selected from the Transactions of Foreign Academies of Science and Learned Societies, ed. Richard Taylor, vol. 3 (London: Richard and John E. Taylor, 1843), 666–731.
Technical and Theoretical Background: Design and Implementation
- Background on Babbage's Difference Engine (Computer History Timeline)
- Background on Babbage's Analytical Engine (Computer History Timeline)
- Babbage's Engines: History and Reconstruction (Video, Computer History Museum)
(Good overview of the two Engines, and their reconstruction in the Science Museum.) - Ifrah, Universal History of Computing [online pdf], read pp.177 (bottom)-180, 189-197.
- Doron Swade, “Automatic Computation: Charles Babbage and Computational Method.” The Rutherford Journal 3 (2010), online version; pdf version (see photos and images at end).
- This is an excellent, detailed essay on Babbage's philosophy of computing by the leading expert on Babbage's machines. Note the photos of versions of the machines and the original documents for diagrams and illustrations.
- The Jacquard Loom (background history, Computer Timeline)
- Babbage combined different technologies in his designs; the two most important were the meshing gear-work in clock components and the punched cards used in automated weaving looms for reproducing woven patterns and controlling the actions of horizontal and vertical weaving actions. Several inventors contributed to the technology, but the most widely known is Joseph-Marie Jacquard. who standardized the grid-pattern modeling and mapping to pattern information and controls in rows of punched holes in cards sequenced as loom operations.
- Programming Patterns: The Jacquard Loom (Science Museum, London)
- History of cylinders and punched cards and tape for information (Koichi Arts & Science)
- Oxford University, Ada Lovelace: The Making of a Computer Scientist (Historical sources used in a recent museum exhibition.)
Video Lessons: Historical Background and Observing Machine Operations
- Doron Swade, Design and Operation of Babbage's Difference Engine No. 2 (Video)
- Charles Babbage, Archive and Engines, The Science Museum, London, UK
- Background and Context of Babbage's Engines (Science Museum, London) [Excellent!]
- Difference Engine No. 2 (Constructed from Babbage's designs by the Science Museum, 1991-2001; move and scroll through images)
- Section of the Analytical Engine, Trial Model (1834-1871)
- Sydney Padua, Babbage's Analytical Engine (3D digital graphics modeling!) [This video is by the author of an illustrated graphical novel (!) of Babbage's and Lovelace's work.]
Writing assignment (Canvas Discussions Link)
- From your study of the primary documents, designs, and examples of Babbage's engines, what seemed the most interesting in our context? What were the limitations of the mechanical computing system in comparison with modern electronic computers? What kinds of assumptions and design principles continue from Pascal and Leibniz?
- Go to your individual assignment topics (Google Doc).
- Hint: copy and paste directly from the online pdf of the Primary Sources document for citing and quoting statements to illustrate or support your points in your post.
Learning Objectives and Main Topics
Learning the foundational ideas for symbolic systems, code, and applications to electronic systems in the mid-19th century.
We can trace the strands of Leibniz's "mechanical thread" of symbols (in rule-governed sequences of abstract symbols and signs for operations and relations) as they connect to design concepts for computable, symbolic notation systems -- for both human computers, and then for the technical designs of physical systems.
This week, we will use Samuel Morse's design for an electronic telegraph code as a case study for understanding how switched electrical states can be designed symbolically with for "system of signs," a code.
Next week, we will study the important parallel and overlapping histories during the same decades in the mid-19th century. These ideas later converged in the foundational concepts for modern electronic computing: continued developments in electronic signals designs in telecommunications, further developments in computing, George Boole's symbolic logic for binary symbols and operations, and C. S. Peirce's development of semiotic theory.
Binary logic and further developments in mathematical logic had become formalized in symbolic notation systems in the 1930s-40s, just when it was becoming technically feasible to employ electrical circuits as the material substrates for representing data and operations (as Babbage, Lovelace, and others defined). With a further development of Morse's break-though concepts for encoding typographic symbols in electrical units, there was another "just in time" convergence of philosophy, math, logic, and technical means automating computation in a system of electronic components. By the late 1940s and through the 1950s, we see the foundations of modern computing -- encoding symbols that mean (data) combined with encoded symbols that do (programs) in an overall design for a controlled, time-sequenced, automated system.
Readings:
- Prof. Irvine, Samuel Morse: Case Study: From Morse Code to Binary Code
Background for museum visits; color pdf. Read before class meeting at the museums. - Prof. Irvine, Introduction to Peirce's Semiotics and the Foundations of Computing
Read for background on terms and concepts in C. S. Peirce's semiotic theory, and how we can apply them in the history of information and computing.
Primary Sources
- Samuel Irenaeus Prime, The Life of Samuel F. B. Morse, Ll. D., Inventor of the Electro-Magnetic Recording Telegraph (New York, 1875). (Internet Archive) (you can download the book in pdf for reference)
- Go to pp.251-53, where Prime records Morse's account of his first ideas for a "system of signs" that used switched electrical current.
- From the 1840s on, Morse and other inventors were involved in a long patent dispute about who "first" invented the electrical telegraph. This chapter records the issues and background history of the dispute.
- This account of the first "system of signs" is from Morse's letter to the Secretary of the Treasury and the US Congress, published in 1837. The government solicited proposals for building a US telegraph system, and Morse's design and plan was one of those proposed. Morse received the government funding.
- See the original document with Morse's letter (Internet Archive).
- Samuel Morse: Primary Documents (Morse Papers, Library of Congress)
- About the Morse Papers Collection
- Timeline History and Morse Papers: to 1839 | 1840-1872
- Table of the first plan for code symbols (with Morse's comments) (Morse notebook, 1838) [The prototype "dot-dash" code is on the last lines]
Video Lesson
- Background History: Electrical Signals and Morse Code (Art of the Problem series)
- This video is good on the background technical history, but it doesn't go further to explain how the basic concepts in Morse Code became the foundation for everything in binary electronic code. We will connect the dots (!).
Backgrounds for Museum Visits
- Morse Telegraph Instruments: Smithsonian American History Museum Collection
- Samuel Morse and Washington, DC: Historical Map of Morse's art and technology in DC.
- Thomas of Colmar's Arithmometer: Arithmometers In the Smithsonian Collection
- The Thomas Arithmometer was the first commercially produced calculating machine, and used by C. S. Peirce in the 1870s. It was designed on a smaller scale (not like Babbage's Engines), and was portable. It embodied many design features first developed by Leibniz.
- The Arithmometer model on exhibition at the museum now. This is a famous model: it belonged to Frederick Barnard, President of Columbia University (a colleague of C. S. Peirce in American scientific circles). Barnard wrote the first history of calculating machines in his report on the technology at the Paris Exposition of 1867, where the Thomas Arithmometer was widely praised. Barnard brought the machine to Columbia to be used by the astronomy observatory.
- How the Arithmometer Works (Video. Mechanical Computing)
Writing assignment (Canvas Discussions Link)
- In preparation for our museum visits, discuss the key concepts in Morse's discovery of a symbolic code based on switched states of electrical current, and how these basic concepts became extended to our modern binary electronic systems. Try out your own application of Peirce's terms and concepts for Morse's "system of signs" in electronic code.
Learning Objectives and Main Topics:
This week, students will learn the intellectual-historical background for the applied mathematics and logic for "shaping signals" in electricity: the engineering design solutions to the core semiotic problems for symbolic representation in electronic communications and for logic in digital electronic computer systems.
Background:
Morse's proof of concept for a "systems of signs" for an electronic code mapped to switched circuits had two important consequences in the history of ideas for our modern computing technologies. Electrical engineers (Hartley, Nyquist, Shannon), who worked for telecommunications companies, developed mathematical models for electrical signals and for switched network connections. (1) They discovered how to apply the abstract, formal concepts from logic and mathematics (Boole, Peirce, and others) to describe connected circuits in telegraph and telephone switched circuits (telecom networks), which led to using combinations of switched circuits designed for representing and performing sequences of Boolean logic in computations; and (2) they developed mathematical models for electronic "information" in structured ("shaped") signals, as design solutions for using electricity (and radio waves) in regulated patterns that could be used to encode messages (sequences of symbols) from sending to receiving points. The applied math and logic in electrical engineering led to Shannon's formalization of the binary bit as the minimal reliable unit for encoding and decoding in controlled electronic states. The binary bit formalizes Boole's binary (one of only two possible values) true/false, 1/0, symbolic representation in a context of interpretation.
The developments in philosophy, logic, mathematics, calculating machines, electricity, and electrical communications from the 1830s to the 1930s converged together in another unanticipated nodal moment. Morse's basic design concepts for electrical switched states as a representational system for "a system of signs," plus Boole's ideas for modeling logic with a two-value binary system, and Claude Shannon's application of Boolean symbolic logic to networks of telecom switches -- these ideas and applications in electronic technologies became the foundation of modern binary electronic computing and all forms of digital media (text characters in all languages, graphics and images, photos and video, audio).
Using our methods and key concepts, we will develop a historical case study through this important path of ideas and technologies:
- From Morse's concept of a "system of signs" used for telegraph Code (controlled switched signals), which became used internationally (1830s-1930s),
- Through the philosophy of representation, signs, meaning, and the discovery of how to apply Boolean logic to electrical circuits developed by C. S. Peirce (1890s-1910),
- To Claude Shannon's development of systematic formulas for circuits based on Boolean symbolic logic, and then Shannon's mathematical concept of binary information (the bit) (1930s-50s).
- These two key concepts for binary electronic logic switches and binary units of information became the essential foundation for modern computing as we know it (1950s-present).
Background Reading
- Martin Irvine, "Semiotic Foundations of Computing and Information Systems." Chapter in Bloomsbury Semiotics (London: Bloomsbury, 2022), vol. 2. Read pp. 203-212 for this week. Download and print for reference. We will return to the key concepts outlined here throughout the course.
Primary Sources
- George Boole: Primary Documents. See the first editions of his important books in pdf digital copies (Internet Archive). (View briefly and download, if you'd like your own pdf copies.)
- Excerpts from Boole's original works: The Mathematical Analysis of Logic (1847) and The Laws of Thought (1854). Download, survey the contents, read the highlighted pages.
- I prepared these excerpts and highlighted sections so that you can see the important statements in Boole's original works. Don't worry about the difficulties; we will discuss in class. What became known later as "Boolean logic" in computing is a reduced version of Boole's whole program for a symbolic algebra of logic.
- Boole was a member of the same circle of Cambridge mathematicians with Charles Babbage and Augustus DeMorgan (Ada Lovelace's math teacher). Boole and the Cambridge circle followed Leibniz on extending the methods of abstraction and use of symbols in algebra for developing notations for symbolic logic. DeMorgan's famous book, Formal Logic, or, the Calculus of Inference [see on Internet Archive] and Boole's The Mathematical Analysis of Logic were both published in the same year (1847). Both were major influences on C. S. Peirce, who wrote the first papers in America on extending Boole's logic in the 1870s, and founded the American tradition of symbolic logic and semiotics. (More next week.)
- Claude Shannon, A Symbolic Analysis of Relay and Switching Circuits, MIT Thesis (1938/1940). Copy of original typed thesis: excerpt of Part 1, and References. (Link to complete text.)
- See the IEEE Journal Article version (1938). This is the version of Shannon's thesis work that became well-known in the engineering community.
- Note: In describing circuits, Shannon was using electrical engineering concepts for "impedance" (resistance or hindrance of the flow of electricity at the contact points of a circuit), so he initially used "1" for an open circuit (i.e., full "hindrance," and no connection) and "0" for a closed circuit (i.e., where there is a connection and no hindrance in the circuit.) This is one way of mapping Boole's "Everything or Nothing" "1/0" values. Engineers and mathematicians soon revised the notation (symbolism) into the form we use today: "1" represents a connected (closed) circuit = the "on" state (electricity flows through the switched contact point); "0" represents an open circuit = the "off" state (electricity does not flow through the switched contact point).
- Shannon didn't know (and most people still don't) that C. S. Peirce drew the first diagram for applying Boolean logic to circuit switches in 1886. Peirce diagrammed what we now call AND and OR gates in the circuit logic for all processor units. (More next week on Peirce.)
Background on Key Concepts in Logic and Computing
- Charles Petzold, Code: The Hidden Language of Computer Hardware and Software, 2nd ed. (2023): excerpts (Chaps.2, 6-8).
- This is such a wonderful book for an introduction to computing concepts. We will include other chapters in the following weeks. (Full book link.)
- Petzold designed an interactive website for the book where you can try out configurations of switches and logic gates and see the results. Go to: https://codehiddenlanguage.com/ and try out the interactive demos for Chaps. 6 and 8.
Video Lessons: How Boole's Binary Algebra of Logic is Applied in Computing
- Crash Course Computer Science
- Electronic Computing (Lesson 2)
- Boolean Logic and Logic Gates (Lesson 3)
- Representing Numbers and Letters with Binary (Lesson 4)
- Understanding Binary Logic Gates (Spanning Tree)
Class Discussion: Step by step discussion and explanation of sources.
Writing assignment (Canvas Discussions Link)
- Describe the key concepts that connect Morse's design for "a system of signs" for an electronic code based on switched states of electrical current, the formal developments of binary logic in Boole's work, and Shannon's application of Boolean symbolic logic for mapping switched circuits in telecommunications. How did the basic ideas that were formalized (given a symbolic notation) in this "nodal moment" in our history provide the foundations for designing digital electronic computing systems?
Learning Objectives and Main Topics:
Learning the key terms and concepts for semiotic systems and computation developed by C. S. Peirce, and how his philosophy of symbolic systems and technology anticipated (and describes) modern 20th-century computer system design.
Peirce's unfinished project for "Logic as Semeiotic" as a unified philosophy of logic, science, and technology provides important concepts for explaining automated reasoning, Information theory, and computation based on binary systems. Many of his concepts and approaches have become part of the common discourse in logic and computing, and others have been rediscovered in other contexts with different terminology. His work also provides valuable concepts for understanding computer systems as designs for semiotic systems (based on symbolic to physical homologies) and for interactive programming design.
Next week, we will continue a Peircean approach to modern electronic information theory (Claude Shannon) and to the early design principles for digital binary computing systems.
Introductions, Background
- Review: Prof. Irvine, Introduction to Peirce's Semiotics and the Foundations of Computing
Review this introduction to Peirce's terms and concepts before reading the compilations of Primary Texts (below). - Martin Irvine, “Semiotics in Computation and Information Systems,” in Bloomsbury Semiotics, ed. Jamin Pelkey, vol. 2; review main points and finish pp. 212-225 for this week.
Table 9.3 (pp. 213-15) provides an outline of important moments and "nodes" in the history of computing; we won't be able to cover all of them, but you can follow up and make connections in your own research. - C. S. Peirce: Background Biography (Wikipedia). [This article is packed with information, and will give you a sense of the scope of Peirce's thought and work.]
Primary Sources (Main Readings)
- C. S. Peirce, Primary Sources: Semiotic Foundations of Logic and Computing (pdf)
- Download and print (if possible). Make notes and mark sections for questions. I will bring copies for discussion in class.
- Survey the whole group of texts, and focus on some that attract your interest.
- I have included texts (at the end of the document) that show how Peirce's ideas have become part of the main discourse in computer science (or have been rediscovered with other terms), and how we can extend and apply the ideas further.
Writing assignment (Canvas Discussions Link)
- We will have a class seminar-style discussion on Peirce's key concepts for symbolic systems and how his "Logic as Semeiotic" applies to (and is often assumed in) our technologies for digital electronic information and computation. In your discussion post, write some notes on your main "aha" moments when reading the texts in the compilation of Primary Sources, as well as difficulties, and ask questions to go over in class.
Learning Objectives and Main Topics:
Learning how design principles for electrical communications and models for computing based on Boolean logic and the binary system converged in the development of modern digital electronic computers in the 1940s-50s. Electrical engineering for telecommunications (from Morse's telegraph to telephone and data networks) and designs for calculating machines had different origins and histories, but the two projects converged in the use of electronics and the adoption of the binary bit (for encoding).
The concept of the bit enabled the mapping of mathematics to electronics for the design of internal structures of components and for the structures for physical signals. Shannon's model for "information theory" in electrical signals provided an engineering solution to a semiotic problem: designing and engineering structures of electricity as a substrate or subsystem for human symbolic representations in an encoded form. By 1948, it became clear that the binary system for logic and the binary system for numbers (represented in base 2) could be combined as a system for both data representations (symbols that mean) and computational operations (symbols that do).
Readings
- Prof. Irvine, Introduction to Information and the Design Concepts for the First Digital Electronic Computer Systems (download and print out).
- Stan Augarten, Bit by Bit [site contents]. Background for this week: Chapter 4 and Chapter 5.
- Focus on Sections 4.2-4.4, and 4.7-4.8; 5.1-5.3; and 5-8-5.10.
- We can't cover all the steps and contributors for the design of the modern computer system, but this is a good resource for the main historical contexts (1930s-50s).
- Michael S. Mahoney, “The Histories of Computing(s),” Interdisciplinary Science Reviews 30, no. 2 (June 2005): 119-35.
- Review Mahoney's good summary of the many historical contexts (institutional, political-economic, disciplinary), and attend carefully to pp. 127-129 (on software and symbol systems).
- Primary Texts: Information and Digital Computer System Design
- Claude Shanon, "A Mathematical Theory of Communication" (1948). (Modern edition of the original article.) This paper defined the concepts and mathematics for "Information" as quantizable/quantized binary signals (bits) for electronic communications and computing. Read pp.1 - 3 (top); skim the rest for the headings to main sections.
- See the Original version in the Bell System Technical Journal (July, 1948).
- Note: The mathematical formulas and diagrams are applied mathematics for "the fundamental problem of [electronic] communication" stated in par.2, p.1. electrical signals.
- Primary Texts (in a compilation of edited and reprinted versions):
John von Neumann and the Logical Design of Electronic Digital Computer Systems. Excerpts from the original papers by designers of the ENIAC and EDVAC systems. Download to read with pdf software.- Read and refer to this compilation of texts. I have highlighted important statements. Read the highlighted sections with as much of the context as you can. We will discuss in class. View the digital copies of the original documents below:
- Original Document: John Von Neumann, First Draft of a Report on the EDVAC (June 30, 1945) (Philadelphia, PA: Moore School of Electrical Engineering, University of Pennsylvania, 1945). Internet Archive: digital copy of original document.
- Original Document: Adele Goldstine, Arthur W. Burks, et al, A Report on the ENIAC (Electronic numerical integrator and computer): Part I: Technical Description of the ENIAC (Moore School, University pf Pennsylvania, 1946). Internet Archive: digital copy of original document. Thumbnail image view. Scroll through to the pages of diagrams.
- Claude Shanon, "A Mathematical Theory of Communication" (1948). (Modern edition of the original article.) This paper defined the concepts and mathematics for "Information" as quantizable/quantized binary signals (bits) for electronic communications and computing. Read pp.1 - 3 (top); skim the rest for the headings to main sections.
Video Lessons and Documentaries
- Backgrounds for Code and Electronic Information Theory (Art of the Problem series)
- The "Von Neumann Architecture" as Implemented in Contemporary Computer Systems
- How Computers Work (Code.org Series): View Lessons 3-5.
- Crash Course Computer Science [series]:
Lessons for this week: 2: Electronic Computing | 5: How Computers Calculate (ALU) | 6: Registers and RAM | 7: The Central Processing Unit (CPU)
- Computer History Archives Project:
- The Development of the ENIAC in 1946 (Documentary, Historical Film Footage)
Writing assignment (Canvas Discussions Link)
- Don't worry that (again) the Primary Source documents are difficult; we will discuss them with your questions in class. But we can capture the key concepts and understand why and how they became so important for the whole history of digital information and computing -- all the way to how they are implemented in our most recent technologies.
- In preparation for discussion, and continuing from last week, write about the following topics as far as you understand them so far (with references to the readings and sources). Ask questions about difficulties, and anything needing further explanation.
- Why does Shannon state that "meaning" is not part of the engineering problem for electrical signals (as waves) or bits (binary discrete units)? Here's where Peirce's semiotic theory clarifies the description of "information" -- the quantified electrical units of "communication." Shannon explains what is required for producing reliable (predictably structured) substrates (the subsystem of electronic structures in physical media) for what we use as perceptible tokens of symbolic forms at our input and output points in the signal transmission path (which have "transducers" for converting electrical current from and to human perceptible forms). (Remember the Morse Code demo? We hold the meaning of the code, and understand the meanings, intentions, and contexts of what is encoded, all of which are not properties of the tokenized signals themselves.)
- What are the most important conceptual and design steps that enabled the first designs for electronic digital computing systems in the 1940s-50s? What did you discover about the concepts behind the "Von Neumann Architecture," the master system plan or model, that we still use in computer system components today?
Learning Objectives and Main Topics:
What were the big conceptual and technical leaps in computing that enabled the kinds of interactive, multimedia systems that we know today?
In this Node (for Weeks 11-12), we will make clear what the model for a digital binary computer system was by the 1960s, and then how computing could be reconceived as creative "tools for thought" that anyone could use. We will find answers to these questions:
- How did it become possible to re-conceive, redesign, and re-engineer a computing system to become a universal "information processing system" for all forms of symbolic representation encodable in a digital, binary form?
- How did we go from large, "number crunching" computer systems, designed for specialists and for application in mathematics, engineering, and business, to designs for "general symbol processing systems" that could be used by anyone?
- How were the concepts for all digital data types (text, images, audio) developed, and how did they become today's "multimedia"? How are all the data types designed to be computable?
- How were the designs for "interactive" computing first conceived and developed: what made the new designs for ongoing user-agent inputs for both data and software possible?
Reading
- Prof. Irvine, Introduction to Node 5: Contemporary Computing Systems.
Key Concepts for Moving Beyond "Number Crunching" Computers to "General Symbol Processors." - Howard Rheingold, Tools for Thought: The History and Future of Mind-Expanding Technology (MIT, 2000) [download]. Read Chapters: 3 (Alan Turing), 6 (Shannon), 7 (Licklider and imagining new kinds of computers, 8 (Project MAC at MIT), and 9 (Doug Engelbart and the new model of multimedia, interactive computing). [Download]
- I ordered this book for the course, and you can download the whole book from our shared drive. (It's good to have your own print copy for taking notes and marking up passages for your own memory and questions.) Read these chapters and attend carefully to the historical contexts and ideas. Rheingold is especially clear about the ideas and design concepts that motivated developments in the modern history of computing (as signaled in his title).
- Compare any of the chapters of Rheingold's book (especially 7-9) to those of Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine, 4th ed. (2023) [download]. [You do not need to read these chapters in full; review contents to compare with Rheingold.]
- This is a valuable history and the authors are expert historians of computing, but you will see that the whole focus is on computing machines as machines (and the people and companies who built them). There is much important history here, but the ideas, reasons behind the designs, what makes a computer a computer, get lost in the linear history of machine products.
- What's missing in this presentation of the history, and why?
- Optional (for further background)
Similarly, compare the above books with Stan Augarten, Bit By Bit: An Illustrated History of Computers (1984) (online version: use the drop-down menu for "Bit by Bit" at the top to go to each chapter and section).- This book has many good features, and is written to be accessible for a popular readership. But the key concepts and ideas also get buried in the history of machines and technologies (Chapters 6-9).
Main Primary Text Reading:
A Turning Point in Re-Conceiving What a "Computer System" Is or Could Be
- Vannevar Bush, "As We May Think," Atlantic Monthly, July, 1945. Illustrated version, Life Magazine, Sept. 1945.
See the historical background on Bush's career (Wikipedia article). - View the original archival versions of the article in pdf:
- Download and Read:
[print out]
- Re-edited version of the article (with context), from ACM Interactions (1996).
- Primary sources edited by Prof. Irvine: includes:
(1) Introduction to Vannevar Bush's article, (2) Selections from Bush's "As We May Think," (3) Doug Engelbart's letter to Bush (1962), (4) original pages and illustrations of the Memex from the Life Magazine version (Sept. 1945).
- Note the date: the same year as Von Neumann's "Report on the EDVAC" (the founding document on digital electronic computer architecture). Bush was imagining a human cognitive-semiotic system that used analog media (microfilm, cameras, desk displays of film), a system for extending memory and combining information sources that was far beyond what was then possible with computing machines. But the ideas were quickly translated into design concepts that could move digital computing far beyond calculating numbers,
- In this famous essay, Bush outlines his idea for a "Memex" -- a memory-extender -- designed as a "desktop" system for presenting all forms of media that could be searched, organized, interpreted, and commented on by a user.
- This essay was an inspiration for J. C. R. Licklider, Doug Engelbart, Alan Kay, and many others who saw how computer systems could be reconceived by placing ordinary "intelligent users" at the center of the design. Bush's vision for a new system that supported learning, knowledge, and creativity motivated the first steps for developing the interactive, multimedia, interpretive-agent-based computing systems that we have today (though mostly reduced to consumer products).
Video Lessons:
From the Von Neumann-Turing Machine Model to Computing Beyond Number Crunching
- Crash Course Computer Science
- Background: Alan Turing (Lesson 15) [Turing's model of computing was combined with Von Neumann's for programming concepts in the 1950s.]
- Developments: The Cold War and Consumerism (Lesson 24)
- Development of the Personal Computer (Lesson 25)
- The Model of a "Turing Machine" (Art of the Problem).
In-Class Discussion: Continuing from last week (digital computing system design)
Writing assignment (Canvas Discussions Link)
- From your reading of the sources and background, discuss some important ideas in the 1940s-60s that enabled computing systems to be reconceived as symbol processors and "mind expanders" with many types of media beyond numerical data used for "number crunching" systems. Refer to sections in Rheingold's Tools for Thought and statements in Vannevar Bush's "As We May Think" and Engelbart's letter for "aha" moments and questions to discuss in class.
- Thinking ahead: In the next few weeks, we will study the (1) design concepts and supporting technologies for interactive computing and graphical interfaces, (2) computing designed for all forms of digital media, and (3) the design concepts and technical architecture for the Internet and Web. (We won't be able to go all the way to AI and Machine Learning, but you will will have a good foundation for asking the right questions and learning more.) Are there specific topics or questions that you would to have included in our study of these main developments? If so, mention topics or questions that interest you at the end of your discussion post.
Learning Objectives:
Learning the key design concepts and developments in underlying technologies in the interactive computing Node: Doug Engelbart's "Augmenting Human Intellect" lab at Stanford, and the research and development at Xerox PARC that applied Engelbart's concepts to smaller, networked, interactive computers that everyone from children to office workers and academics could use.
- How was it possible to re-redefine computing beyond number-crunching processes and applications for specialists?
- What can we learn from the key ideas in the primary documents and early "proofs of concept" that make a working implementation of design concepts that are now built-in to all our interactive systems and devices today?
- What design concepts were developed in the 1960s-80s that are still not implemented in commercially made computers, PCs, tablets, and mobile devices? (Hints: multimedia (text and graphics) software with ability for user-agent to create two-way hyperlinks to other files in any form of media; ability to write your own software and run it on the same computer.)
Readings
- Howard Rheingold, Tools for Thought: The History and Future of Mind-Expanding Technology (MIT, 2000) [download]. Continuing from the background on Engelbart in Chap. 9: Read:
Chap 10 (On the ARPANet research team and Xerox PARC), Chap. 11 (On Alan Kay and the new interactive, graphical designs at PARC). - Bill Moggridge, ed., Designing Interactions (MIT Press, 2007). Excerpts from Chapters 1 and 2: The Designs for the "Desktop Computer" and the first PCs.
- The background and excerpts from interviews are very revealing about the semiotic systems designing thinking behind the development of PCs, graphical interfaces, and systems design for interaction.
- This well-illustrated book provides an insider's view of the ideas developed at Engelbart's lab and Xerox PARC, and then how many of the same teams of people worked on consumer-level PCs at Apple and on the interface designs for the first Macintosh.
- Brad A. Myers, “A Brief History of Human-Computer Interaction Technology,” Interactions 5, no. 2 (March 1998): 44-54.
- This is an excellent brief overview of the technical history, and though written in 1998, we continue to use all the interface design principles summed up here.
- This can be a good research starting point if you want to follow up on interface design and the what it takes to make it technically possible.
Primary Documents on Expanding Computing and Interface Design
- Primary Source Documents on Graphical Interfaces and Interaction [download]
- Main Contents (descriptions are in the pdf file):
- Douglas Engelbart, "Augmenting Human Intellect." Main report, Oct. 1962.
- Ivan Sutherland, "Sketchpad: A Man-Machine Graphical Communication System" (1963).
- Alan Kay, Xerox PARC, and the Dynabook/ Metamedium Concept for a "Personal Computer" (1970s-80s)
- What to look for in the primary sources (applying what you've learned about interpreting the primary documents in the history of computing):
- First, skim each document, then review to find the main statements that reveal the motivating ideas and design concepts that we have been studying. Each text will reveal assumptions about the underlying designs of electronic computer systems as semiotic systems, and how to use
- DYI (Do-It-Yourself) highlighting: highlight the sentences that you find most significant for the history of ideas that we have been following. (For your discussion post and discussion and questions in class.)
- Doug Engelbart's "Mother of All Demos" (1968). Film documentary of the live event.
The Human Augmentation Lab demonstration of the multi-user graphical "Online System," San Francisco, Oct. 1968.- Highlights (5 mins), SRI
- Complete 3-reels of film footage (remastered) (Doug Engelbart Institute)
- Use the original film segments (as much as you have time to view) as a primary source document for interpreting in the same way that we interpreted all the original sources.
- Background and context on the 1968 Demo: Doug Engelbart Institute
Online Research Sources
- Doug Engelbart Institute (main page; use menu at left) || Publications Archive
- Computer History Museum (search on topics like Engelbart, Alan Kay, Xerox PARC, Alto)
Video Lessons: Technical Background for Interactive Graphical Systems
Crash Course Computer Science
- You can trace the technical implementation of designs for "symbol processing" with human interactive interpretations in the technical means for achieving overall design goals:
- Keyboards & Command Line Interfaces (Lesson 22)
- Screens & 2D Graphics (Lesson 23)
- The Development of Graphical User Interfaces (Lesson 26)
Video Documentaries (For in-class discussion)
- Alan Kay on the history of graphical interfaces:
- Youtube version | Internet Archive (video will play in browser) (1987)
- Demo of Ivan Sutherland's Sketchpad, Lincoln Labs, MIT (c.1963)
- This is the "proof of concept" for a two-way graphical display interface.
- See also the short video of Alan Kay's commentary on Sutherland's Sketchpad graphical system.
Writing assignment (Canvas Discussions Link)
- In your review of the primary source documents and background in the Rheingold and Moggridge selected reading, discuss the key concepts and design principles that enabled computer systems to be redesigned and engineered to become the interactive semiotic systems that we use today.
- Use your highlighted statements in the primary source texts (or however you have marked them for yourself) for selecting the those that you find most interesting, revealing, or important to discuss in your post. Include questions and topics that you would to discuss further in class.
Learning Objectives
- Continuing the design concepts in networked systems.
- The design principles for information networks: the Internet and World Wide Web.
Readings
- Introduction to the Design Principles for the Internet and Web.
Video Lessons
- Crash Course Computer Science
- Computer Networks (Introduction) (Lesson 28)
- The Internet (Lesson 29)
- The World Wide Web (Lesson 30)
- Other computer history reference books:
- Thomas Haigh and Paul E. Ceruzzi, A New History of Modern Computing (Cambridge, MA: MIT Press, 2021).
Writing assignment (Canvas Discussions Link)
Class Discussion:
Discussion of final capstone projects.
Post Your Final Project Ideas in Week 14 (Canvas Discussions Link)
Instructions for the Final Version of Your Essay and Posting to Canvas
- Detailed instructions for the Capstone Essay Project [download and print out for reference.]
- Upload the pdf file of your essay either to Canvas or Google Drive, and create a link to your file in your Canvas Discussion post for "Final Projects" [Canvas Discussions Link]. Write the Title of your essay as a heading for your post, and insert your brief abstract below the title. Then below this text information, provide the link to your file (you can use the URL or a short title with the embedded link). Test the link after you save the post to make sure it works; revise and edit if needed.
Due Date for Posting Your Final Project:
- Using your final capstone project after the course: You can use your final capstone essay as part of your "digital portfolio" wherever it can be useful to you (in a resume, LinkedIn, social media, internship applications, job applications, and applications for further graduate studies).