Media History

Georgetown University
Graduate School of Art and Sciences
Communication, Culture & Technology Program

CCTP-5038 History and Philosophy of Computing
Professor Martin Irvine

Spring 2024

Have you wondered about the deeper history of ideas that enabled us to get where we are today in all our computing and data systems? This course provides students with an inside view of the history of ideas that have made computing possible, starting from the earliest beginnings in the 1600s to our contemporary digital electronic systems, digital media, data networks, and AI. To facilitate access to the deeper history, students will learn how use and interpret earlier original primary documents and earlier examples of technologies archived in libraries and museums. We will study the history of computing from the perspective of the key concepts and design principles that have made computing possible, and not as a story of the inevitable "progress" of machines, products, or innovations. You will discover the very essential human ideas and capabilities that motivated the beginnings of computing in the Enlightenment era (1600s-1700s), and have continued to motivate the designs of computing systems in the various technical implementations down to us today.

Recovering the history of ideas in computer reveals that the key concepts in computing are accessible to everyone, but have accidentally become “black boxed” (inaccessible, closed-off from understanding) from two main causes: (1) our modern political economy for technology with computing tech industries that market closed “products” only accessible to specialists, and (2) the false modern institutionalized division of disciplines and sciences into "math/science/tech" and "humanities/social science/arts" domains of knowledge. We will apply our CCT interdisciplinary approach to “de-blackbox” (make accessible for understanding) the key concepts in computing, and create a truer view of computing that draws from overlapping histories in philosophy, logic, mathematics, science, communications, social contexts, and supporting technologies. Students will work on interdisciplinary research projects for doing their own “deep dive” into the history of concepts and contexts that have led to where we are today.

Syllabus Structure

The 14 week units of the course are organized around six "Nodal Moments" in the history of ideas and how they became implemented in different stages of the technologies. A "node" in this context means a moment in time and place where people, ideas, social conditions, and developments in underlying technologies intersect and interconnect, like a node in a network that connects many active "links" at the same time. We will also follow how earlier nodes produce links that become connected to further nodes through the whole history, leading right to our present moment.

Conceptual Framework and Methods Used in the Course

Intellectual history (history of ideas). We will study major moments in the history of philosophy, mathematics, logic, and design concepts for technologies, studied in their social contexts and as reinterpreted and re-applied throughout developments in other contexts.

Systems and Design Thinking: learning to understand how everything in computing – past and present – is based on designing complex (multi-part) systems of combined subsystems (modules); that is, as interconnecting components, combined orchestrated in a unifying master-design termed an architecture.

Semiotics. Theory of symbol systems (signs, symbols, diagrams, writing, notation for math, logic, and code), symbolic thought, interpretation, and physical representation that underlies all design thinking in computing, math, code, programming.

Archival methods. How to access and interpret earlier, original, primary sources (documents, artefacts, devices, technical documentation) from online data sources (archives, libraries, museums).

Learning Goals and Outcomes

Students will learn the combined methods in our framework for a truer and more complete understanding of the core concepts in computing technologies from their deeper histories of development, and be able to apply this knowledge to their own further research, learning, and career development. By the end of the course, students will be able to apply this interdisciplinary knowledge to many other fields and careers, including design, applications in digital media, and becoming capable communicators for "de-blackboxed" explanations of computing, including our complex data and AI systems Students will be able to explain the reasons behind the design of our contemporary computing systems, and better understand what is possible (and not) in future developments.

Learning archival methods for access to the history in original primary sources is an important skill in many disciplines and career paths. Students with this competency will always be in demand for being able to research, interpret, and apply the historical knowledge gained from direct access to primary sources (original documents, designs, artefacts, earlier technologies) in understanding the foundation of computing systems today, and what can be possible in new future designs.

View and download the pdf syllabus document:
Full description of the course, Georgetown Policies, and Georgetown Student Services.

Course Format

The course will be conducted as a seminar and requires each student’s direct participation in the learning objectives in each week’s class discussions. The course has a dedicated website designed by the professor with a detailed syllabus and links to weekly readings and assignments. Each syllabus unit is designed as a building block in the interdisciplinary learning path of the seminar. For each week, students will write a short essay with comments and questions about the readings and topics of the week (posted in the Canvas Discussions module). Students will also work in teams and groups on collaborative in-class projects and group presentations prepared before class meetings.

Students will participate in the course by through a suite of Web-based learning platforms and etext resources:

(1) A custom-designed Website created by the professor for the syllabus, links to readings, and weekly assignments:https://irvine.georgetown.domains/5038/ [this site].
(2) An e-text course library and access to shared Google Docs: most readings (and research resources) will be available in pdf format in a shared Google Drive folder prepared by the professor. Students will also create and contribute to shared, annotatable Google Docs for certain assignments and dialogue.
(3) The Canvas discussion platform for weekly assignments.

Grades

Grades will be based on:

  • Weekly short writing assignments (posted to the Canvas Discussions platform) and participation in class discussions (50%). Weekly writing must be posted at least 4 hours before each class so that students will have time to read each other's work before class for a better informed discussion in class.
  • A final research "capstone" project written as an essay or a creative application of concepts developed in the seminar (50%). Due date: one week after last day of class. Final projects will be posted as pdf documents in the Final Projects category in the Canvas Discussions platform.

Professor's Office Hours
To be announced. I will also be available most days before and after class meetings.

Books and Resources

This course will be based on an extensive online library of book chapters and articles in PDF format in a shared Google Drive folder (access only for enrolled students with GU ID). Most readings in each week's unit will be listed with links to pdf texts in the shared folder, or to other online resources in the GU Library.

The required books below are available at the GU bookstore, but, of course, you can get them from online sellers as well. Selections from these books are in pdfs in the e-text library, but these books are important to have (in print) for ongoing reference and your own annotations.

Required Books:

  • Georges Ifrah, The Universal History of Computing: From the Abacus to the Quantum Computer. New York: Wiley, 2001.
  • Howard Rheingold, Tools for Thought: The History and Future of Mind-Expanding Technology. rev. ed. Cambridge, MA: MIT Press, 2000.
  • Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: MIT Press, 2015.
  • Doron Swade, The History of Computing (Very Short Introduction).Cambridge, MA: MIT Press, 2022.

Course Online Library (Google Drive: GU student login required)

University Resources

Using Research Tools for this Course (and beyond)

  • Required: Use Zotero for managing bibliography and data for references and footnotes.
    Directions and link to app, Georgetown Library (click open the "Zotero" tab).
    You can save, organize, export and copy and paste your references with formatted metadata into any writing project.
  • Required: Internet Archive (archive.org). Sign up for a free account (gets you access to books and documents that can only be read online).
    • The Archive is a rich library of free-access and downloadable books and journals in the history of computing and technology. You will learn how to do your own research in primary documents published before the 1950s.

Introduction to the Methods, Topics, and Learning Goals of the Course

  • Introduction to the History and Philosophy of Computing.

Introductions

  • Professor's Personal Introduction. (Where is this guy coming from?)
  • Student introductions: Who are we? Backgrounds and interests to be considered in developing the course.

Course Introduction: Requirements, Expectations, Orientation

  • Format of course, requirements, participation, weekly assignments, projects, outcomes (see above).
  • Using the course Website syllabus (the main launchpad) and online etext library (shared Google Drive).
    • Why I use custom-designed Websites for courses: teaching philosophy, instructional design, student access to materials.
    • I update and revise this course Website as we study together: check back each week for updates.
  • Classroom rules: how to use PCs and mobile devices: no social media or attention sinks during class.

Using Research Tools for this Course (and beyond)

  • Required: Learn how to use Zotero for managing bibliography and data for references and footnotes.
    Directions and link to app, Georgetown Library (click open the "Zotero" tab).
    You can save, organize, export, and copy and paste your references with appropriate formatted metadata into any writing project.
  • Required: Using Georgetown Library, Main Search Page
    https://www.library.georgetown.edu/
    • Learn how to search and access journals, books, and databases for further study and research.
  • Required: Sign up for a free account on Internet Archive (archive.org) (gets you access to books and documents that can only be read online).
    • The Archive is a rich library of free-access and downloadable books and journals in the history of computing and technology (and many other topics). This is a great resource for learning how to do your own research in primary documents published before the 1960s.

Introduction to the course: methods and main topics (Prof. Irvine, Presentation)

Main Topics and Learning Objectives

This week, students will learn about the conceptual framework, methods, and main topics of the course. Students will be introduced to the design principles for modern computing systems (including forms of digital data and media) so that we can ask important questions about "how and why did we get here," and "what makes all this possible"?

In this introductory week, we will begin framing the major questions which the course will enable all students to answer. In each course unit, students will be introduced to research methods and foundational concepts for doing their own investigations and developing answers and explanations. By the end of the course, students will be able to answer the following questions and explain why they are important for our understanding of computing today.

  • Why is computing (as we know it) and everything digital based on philosophical assumptions that go back 400 years? What are the major ideas in computing that aren't technological in themselves?
  • What is Computation? How long have we humans being doing computation?
  • What is a Computer? What makes a computer a computer? And why should we always say Computer System rather than computer (as an object or product)?
  • What do we mean by automation? What can, and can't be, automated in computer systems? Why is computation, as we understand in computer system design, called "automatic symbol processing" and "automated reasoning"?
  • How can we "compute" all kinds of human symbolic forms (all our digital media from text and numbers to images and audio) that are not "about" numbers, and not numerical in themselves?
  • What are the historical "roots" (deep histories) and "routes" (paths of development) of computing that have led us to where we are today? What "routes" haven't yet been followed -- well, fully, or at all?

About our required texts and main online sources:
Our required texts and online sources often overlap in the presentation of topics, but this is good: you will read explanations and historical contexts from different ways that the history of computing is conceived.

Introductory Readings

  • Review: Course Introduction (Slide Presentation: Review)
  • Prof. Irvine, Introduction to the Course. (Print out, save for reference.)
  • Subrata Dasgupta, It Began with Babbage: The Genesis of Computer Science. Oxford, Oxford University Press, 2014. Selections. Read the Prologue, pp.1-8 for this week.
    • Note: Computing didn't begin with Babbage, but, as we will see, he was the first to work out the many complex details for a mechanical computing system.
  • George Ifrah, The Universal History of Computing, skim Part 1, pp.3-96 (for an appreciation of the deep history of thinking about, and using, numbers and symbols before all the cumulative knowledge began to applied to computing machines), and read pp. 302-310 in Chap. 6. (Available online in pdf). Pp. 99-154 is a preview of our first historical node.
    • This book is a treasure of deep research and thinking by Ifrah, who is famous for his masterful book, The Universal History of Numbers The sections are sometimes oddly organized, and contain the translator's comments; but you can easily skip around to sections that you want to study.
  • Peter J. Denning and Craig H. Martell, Great Principles of Computing. MIT Press, 2015 (in pdf). Introduction.
    • This is a great, accessible introduction to the main principles of computing by a leading computer scientist. Many chapters and topics may seem beyond what you're ready for now, but keep reading and re-reading, and we will discuss the main topics throughout the course.

Preview Online Sources: get to know the sources that we will use in the course

  • Go to these online sources this week just to become familiar with the contents. We will use topics in these sources in our course units, and they are great to know for reference (and fuller details and background than Wikipedia).
  • Stan Augarten, Bit by Bit: An Illustrated History of Computers (Web version).
    • Link is to the Table of Contents. Use the drop down menu (BIT BY BIT) in the navigation row at the top.
    • For this week, go to the Preface and Chapter One: Introduction. Browse any other chapters to look ahead, and consult throughput the course.
    • This will be a major online textbook through the course.
  • History of Information (Online Encyclopedia of Technology, by Jeremy Norman)
    • This is a richly resourced site. View by "Theme" (Topic Categories).
  • Computer Timeline (Developed by Georgi Dalakov). This is a good resource that combines many solid sources in the history of computing. See the Sitemap for s list of "Timeline Stories" [not sorted]
  • Koichi Arts and Science (Koichi, India).

Video Lessons Online:

Examples for Discussion (Prof. Irvine, in class):

  • Examples of primary documents and artefacts for introducing our methods of study.

Prof. Irvine, Introduction: Key Concepts in Computing (Slides)

Writing assignment (Canvas Discussions Link)

  • Discuss one or two important things that you learned from the readings and video lessons for this week, and write questions that you would like to go over in class. Were the concepts and methods in the "Introduction to the Course" (pdf) clear and understandable? What wasn't? Include questions or topics that you would like to have covered or explained more fully this week and throughout the course.

Learning Objectives and Main Topics

Learning the contexts and concepts for the first mechanical calculating machines during the 1640s-1710s designed by Blaise Pascal and Gottfried Leibniz. Learning from the original texts, documents, and examples of the machines. Learning about the social and intellectual conditions of the 17th-century "Scientific Revolution" (defined by major developments in mathematics and applied mathematics for scientific observations, instruments, and machines) as the foundations for the 18th century "Enlightenment" (new developments in all fields of knowledge separate from the authority of the church).

Important historical question:
Why were machines for automating computation (calculation) in decimal numbers first developed in France and Europe from the 1640s-1670s? Why then and there? Why not earlier (or later) or in some other country or region?

"Scientific Revolution" and "Enlightenment" Era
In learning about the context of this historical period, we will use these two terms "Scientific Revolution" and "Enlightenment Era" just as conventional labels for what is really a very complex, and contested, period of history. Our focus will be on what converged to make the very conception of computation and design models for machines possible at all, and possible to implement in actual physical materials.

Pascal is important for his contributions to geometry, mathematics, scientific experiments, the philosophy of science, and for conceiving, designing, and building the very first mechanical arithmetic machine (when he was 19 years old). His machine is notable for a number of "firsts" in history, but we will study it for the major conceptual leaps that made it possible to automate human abstract, conceptual structures (numbers represented in symbols) in physical components.

Leibniz is important for many contributions in the history of human thought (the calculus, notation signs), but we will consider his work on the theory of signs and symbols, his design concepts for an arithmetic machine (mechanical calculator), and his development of the binary (base 2) number system.

What difference does it make?
How does it change (does it?) your understanding of the history of computing by having your own direct access to original primary documents and views of the early machines?

Readings:

Windows Into the Intellectual World of Pascal and Leibniz: Important Primary Texts
View and Survey the Contents of these Famous Books

  • René Descartes, Geometry (Part of Discourse on Method) (1637) (Wikipedia background)
    • Original Text (in French) (Internet Archive)
      The link opens on the pages for Chap. 2: page through to p.318, p.356, p.370. Descartes calls his drawn lines in a diagram a "machine" for working out demonstrations. This is the traditional method with a ruler and compass.
    • Parallel Text edition (original French and English translation)
      Note: Descartes' combination of algebra and geometry, and use of "paper machines." He is using an "analog" computing device. Translator translates as "instrument."
  • Christiaan Huygens, The Pendulum Clock, or, Geometrical Demonstrations Concerning the Motion of Pendula as Applied to Clocks (1673) (Wikipedia Background)
    • Original Latin edition (1673) of The Pendulum Clock (Horologium oscillatorium; sive, De motu pendulorum ad horologia aptato demonstrationes geometricae)
      The link opens on the page with the illustration of the clock with gear components and pendulum. The Latin word horologium (and French horloge) comes from the Greek (via Latin) word meaning "time counter, time reckoner."
    • English Translation (Blackwell, 1986). Go to the page with Huygens' illustration of the clock, and read a few pages of his description of the geometrical-mathematical principles of the clock design.
    • Note: The translation of arithmetic and geometry to numbers of gear teeth, wheel diameters, and a mesh-work system of components for the design of clocks and watches was already a well-known Art (a practiced craft, trade, technique). 17th-18th mathematicians described the formal (abstract, mathematical) rules and laws for time-keeping and navigation instruments. Pascal obviously saw how this technology could be re-purposed and redesigned for a calculating machine.
  • Isaac Newton, Universal Arithmetic: Or, A Treatise of Arithmetical Composition and Resolution (1707) (Wikipedia Background)
    • Original Latin Text (Arithmetica universalis, first printed in 1707) (Internet Archive)
    • English Translation (1720): (Google Books) and (Internet Archive)
      Go to p.1 for the definition of "Computation," and survey the presentation of arithmetic and algebra. (Newton sums up the accepted knowledge of his time).
    • Newton is famous for much more (optics, theory of gravity, applied mathematics in many fields), and his important work, The Mathematical Principles of Natural Philosophy (1687, and many later editions), which is usually cited as the embodiment of Enlightenment mathematics and science. He and Leibniz were rivals, and engaged in a useless long dispute over who first discovered and formalized the notation (mathematical symbols) for the calculus.

Examples for discussion in class

Writing assignment (Canvas Discussions Link)

  • Choose one of these questions to write about with references to the readings and to one of the primary source texts assigned for this week (as above):
  • What did you find most interesting, challenging, or difficult about studying the ideas and intellectual and technical contexts for this period?
  • How is Leibniz's philosophy of symbols connected to his view of reasoning (rational thinking), mathematics, and necessary ("mechanical") thinking processes?
  • Could you see how the concepts, assumptions, and ways of thinking with mathematics exemplified in the primary texts by Descartes, Huygens, and Newton would have been understood and applied by Pascal and Leibniz for conceiving and designing an arithmetic machine?

Learning Objectives

Learning the essential concepts and design principles of computing systems by studying major original documents and historical examples of the first calculating machines.

What can we learn about the key concepts in computing from studying the primary sources for the the design principles of the earliest arithmetic machines?

Readings and background:

  • Prof. Irvine, ed., Primary Sources, Part 1. [Online pdf of Primary Sources: Part 1].
    • Read the selection of texts on Pascal's and Leibniz's Arithmetic Machines.
      We will study the Binary (base 2) Number System in the following historical Node (connecting Leibniz's work with that of George Boole)..
  • Prof. Irvine: Introduction to Node 1: Background and Sources (Slides).
    Review the last 5 slides for the underlying design concepts for Pascal's and Leibniz's Arithmetic Machines.
  • Calculating in Pre-modern Times: Online Exhibition by the Arithmeum, Bonn, Germany.
    • Scroll through the examples up through the 18th Century.
  • History of Information (site), Calculator and Computer Design (Ancient to Modern)
    • See the historical context of Pascal's and Leibniz's work, 1642-1705 in a timeline view.

Video Lessons: Visualizing How the Design Principles Were Implemented

  • View these videos for the design and operations of the machines from our point of view:
    can you see how each component and its linked actions represent a technical solution to a design problem? How can we map physical instances of arithmetical numbers in the physical structures of components with the operations that we apply to them (addition, subtraction,...) in a system that "does arithmetic" (from human inputs), and is not just a "box of parts."
  • How Pascal's Arithmetic Machine Works (Mechanical Computing)
  • Pascal's Machine: Digital Simulation (Dresden State Art Collections, Milestones of Knowledge)
  • Pascal's Machine (Arithmeum, Bonn; in German but you can follow.)
  • Leibniz's Machine (Arithmeum, Bonn; in German but you can follow.)
  • Leibniz's Machine (Exhibition and video, Arithmeum, Bonn)

Case Studies: Pascal's and Leibniz's machines in the original documents

  • In addition to the texts and images in the Primary Sources, Part 1 pdf, you can also access and think about the primary documents in online digital version:
  • Pascal, Dedicatory Letter and Information for Users for his Arithmetic Machine (1645). Digital copies of the texts that Pascal had printed for his machine.
  • View the description and illustrations of Pascal's Arithmetic Machine from the original printing of the Encyclopedia

Writing assignment (Canvas Discussions Link)

  • Using the primary documents and demonstrations of details in Pascal's and Leibniz's machines (videos and illustrations in the readings), describe how the two early designers used the principle of "correspondence mapping" for numeral symbol representations and arithmetical operations in a system physical components. Select one or two examples of components that show the features for physically registering numerals and for conducting actions corresponding to arithmetical operations. Draw your own diagrams with labels and captions, if this helps and the learning more fun. Can you also see how "user agency" is designed into the system for directing input numeral representations to output numeral representations? Explain, as best you can so far, the "why" and "how" of the designs. Ask questions that we can follow up on in class.

Learning Objectives and Main Topics:

Learning the key concepts, computational models, design principles that Charles Babbage found ways to implement in the physical components available in England and Europe in the 1820s-1840s.

Background: A Computing System, From Design to Implementation

We will study Node 2 in two steps (as before): a week devoted to the main ideas and design concepts, followed by a week on the technical implementations and interpreting the primary sources that document the developments.

This week, you will learn the basic background on Babbage's great "conceptual leaps" that enabled him to come close to conceiving a "general purpose" mechanical computing system, and the ideas that he attempted to realize in actual working machines.

Next week, we will focus on the design principles and technical implementations in Babbage's Engines, and study the descriptions and commentary on the machines in the primary sources.

Readings: Background and Primary Sources

Writing assignment (Canvas Discussions Link)

  • Discuss two or three of the key concepts in Babbage's method and design for a mechanical computer system. Can you explain the significance of the motivation for his design and implementation of the Difference Engine (computing error-free tables of pre-calculated numbers)?

Learning Objectives and Main Topics

Learning the fundamental design principles for Babbage's Analytical Engine from the original papers, diagrams, Ada Lovelace's explanations, and actual versions of Babbage's machines.

Case studies with primary sources: Babbage's Engines and the context of applied mathematics. Lovelace's interpretations, diagrams, algorithm: symbols and operations (two orders, classes of symbols requiring homologous mapping in system components), beginning of concepts of data and programs, diagrams of the Analytical Engine.

Terminology:

  • Analytical: computations that use algebra and mathematical functions, and not only basic arithmetical operations.
  • Engine: a designed device that embodies mathematical and/or mechanical principles, and intended as an aid for human capabilities and/or for labor-saving (as in reducing the mental labor of calculations).

Readings and Primary Sources: Deeper into the history of ideas and machines

Technical and Theoretical Background: Design and Implementation

Video Lessons: Historical Background and Observing Machine Operations

Writing assignment (Canvas Discussions Link)

  • From your study of the primary documents, designs, and examples of Babbage's engines, what seemed the most interesting in our context? What were the limitations of the mechanical computing system in comparison with modern electronic computers? What kinds of assumptions and design principles continue from Pascal and Leibniz?
  • Go to your individual assignment topics (Google Doc).
  • Hint: copy and paste directly from the online pdf of the Primary Sources document for citing and quoting statements to illustrate or support your points in your post.

Learning Objectives and Main Topics

Learning the foundational ideas for symbolic systems, code, and applications to electronic systems in the mid-19th century.

We can trace the strands of Leibniz's "mechanical thread" of symbols (in rule-governed sequences of abstract symbols and signs for operations and relations) as they connect to design concepts for computable, symbolic notation systems -- for both human computers, and then for the technical designs of physical systems.

This week, we will use Samuel Morse's design for an electronic telegraph code as a case study for understanding how switched electrical states can be designed symbolically with for "system of signs," a code.

Next week, we will study the important parallel and overlapping histories during the same decades in the mid-19th century. These ideas later converged in the foundational concepts for modern electronic computing: continued developments in electronic signals designs in telecommunications, further developments in computing, George Boole's symbolic logic for binary symbols and operations, and C. S. Peirce's development of semiotic theory.

Binary logic and further developments in mathematical logic had become formalized in symbolic notation systems in the 1930s-40s, just when it was becoming technically feasible to employ electrical circuits as the material substrates for representing data and operations (as Babbage, Lovelace, and others defined). With a further development of Morse's break-though concepts for encoding typographic symbols in electrical units, there was another "just in time" convergence of philosophy, math, logic, and technical means automating computation in a system of electronic components. By the late 1940s and through the 1950s, we see the foundations of modern computing -- encoding symbols that mean (data) combined with encoded symbols that do (programs) in an overall design for a controlled, time-sequenced, automated system.

Readings:

Primary Sources

  • Samuel Irenaeus Prime, The Life of Samuel F. B. Morse, Ll. D., Inventor of the Electro-Magnetic Recording Telegraph (New York, 1875). (Internet Archive) (you can download the book in pdf for reference)
    • Go to pp.251-53, where Prime records Morse's account of his first ideas for a "system of signs" that used switched electrical current.
    • From the 1840s on, Morse and other inventors were involved in a long patent dispute about who "first" invented the electrical telegraph. This chapter records the issues and background history of the dispute.
    • This account of the first "system of signs" is from Morse's letter to the Secretary of the Treasury and the US Congress, published in 1837. The government solicited proposals for building a US telegraph system, and Morse's design and plan was one of those proposed. Morse received the government funding.
    • See the original document with Morse's letter (Internet Archive).
  • Samuel Morse: Primary Documents (Morse Papers, Library of Congress)

Video Lesson

  • Background History: Electrical Signals and Morse Code (Art of the Problem series)
    • This video is good on the background technical history, but it doesn't go further to explain how the basic concepts in Morse Code became the foundation for everything in binary electronic code. We will connect the dots (!).

Backgrounds for Museum Visits

Writing assignment (Canvas Discussions Link)

  • In preparation for our museum visits, discuss the key concepts in Morse's discovery of a symbolic code based on switched states of electrical current, and how these basic concepts became extended to our modern binary electronic systems. Try out your own application of Peirce's terms and concepts for Morse's "system of signs" in electronic code.

Learning Objectives and Main Topics:

This week, students will learn the intellectual-historical background for the applied mathematics and logic for "shaping signals" in electricity: the engineering design solutions to the core semiotic problems for symbolic representation in electronic communications and for logic in digital electronic computer systems.

Background:
Morse's proof of concept for a "systems of signs" for an electronic code mapped to switched circuits had two important consequences in the history of ideas for our modern computing technologies. Electrical engineers (Hartley, Nyquist, Shannon), who worked for telecommunications companies, developed mathematical models for electrical signals and for switched network connections. (1) They discovered how to apply the abstract, formal concepts from logic and mathematics (Boole, Peirce, and others) to describe connected circuits in telegraph and telephone switched circuits (telecom networks), which led to using combinations of switched circuits designed for representing and performing sequences of Boolean logic in computations; and (2) they developed mathematical models for electronic "information" in structured ("shaped") signals, as design solutions for using electricity (and radio waves) in regulated patterns that could be used to encode messages (sequences of symbols) from sending to receiving points. The applied math and logic in electrical engineering led to Shannon's formalization of the binary bit as the minimal reliable unit for encoding and decoding in controlled electronic states. The binary bit formalizes Boole's binary (one of only two possible values) true/false, 1/0, symbolic representation in a context of interpretation.

The developments in philosophy, logic, mathematics, calculating machines, electricity, and electrical communications from the 1830s to the 1930s converged together in another unanticipated nodal moment. Morse's basic design concepts for electrical switched states as a representational system for "a system of signs," plus Boole's ideas for modeling logic with a two-value binary system, and Claude Shannon's application of Boolean symbolic logic to networks of telecom switches -- these ideas and applications in electronic technologies became the foundation of modern binary electronic computing and all forms of digital media (text characters in all languages, graphics and images, photos and video, audio).

Using our methods and key concepts, we will develop a historical case study through this important path of ideas and technologies:

  • From Morse's concept of a "system of signs" used for telegraph Code (controlled switched signals), which became used internationally (1830s-1930s),
  • Through the philosophy of representation, signs, meaning, and the discovery of how to apply Boolean logic to electrical circuits developed by C. S. Peirce (1890s-1910),
  • To Claude Shannon's development of systematic formulas for circuits based on Boolean symbolic logic, and then Shannon's mathematical concept of binary information (the bit) (1930s-50s).
  • These two key concepts for binary electronic logic switches and binary units of information became the essential foundation for modern computing as we know it (1950s-present).

Background Reading

Primary Sources

  • George Boole: Primary Documents. See the first editions of his important books in pdf digital copies (Internet Archive). (View briefly and download, if you'd like your own pdf copies.)
  • Excerpts from Boole's original works: The Mathematical Analysis of Logic (1847) and The Laws of Thought (1854). Download, survey the contents, read the highlighted pages.
    • I prepared these excerpts and highlighted sections so that you can see the important statements in Boole's original works. Don't worry about the difficulties; we will discuss in class. What became known later as "Boolean logic" in computing is a reduced version of Boole's whole program for a symbolic algebra of logic.
    • Boole was a member of the same circle of Cambridge mathematicians with Charles Babbage and Augustus DeMorgan (Ada Lovelace's math teacher). Boole and the Cambridge circle followed Leibniz on extending the methods of abstraction and use of symbols in algebra for developing notations for symbolic logic. DeMorgan's famous book, Formal Logic, or, the Calculus of Inference [see on Internet Archive] and Boole's The Mathematical Analysis of Logic were both published in the same year (1847). Both were major influences on C. S. Peirce, who wrote the first papers in America on extending Boole's logic in the 1870s, and founded the American tradition of symbolic logic and semiotics. (More next week.)
  • Claude Shannon, A Symbolic Analysis of Relay and Switching Circuits, MIT Thesis (1938/1940). Copy of original typed thesis: excerpt of Part 1, and References. (Link to complete text.)
    • See the IEEE Journal Article version (1938). This is the version of Shannon's thesis work that became well-known in the engineering community.
    • Note: In describing circuits, Shannon was using electrical engineering concepts for "impedance" (resistance or hindrance of the flow of electricity at the contact points of a circuit), so he initially used "1" for an open circuit (i.e., full "hindrance," and no connection) and "0" for a closed circuit (i.e., where there is a connection and no hindrance in the circuit.) This is one way of mapping Boole's "Everything or Nothing" "1/0" values. Engineers and mathematicians soon revised the notation (symbolism) into the form we use today: "1" represents a connected (closed) circuit = the "on" state (electricity flows through the switched contact point); "0" represents an open circuit = the "off" state (electricity does not flow through the switched contact point).
    • Shannon didn't know (and most people still don't) that C. S. Peirce drew the first diagram for applying Boolean logic to circuit switches in 1886. Peirce diagrammed what we now call AND and OR gates in the circuit logic for all processor units. (More next week on Peirce.)

Background on Key Concepts in Logic and Computing

Video Lessons: How Boole's Binary Algebra of Logic is Applied in Computing

Class Discussion: Step by step discussion and explanation of sources.

Writing assignment (Canvas Discussions Link)

  • Describe the key concepts that connect Morse's design for "a system of signs" for an electronic code based on switched states of electrical current, the formal developments of binary logic in Boole's work, and Shannon's application of Boolean symbolic logic for mapping switched circuits in telecommunications. How did the basic ideas that were formalized (given a symbolic notation) in this "nodal moment" in our history provide the foundations for designing digital electronic computing systems?

Learning Objectives and Main Topics:

The development of modern digital electronic computing systems. Information theory and computation based on binary systems. Underlying semiotic systems designs, applied logic and mathematics in homologous systems. Models of computation in the works of Alan Turing, John Von Neumann, and system designers in the 1940s-50s.

Introductions, Background

Video Introductions and Lessons

Writing assignment (Canvas Discussions Link)

Learning Objectives and Main Topics:

 

Readings

  • Primary texts by modern computer system designers.

Writing assignment (Canvas Discussions Link)

Learning Objectives and Main Topics:

What were the big conceptual and technical leaps in computing that enabled the kinds of interactive, multimedia systems that we know today?

How did we go from large, "number crunching" computer systems, designed for specialists, to designs for "general symbol processing systems" that could be used by anyone? How did "computation" become more than just applications for math and science? How were the concepts for all digital data types (text, images, audio) developed, and how did they become today's "multimedia"? How were the designs for "interactive" computing with ongoing user inputs and controls for software first conceived and developed?

Readings and Video Introductions:

Writing assignment (Canvas Discussions Link)


Learning Objectives:

Learning the key design concepts and developments in underlying technologies in the interactive computing Node: Doug Engelbart's "Augmenting Human Intellect" lab at Stanford, and the research and development at Xerox PARC that applied Engelbart's concepts to smaller, networked, interactive computers that everyone from children to office workers and academics could use. What can we learn about the key ideas in the primary documents, events, and artefacts of design that are still with us today?

Readings and Video:

  • Primary documents on Interface Design: Doug Engelbart, Proposal for the Augmenting Human Intellect Project (1961)
  • Xerox PARC: Alan Kay and designs for interactive computers that anyone can use

Writing assignment (Canvas Discussions Link)

Learning Objectives

  • The design principles for information networks: the Internet and World Wide Web.

Readings and video Lessons

  • Crash Course Computer Science

Writing assignment (Canvas Discussions Link)

Class Discussion:

Discussion and presentation of final capstone projects.

Post Your Final Project Ideas in Week 14 (Canvas Discussions Link)

Instructions for the Final Version of Your Essay and Posting to Canvas

  • Detailed instructions for the Capstone Essay Project [download and print out for reference.]
  • Upload the pdf file of your essay either to Canvas or Google Drive, and create a link to your file in your Canvas Discussion post for "Final Projects" [Canvas Discussions Link]. Write the Title of your essay as a heading for your post, and insert your brief abstract below the title. Then below this text information, provide the link to your file (you can use the URL or a short title with the embedded link). Test the link after you save the post to make sure it works; revise and edit if needed.

Due Date for Posting Your Final Project:

  • Using your final capstone project after the course: You can use your final capstone essay as part of your "digital portfolio" wherever it can be useful to you (in a resume, LinkedIn, social media, internship applications, job applications, and applications for further graduate studies).