Un projet Labex Arts H2H
en /fr
2012 - 2016

The Continuum in the Code Base: Computation as Automated Semiosis
By Martin Irvine

Georgetown University, Communication, Culture & Technology Program


[A] sign is something by knowing which we know something more.
All my notions are too narrow. Instead of ‘Sign,’ ought I not to say Medium?
--C. S. Peirce (c.1906)


A symbol is any device whereby we are enabled to make an abstraction….
All abstraction involves symbolization. --Susanne Langer (1962)


Computation is about automating abstractions.
-- Jeannette Wing, “Computational Thinking” (2008)


The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same. --George Dyson, Turing’s Cathedral (2012)


Computation is information processing through transformations of symbolic representations. --Peter Denning, “What is Computation” (2010)


Introduction: A Continuum From Morse to Peirce and Computation
On his way home to New York from Paris in 1832, and carrying with him his famous metapainting of The Gallery of the Louvre (Salon Carré), Samuel Morse described how he came to his invention of the electromagnetic telegraph:


[O]n my voyage from Europe, [I recalled] the electrical experiment of Franklin...in which experiment it was ascertained that the electricity traveled through the whole circuit in a time not appreciable, but apparently instantaneous. It immediately occurred to me that, if the presence of electricity could be made VISIBLE in any desired part of this circuit, it would not be difficult to construct a SYSTEM OF SIGNS by which intelligence could be instantaneously transmitted. The thought, thus conceived, took strong hold of my mind… and I planned a system of signs, and an apparatus to carry it into effect. [Emphases as in original text]


Everyone learns that Morse’s “system of signs” and the universal adoption of his model of the electromagnetic telegraph marks the beginning of global electronic telecommunications. Morse also designed his telegraph to use the coded electronic pulses to automatically record the pulses as strings of imprinted code (typically on tape), a process he considered his crowning achievement. What is less known is that Morse’s intellectual property became the foundation of AT&T and Bell Labs, the research lab that gave us Claude Shannon’s famous paper, “A Mathematical Theory of Communication” (1948), the transistor (1947-50s), the UNIX OS for time sharing computing (1960s), the C programming language (1960s-80s), and the computing infrastructure used by Arpanet and the designers of the Internet for data packets. Lesser known still is that Morse began his career as a history painter, became friends with Louis Daguerre in Paris, brought the daguerreotype technology to New York, and collaboratively made the first daguerreotype photographs in America in the 1840s, during the same time that he lobbied the US Congress (successfully) to fund the first telegraph line from Washington to Baltimore. Morse was obsessed with code, with symbolic media used for encoding the transmission of meaning, and he had no sense of our recent partitioning of knowledge domains for meaning systems, language, art, communications, mathematics, science, and technology.

There is a “Morse moment”--an integrative view of our symbolic and technical artefacts--now emerging at the intersection of Peircean semiotics, computation, linguistics, communication theory, and the cognitive sciences. Focusing on the convergence of research programs and conceptual models at this intersection of knowledge domains, I will propose a synthesis of promising lines of research that unify the humanistic disciplines with knowledge formation in technical and scientific fields around the key questions of human symbolic cognition, sign systems, meaning generation, and information representation; i.e., the foundations of everything we call code. In short, computation, digitization, and digital media are artefacts of human symbolic cognition and part of a long continuum of technical mediations for human sign systems. The position of estrangement or alienation from computers, computation, and digital media (now a ritual trope in most humanities studies) is not an effect of designs for automating symbolic activities in digital substrates (symbolic and cognitive artefacts), but of very recent ideological and political-economic forces that have hijacked and blackboxed pre-existing symbolic functions.


1 Semiosis Across Material States and Time Sequences:
Translating and Correlating Cognition and Action
C. S. Peirce provides the most useful heuristic model for describing sign functions as correlated material-cognitive processes based on material-perceptible sign-vehicles that co-constitute meaning structures over dynamic states of time in situated uses (semiosis). His model includes descriptions for sign processes that mark both conceptual and abstract sequences and that also cause actions on material forms (summarized in figure 1). The logical, conceptual, generative, and abstractive features of sign functions are well-known in Peirce’s writings--since he was primarily motivated by explaining logic and mathematics as sign and symbol processes--but in his pragmatist-driven model, signs are motivated by the results that they can produce in communities of use, including actions distributed to machines (e.g., the holes in a punch card for a Jacquard loom). Interpretant sequences of symbolic structures can be compound and complex, combining cognitive-conceptual “growth” as well as actions and physical responses. Simplifying, the meaning of a set of signs is the response you get--or can cause to be enacted--which, as a moment in the continuum of symbolic possibilities, is likewise representable in (an)other pattern of signs.
The program of action designed to be initiated by a human agent in any app or software module reveals that a material-perceptible interface (screen layout on the pixel-mapped substrate, icons, other affordances for interaction) is part of a symbolic structure that translates meaning generation, agency, and intentionality back and forth to/from unobservable symbolic structures (digital states and their transitions materially instantiated). Digital user interfaces to the internal computational symbolic states are thus designed to correlate meaning, action, and material structures. Software is designed to materially instantiate interfaces between symbolic cognition and agency. Computation allows us to enact the dual functions of semiosis by designating symbols that mean things, and symbols that do things.


2 Computation, Code, Cognitive Artefacts, and Distributed Cognition
Extrapolating from recent cognitive science literature, computer code and media digitization can be described as the combination of multiple technologies for implementing cumulative symbolic processes that have been accruing in human societies ever since humans became “the symbolic species” (Deacon) with the ability to use both natural language and “exosomatic,” durable, symbolic representations as externalized cultural memory (Donald) for technical mediation over space and time (Stiegler). The material, perceptible, conceptual/cognitive, and intersubjective/interindividual properties of human sign systems are a “cognitive scaffolding” (Clark) that we repurpose for designing technologies, communication media, and methods for material representation in combinatorial and scalable modular systems (Arthur). From this perspective, everything digital and computational--processes, representations, data formats, software, operating systems, network architecture and data protocols--is an artefact of human symbolic cognition, and thus motivated by meanings, values, and intentions of human communities (including the delegated or assigned agency in automated, unobservable background processes). This view provides conceptual necessary conceptual hacks for constructively intervening in recent debates about interfaces and networks, and the “magical” blackboxing of symbolic functions in the relentless productization of consumer computational devices.


3 Computation as Automated Semiosis:
Semiosic Substrates, and Distributing the Sign

Using Peirce’s description of sign functions as an extensible reference model for sign transitions and transformations over time sequences, I will introduce a group of related concepts: (1) the necessity of a semiosic substrate (or the substrate function) as the generalizable requirement in the material/perceptible component of symbolic structures (a substrate presupposed in Peirce’s Representamen as a stack of material substrate(s) and phenomenologically based percept(s)), and (2) a built-in feature of semiosis enabling us to split and redistribute signs in n-instances of de- and re-tokenization in same or different substrates in space and time (decomposition and recomposition). Programming and software are based on internally chaining what Peirce categorized as indexical and iconic signs (for example, signs pointing to states and physical locations in memory matrices that are used to represent numeric, textual, or image components as recomposable data types in the software flow). In all programming and digitization schemes, these electronic materially distributed sign components are designed to “return values” in other symbolic representations (again, combining cognitive-abstractive-conceptual meanings with actions). Because sign tokens (replicable material occurrences) instance types (classes, categories, codes, values), we have multiple means for chaining abstractions and correlating them to other material representations, concepts, and actions. Extrapolating from what I think is implicit in Peirce’s attempts to describe the material-conceptual correlates in symbolic activity, we have a productive explanation for describing computation and digitization as mediated symbolic action through motivated sign distributions in re-presentable instances across multiple material substrates (cf. transcoding and remediation). Since both individual sign units and complex symbolic structures (e.g., connected discourse, musical forms, images, film) are decomposable and recomposable (or partially) at multiple levels of their constituent types, we are able to produce an unlimited series of   (new instances of symbolic types or sets/classes of types in other material forms) across shifts in material substrates and technical means of implementation.
The 2D material substrate alone (as a primary “user interface”) has provided a long history of symbolic mediations, remediations, and transitions from clay and wax tablets to computer tablets, from the codex book (through script and industrial print production transitions) and all kinds of writing supports to ebooks and “pages” on screens, from painting, drawing, and image supports of all kinds to projections and displays on 2D screens. While material semiotics reminds us that all material media are culturally encoded prior to the registration of specific symbolic “content,” the long history of retokenizations and redistribution of sign functions across, in, and through material substrates is an obvious fact. Software and digitization enables automation of semiosic processes and unlimited retokenization in the design of computer systems for structuring and processing transformations of symbolic states. Visual and acoustic information--encoded to be received/perceived through the material substrates of symbolic surfaces (planar structures, pages, screens, tableaux) and/or structured audible waveforms--can be digitally encoded and represented, but (dependently) retokenized in analogue form in material substrates for human perception (except when digital representations are objects designed to be further interpreted by software agents, “running” as automated semioses in electronic and memory matrix substrates with output transitions that may communicate only with other software processes before returning values in a human-perceptible interface substrate). There is no analogue/digital dichotomy: human perception requires material- perceptible substrates (the substrate function in symbolic form). Analogue and digital representations form a continuum, and regardless of all the transitions in and out of digital processing, symbolic artefacts are designed to always return to a human perceptible substrate, or to perform delegated actions in other material systems. (I am working on a formal model and a software simulation that will help visualize the description of structures here.)


The testable hypotheses that emerge from this new orientation to semiosic processes will require much more elaboration, formalization, and application to symbolic artefacts in all media. But, for me, the theoretical elaboration is not only an academic or philosophical project (as important as it is to get that project right). My 20 years of teaching topics in technology, media, communication, and semiotics convinces me that we need a new integration of knowledge in the human sciences that embraces all knowledge domains. By recovering a “Morse paradigm” for unifying the cultural, technical, and scientific bases of symbolic representation and transmission, we have an opportunity to incorporate important missing knowledge in the human sciences for understanding sign systems, symbolic cognition, computation, and technical mediation. By repositioning symbolic mediation and the computational automation of symbolic actions in this longer continuum of cumulative cultural, cognitive, and social implementations, we have a far more productive way to embrace and incorporate this knowledge in research, theory, and practical expertise. Further, this repositioning enables non-technical people to reclaim ownership over our cognitive symbolic technologies, not as alienating machine products controlled by specialists or as determinist instruments of political economy, but as implementations of core human cognitive-symbolic capacities in social-material situations. A new integrative paradigm thus enables non-technical students and professionals to counter prevailing ideologies, mythologies, and collective misrecognitions that construct computational and digital technologies as nonhuman agencies external to core human faculties or as inhuman/posthuman forces imposed by very recent political-economic conditions.
Computation is a human design for complex, multileveled automated semiosis distributed through digital material substrates and their structured time-state transitions. We use all kinds of computation and digital media processes to represent and instantiate symbols that mean things and activate symbols that do things in the most sophisticated way we’ve yet invented. If “digital humanities” is ever going to mean anything outside an academic enclave, it will only come by way of integrating all relevant knowledge about our cumulative symbolic activities in a continuum of technical mediation and by redefining computation and digitization as artefacts of human symbolic cognition. I firmly believe that this reorientation enables us to reclaim ownership (in all senses of this term) over what has always been ours from the beginning but obscured from recognition in recent instrumental configurations.


Select Representative Bibliography

Computation and Semiotics
Andersen, Peter Bøgh. A Theory of Computer Semiotics: Semiotic Approaches to Construction and Assessment of Computer Systems. Cambridge; New York, NY: Cambridge University Press, 1997.
Denning, Peter J. “What Is Computation?” Ubiquity, ACM, August 26, 2010.
Denning, Peter J., and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015.
Goguen, Joseph. “An Introduction to Algebraic Semiotics, with Application to User Interface Design.” In Computation for Metaphors, Analogy, and Agents, edited by Chrystopher L. Nehaniv, 242–91. Lecture Notes in Computer Science 1562. Springer Berlin Heidelberg, 1999.
Nadin, Mihai. “Information and Semiotic Processes: The Semiotics of Computation.” Cybernetics & Human Knowing 18, no. 1–2 (January 1, 2011): 153–75.
Nöth, Winfried. “Representation in Semiotics and in Computer Science.” Semiotica 115, no. 3/4 (August 1997): 203–13.
Nöth, Winfried. “Semiotic Machines.” SEED 3, no. 3 (December 2003). http://www.library.utoronto.ca/see/SEED/Vol3-3/Winfried.htm.
Rocchi, Paolo. Logic of Analog and Digital Machines. Hauppauge, New York: Nova Science Publishers, 2013.
———. “Ubiquity Symposium: What Is Information?: Beyond the Jungle of Information Theories.” Ubiquity, ACM 2011, no. March (March 2011): 1:1–1:9.
Smith, Brian Cantwell. “Age of Significance.” 2010. http://www.ageofsignificance.org/aos/en/toc.html.
Sowa, John, ed. Principles of Semantic Networks: Explorations in the Representation of Knowledge. San Mateo, CA: Morgan Kaufmann, 1991.
Sowa, John F. Conceptual Structures: Information Processing in Mind and Machine. Reading, MA: Addison-Wesley, 1983.
———. Knowledge Representation: Logical, Philosophical, and Computational Foundations. Pacific Grove: Brooks / Cole, Thomson Learning, 1999.
———. “Peircean Foundations for a Theory of Context.” In Conceptual Structures: Fulfilling Peirce’s Dream: Fifth International Conference on Conceptual Structures, ICCS’97, Seattle, Washington, edited by Dickson Lukose, Harry Delugach, Mary Keeler, Leroy Searle, and John F. Sowa, 41–64. Berlin ; New York: Springer, 1997.
Tanaka-Ishii, Kumiko. Semiotics of Programming. New York: Cambridge University Press, 2010.


History of Code, Semiotics, Symbolic Systems
Brownlee, Peter John, ed. Samuel F. B. Morse’s “Gallery of the Louvre” and the Art of Invention. New Haven: Yale University Press, 2014.
Peirce, Charles S. The Essential Peirce: Selected Philosophical Writings (1867-1893). Volumes 1 and 2. Edited by Nathan Houser and Christian J. W. Kloesel. Bloomington, IN: Indiana University Press, 1992 and 1998.
Peirce, Charles S. Collected Papers of Charles Sanders Peirce, 8 Volumes. Edited by Charles Hartshorne, Paul Weiss, and A. W. Burks. Cambridge, MA: Harvard University Press, 1931-1966.
Prime, Samuel Irenæus. The Life of Samuel F. B. Morse, LL.D.: Inventor of the Electro-Magnetic Recording Telegraph. New York: D. Appleton and Company, 1875.
Shannon, Claude E. “A Mathematical Theory of Communication.” The Bell System Technical Journal 27 (October 1948): 379–423, 623–56.


Cognitive Science, Artefacts, and Technology Theory
Arthur, W. Brian. The Nature of Technology: What It Is and How It Evolves. New York, NY: Free Press, 2009.
Clark, Andy. “Material Symbols.” Philosophical Psychology 19, no. 3 (2006): 291–307.
———. Supersizing the Mind: Embodiment, Action, and Cognitive Extension. New York, NY: Oxford University Press, USA, 2008.
Clark, Andy, and David Chalmers. “The Extended Mind.” Analysis 58, no. 1 (January 1, 1998): 7–19.
Deacon, Terrence W. The Symbolic Species: The Co-Evolution of Language and the Brain. New York, NY: W. W. Norton & Company, 1997.
Donald, Merlin. A Mind So Rare: The Evolution of Human Consciousness. New York, NY: W. W. Norton & Company, 2001.
———. Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition. Cambridge, MA: Harvard University Press, 1991
Dupuy, Jean-Pierre. On the Origins of Cognitive Science: The Mechanization of the Mind. Cambridge, MA: The MIT Press, 2009.
Harnad, Stevan, and Itiel Dror. “Distributed Cognition: Cognizing, Autonomy and the Turing Test.” Pragmatics & Cognition 14, no. 2 (July 2006): 209–13.
Marcus, Gary F. The Algebraic Mind: Integrating Connectionism and Cognitive Science. Cambridge, Mass.: MIT Press, 2001.
Pinker, Steven. “The Cognitive Niche: Coevolution of Intelligence, Sociality, and Language.” Proceedings of the National Academy of Sciences 107, no. Supplement 2 (May 5, 2010): 8993–99.
Winograd, Terry, and Fernando Flores. Understanding Computers and Cognition: A New Foundation for Design. Reading, MA: Addison-Wesley, 1987.
Zhang, Jiajie, and Vimla L. Patel. “Distributed Cognition, Representation, and Affordance.” Pragmatics & Cognition 14, no. 2 (July 2006): 333–41.


Humanities and Social Science Studies

Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 2000.
Flusser, Vilem. Writings. Minneapolis, MN: University of Minnesota Press, 2004.
Galloway, Alexander R, Eugene Thacker, and McKenzie Wark. Excommunication: Three Inquiries in Media and Mediation. Chicago, IL: University Of Chicago Press, 2014.
Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago, IL: University of Chicago Press, 1999.
———. How We Think: Digital Media and Contemporary Technogenesis. Chicago, IL: University Of Chicago Press, 2012.
Latour, Bruno. “On Technical Mediation.” Common Knowledge 3, no. 2 (1994): 29–64.
Manovich, Lev. Software Takes Command: Extending the Language of New Media. London; New York: Bloomsbury Academic, 2013.
———. The Language of New Media. Cambridge, MA: MIT Press, 2001.
Stiegler, Bernard. Technics and Time, 1: The Fault of Epimetheus. Translated by Richard Beardsworth and George Collins. Stanford, CA: Stanford University Press, 1998.
———. The Re-Enchantment of the World: The Value of the Human Spirit vs Industrial Populism. Translated by Trevor Arthur. Continuum, 2012.


About the author

⦁ Martin Irvine is an Associate Professor and the Founding Director of the graduate program in Communication, Culture & Technology (CCT) at Georgetown University in Washington, DC, where he has taught and held many administrative positions for over 20 years. His research, teaching, and publications span a wide range of fields including communication and media, computation and the Internet, semiotics, linguistics, philosophy of language, art theory, and technology design theory. Martin first learned computing (UNIX and C) while writing his dissertation on classical, medieval, and renaissance semiotics and textual culture in the early 1980s at Harvard University. Since that time, he’s been trying be at home in both worlds. After learning the Internet and Web technologies, he set up the first Web server at Georgetown in 1993, and went on to found the first interdisciplinary graduate program in post-Internet media and communication studies (CCT) in 1995. After many years of developing Web media, Martin also designed and implemented the first online-only graduate course in CCT in 2014. His recent publications include a semiotic and urban theory study of street art ("The Work on the Street: Street Art and Visual Culture" (chapter in The Handbook of Visual Culture, Berg, 2012) and a semiotic and interdisciplinary study of Remix and hybrid culture ("Remix and the Dialogic Engine of Culture: A Model for Generative Combinatoriality" (chapter in The Routledge Companion to Remix Studies, 2014). The current paper is part of a longer book project on semiotics, media, computation, and technology.


Key words

Semiosis; code; symbolic artefact; computation; sign actions.

Ajouter un commentaire

Un projet Labex Arts H2H