Skip to main content

0x003 Report II ::: Algorithms and Algorismus

Document compiled from salon notes, interpretation & additional theory.

Published onDec 09, 2020
0x003 Report II ::: Algorithms and Algorismus
·

The 0x Salon is an experiment in reformulating the salon context as a venue for post-disciplinary exchange.

The purpose of this text (and others to follow) is twofold: to provide some written documentation for the recent 0x Salon discussions on Algorithmic Realism and as an opportunity to further develop concepts arising in conversations at and around salon topics. We aim to capture the spirit of early-stage salon ideation whilst bridging the gap from speculative & generalistic discourse to substantive analysis, critique, synthesis and extension of the concepts in source materials and salon discussions.

This is a public draft / living document - we welcome your comments and critiques!

This is the second in a series of articles on Algorithmic Realism. Access the entire collection here.

0x003 Event Information

0xSalon003 (a) & (b) ::: Algorithmic Realism
b) Thursday 27th August 2020 ::: Theory Branch ::: Berlin
a) Tuesday 14th April 2020 ::: Praxis Branch ::: Virtual

Algorithms and Algorismus

The definitional scope of algorithms has crept well beyond the historical conception of formulaic, sequential and deterministic procedures consisting of a set of instructions. The formalisation of computing from a mathematical viewpoint by Church and Turing is somewhat responsible for this antiquated and constrained description, coming from the perspective of a (solo, analog) mathematician and a simple toolbox of mental arithmetic. 

A photograph of Al-Khwārizmī’s Hisab al-jabr w’al-muqabala, 9th Century CE


Algorithm” derives from the late medieval Latin “algorismus,” from the name of the Islamic mathematician Al-Khwārizmī, whose manuscripts in Arabic described the Indian system of arithmetic. The word’s meaning eventually developed into the technical definition employed in computer science today.”

https://phenomenalworld.org/analysis/long-history-algorithmic-fairness#fnref1:3 (edited)

To emphasise the way in which this mathematical notion of algorithms is rooted in pre-Modern/early Modern rationality, we will use the historic term algorismus. (This is a novel and somewhat idiosyncratic formulation, and none of the texts we quote use this terminological distinction between algorismus and algorithms.) For the purposes of this article, we can divide the ideas listed above into two general categories. The first is an arithmetic or mathematics-oriented notion of an algorithm, and the second is oriented around statistics and aggregate scale.

“In the 12th century and for a long time thereafter the spelling “algorism,” with an “s,” meant the rules and procedures for using the nine Hindu-Arabic numerals 1, 2, 3, 4, 5, 6, 7, 8, 9 and the cypher (Arabic “sifr”) 0, though the actual shapes of these characters were different in those days. “Algorism” therefore referred only to a very small collection of algorithms in our modern sense”

John N. Crossley, Alan S. Henry, “Thus spake al-Kwārizmī: A translation of the text of Cambridge University Library Ms. Ii.vi.5

Algorismus is mathematical in the sense that it operates in a procedural and sequential manner. In a key computer science textbook, a succinct proposed definition is presented alongside a simple example. The “sorting problem” poses a scenario where a sequence of numbers must be programmatically reordered.

“We will see several ways to solve the sorting problem. Each way will be expressed as an algorithm: a well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as output.

We express algorithms in whatever way is the clearest and most concise. English is sometimes the best way.”
Thomas H. Cormen et al., Introduction to Algorithms, 2nd ed. (MIT Press, 2001), 5

Calculating Table by Gregor Reisch, Margarita Philosophica, 1508


Despite it being a frequent point of discussion in scholarship on the history of computing, it’s worth repeating the original usage of the term ‘computer’, which referred to the person (often a woman) who performed computational calculations. And while computing today is automated, human labour still plays an indispensable role in producing computers.

In the nineteen-eighties, Apple called its headquarters the Robot Factory. “To understand the electronics industry is simple: every time someone says ‘robot,’ simply picture a woman of color,” Hyman advises. One in five electronics companies used no automation at all, and the rest used very little.”
J. Lepore, “Are Robots Competing For Your Job?

Any labour that follows a procedural sequence and involves the use or manipulation of material signifiers should be considered algorismus. The activities involved in both algorismus and algorithms have been transformed by recent advances in computational technology and their implementation in society. The two are intimately intertwined, despite the lack of attention given to the former. Algorismus seems to have faded from mainstream discourse in favor of a focus on big data and machine learning. But recently there’s been a surge of interest among scholars and activists in uncovering the role of labour in algorithmic technology and highlighting the ways that labour is invisibilised within algorithmic systems. This rediscovery of algorismus within algorithms is exemplified in the notion of ‘ghost work’:

“Beyond some basic decisions, today’s artificial intelligence can’t function without humans in the loop. Whether it’s delivering a relevant newsfeed or carrying out a complicated texted-in pizza order, when the artificial intelligence (AI) trips up or can’t finish the job, thousands of businesses call on people to quietly complete the project.”

Mary L. Gray, “Ghost Work”, Houghton Mifflin Harcourt, 2019.

The relationship between algorismus and algorithms is characterized by extimacy – a portmanteau of ‘exterior’ and ‘intimacy’ coined by Lacan. Like a Möbius strip, algorismus is the internal part of an algorithm that is exterior to it. When algorismus is obscured, the remaining algorithm appears mystical and incomprehensible. Structural linguists like Lacan have shown how notions of truth and meaning emerge from complex intersubjective social relations. When the social relationship at the core of machinic computation is obscured, meaning is obscured along with it. Non-human machines know no meaning. 

Mathematicians living in the middle ages and early Enlightenment understood algorismus as a process of producing clarity and knowledge. This practice of using mathematics to understand the world has remained relatively constant from the Middle Ages to modernity. By contrast, we use the term algorithm to describe the modern breed of programs that obscure information and introduce uncertainty and incomprehensibility into computing. Algorithms mystify the people they operate upon, including their creators. Expert technicians are unable to follow, characterise or formalise how such algorithms solve problems. Modern algorithmic systems that organise sensing, perception, ingress of inputs and operation based on contingent, non-sequential processes enacted on those inputs no longer fall under the strict umbrella of algorismus and in turn, this historical notion of “algorithmically” solving problems - and its hitherto central role in computing - is liable to itself be obsoleted. 

Crawford & Joler, Anatomy of an AI, 2018. High resolution copy available at https://anatomyof.ai/


Today, these reflexive learning systems are acting inductively in the wild and with increasingly noticeable impacts on the outside world. These models can absorb huge amounts of information from their environments in sophisticated ways through sensory technology, APIs, public and private datasets. Real-time scraping of data feeds from financial markets and meteorological measurements means that these algorithmic systems are responding immediately to stimuli across a tremendous diversity of information types. The programs that constitute these models and meta-models are essentially writing themselves, or at least their authorship isn’t traceable to a human sovereign. For example, the designers of the Facebook newsfeed algorithm would not be able to draw up a full formal specification, because the algorithm is defined to such an extent by all the data that’s coming in that it’s incomplete without being connected to a massive data stream. Much like labyrinthine monolithic enterprise software systems such as SAP, the algorithmic system as a whole becomes incomprehensible even to its designers and programmers. 

"So, on the one hand, we have algorithms* – a set of mathematical procedures whose purpose is to expose some truth or tendency about the world. On the other hand, we have algorisms – coding systems that might reveal, but that are equally if not more likely to conceal.

Claude Shannon’s chief concern was parsing signal and noise, and thereby increase the odds that a system would achieve a sufficient degree of order. Hence, he needed to devise a set of formal instructions – an algorithm, if you will, although he did not use the term specifically – capable of dealing with the cascade of determinations that governed communicative encounters. Shannon may have believed he was developing a ‘Mathematical Theory of Communication’, but in fact he produced among the first algorithmic theories of information. Prior to this during WW2, he worked on secret projects at Bell Labs and produced a lesser-known originally classified paper, ‘A Mathematical Theory of Cryptography’. Invoking these theories, ordinary communication is nothing other than a simpler case of cryptographic ciphering and deciphering - in effect using algorithms to attenuate algorisms.”

Ted Striphas, “Algorithmic Culture”, European Journal of Cultural Studies, 2015 - edited.  *Striphas uses algorithm where we would employ algorismus, and algorism/algorismus where we would use algorithm.


Software is yet another epistemically muddied concept today, as the hardware-machine / software-code binary distinction does not translate well to non-digital, non-classical and organic modes of computation. Code in the most general sense refers to a symbolic language used in some abstraction process. Source code refers more specifically to input material intended for computational execution. Digital programs, algorithms and software can be all thought of as being constructed from source code.

“Code draws a line between what is material and what is active, in essence saying that writing (hardware) cannot do anything, but must be transformed into code (software) to be effective ... code is language, but a very special kind of language - the only one which is executable. The material substrate of code, which must always exist as an amalgam of electrical signals and logical operations in silicon, however large or small, demonstrates that code exists first and foremost as commands issued to a machine. Code essentially has no other reason for being that instructing some machine in how to act. One cannot say the same for the natural languages.”

Alexander Galloway, “Protocol: How Control Exists after Decentralization”, MIT Press, 2004 - edited


“With the advent of alpha-numeric languages, human-written, nonexecutable code becomes source code and the compiled code, the object code. Source code thus is arguably symptomatic of human language’s tendency to attribute a sovereign source to an action, a subject to a verb. By converting action into language, source code emerges. The translation from source code to executable is arguably as involved as the execution of any command. More importantly, it depends on the action (human or otherwise) of compiling/interpreting and executing. Some programs may be executable, but not all compiled code within that program is executed; rather, lines are read in as necessary. So, source code thus only becomes source after the fact. Code is also not always the source, because hardware does not need software to “do something.”
Wendy Hui Kyong Chun, “On Sourcery, or Code as Fetish”, Configurations, 2008 - edited

Source code is by necessity legible to the people who write it. Machine learning (ML) algorithms, on the other hand, automate away the programmer’s work and excommunicate the author from the loop. Having discarded human authorship, ML algorithms do away with modern reasoning as well. Algorismus depends on reasoning, whereas algorithms emerge from pattern recognition. The language of ML algorithms is often unintelligible, but some researchers are working to bridge the gap between symbolic language and ML language

Scientists who work with algorithms and lay people alike are mystified by the way that ML produces unexplainable results. Spirituality seems to offer a language capable of re-inscribing ML algorithms inside the bounds of human reasoning. It also points to people’s desire for a “theory of algorithms” - a desire shared by some of the authors of this paper and the 0x Salon discussion participants. An upcoming article ‘Corpus Algorithmi’ will discuss this in greater detail.

“Science, art and religion are all about making the invisible visible. Supposedly the domain of science is what's measurable, the domain of art is what's experienceable but not measurable, the domain of religion is what is beyond the realm of experience. There's a huge overlap, and unsurprisingly one anecdotally experiences them to be highly contingent.”
Oliver Beige, Epistemic Trespassing Salon Report, 0x Salon

0x Salon, Corpus Algorithmi, 2020.


“Today, amidst the expanding capacities of AI, there is a tendency to perceive algorithms as an application or imposition of abstract mathematical ideas upon concrete data. On the contrary, the genealogy of the algorithm shows that its form has emerged from material practices, from a mundane division of space, time, labor, and social relations. Ritual procedures, social routines, and the organization of space and time are the source of algorithms, and in this sense they existed even before the rise of complex cultural systems such as mythology, religion, and especially language. In terms of anthropogenesis, it could be said that algorithmic processes encoded into social practices and rituals were what made numbers and numerical technologies emerge, and not the other way around. Modern computation, just looking at its industrial genealogy in the workshops studied by both Charles Babbage and Karl Marx, evolved gradually from concrete towards increasingly abstract forms.”
Matteo Pasquinelli, Three Thousand Years of Algorithmic Rituals: The Emergence of AI from the Computation of Space

Connections
1 of 7
Comments
0
comment

No comments here