Skip to main content
SearchLoginLogin or Signup

0x003 Report III ::: Constraints of the Computational Paradigm

Document compiled from salon notes, interpretation & additional theory.

Published onJan 27, 2021
0x003 Report III ::: Constraints of the Computational Paradigm
·

The 0x Salon is an experiment in reformulating the salon context as a venue for post-disciplinary exchange.

The purpose of this text (and others to follow) is twofold: to provide some written documentation for the recent 0x Salon discussions on Algorithmic Realism and as an opportunity to further develop concepts arising in conversations at and around salon topics. We aim to capture the spirit of early-stage salon ideation whilst bridging the gap from speculative & generalistic discourse to substantive analysis, critique, synthesis and extension of the concepts in source materials and salon discussions.

This is a public draft / living document - we welcome your comments and critiques!

This is the third in a series of articles on Algorithmic Realism. Access the entire collection here.

0x003 Event Information

0xSalon003 (a) & (b) ::: Algorithmic Realism
b) Thursday 27th August 2020 ::: Theory Branch ::: Berlin
a) Tuesday 14th April 2020 ::: Praxis Branch ::: Virtual

Constraints of the Computational Paradigm

In modern computing approaches such as machine learning and genetic programming, a large part of the perceived-to-be “magical properties” of the technology are in reality brute force numerical processing and statistics, employing huge amounts of CPU cycles and electricity as they do so. The program outputs typically make - and hide - assumptions along the way about the nature of the statistical paradigms employed and interpretations applied to the phenomena under observation, for clarity or obfuscation depending on the intentions and goals of the system designers and implementers. In other words, certain statistical approaches may be employed transparently with auditable code repositories and permissive licensing or in a ‘black box’ clandestine matter. The historically-informed view of the objective nature of an algorithm (as discussed in 0x003 articles I and II) means that the enduringly hegemonic Church-Turing model of computation acts as yet another epistemic constraint. 

Diagram of an algorithm for the Analytical Engine for the computation of Bernoulli numbers, from Sketch of The Analytical Engine Invented by Charles Babbage by Luigi Menabrea with notes by Ada Lovelace. Wikimedia.

Even more problematic than obscurity via computational and statistical complexity, there is an increasing incidence of what can be thought of as ‘negative results’ in the mathematical sense. Theoretical, simulated and/or experimental investigations imply that some things are never answerable or computable. The answer is not true or false - the answer is that there is no answer. The assumptions that are employed in algorithmic system designs, simulations, formal validations and implementations critically require these more complete notions of incompleteness and incomputability - how can we apply frameworks such as Godel’s Incompleteness Theorems to systems or environments which exhibit some extent of realism due to unformalisability?

(Janitor’s Note: Later articles in this series will explore this in more detail)

These notions of computationalist realism in turn need to be embedded in both human and machine mediated decision-making procedures without excluding domains outside the narrowly-defined computational systems paradigm. In particular, biology and the natural sciences are areas where very strong cases can be made that discretised, sequential and deterministic modes of ‘cognitive’ function are not how natural ecologies, neural systems and inter-species assemblages operate - see for example 0x004 discussions of ants and slime mould. Revisiting alternative forms of computation - biological, analog electronic, paper, hydraulic, pneumatic, quantum and so on - some employing different modes of action with respect to binary digital computing may provide additional insights and affordances.

“The realization that computation was an incomplete affair radically challenged the axiomatic method and forced formal computation to admit that infinite quantities of information had entered its core. If a program is left to run according to precise algorithmic instructions based on the evolutionary drive of growth, change, adaptation, and fitness, then the computational limit arrives as the space of incomputable probabilities that reveal how abstract quantities can reprogram preset rules. 

Algorithmic architectures are used not simply to build profiles based on prefixed sets of algorithms, but to exploit the self-delimiting power of computation, defined by its capacity to decide when a program should stop … the incompleteness of computational models cannot simply be explained away by the paradigmatic substitution of biological dynamics for mathematical axiomatics. On the contrary, one must explain the incompleteness of computation by addressing contingency within algorithmic processing. This is to say that it is in the axiomatic method of computation that incomputable algorithms reveal the incompressible (infinite, nondenumerable, uncountable) discrete unities, which are strictly neither mathematical nor biological. Incomputable algorithms instead can only be defined by the immanence of infinities in finite sets of data.” 

Luciana Parisi, Contagious Architecture, MIT Press, 2013 (edited)

Connections
1 of 7
Comments
0
comment
No comments here
Why not start the discussion?