Skip to main content
SearchLoginLogin or Signup

0x003 Report VI ::: A Recipe for Disaster?

Document compiled from salon notes, interpretation & additional theory.

Published onFeb 13, 2021
0x003 Report VI ::: A Recipe for Disaster?
·

The 0x Salon is an experiment in reformulating the salon context as a venue for post-disciplinary exchange.

The purpose of this text (and others to follow) is twofold: to provide some written documentation for the recent 0x Salon discussions on Algorithmic Realism and as an opportunity to further develop concepts arising in conversations at and around salon topics. We aim to capture the spirit of early-stage salon ideation whilst bridging the gap from speculative & generalistic discourse to substantive analysis, critique, synthesis and extension of the concepts in source materials and salon discussions.

This is a public draft / living document - we welcome your comments and critiques!

This is the sixth in a series of articles on Algorithmic Realism. Access the entire collection here.

0x003 Event Information

0xSalon003 (a) & (b) ::: Algorithmic Realism
b) Thursday 27th August 2020 ::: Theory Branch ::: Berlin
a) Tuesday 14th April 2020 ::: Praxis Branch ::: Virtual

A Recipe For Disaster?

Let’s situate the algorithm in context as part of a greater whole, for example a cybernetic-feedback type system. We can then consider an algorithm as a socio-technical (or cyber-physical) primitive, perhaps most easily conceived of as an arbiter - adjudicating when policies are created or adjusted, determining when other network / system-level priorities and actions take place. 

One of the simplest analogies for algorithms (or algorisms in our proposed terminology) is that of the cooking recipe - let’s develop it to see how well it holds up. (As illustrated by experiments in applying machine learning algorithms to data from cooking recipes, there’s a stark difference between algorithmically-generated recipes and standard algorismic recipes, each with different limitations. At a glance both appear to be in the same format– recipe name, list of ingredients, sequential instructions– but the items produced by the two recipes are dramatically different. The algorithmic instructions are generally syntactically correct, but semantically absurd or impossible to execute. Rather than offering a path to satiating the reader’s hunger, these algorithmic recipes are more likely to produce laughter than an edible treat. The apparent absurdity of the recipes isn’t an “optional ingredient”– it’s baked into the structure of ML algorithms. The recipe-as-data fed to the ML algorithm is stripped of it’s culinary meaning. It would be chaotic/schizophrenic to treat the algorithmic recipes as if the terms were still tied to their traditional meaning.) An algorism is a procedure that is administered for example through a compiler. There's a formalist, deterministic “if this, then that” decision-making logic and our goal is to reduce that to a linear procedure. Naturally when implemented, the algorithmic multiple has some social implications - for example meals which are cooked using a recipe. Here the analogy weakens for a strictly symbolic technologically-mediated case where those procedures are ‘enshrined’ in code and then executed in some downstream environment at a later time, with outcomes beyond our control or influence but with implication and (ethical / legal) liability through time and space. 

Sticking with the cooking analogy, the sharp, glossy images on food shows like the Bon Appetit Test Kitchen are immaterial in the same way that algorithms are immaterial/symbolic. They satisfy their viewers simply by being seen, without the viewer needing to prepare the recipe themselves. 

Tatsiana Tsyhanova

There's a lot of semantic baggage entangled in these words, algorithm, policy, recipe, procedure, law which leaves the analogy with an impossible amount of work to do in this case. Cooking in practice is rather a realist activity as there are few penalties for deviating from the specification, so perhaps a more constrained analogy would be useful. Chemical synthesis is defined as: the artificial execution of useful chemical reactions to obtain one or several products. This occurs by physical and chemical manipulations usually involving one or more reactions. In modern laboratory uses, the process is reproducible and reliable.1

Coincidentally, these reactions (and their symbolic representations) can also be thought of as code, and the execution of that code is one or more chemical reactions in an experimental procedure. [Indeed, a friend of the salon developed the ChemPuter software] A successful chemical synthesis will be constituted of many steps (sub-processes), some in serial and some in parallel. Due to the often exacting requirements of conducting these reactions (some require the absence of water, light, heat, oxygen or some combination thereof) there is much less tolerance for deviation from the precise formulations than in our culinary example. In a way this epistemic tolerance delta is akin to the epistemic gap (in 0x Salon parlance) or Ballantyne’s notion of analogical distance - the closure (or indeed commensurability) of formalist and realist approaches that must necessarily take place on account of their eventual contingency (a play on words with ‘eventual consistency’). Is it the case that the greater the gap to bridge, the more fraught with danger this epistemic leap would be? The mistake that the Algorithmic Realism paper addresses (where programmers fail to account for the actual context that their algorithms run in / “computer science lacks the language and methods to fully recognize, reason about, and evaluate the social aspects and impacts of algorithmic interventions”) could be rehashed as: programmers think they’re doing chemistry when they're actually writing a cookbook.

Mothers for Nuclear / Pinterest

The analogy of the recipe is nice in a way, as we can take it further to delineate between the algorithm and a system that executes or implements it. Writing a recipe for chocolate fudge brownies is just an algorithm we haven't implemented nor pushed into the wild. So if the salon collectively makes a recipe for chocolate fudge brownies, and we put potassium cyanide in that recipe, we haven't hurt anyone by creating the algorithm. But once that algorithm is implemented and once people start cooking…let’s say it goes in a best-selling cookbook. Once people start using that and harm occurs - you can't even say that there's a lack of understanding of the consequences, those consequences are probably quite deliberate. But the implementation of the algorithm into something that exists in the wild, is where the kind of real world ‘impact’ comes from. Was it wrong for the salon to write a book of dark chemistry cooking jokes if there's a risk in the future of someone finding that book and interpreting it as an actual cooking book? Then suppose an institution is formed in the form of a community cooking club, and then people start cooking with our poisonous procedures. Then occurs a weird algorithmic mixing of procedures, the instantiation of those procedures and text, the taking of that text into the context of some other institution, the implementation of those procedures in the cooking itself, and the resulting outcomes in the real world. However, the way that cookbooks and algorithms circulate in the market has some key differences. Individual people can choose to buy a cookbook and cook the meals described inside. Algorithms can work this way too. But the algorithms that have the greatest impact are bought or developed by large organisations i.e. governments and companies, and then run on users who have no choice but to click “consent.” Sticking with the food analogy again, a more comparable example would be if a restaurant made and sold the cyanide-laced brownies from the 0x Salon Cookbook.

As simple procedural algorithms have the property of recursive composability, given the preponderance of open-source software anyone can implement an algorithm as a sub-procedure in another program. To what extent can the creator of the original procedure be held liable or responsible for these actions, by necessity downstream in time and space from them - is this analogous to including harmful or potentially allergenic ingredients (with or without intent) in a formulation such as a recipe? There appears to be very few ways to align creator's intent and desirable downstream uses other than the usual intellectual property and licensing approaches, but the extent to which these can be relied upon in a statutory law court largely remains to be seen. Even in this very simple idealised case of the recipe and cooking club, we're seeing potential consequences spiral out of control, despite each individual step being ostensibly reasonable in itself.


1. Vogel, A.I., Tatchell, A.R., Furnis, B.S., Hannaford, A.J. and P.W.G. Smith. Vogel's Textbook of Practical Organic Chemistry, 5th Edition. Prentice Hall, 1996

Connections
1 of 7
Comments
0
comment
No comments here
Why not start the discussion?