Eric Schliesser proposed the idea of synthetic philosophy, first in his 2019 review of Daniel Dennett’s From Bacteria to Bach and Back: The Evolution of Minds and Peter Godfrey-Smith’s Other Minds: The Octopus and the Evolution of Intelligent Life and then in a recent Substack post. In the latter, he defines it as follows:
Synthetic philosophy is a style of philosophy that presupposes or develops expertise in a general theory (or a model, a certain method/technique, etc.) that is thin and flexible enough to be applied in/to different special sciences, but rich enough that, when applied, it allows for connections to be developed among them with the aim to offer a coherent account of complex systems and connect these to a wider culture, the sciences, or other philosophical projects (or both).
Examples of such general theories are Darwin’s theory of evolution (playing central role in the books by Dennett and Godfrey-Smith), game theory, or information theory. In the opening of his 2019 paper, Schliesser writes:
‘synthetic philosophy’ [is] a style of philosophy that brings together insights, knowledge, and arguments from the special sciences with the aim to offer a coherent account of complex systems and connect these to a wider culture or other philosophical projects (or both).
Regular readers will certainly not be surprised when I say that engineering theories, encompassing computer and system sciences, should play the leading role here due to the fact that the complex systems that fall within the purview of synthetic philosophy are open systems that reside in heir own specific environments, have internal values, and are in interaction with external entities and observers who in turn have their own systems of values. This view can be neatly summed up by a quote from A Pluralistic Universe by William James:
everything is in an environment, a surrounding world of other things, and ... if you leave it to work there it will inevitably meet with friction and opposition from its neighbors.
Again, as I have been writing here on multiple occasions, modern control theory is the precisely the right “general theory” for the job. Control theory deals with open systems which exchange matter, energy, and information with their environments; it brings the interplay of structure, function, and organization of complex systems to the fore; and it makes contact with all the other disciplines listed by Schliesser and enlisted by Dennett and Godfrey-Smith (information theory, game theory, theory of computing, and evolutionary theory). It would not be a stretch then to envision a synthetic philosophy of society (and of social systems) built on the scaffolding of control theory. A nontrivial, if not entirely convincing, attempt at something like this can be found in the first hundred and twenty pages of James R. Beniger’s 1986 book The Control Revolution: Technological and Economic Origins of the Information Society. Now, given the wide net Beniger is casting, his approach is not without its critics. As an example, here is what Henry Farrell wrote on Bluesky:
When social scientists start by mixing Gödel, evolutionary theory, Weberian bureaucracy, the heat death of the universe, Talcott Parsons, the halting problem and the industrial revolution into a brightly painted swirl, I start getting nervous.
Cosma Shalizi is similarly unsparing in his review of Beniger’s book:
do we really need mini-articles on Principia Mathematica, Gödel, Talcott Parsons, the origin of molecular biology in physics, Structuralism, and even an “addendum” on the nature of life?
All of these are fair points, but hey—if it was somehow ok for Dennett to throw Darwin, Turing, Gödel, von Uexküll, Sellars, and Shannon into the mix, then it is surely ok for Beniger to do something similar. Any synthetic philosophy worth its salt should be wide-ranging and ambitious. This is not to say that I don’t have a few bones to pick with Beniger, which I will reveal in due time.
Vive la révolution du contrôle!
Here is Beniger’s objective in a nutshell. He wants to explain the historical roots of “the information society,” the peculiar nexus of communication, control, and material conditions that constitute modern capitalism. In this, he follows the Australian economist Colin Clark, who had identified three modes of economic activity: primary (extraction of natural resources), secondary (manufacturing), and tertiary (service). Beniger’s thesis is that the management of all the organized complexity around these activities, apart from necessitating the growth of state and corporate bureaucracy, benefited in no small part from the concomitant spread of communication and computing technologies. These, in turn, have been co-evolving with a wide variety of control mechanisms, ranging from laws, rules, and regulations to communication networks, data mining, marketing surveys, political polls, and the like. The control revolution he has in mind is technological; but, for it to have become possible, three other major transitions (to borrow an apt term from John Maynard Smith and Eörs Szathmáry) had to have occurred first—the emergence of life (or molecular programming) approximately one billion years ago, followed by the emergence of culture (or learning by imitation and then learning from demonstrations) approximately 100 million years ago, followed by the emergence of bureaucracy (the institutions of taxation, conscription, and forced labor—see James C. Scott) approximately ten thousand years ago, and finally followed by the emergence of industrial technology (control revolution) about two hundred years ago. Beniger associates control with organized, as opposed to merely ordered, systems, and the story of control revolution(s) is one of the forces of organization and many varieties of programming locked in a struggle against the forces of entropy and what Émile Durkheim had called anomie (the peculiarly modern manifestation of Hobbesian state of nature, the breakdown of norms and regulations in the face of increasing decentralization brought about by the industrial society). But, in order to arrive at this point, he has to first locate the emergence of control in the origin of life, and then to trace the progression of complexity from single-celled organisms all the way to societies and their culture (from bacteria to Bach, as it were).
The varieties of controlled experience
Preparatory to anything else, let’s agree upon the definitions. Beniger defines control as “purposive influence toward a predetermined goal.” This is not far from existing definitions offered by control theorists. For example, Alistair G.J. MacFarlane defines control as “effective interaction” and gives two examples taken from the biological and the technological realms, respectively: a bird hovering in midair and an aircraft going in for the landing. Jan Willems, on the other hand, speaks of “control as interconnection:” Two (or more) systems, each with their own laws prescribing their behavior, mutually constraining each other and thus restricting the resultant behavior in some desired way.
Willems’ definition is, perhaps, the broadest, as he wants to expand the scope of what we interpret as control to include not just the usual cybernetic idea of the feedback loop involving the processing of sensor outputs to determine the control inputs which then drive the actuators (what he calls “intelligent control”), but also such passive control systems as, say, shock absorbers or pressure valves or heat sinks. While it may be expedient for the purposes of analysis or human understanding to interpret their operation in terms of signal flow diagrams, there isn’t really any signal processing going on there; rather, one system is physically coupled to another with the goal of reducing (or redirecting) the variety of possible behaviors that could be generated. Beniger’s definition is in good accord with this: markets, corporations, bureaucracies, the Internet, HVAC systems, clocks, flush toilets, biological organisms, individual cells, etc. etc. are all instances of control(led) systems. We will come back to the whole issue of “intelligent” control a bit later when we get to Maxwell’s demon, everyone’s favorite intelligent controller.
The importance of being open
The importance of open systems is not lost on Beniger. He starts by introducing the “society as processor” metaphor, which naturally leads him to the fundamental recognition that, just like living organisms, societies are
open systems with significant inputs, throughputs, and outputs of various sorts of matter-energy and information. Processing these is all they do—a deceptively simple fact not widely recognized by the scientists who study them
(he takes this quote from James Grier Miller’s 1978 book Living Systems). And if there is one thing we know about open systems, is that they rely on control mechanisms to maintain certain relations with their environments (each such system “will inevitably meet with friction and opposition from its neigbhors”). Again, the contrast between order and organization is apt: Beniger brings up an example of an amoeba, which is
not at all well ordered; it is a formless bag full of sticky fluid in which irregularly shaped molecules float haphazardly. In stark contrast to even the most complex crystalline structure, however, the amoeba is highly organized, and indeed its description requires several hundred large volumes, the information storage capacity of the DNA in which the structure is in fact recorded. That living material has greater complexity of organization holds even at the molecular level, which explains why students normally learn physical before organic chemistry.
This highly organized structure is one salient feature of controlled (and controlling) systems, both biological and technological. It goes hand in hand with complexity and management thereof. In fact, one key aspect of control is the reduction of externally perceived complexity at the expense of increased internal complexity, an observation made by many control theorists, such as George Zames or Alistair MacFarlane. For Beniger, this is inextricably linked with purposiveness, but he is quick to dispel any suspicions of teleology or vitalism in favor of teleonomy, the Leibnizian trick of disguising final causes in a web of efficient causes operating a lower level of abstraction. Again, openness is key since it is the only way of exorcising Laplace’s demon via the duality between knowledge and control—as aptly noted by Claude Shannon, “we know the past but cannot control it; we control the future but cannot know it.”
The revolution will be programmed
This gets us to one of the key components of Beniger’s whole framework, which I also find to be one of the most questionable—that of control being implemented by means of programs:
all control is thus programmed: it depends on physically encoded information, which must include both the goals toward which a process is to be influenced and the procedures for processing additional information toward that end.
To be fair to Beniger, he is very liberal with his notion of what constitutes a program—it could be adjustable, like the patterns of the Jacuqard loom, or built directly into the fixed physical mechanism of the system, like the mechanism of the naval chronometer. Nevertheless, while one could conceivably attribute program semantics to clocks (as Brian Cantwell Smith had done, for example), it would be much less convincing to view the function of, say, a suspension system in an automobile from the vantage point of programs. On the other hand, the emphasis Beniger puts on programs is perfectly in line with his emphasis on teleonomy as opposed to telelology: Since the entire program must be physically instantiated in some form before it can be put into action, this automatically blocks any attempt to smuggle in some contraband élan vital.
At this point, Beniger brings in another important feature of control systems: They exert control by means of making decisions. Indeed, since programs are syntactic (or “inert,” per Beniger), they must interface with the material world through transducers and effectors, and their end result is a decision being made and carried out. This is how Beniger ends up dragging in Principia Mathematica, Turing, Gödel, the halting problem, and all that. In order to be used, decision procedures (a term apparently first introduced by W.V.O. Quine, as Beniger hastens to point out) must be represented by finite means and using finite resources, which naturally brings in the questions of decidability, computability, and complexity.
This particular detour, I must say, is not as outlandish as it may seem. Indeed, A.G.J. MacFarlane had been emphasizing the tension between parsimony (process description) and experience (data description), as seen through the lens of algorithmic information theory (Kolmogorov complexity). Another control theorist, Roger W. Brockett, opened his 2009 Peter Sagirow lecture Cybernetics, Artificial Intelligence, and the Avoidance of Mathematical Blind Spots at the University of Stuttgart, with the following (rough) historical outline of “200 years of thinking” in systems and control:
Vitalism explains nothing, physics explains everything.
Feedback and signal processing explains everything (that physics cannot).
Rules (Boolean manipulation) explains everything (that physics and feedback cannot).
Statistical analysis and Bayes’ rule explains everything (that physics and feedback and rules cannot).
Computational Complexity explains everything (that physics and feedback and rules and statistical analysis cannot).
This caricature is not too bad, taking us from all the way from Julien Offray de la Mettrie’s L'homme machine (1747) to the Cook-Levin theorem (1971) and beyond by way of Shannon, Wiener, Bode, Nyquist, Kalman, McCulloch and Pitts, Turing, Church, Vapnik and Chervonenkis, Neyman and Pearson, Ramsey, de Finetti, Savage, Wald, Blackwell, etc. Viewing the endless variety of both natural and artificial control systems through the functionalist lens, as computational and informational abstractions, is a powerful tool for conceptual analysis and for system design.
But we must be careful not to fall prey to what Alfred North Whitehead termed “the fallacy of misplaced concreteness:” Just because so many of our technological control systems are based on programs and signal processing (especially, as Willems points out, given the ubiquity and ease implementing control using digital logic devices and computers), we must not mistake the abstract functional picture of control for its physical implementation, especially in living systems. Yet this is precisely the trap Beniger’s account falls into, as I will show in the next post. That story will begin with Maxwell’s demon.
(to be continued)
If I can reconstruct what was going through my mind 29 (!) years ago when I read and reviewed the book, it wasn't so much that I objected to the _fact_ of Beniger addressing such a wide range of topics, but rather that I wasn't impressed by how he did so: "long-winded, excessively detailed, ... not as well-grounded as the remaining, historical parts of the book". Bluntly, I was not at all as confident in Beniger's mastery of this material, and so his reliability as an informant, as I was confident in his historical narrative. I'd have to re-read it to see _why_ I had that impression, and perhaps it was unfair. Certainly, in retrospect, those lines in my review also had an element of "the average person only knows the formulas for olivine and one or two feldspars" [https://xkcd.com/2501/] (though in justice to young-Cosma, he also wrote "perhaps this is demanded by Beniger's audience, historians and social scientists who are unfamiliar with these matters").
This might be a question for Eric Schliesser too.
Schliesser's Synthetic Philosophy (SP) sounds a lot like General Systems Theories (GSTs) of the lore. It is especially interesting that both Cosma Shalizi and Henry Farrell's critiques apply (without changes in imo) to GSTs too. I'm curious if you've thought about similarities and dissimilarities between SP and GST of von Bertalanffy, Boulding, Parsons and Rappaport and others.