A hit, a very palpable hit – Hamlet
Never before have we had so many researchers in the world, so much knowledge to build on, and so much computational power and experimental technology. However, science productivity has dropped and technological change has stagnated in recent decades. This may result from too much science being centralised and disconnected from technology and from people’s lives. Furthermore, we need to develop better ways of moving young researchers faster to the frontiers from which they can then begin to create new knowledge.
Learning is evolutionary and cumulative. It is Bayesian. It accords with Occam’s razor, and aims to minimise energy use needed to optimise learning. Learning comes largely from doing and involves sensory feedback that helps retain learning.
Learning retention is directly related to the extent that environmental or experiential feedback is palpable. This ensures learning feedback is felt in impactful ways and therefore retained. Some feedback is feather-light, some shocking enough to trigger amygdala-mediated recalibration, as in PTSD.
Sensory feedback is translated through and shaped by touch, sound, vision, smell, sensation, respiration, temperature, heart rate and the emotional circuitry, biochemical and physiological processes associated with them. Palpability works at all learning levels – including the highest. Great scientific advances often involve feelings of transcendence and awe that are visceral, “sending a shiver down your spine”.
Learning is therefore haptic, emoted, tangible, that is, it is palpable. There is therefore no mind-matter nor self-world dichotomy, only learning endogenous to the world.
Learning forms connect through an ensemble of processes that operate from primitive life right through to advanced technological innovation.
All life evolves through natural selection. Within a rugged fitness landscape, variations are selected through differential survival. What survives reflects what the past has “taught”. Primitive life learns as new information is genetically encoded, cumulates, is passed on genotypically and expresses itself phenotypically. Later comes consciousness, and from this learning by doing, and purposeful and prospective higher learning.
The Second Law of Thermodynamics suggests that entropy can lead to disorder. Karl Friston asked how, given entropy, can an organism stay alive? Organisms create structures from the cellular to the most complex levels that are akin to Markov blankets protecting from external harm, whilst drawing resources and learning from the external environment.
Friston argues that neuronal processes start from prior beliefs and make predictions which are then compared with sensory feedback. They aim to minimise the gap between predictions and feedback. Friston labels this gap “free energy” or “surprise”, though it is better termed “waste energy” that life depends on minimising. Cognitive processes are very energy-intensive, and therefore energy must be used as efficiently as possible, given the learning to be acquired.
When feedback differs from predictions, the brain can update its assumptions, or it can seek to change reality, that is to remake the environment within which it is living. The brain can also deny the feedback and fail to adjust to reality, which can be fatal as natural selection does its work.
Efficient cognitive energy use requires focus and parsimony. Einstein said “everything should be made as simple as possible, and no simpler.” Occam’s razor holds that where there is more than one explanation for a phenomenon, the one with the fewest assumptions is likeliest to be correct. This parsimony principle is applicable in physics, biology and in learning. Rivers rise in the mountains and follow the easiest path to the sea. Some metal alloys and muscle have memory. Neuronal connections that wire together fire together, and so minimal brain energy is used once connections are well developed.
In its primitive origins most learning was concerned with survival. Learning by doing is amplified by necessity, that is by the drive to survive through natural selection. Unschooled street urchins learn the language and arithmetic needed to trade.
Blue skies thinking, ideation and thought experiments have long contributed new insights and generated theory and propositions far in advance of means of testing them, let alone of practical applications. However, science is not generally driven by free-ranging intellects rating each other’s papers. It comes from engagement with the real world. Science may lead, but more commonly it follows technology that links science to human experience. Technological innovation solves real world problems, and technology then advances science.
Most fundamental scientific advances are grounded in or spin-offs from practical problem solving. Radio astronomy arose from Jansky’s telecommunications work at Bell Labs. Louis Pasteur became microbiology’s father by tackling health and agricultural problems, leading to advances in vaccination, industrial fermentation and of course pasteurisation.
Technological change typically starts with trial and error learning rather than ex ante theory. From this learning, theory with explanatory and predictive power is developed and used to refine technology towards optimality. That is, reality creates data, and from this data theory is developed, rather than abstract theory generating hypotheses that selectively narrow the data against which they are tested.
Rather than being concentrated in universities or research laboratories, learning by doing typically draws on tacit knowledge that is decentralised and differentiated by local context. It encompasses Alfred Marshall’s view that “the secrets of industry are in the air”. It fits with the Austrian economic preference for methodological individualism, and its discomfort with mathematical modelling and macroeconomic analysis.
In industrial settings learning can occur as an unplanned and natural social process. Erik Lundberg demonstrated that, beginning in the 1930s, the Horndal steel works in Sweden achieved productivity gains of 2% a year for 15 years with virtually no new capital investment. In his famous 1962 learning by doing paper, Kenneth Arrow argued that technological change is a process of learning about the environment in which we operate. Production activity gives rise to problems for which favourable responses are selected over time. That is, evolutionary natural selection is at work.
Arrow integrated learning by doing into endogenous growth theory, arguing that productivity growth results from internal factors not from knowledge showers from welkin ivory towers. Technological innovation arises from differentiated, domain-dependent learning that is socially interactive and integrating. It is an evolutionary and cumulative process that lays a platform for further knowledge creation and technological functionality that is superior to that which came before. Knowledge’s indivisibility and non-rivalry means sharply diminishing marginal costs as ascending knowledge platforms build from antecedents and create exponentially new potentialities and affordances. To survive and to be passed on, knowledge has to be valuable. Such knowledge tends to be irreversible, and if it underpins new learning it is typically cumulative.
Nassim Taleb describes as a “Ludic Fallacy” abstract modelling of the real world and making long-term predictions from this. A map is not the territory – a representation of something is not the thing itself. A model can so simplify reality as to turn it into non-reality. Desk-top modelling that purports to predict long-run future states departs from messy reality that can only be understood by those who live in, engage with and receive feedback from the world.
Wall Street and City of London financiers can do a lot of damage through naïve and unworldly modelling and manipulative rent seeking. In contrast, London’s Inns of Court live through experience and felt necessity. Common law is not academic brain coinage. It builds enduring legal principles from case law derived from interactions between warm-blooded people doing things that matter to them. Common law has its feet on the ground, not its hand in the till.
Abstract models may start with well-informed prior beliefs, but they are projected long into the future without ongoing updating from real world feedback. In contrast, plumbers, chefs and musicians all receive instantaneous or near-term feedback. Not so financial and macroeconomic modelers.
Learning by doing is undertaken by those with “skin the game”, and with incentives to minimise energy use. Abstract modellers typically face no risk from failure, and lose nothing when models mislead because others bear the consequences. Economic and financial modelers move on before their prognostications are mugged by reality – with some exceptions.
Long-Term Capital Management (LTCM), a hedge fund, was established in 1994 based on an absolute returns and high financial leverage model. On its board were Robert Merton and Myron Scholes, who had been awarded the Nobel Prize for a “new method to determine the value of financial derivatives”. LTCM was caught out by the 1997 Asian and the 1998 Russian financial crises. LTCM’s collapse caused another international financial crisis, and it was dissolved in 2000.
In contrast with much financial and economic forecasting, epidemiological models are grounded in empirical evidence, with exponential projections updated with new data from real-world observations. Coronavirus modelling projections were Bayesian and constantly revised in near real-time. Testing, diagnosis, treatment, demographic, sociological and behavioural data led to revised inferences, and sometimes to the abandonment of flawed prior beliefs.
Learning involves natural selection, so that beliefs that are maladaptive pass away funeral by funeral. However, some flawed or false beliefs, and the institutions, societal rules and norms that instantiate them persist long after beliefs have been logically discredited. Dogmatic belief defies reality because it is not exposed to it.
In past times belief, only some of which was knowledge, was translated into traditions, rituals, religious beliefs or memeplexes that were passed on through the generations. Uncritical acceptance economised on cognitive energy, but stifled new learning. The lack of Bayesian updating of beliefs over time made them maladaptive.
The changing world leaves behind outdated cultural, religious and scientific beliefs , except where they ossify into sects, dogmas or identarian habitations. These can function as group Markov blankets, with a surface tension that blocks out external sensory feedback and critical thinking – for a time… “Darwin’s dangerous acid”, the receding tide on Dover Beach, new Kuhnian science revolutions, the Vatican’s 1992 newspaper headline: “Galileo got off”, and the 2008 global financial crisis are surfactant way marks.
Transformative new technological advances and potentialities have trended down over recent decades. Different reasons for this are cited. Perhaps the greatest advances have already occurred, and we are up against natural law limits? Perhaps too many fine minds are playing on-line games, or are engrossed in Facebook? Or are scientific and technological advances being constrained by their public good nature, which means industry doesn’t invest in them because their benefits are not privately appropriable?
What seems plausible is that science’s self-referenced peer review nature has led to disconnectedness from technology and therefore from people’s lives. In our times, the overweighting of abstract science and disembodied information technology means we have become remote from tangible material technology and engrossed in abstractions. Yet living standards depend on the material engineered world, not its internet-mediated phantasmagoria. Our mobiles require metals and rare earths, and our bitcoin consumes electricity produced by coal, natural gas, concrete hydro-dams and metal wind turbines.
Prior knowledge is the starting point for new learning. As Pasteur said: “fate favours a prepared mind”. As knowledge in every discipline accumulates there is a higher threshold learning burden to reach the starting point for new knowledge creation, let alone to move beyond it. This can lead to longer research apprenticeships as doctorate graduates have to complete a post-doctoral fellowship before they have a chance for a secure career. It also leads to more narrow specialisations, and often to larger teams being required to pull together the capabilities needed to make progress.
However, these constraints can be overcome. Much science is translated through technology into optimisation routines that codify complex knowledge and make it accessible without too much need for every technology user to have deep understanding.
Knowledge embodied in technology that is digitally retrievable can be functionally valuable to learners without needing to be visible. New knowledge and means of acquiring it can be subject to low or near-zero marginal costs of dissemination. Digital technology codifies pertinent knowledge and enhances its retrieval. Computational technology will only get better. Autonomous intelligence will drive learning advances unforeseeable to us now. Whether this will take eons, eras, periods, epochs, ages or next year only time will tell.