Categories

Plants doing arithmetic

A preview of a paper to be published in the journal eLife was provided by phys.org on June 23.  Plants do sums to get through the night researchers show, was the title given their report.

New research shows that to prevent starvation at night, plants perform accurate arithmetic division. The calculation allows them to use up their starch reserves at a constant rate so that they run out almost precisely at dawn. “This is the first concrete example in a fundamental biological process of such a sophisticated arithmetic calculation.” said mathematical modeller Professor Martin Howard from the John Innes Centre.

…During the night, mechanisms inside the leaf measure the size of the starch store and estimate the length of time until dawn. Information about time comes from an internal clock, similar to our own body clock. The size of the starch store is then divided by the length of time until dawn to set the correct rate of starch consumption, so that, by dawn, around 95% of starch is used up.

An ‘in press’ copy of the paper by Antonio Scialdone (John Innes Centre), Sam Mugford (John Innes Centre), Doreen Feike (John Innes Centre), Alastair Skeffington (John Innes Centre), Philippa Borrill (John Innes Centre), Alexander Graf (ETH Zurich), Alison Smith (John Innes Centre), and Martin Howard (John Innes Centre) can be accessed here.  From the paper:

Overall, these results demonstrate that the control of starch degradation at night to achieve almost complete consumption at the expected time of dawn can accommodate unexpected variation in the time of onset of darkness, starch content at the start of the night, and patterns of starch accumulation during the preceding day. Although the rate of degradation is different in a circadian clock mutant with an altered period from that in the wild-type, the capacity to adjust starch degradation in response to an unexpectedly early night is not compromised.

It’s important to understand that the computations do seem to be actions since they accommodate variations in the length of the night as well as the amount of starch acquired.  The quantities involved are established by mechanisms that operate within the plant’s structure that respond to the start and end of night, and molecular activity that encodes the amount of starch stored. “Since it is conceptually unclear how such a computation might be performed,” the authors tell us, “we turned to mathematical modeling to generate possible mechanisms.” They assumed the presence of two molecules, one whose concentration was related to the amount of starch present, and one whose concentration encodes information about the expected time to dawn. Arithmetic operations are then implemented by chemical reactions.

It is a longstanding idea that cells are able to use proteins to store and process information through networks of interactions (Bray, 1995. Understanding how such biochemical networks work and what kind of computations they perform is an ongoing challenge (see Deckard and Sauro, 2004, Lim et al., 2013). Our analysis here has underlined the utility of analog chemical kinetics in performing arithmetic computations in biology. Importantly, we have for the first time provided a concrete example of a biological system where such a computation is of fundamental importance.

There is something fascinating, I believe, about the relationship between these molecular actions and our symbolic arithmetic. At the very least, this research demonstrates that outside of our symbolic representation of a computation, the action of a computation exists. It reminds me of a discussion of mathematics I once read where it was noted that the derivative can be expressed in more than one way – as a rate of change, as the slope of the tangent to a curve, or fully arithmetically as a limit. So, the question was asked, what is the derivative? Inevitably, the careful observation of something like a plant’s performance of division will contribute fresh insight into the way one might try to answer this question.

 

 

Finding ourselves between physics and biology

The Institute of Physics (IOP) Biological Physics Group has a conference coming up June 24 to June 26 in Brighton, UK.  The title of the conference is what first got my attention: Physics of Emergent Behavior/From single cells to groups of individuals.

The following text appears on the conference home page to introduce their interest in emergent collective behavior:

Biological systems are often conceptualised as networks of interacting genes and proteins. Nevertheless, a simple analysis of the fundamental genetic programs is often not sufficient to explain higher-level functions such as multi-cellular aggregation, tissue organization, embryonic development, and collective behaviour of groups of individuals. Furthermore, various aspects of these processes are often emergent properties of the underlying complex system, irrespectively to its microscopic details. In the past few years, larger scale experiments allowed the construction of statistical mechanics models of biological systems directly from real data, producing immense progress in our understanding of emergent collective behaviour in biology.

On their list of invited speakers is Dante Chialvo from the National Research Council of Argentina. He happened to be first alphabetically and, since his work is described as exploring “the interface of physics and biology on a variety of problems,”  I decided to try to find some of his papers and/or talks.  Chialvo’s work is largely involved in finding evidence to support the idea that the brain itself is in, what physics calls, a critical state. The rationale for proposing this idea is presented at the beginning of a paper that Chialvo co-authored with Enzo Tagliazucchi in 2012.

Complexity, in simple terms, is all about how diversity and non-uniformity arises from the uniform interaction of similar units. In all cases, the dynamics of the emergent complex behavior of the whole cannot be directly anticipated from the knowledge of the laws of motion of the isolated parts. Early forerunners of complexity science, namely statistical mechanics and condensed matter physics, have identified a peculiar scenario at which, under certain general conditions, such complexity can emerge: near the critical point of a second order phase transition. At this point, complexity appears as a product of the competition between ordering and disordering collective tendencies, such that the final result is a state with a wide variety of dynamic patterns exhibiting a mixture of order and disorder.

…From the cognitive side, brain’s complexity is an almost obvious statement: the ultimate products of such complexity are, for instance, the nearly unpredictable human behavior and the underlying subjective experience of consciousness, with its bewildering repertoire of possible contents. However, the proposal that the same mechanisms underlying physical complexity also underlie the biological complexity of the brain is surprisingly recent…. Being a relatively recent proposal, the consequences of such hypothesis are still far from clear.

Characteristic of a critical state is the subtle balance between order and disorder whose manifestation is being observed in neuronal activity.  An article discussing very recent work appeared on APS’s website.  In a talk Chialvo gave in 2010,  he argues that it is reasonable to expect this hypothesis to be true.  He begins his argument with a simple question: “Why do we need a brain at all?”  His answer is also simple.  The brain is necessary to navigate a complex, critical world.   By this he means that the world itself rests on the border between order (subcritical) and disorder (supercritical) where the critical state is understood as one in which islands of order are bordered by the edges of disorder.

In a sub-critical world everything would be simple and uniform – there would be nothing to learn.  In a supercritical world, everything would be changing all the time – it would be impossible to learn.

The brain took shape to navigate in a complex, critical world.  But why must the brain itself be critical? Simple again.

In a sub-critical brain memories would be frozen.  In a supercritical brain, patterns change all the time so no long term memory would be possible.  To be highly susceptible, the brain itself has to be in the (in-between) critical state.

It was on this thought that my attention became focused.  It created the impression of the brain (or the organism) and the world (or the universe) as joined, or interlocked.  Chialvo’s answer to ‘why a brain at all?’ seems right, since organisms and their worlds are fully interactive – they belong to each other.  At life’s most fundamental levels, it makes sense that an effective brain would acquire a structure that somehow matches its world.  But I think it’s interesting to extend this view of the brain (as it has to do essentially with learning) to our more purely intellectual pursuits as well.  Considering this biological notion of criticality could shed a new light on how it is that we come to understand our experience in the symbolic forms created in the arts and the sciences.  This brings my mind back again to the way David Deutsch once expressed the baffling awareness we seem to have of aspects of our origins, ones that lie far outside our time and space.  I’ll finish with reproducing it once again.

The one physical system, the brain, contains an accurate working model of the other, the quasar, not just a superficial image of it (though it contains that as well) but an explanatory model embodying the same mathematical relationships and the same causal structure… The faithfulness with which the one structure resembles the other is increasing with time….…Physical objects that are as unlike each other as they could possibly be can, nevertheless, embody the same mathematical and causal structure and do it more and more so over time…This place is a hub which contains within itself the structural and causal essence of the whole of the rest of physical reality.

What’s the tool, what’s the reality, what are we doing?

I am intrigued by the current debate in physics concerning the significance of the wave function in quantum theory.  The nature of the debate opens the door to a host of philosophical issues surrounding both physics and mathematics.  In an article appearing in the June issue of Scientific American, I was introduced to a relatively new, alternative view of the nature of the wave function (known now as QBism by its proponents).  Theoretical particle physicist, and author, Hans Christian von Baeyer, describes something of the history of interpretations of quantum strangeness, and outlines the way the issues are dealt with in QBism.

QBism, which combines quantum theory with probability theory, maintains that the wave function has no objective reality. Instead QBism portrays the wave function as a user’s manual, a mathematical tool that an observer uses to make wiser decisions about the surrounding world—the quantum world. Specifically, the observer employs the wave function to assign his or her personal belief that a quantum system will have a specific property, realizing that the individual’s own choices and actions affect the system in an inherently uncertain way. Another observer, using a wave function that describes the world as the person sees it, may come to a completely different conclusion about the same quantum system. One system—one event—can have as many different wave functions as there are observers. After observers have communicated with one another and modified their private wave functions to account for the newly acquired knowledge, a coherent worldview emerges.

Seen this way, the wave function “may well be the most powerful abstraction we have ever found,” says theoretical physicist N. David Mermin of Cornell University, a recent convert to QBism.

The most immediate allure of not attributing a physical reality to the wave function is that it rids the quantum realm of disturbing paradoxes, like a particle occupying two locations at the same time, or information traveling faster than the speed of light. In the 1930s Niels Bohr emphasized the formalism that gave the wave function its power as a computational tool.  But this view of the wave function is strengthened considerably in QBism which incorporates the use of Bayesian statistics.  Bayesian statistics was established by Thomas Bayes in the 18th century, and then independently rediscovered and developed further by Laplace in the early 19th century.  The Bayesian paradigm employs probability as a conditional measure of uncertainty much like the ordinary use of the word. Statistical inference, then, is the modification of the uncertainty about a quantity in the light of new evidence.  Bayes’ Theorem specifies how this modification is made.  These probabilities contrast with the probablities defined by observed frequencies – how many times something actually happens.  In simple cases like coin tosses, the two kinds of probablities agree.  But, von Baeyer explains,

For the prediction of the weather or of the outcome of a military action, the Bayesian, unlike the frequentist, is at liberty to combine quantitative statistical information with intuitive estimates based on previous experience.

I find it worth noting that Bayesian statistics are also being used widely to model perception, cognitive development and learning in general.

Christopher A. Fuchs, now at the Perimeter Institute in Ontario, is a prominent spokesperson for QBism.  Von Baever reports in his article that Fuchs recently made an important mathematical discovery that provides QBism even greater strength.  He has shown that experimental results can be predicted without using wave functions at all, and using probabilities alone.

There is an illustration in the Scientific American article that I believe allows a mistaken oversimplification of the issue.  In the illustration, the wave function is placed either in the box that has Schrodinger’s famous cat, or inside the head of the person looking at the box.  The wave equation may be an instrument, but the complexity of its development, and its use, should serve to amplify the complexity of our relationship to the world we perceive.

In a paper collecting answers to interview questions, Fuchs takes note of the fact that QBism makes Wheeler’s question, “why the quantum,” even more pressing.

In other words, even if quantum theory is purely a theory for apportioning and structuring degrees of belief, the question of “Why the quantum?” is nonetheless a question of what it is about the actual, real, objective character of the world that compels us to use this framework for reasoning rather than another. We observers are floating in the world, making decisions on all that we experience around us: Why are we well-advised to use the formalism of quantum theory for that purpose and not some other formalism? Surely it connotes something about the general character of the world—something that is contingent, something that might have been otherwise, something that goes deeper than our decision-making itself.

Then later,

What has been lost sight of is that physics as a subject of thought is a dynamic interplay between storytelling and equation writing. Neither one stands alone, not even at the end of the day.

No doubt the difficulty with QBism is the manner in which notions of objectivity and subjectivity are being handled.  But this is an intriguing problem, and one which the content of mathematics always poses.  I’ll end with Fuch’s take on the problem of allowing ‘subjectivity’ into science.

“Subjective” is such a frightening word. All our lives we are taught that science strives for objectivity. Science is not a game of opinions, we are told. That diamond is harder than calcite is no one’s opinion! Mr. Mohs identified such a fact once, and it has been on the books ever since. In much the same way, quantum theory has been on the books since 1925,
and it doesn’t appear that it will be leaving any time soon. That isn’t lessened in any way by being honest of quantum theory’s subject matter:  That, on the QBist view, it is purely a calculus for checking the consistency of one’s personal probabilities. If by subjective probabilities one means probabilities that find their only source in the agent who has assigned them, then, yes, quantum probabilities are subjective probabilities. They represent an agent’s attempt to quantify his beliefs to the extent he can articulate them. Why should this role for quantum theory—that it is a calculus in the service of improving subjective degrees of belief—be a frightening one? I don’t know, but a revulsion or fear does seem to be the reaction of many if not most upon hearing it. It is as if it is a demotion or a slap in the face of this once grand and majestic theory. Of course QBism thinks just the opposite: For the QBist, the lesson that the structure of quantum theory calls out to be interpreted in only this way is that the world is an unimaginably rich one in comparison to the reductionist dream. It says that the world has excitement, risk, and adventure at its very core.

And it says that we are made of stuff belonging to that world that possesses an inexhaustible talent for building structure from sensation.

 

Multiverse, busses and emergent space-time

There was once what many call a ‘foundational crisis in mathematics’ – disputes among mathematicians about both their ideas and their methods.  But while one needn’t now address the relationship between mathematics and reality in order to pursue a successful career in mathematics, the conceptual and experimental puzzles of modern physics likely reflect a similar difficulty that we have reconciling our ideas with our reality.

Some of these puzzles are addressed in a recent article appearing on the Simons Foundation website by Natalie Wolchover.  The article was given the title Is Nature Unnatural?  In it Wolchover describes difficulties that physicists have reconciling very effective mathematical models with experimental data.  The accuracy with which the mathematics of the Standard Model represents experimental results has forced the consideration of some fairly radical ideas about the nature of our reality.  Some physicists find these alternatives unacceptable, while others see them as inevitable.

With the discovery of only one particle, the LHC experiments deepened a profound problem in physics that had been brewing for decades. Modern equations seem to capture reality with breathtaking accuracy, correctly predicting the values of many constants of nature and the existence of particles like the Higgs. Yet a few constants — including the mass of the Higgs boson — are exponentially different from what these trusted laws indicate they should be, in ways that would rule out any chance of life, unless the universe is shaped by inexplicable fine-tunings and cancellations…..

Physicists reason that if the universe is unnatural, with extremely unlikely fundamental constants that make life possible, then an enormous number of universes must exist for our improbable case to have been realized.

It’s interesting that this multiverse possibility is essentially a product of missing pieces in our ‘breathtakingly accurate’ model of the universe taken together with what we understand about probabilities. Shifting the attention of the mind’s eye a bit, a well investigated pattern in mathematics, that characterizes quantum chaotic systems, has also been observed in the departure times of busses at a specific location in Mexico, where drivers alter their behavior based on information they receive about busses that are ahead of them.  In another article by Wolchover,  she explains:

Subatomic particles have little to do with decentralized bus systems. But in the years since the odd coupling was discovered, the same pattern has turned up in other unrelated settings. Scientists now believe the widespread phenomenon, known as “universality,” stems from an underlying connection to mathematics, and it is helping them to model complex systems from the Internet to Earth’s climate.

The concept of universality is grounded in the purely mathematical exploration of an eigenvalue whose roots are traced back to the late 18th century.

“It seems to be a law of nature,” said Van Vu, a mathematician at Yale University who, with Terence Tao of the University of California, Los Angeles, has proven universality for a broad class of random matrices.

Universality is thought to arise when a system is very complex, consisting of many parts that strongly interact with each other to generate a spectrum. The pattern emerges in the spectrum of a random matrix, for example, because the matrix elements all enter into the calculation of that spectrum. But random matrices are merely “toy systems” that are of interest because they can be rigorously studied, while also being rich enough to model real-world systems, Vu said. Universality is much more widespread. Wigner’s hypothesis (named after Eugene Wigner, the physicist who discovered universality in atomic spectra) asserts that all complex, correlated systems exhibit universality, from a crystal lattice to the Internet.
The more complex a system is, the more robust its universality should be, said László Erdös of the University of Munich, one of Yau’s collaborators. “This is because we believe that universality is the typical behavior.”
In many simple systems, individual components can assert too great an influence on the outcome of the system, changing the spectral pattern. With larger systems, no single component dominates. “It’s like if you have a room with a lot of people and they decide to do something, the personality of one person isn’t that important,” Vu said.

The technique is being applied to the analysis of the evolution of the Internet, climate change models, and tests of bone tissue related to an understanding of osteoporosis.

Another take on the current state of affairs is given in an interview with Nobel Prize-winning physicist David J. Gross by Peter Byrne (also on the Simons Foundation website)  Byrne tells us:

Gross characterizes theoretical physics as rife with esoteric speculations, a strange superposition of practical robustness and theoretical confusion.

In response to the question “Is there a crisis in physics?”  Gross says:

I do not view the present situation as a crisis, but as the kind of acceptable scientific confusion that discovery eventually transcends.

But I found the next question unexpected and the answer provocative.

What does it mean to say that space-time is an emergent phenomenon?
[Chuckles.] That is a very sophisticated concept, which takes from about birth until the age of two to grasp. We do not really experience space-time; it’s a model. It describes how to get that piece of food that’s on the rug over there: crawl.
Our model of space-time, as amended by Einstein, is extremely useful, but perhaps it is not fundamental. It might be a derived concept. It seems to emerge from a more fundamental physical process that informs the mathematical pictures drawn by string theory and quantum field theory.

It seems that the conceptual difficulties in modern physics cannot avoid an intriguing question about us, namely, “What are we doing when we build a concept?”  Gross also says this:

There are frustrating theoretical problems in quantum field theory that demand solutions, but the string theory “landscape” of 10500 solutions does not make sense to me. Neither does the multiverse concept or the anthropic principle, which purport to explain why our particular universe has certain physical parameters. These models presume that we are stuck, conceptually.

When asked about human consciousness and objective reality, Gross sounds like a Platonist.

I believe that there is a real world, out there, and that we see shadows of it: our models, our theories. I believe that mathematics exists. It may be entirely real in a physical sense; it may also contain “things” that are ideal. But, to be clear, the human mind is a physical object. It’s put together by real molecules and quarks.

One of the more aggressive moves toward an interdisciplinary look at how our views of reality are constructed was demonstrated by the 2011 International and Interdisciplinary Conference organized by the Foundational Questions Institute which I plan to go back to in a future post:
Setting Time Aright: An international and inter-disciplinary meeting investigating the Nature of Time.

Quantum mechanical biology

In my guest blog for Scientific American, I wrote about the work of Bob Coecke who has designed a graphical mathematics, based on a branch of mathematics called category theory. He uses this diagrammatic calculus to describe and investigate quantum mechanical processes.  Coecke’s work has found application in biology and linguistics, suggesting some interesting links between mathematics, physics and biology.   But yet another link between quantum mechanics and biology is discussed in a recent article on the Foundational Questions Institute website:  Could quantum effects explain the mechanisms behind smell, photosynthesis and bird navigation?

In the article Carinne Piekema reviews the work of three researchers – quantum physicists Simon Benjamin at Oxford University and Alex Chin at the University of Cambridge, and biophysicist Luca Turin. Turin was one of the first to suggest a quantum effect in a biological process, namely, in how we distinguish odors.  This description of biological receptors sets up the problem Turin’s work addresses.

In general, for biological receptors, shape is everything.  Molecules of a certain shape are able to bind with particular receptors, triggering them via what’s known as the “lock-and-key” mechanism. This is true for antibodies, hormones, enzymes and even many neurotransmitters, so it seems perfectly reasonable to assume that molecules with similar shapes would bind to the same receptors in the nose, generating identical smell sensations. Indeed, this idea forms the basis of the conventional, non-quantum theory of smell.

But, Turin explains, “There doesn’t seem to be a correlation between molecular structure and its smell.”  Recognizing this fact led Turin to propose that we distinguish smells on the basis of the frequencies with which their bonds vibrate, rather than on their shape. But a quantum  mechanical process is needed to explain how that kind of interaction can happen within the nanoscale of proteins in your nose.  The experimental verification of this idea has run into some difficulties, but Turin has remedied some of them and is optimistic and encouraged by a related investigation of photosynthesis, an extremely efficient process that works even in the low light conditions of the ocean bottom.

The extreme efficiency seems surprising given that the signal needs to travel all the way down the leaf, to its “reaction centre,” which in itself costs energy. In 2007, chemist Greg Engel, then at the University of California, Berkeley, and colleagues ran a series of experiments that suggested that the process might actually make use of a quantum property known as superposition—the ability to be in two or more places at the same time. “When people were able to zoom in on what was happening in these tiny time windows, they saw that actually energy doesn’t just hop from molecule to molecule,” explains Chin. “It actually spreads in a wave-like manner, thus evolving according to the laws of quantum mechanics.”

But I most enjoyed the discussion of  bird navigation.  Based on the observation that vision plays a role in the European robin’s sense of direction, Benjamin and colleagues have come up with a hypothesis to describe the quantum mechanical processes involved in the birds’ sense of the earth’s magnetic field:

The quantum physicists now believe that the bird might actually “see” a grid like pattern on its eyeballs.  The idea is that when a molecule in the eye absorbs a photon from sun light, it gives an energetic nudge to a pair of electrons in the molecule. The electrons are entangled—that is, they are inextricably linked by a quantum property so that they influence each other, no matter how far apart they are separated. One of the electrons in the pair is dislodged and kicked off to a new location, but it remains linked to its partner, and each feels a slightly different level of magnetism, due to the Earth’s field. Before the transported electron relaxes back to its original state, a small electric dipole field is created, leaving a little trace on the bird’s vision. The orientation of the molecule with respect to the Earth’s magnetic field dictates how quickly the electron relaxes back and thus controls the strength of the superimposed image on the bird’s vision. The orientation of the molecule with respect to the Earth’s magnetic field dictates how quickly the electron relaxes back and thus controls the strength of the superimposed image on the bird’s vision.

What I find most interesting about these reports is the way that the unshakeable presence of quantum mechanics in physics seems to be prying open conceptual structures, or newly imagined views of the living material that is the subject of biology.  Despite the fact that we call it quantum ‘mechanics,’ quantum theory shattered the mechanical world view that prevailed in classical physics.   But that classical world view continues to have some general influence.  The conceptual shift in the research discussed here is not fully clarified. But it does look like the mind’s eye is getting nudged a bit, and beginning to see that there are many things that may happen in ways we have not yet imagined.  This kind of thinking will undoubtedly refresh our sense of what is real.

 

Quantum Mechanical Words and Mathematical Organisms

My post appeared on the Scientific American Guest Blog this morning.  Here’s the link:

Quantum Mechanical Words and Mathematical Organisms

Juggling, interviews and grant opportunities

My time this week is again taken up with work on a few writing projects that I’m trying to wrap up (not to mention end of the term grading).  But I should be back on track with my regular blogs next week.

In the meantime, an article on scientificamerican.com caught my attention, being about the mathematics of juggling!  But the article originated elsewhere, at Simons Science News.  And this is how I became aware of the Simons Foundation website.  It’s worth exploring.

The Simons Foundation’s mission is to advance the frontiers of research in mathematics and the basic sciences. We sponsor a range of programs that aim to promote a deeper understanding of our world.

It is a private foundation that provides some interesting grant opportunities.

The site provides some video interviews with mathematicians organized under the heading Science Lives.  These interviews include ones with John Nash, Michael Atiyah and Cathleen Morawetz from my own alma mater – The Courant Institute of Mathematical Sciences.   There’s also an interesting article on new observations in biology that are consistent with a mathematical idea of Turing’s proposed in 1952.

Now, more than 60 years later, biologists are uncovering evidence of the patterning mechanisms that Turing proposed in his paper, prompting a resurgence of interest in them, with the potential to shed light on such developmental questions as how genes ultimately make a hand.

 

 

A brief note and a little from Deutsch

I’m short on time today and working on a guest blog which I hope to be able to provide a link to shortly.  But I did begin exploring a website that has short video interviews with some of my favorite thinkers.  I found among a list of participants on the website Closer To Truth, Gregory Chaitin and David Deutsch.  They have each participated in a number of video interviews that seem always short, but always interesting.

I listened to one of Deutsch’s this evening called What is Ultimate Reality?  In the interview he described the four fundamental, interdependent aspects of reality, as  presented in his book, The Fabric of Reality.  These are, quantum physics, the theory of evolution, the theory of computation and the theory of knowledge.  Briefly,  he says that quantum physics constrains the kinds of theories that one can express and that evolution is the theory of emergent properties that cannot be expressed in terms of atoms.  Computation is about the processes in nature that are independent or that transcend the substance in which they are embodied.  And knowledge is the kind of information that can do things.  Knowledge is embodied in DNA, brains, books, computers, etc.  But Deutsch makes the point that what moves things, what changes things or creates things is information, not the substances in which the information is embodied.

For me, the most refreshing and important thought was his observation that all of these aspects of reality have been underestimated by being accepted as the right explanation in their own field.  But they have not become integrated.  The depth of these specialized fields has made it increasingly more difficult to consider their interdependence.

In another interview, Gregory Chaitin explores his own Platonism.

I recommend listening.

Structure, structure and more structure

I was expecting to write about a paper I found recently by Oran Magal, a post doc at McGill University, On the mathematical nature of logic. I was attracted to the paper because the title was followed by the phrase Featuring P. Bernays and K. Gödel

I’m often intrigued by disputes over whether mathematics can be reduced to logic or whether logic is, in fact, mathematics, because these disputes often remind me of questions addressed by cognitive science, questions related to how the mind uses abstraction to build meaning. This particular paper acknowledges, in the end, that its purpose is two-fold. It makes the philosophical argument that an examination of the interrelationship between mathematics and logic shows that “a central characteristic of each has an essential role within the other” But the paper is also a historical reconstruction and analysis of the ideas presented by Bernays, Hilbert and Gödel (the detail of which is not particularly relevant to my concerns). It was Bernays’ perspective that I was most interested in pursuing.

Magal begins with the observation that

the relationship between logic and mathematics is especially close, closer than between logic and any other discipline, since the very language of logic is arguably designed to capture the conceptual structure of what we express and prove in mathematics.

While some have seen logic as more general than mathematics, there has also been the view that mathematics is more general than logic. It is here that Magal introduces Bernays’ idea that logic and mathematics are equally abstract but in different directions. And so they cannot be derived one from the other but must be developed side-by-side. When logic is stripped of content it becomes the study of inference, of things like negation and implication. But while logical abstraction leaves the logical terms constant, according to Bernays, mathematical abstraction leaves structural properties constant. These structural properties do seem to be the content of mathematics, and what makes mathematics so powerful.

Magal describes how Bernays understands Hilbert’s axiomatic treatment of geometry. Here, the purely mathematical part of knowledge is separated from geometry (where geometry is thought of as the science of spatial figures) and is then investigated directly.

The spatial relationships are, as it were, mapped into the sphere of the abstract mathematical in which the structure of their interconnections appears as an object of pure mathematical thought. This structure is subjected to a mode of investigation that concentrates only on the logical relations and is indifferent to the question of the factual truth, that is, the question whether the geometrical connections determined by the axioms are found in reality (or even in our spatial intuition). (Bernays, 1922a, p. 192) (emphasis added)

Magal then uses abstract algebra to illustrate the point:

To understand Bernays’ point, that this is a structural direction of abstraction, and the sense in which this is a mathematical treatment of logic, it is useful to compare this to abstract algebra. The algebra familiar to everyone from our school days abstracts away from particular calculations, and discusses the rules that hold generally (the invariants, in mathematical terminology) while the variable letters are allowed to stand for any numbers whatsoever. Abstract algebra goes further, and ‘forgets’ not just which number the variables stand for, but also what the basic operations standardly mean. The sign ‘+’ need not necessarily stand for addition. Rather, the sign ‘+’ stands for anything which obeys a few rules; for example, the rule that a+ b= b+ a, that a+ 0= a, and so on. Remember that the symbol ‘a’ need not stand for a number, and the numeral ‘0’ need not stand for the number zero, merely for something that plays the same role with respect to the symbol ‘+ ’ that zero plays with respect to addition. By following this sort of reasoning, one arrives at an abstract algebra; a mathematical study of what happens when the formal rules are held invariant, but the meaning of the signs is deliberately ‘forgotten’. This leads to the study of general structures such as groups, rings, and fields, with immensely broad applicability in mathematics, not restricted to operations on numbers.

Again the key to the discussion is the question of content. When mathematics is viewed as a variant of logic it could easily be judged to have no specific content. The various arguments presented are complex, and not everyone writes with respect to the same logic. But the consistency of Bernays’ argument is most interesting to me. He is very clear on the question of content in mathematics. And reading this sent me back to another of his essays, where he is responding to Wittgenstein’s thoughts on the foundations of mathematics is 1959. Here he challenges Wittgenstein’s view with the nothingness of color.

Where, however, does the initial conviction of Wittgenstein’s arise that in the region of mathematics there is no proper knowledge about objects, but that everything here can only be techniques, standards and customary attitudes, He certainly reasons: `There is nothing here at all to which knowing could refer.’  That is bound up, as already mentioned, with the circumstance that he does not recognize any kind of phenomenology. What probably induces his opposition here are such phrases as the one which refers to the `essence’ of a colour; here the word `essence’ evokes the idea of hidden properties of the color, whereas colors as such are nothing other than what is evident in their manifest properties and relations. But this does not prevent such properties and relations from being the content of objective statements; colors are not just a nothing….That in the region of colors and sounds the phenomenological investigation is still in its beginnings, is certainly bound up with the fact that it has no great importance for theoretical physics, since in physics we are induced, at an early stage, to eliminate colors and sounds as qualities. Mathematics, however, can be regarded as the theoretical phenomenology of structures. In fact, what contrasts phenomenologically with the qualitative is not the quantitative, as is taught by traditional philosophy, but the structural, i.e. the forms of being aside and after, and of being composite, etc., with all the concepts and laws that relate to them. (emphasis added)

Near the end of the essay he makes a reference to the Leibnizian conception of the characteristica universalis which, Bernays says was intended “to establish a concept-world which would make possible an understanding of all connections existing in reality. This dream of Leibniz’s (which it seems Gödel thought feasible) is probably the subject of another blog. But in closing I would make the following remarks:

Cognitive scientists have found that abstraction is fundamental to how the body builds meaning or brings structure to its world. This is true in visual processes where we find cells in the visual system that respond only to things like verticality, and it is seen in studies that show that a child’s maturing awareness seems to begin with simple abstractions. Mathematics is the powerful enigma that it is because it cuts right into the heart of how we see and how we find meaning.

Pigeons, rats, monkeys and real numbers

I’d like today to stay on the topic of mathematics from the cognitive science perspective, and in particular, to make available another set of interesting studies summarized by C. R. Gallistel, Rochel Gelman and Sara Cordes. The studies are described in their contribution to the book Evolution and Culture (edited by Stephen C. Levinson and Pierre Jaisson and published by MIT press) and entitled: The Cultural and Evolutionary History of the Real Numbers. A pdf of this selection can be found here. These are provocative ideas that don’t seem to be getting a lot of attention yet.

Their premise:

that a system for arithmetic reasoning with real numbers evolved before language evolved. When language evolved, it picked out from the real numbers only the integers, thereby making the integers the foundation of the cultural history of the number.

Observations of the conceptual need for the real numbers, as well as their sometimes unwelcome presence, is peppered throughout the history of mathematics. But they were only formerly defined in the 19th century. The authors clarify that this real number system – a continuous, uncountable set of rational and irrational numbers

is used by modern humans to represent many distinct systems of continuous quantity–duration, length, area, volume, density, rate, intensity, and so on. Because the system of real numbers is isomorphic to a system of magnitudes, the terms real number and magnitude are used interchangeably. Thus, when we refer to “mental magnitudes” we are referring to a real number system in the brain. Like the culturally specified real number system, the real number system in the brain is used to represent both continuous quantity and numerosity.

I liked their summary of the observed weakness of the rationals.

The geometric failing of the integers and their offspring the rational numbers arises when we attempt to use proportions between integers to represent proportions between continuous quantities, as, for example when we say that one person is half again as tall as another, or one farmer has only a tenth as much land as another. These locutions show the seemingly natural expansion of the integers to the rational numbers, numbers that represent proportions. This expansion seemed so natural and unproblematic to the Pythagoreans that they believed that the natural numbers and the proportions between them (the rational numbers) were the inner essence of reality, the carriers of all the deep truths about the world. They were, therefore, greatly unsettled when they discovered that there were geometric proportions that could not be represented by a rational number, for example, the proportion between the diagonal and the side of a square. The Greeks proved that no matter how fine you made your unit of length, it would never go an integer number of times into both the side and the diagonal. Put another way, they proved that the square root of two is an irrational number, an entity that cannot be constructed from the natural numbers by a finite procedure.

And this is the heart of my interest:

Our thesis is that this cultural creation of the real numbers was a Platonic rediscovery of the underlying non-verbal system of arithmetic reasoning. The cultural history of the number concept is the history of our learning to talk coherently about a system of reasoning with real numbers that predates our ability to talk, both phylogenetically and ontogenetically.

What I find provocative about the history of mathematics is while it may look like mathematics is just the conscious organization of practical symbols, over time it is inevitably discovered that these symbols contain more than was put into them. They grow deeper, become more entwined and produce unanticipated new possibilities. This has always suggested to me that every formalized idea emerges from a well-spring of possibilities to which the mathematician keeps gaining proximity. This alone is full of implications about the nature of abstract ideas, what they accomplish, and what moves the development of human culture. Recent papers, like this one on the evolutionary history of the real numbers, consistently encourage me to keep thinking along these lines.

The way these investigators identify the presence of this primitive use of continuous mental magnitudes is interesting. Some of the first studies cited involve pigeons, rats and monkeys, where their memory of ‘duration’ is observed by exploiting one of the difficulties with continuous measurements. The difference between nearby numbers is difficult to discern, for example, numerosities are represented by voltage levels, because of the noise in voltage levels. This is contrasted with numerosities represented by digital computers. Experiments were designed to identify one of these creature’s subjective judgment of durations, by using the behavior of the animals, as the indicator of their memory of duration. The variability in these judgments (called scalar variability) increases as the remembered durations get longer. It is believed that this is because the noise in a magnitude is proportional to the size of the magnitude. Their observations are fairly precise, and even extended to allow the observation of non-verbal animals doing arithmetic with these continuous magnitudes. Other studies designed to produce non-verbal counting in humans produced the same results. These mental magnitudes were also seen mediating judgments of the numerical ordering of symbolically presented integers.

I expect this kind of evidence will continue to grow.

For now I’ll leave you with their summaries:

In summary, research with vertebrates, some of which have not shared a common ancestor with man since before the rise of the dinosaurs, implies that they represent both countable and uncountable quantity by means of mental magnitudes (real numbers). The system of arithmetic reasoning with these mental magnitudes is closed under the basic operations of arithmetic, that is, mental magnitudes may be mentally added, subtracted, multiplied, divided and ordered without restriction.

In short, we suggest that the integers are picked out by language because they are the magnitudes that represent countable quantity. Countable quantity is the only kind of quantity that can readily be represented by a system founded on discrete symbols, as language is. It is language that makes us think that God made the integers, because the learning of the integers is the beginning of linguistically mediated mathematical thinking about both countable and uncountable quantity.