This is a serious and interesting attack on a truly fundamental problem. Author of over three hundred research papers on the mathematical, physical, and philosophical foundations of quantum mechanics, and a Springer book "Mind, matter, and quantum mechanics". Worked personally with W. Heisenberg, W. Pauli, and J. Wheeler on these issues.
Invited author of entries about quantum theories consciousness in several currently about to appear encyclopedias. Invited plenary speaker at numerous international conferences. For book cover: Henry Stapp has spent his entire career working in frontier areas of theoretical physics. After Pauli's early death, he turned to von Neumann's ideas about the mathematical foundations of quantum theory. The essay 'Mind, Matter and Quantum Mechanics', that developed out of this work eventually evolved into Stapp's classic book bearing the same title.
His deep interest in the quantum measurement problem led him to pursue extensive work pertaining to the influence of our conscious thoughts on physical processes occurring in our brains. The understandings achieved in this work have been described in many technical articles and now, in more accessible prose, in the present book. Convert currency. Add to Basket. Book Description Springer. Condition: New. Seller Inventory More information about this seller Contact this seller.
Book Description Springer Nature, Germany, Condition: Brand New. Book Condition:- Brand New. Secured Packaging. Fast DeliveryBookseller Inventory Seller Inventory STM Seller Inventory NEW Book Description Springer, Never used!. Seller Inventory P This conception of man undermines the foundation of rational moral philosophy, and science is doubly culpable: It not only erodes the foundations of earlier value systems, but also acts to strip man of any vision of himself and his place in the universe that could be the rational basis for an elevated set of values.
During the twentieth century this morally corrosive mechanical conception of nature was found to be profoundly incorrect. Those scientists were forced to a wholesale revision of the entire subject matter of physical theory by the peculiar character of the new mathematical rules, which were invariably validated by reliable empirical data. Newton himself rejected the idea that gravity could really act at distance without any intervening carrier.
Nevertheless, provisional rules were found that were imagined to control the behavior of these tiny entities, and thus also the objects composed of them. These laws were independent of whether or not anyone was observing the physical universe: they took no special cognizance of any acts of observation performed by human beings, or of any knowledge acquired from such observations, or of the conscious thoughts of human beings.
All such things were believed, during the reign of classical physics, to be completely determined, insofar as they had any physical consequences, by the physically described properties and laws that acted wholly mechanically at the microscopic scale. Perhaps I should say that they turned right side up what had been upside down. It is rather what we human beings can know, and can do in order to know more. It is the three hundred years of indoctrination with mechanistic ideas that now makes puzzling a conception of ourselves that is fully concordant with both normal human intuition and the full range of empirical facts.
The founders of quantum mechanics presented this theory to their colleagues as essentially a set of rules about how to make predictions about the empirical feedbacks that we human observers will experience if we take certain actions. All observers and their acts of observation are conceived to be simply parts or aspects of the continuously evolving fully mechanically pre-determined physically described universe. Thus a natural mind—brain connection should give, it would seem, a continuously changing state of consciousness, composed of parts in a way analogous to the neural activity that it represents.
But this surmise seems at odds with the empirical evidence. According to William James : [. We either perceive nothing, or something already there in a sensible amount. Either your experience is of no content, of no change, or it is of a perceptible amount of content or change. Your acquaintance with reality grows literally by buds or drops of perception. Thus Bohr , p. The observer does not create what is not potentially there, but does participate in the extraction from the mass of existing potentialities individual items that have interest and meaning to the perceiving self.
Quantum theory exhibits, as we shall see, a similar feature. Insofar as it has been tested, the new theory, quantum theory, accounts for all the observed successes of the earlier physical theories, and also for the immense accumulation of new data that the earlier 1 Science, Consciousness and Human Values 9 concepts cannot accommodate. But, according to the new conception, the physically described world is built not out of bits of matter, as matter was understood in the nineteenth century, but out of objective tendencies — potentialities — for certain discrete, whole actual events to occur.
Each such event has both a psychologically described aspect, which is essentially an increment in knowledge, and also a physically described aspect, which is an action that abruptly changes the mathematically described set of potentialities to one that is concordant with the increase in knowledge. The most radical change wrought by this switch to quantum mechanics is the injection directly into the dynamics of certain choices made by human beings about how they will act. Human actions enter, of course, also in classical physics. In the classical case the way a person acts is fully determined in principle by the physically described aspects of reality alone.
But in the quantum case there is an essential gap in physical causation. But in classical physics the only needed setting of boundary conditions is the one done by God at the beginning of time. On the other hand, the conventional laws of quantum mechanics have both a dynamical opening for, and a logical need for, additional choices made later on. Thus contemporary orthodox physics delegates some of the responsibilities formerly assigned to an inscrutable God, acting in the distant past, to our present knowable conscious actions.
Bohr , p. In fact, it is precisely the absence from classical physics of any notion of experiential-type realities, or of any job for them to do, or of any possibility for them to do anything not already done locally by the mechanical elements, that has been the bane of philosophy for three hundred years. The preceding remarks give a brief overview of the theme of this work. Let there be no doubt about this point. The original form of quantum theory is subjective, in the sense that it is forthrightly about relationships among conscious human experiences, and it expressly recommends to scientists that they resist the temptation to try to understand the reality responsible for the correlations between our experiences that the theory correctly describes.
The following brief collection of quotations by the founders gives a conspectus of the Copenhagen philosophy: The conception of objective reality of the elementary particles has thus evaporated not into the cloud of some obscure new reality concept but into the transparent clarity of a mathematics that represents no longer the behavior of particles but rather our knowledge of this behavior. Heisenberg a, p. The discontinuous change in the probability function [. Heisenberg b, p.
Wigner b, p. He in no way claims or admits that there is an actual objective reality out there that conforms to the precepts of classical physics. Each man had his own bias and intuitions, but in spite of intense effort no rational comprehension was forthcoming. Finally, at the Solvay conference a group including Bohr, Heisenberg, Pauli, Dirac, and Born come into concordance on a solution that came to be called the Copenhagen interpretation, due to the central role of Bohr and those working with him at his institute in Denmark. We, and in particular our mental aspects, have entered into the structure of basic physical theory.
The very idea that in order to comprehend atomic phenomena one must abandon physical ontology, and construe the mathematical formulas to be directly about the knowledge of human observers, rather than about external reality itself, is so seemingly preposterous that no group of eminent and renowned scientists would ever embrace it except as an extreme last measure. Einstein never accepted the Copenhagen interpretation.
He said: What does not satisfy me, from the standpoint of principle, is its attitude toward what seems to me to be the programmatic aim of all physics: the complete description of any individual real situation as it supposedly exists irrespective of any act of observation or substantiation. Einstein , p. He did not succeed! Rather he admitted ibid. He also referred ibid, p. This is the only theory at present which permits a unitary grasp of experiences concerning the quantum character of micro-mechanical events.
Or one can try to claim that these problems concern only atoms and molecules, but not the big things built out of them. In this connection Einstein said ibid, p. To answer this query I begin with a few remarks on the development of quantum theory.
The original version of quantum theory, called the Copenhagen quantum theory, or the Copenhagen interpretation, is forthrightly pragmatic. It aims to show how the mathematical structure of the theory can be employed to make useful, testable predictions about our 2 Human Knowledge as the Foundation of Science 15 future possible experiences on the basis of our past experiences and the forms of the actions that we choose to make. The devices are treated as extensions of our bodies. However, the boundary between our empirically described selves and the physically described system we are studying is somewhat arbitrary.
The interaction between the psychologically and physically described aspects in quantum theory thereby becomes the mind—brain interaction of neuroscience and neuropsychology. It is this von Neumann extension of Copenhagen quantum theory that provides the foundation for a rationally coherent ontological interpretation of quantum theory — for a putative description of what is really happening. My aim in this book is to explain to non-physicist the interplay between the psychologically and physically described components of mind—brain dynamics, as it is understood within the orthodox von Neumann—Heisenberg quantum framework.
The founders of quantum mechanics made the revolutionary move of bringing conscious human experiences into basic physical theory in a fundamental way. After two hundred years of neglect, our thoughts were suddenly thrust into the limelight. This was an astonishing reversal of precedent because the enormous successes of the prior physics were due in large measure to the policy of excluding all mention of idea-like qualities from the formulation of the physical laws.
What sort of crisis could have forced the creators of quantum theory to contemplate, and eventually embrace, this radical idea of injecting our thoughts explicitly into the basic laws of physics? The answer to this question begins with a discovery that occurred at the end of the nineteenth century. The integration into physics of each of these three basic quantities generated a monumental shift in our conception of nature. Isaac Newton discovered the gravitational constant, which linked our understandings of celestial and terrestrial dynamics.
It connected the motions of the planets and their moons to the trajectories of cannon balls here on earth, and to the rising and falling of the tides. It is rather a fundamental number that enters into the equations of motion of every kind of material substance, and, among other things, prevents any piece of matter from traveling faster than this universal maximum value. The radiant energy emerging from a tiny hole in a heated hollow container can be decomposed into its various frequency components. During the years that followed many experiments were performed on systems whose behaviors depend sensitively upon the properties of their atomic constituents.
The fundamental laws of physics, which every physics student had been taught, and upon which much of the industrial and technological world of that era was based, were failing. Something was fundamentally amiss. No one could say how these laws, which were 3. No one could foresee whether a new theory could be constructed that would explain these strange and unexpected results, and restore rational order to our understanding of nature. He discovered in the completely amazing and wholly unprecedented solution to the puzzle: the quantities that classical physical theory was based upon, and which were thought to be numbers, must be treated not as numbers but as actions!
Ordinary numbers, such as 2 and 3, have the property that the product of any two of them does not depend on the order of the factors: 2 times 3 is the same as 3 times 2. But Heisenberg discovered that one could get the correct answers out of the old classical laws if one decreed that certain numbers that occur in classical physics as the magnitudes of certain physical properties of a material system are not ordinary numbers.
Rather, they must be treated as actions having the property that the order in which they act matters! The ordinary numbers that we use for everyday purposes like buying a loaf of bread or paying taxes are just a very special case from among a broad set of rationally coherent mathematical possibilities. In this simplest case, A times B happens to be the same as B times A. But there is no logical reason why Nature should not exploit one of the more general cases: there is no compelling reason why our physical theories must be based exclusively on ordinary numbers rather than on actions.
It is called quantum mechanics, or quantum theory. Thus this tweaking of laws of physics might seem to be a bit of mathematical minutia that could scarcely have any great bearing on the fundamental nature of the universe, or of our role within it. But replacing numbers by actions upsets the whole 20 3 Actions, Knowledge, and Information apple cart. It produced a seismic shift in our ideas about both the nature of reality, and the nature of our relationship to the reality that envelops and sustains us.
The aspects of nature represented by the theory are converted from elements of being to elements of doing. A purported theory of matter alone is converted into a theory of the relationship between matter and mind. What is this momentous change introduced by Heisenberg? In classical physics one uses the set of three numbers denoted by x, y, z to represent the position of the center point of an object, and the set of three numbers labeled by p, q, r to represent the momentum of that object.
Thus modern quantum theory was born by recognizing, or declaring, that the symbols used in classical physical theory to represent ordinary numbers actually represent actions such that their ordering in a sequence of actions matters. You might mutter that scientists should try to make things simpler, rather than abandoning one of the things we really know for sure, namely that the order in which one multiplies factors does not matter.
But against that intuition one must recognize that this change works beautifully in practice: all of the tested predictions of quantum mechanics are borne out, and these include predictions that are correct to the incredible accuracy of one part in a hundred million. There must be something very, very right about this replacement of numbers by actions. In quantum theory one generally considers systems of many particles, but insofar as one can consider one particle alone the state of that particle at any instant of time would be represented by a cloud of pairs of numbers, with one pair of numbers called a complex number assigned to each point in three-dimensional position space.
Someone might choose to perform a phenomenologically i. In quantum mechanics each such possible probing action turns out to have an associated set of distinct experientially distinguishable possible outcomes. The cloud of numbers taken as a whole determines the probability for the appearance of each of the alternative possible outcomes of that chosen probing action.
But in quantum theory one arrives instead at clouds, or quantum smears, of numbers that taken as a whole have empirical meaning in terms of probabilities of alternative possible experiences. Each physically described part corresponds to one perceivable outcome from the set of distinct alternative possible perceivable outcomes of that particular probing action. If such a probing action is performed, then one of its allowed perceivable feedbacks will appear in the stream of consciousness of the observer, and the mathematically described state of the probed system will then jump abruptly from the form it had prior to the intervention to the partitioned portion of that state that corresponds to the observed feedback.
Presumably the choice has some cause or reason — it is unreasonable that it should simply pop out of nothing at all — but the existing theory gives no reason to believe that this cause must be determined exclusively by the physically described aspects of the psychophysically described nature alone.
Thus classical physics is an approximation to quantum physics. In this approximation the quantum smearing does not occur — each cloud is reduced to a point — and one 3. Each such action is intended to produce an experiential feedback. Probing actions of this kind are performed not only by scientists. Thus both empirical science and normal human life are based on paired realities of this action—response kind, and our physical and psychological theories are both basically attempts to understand these linked realities within a rational conceptual framework.
A purposeful action by a human agent has two aspects. One aspect is his conscious intention, which is described in psychological terms. I shall retain that terminology. This process is the one controlled by the Schroedinger equation. There are two other associated processes that need to be recognized. It is the absence from orthodox quantum theory of any description on the workings of process zero that constitutes the causal gap in contemporary orthodox physical theory.
According to classical physics any such system has a state of lowest possible energy. In this state the center point of the object lies motionless at the base point. In quantum theory this system again has a state of lowest possible energy. But this state is not localized at the base point. If one were to put this state of lowest energy into a container, then squeeze it into a more narrow space, and then let it loose, the cloudlike form would explode outward, but then settle into an oscillating motion. Thus the cloudlike spatial structure behaves rather like a swarm of bees, such that the more they are squeezed in space the faster they 26 3 Actions, Knowledge, and Information move relative to their neighbors, and the faster the squeezed cloud will explode outward if the squeezing constraint is released.
If one shoots an electron, a calcium ion, or any other quantum counterpart of a tiny classical object, at a narrow slit then if the object passes through the slit the associated cloudlike form will fan out over a wide angle, due essentially to the reaction to squeezing mentioned above. But if one opens two closely neighboring narrow slits, then what passes through the slits is described by a probability distribution that is not just the sum of the two separate fanlike structures that would be present if each slit were opened separately.
Instead, at some points the probability value will be nearly twice the sum of the values associated with the two individual slits, and in other places the probability value drops nearly to zero, even though both individual fanlike structures give a large probability value at that place.
This non-additivity property, which holds for a quantum particle such as an electron or a calcium ion, persists even when the particles come one at a time! According to classical ideas each tiny individual object must pass through either one slit or the other, so the probability distribution must be just the sum of the contributions from the two separate slits.
But this is not what happens empirically. Quantum mechanics deals consistently with this non-additivity property, and with all the other non-classical properties of these cloudlike structures.
The non-additivity property is not at all mysterious or strange if one accepts the basic idea that reality is not made out of any material 3. Potentialities are not material realities, and there is no logical requirement that they be simply additive. According to the mathematically consistent rules of quantum theory, the quantum potentialities are not simply additive: they have a wave-like nature, and can interfere like waves. But classical physics has bottom-up causation, and the direct rational basis for the claim that classical physics is applicable to the full workings of the brain rests on the basic presumption that it is applicable at the microscopic level.
However, empirical evidence about what is actually happening at the trillions of synapses on the billions of neurons in a conscious brain is virtually nonexistent, and, according to the uncertainty principle, empirical evidence is in principle unable to justify the claim that deterministic behavior actually holds in the brain at the microscopic ionic scale. Classical physical theory is adequate, in principle, precisely to the extent that the smear of potentialities generated at the microscopic level by the uncertainty principle leads via the purely physically described aspects of quantum dynamics to a macroscopic brain state that is essentially one single classically describable state, rather than a cloud of such states representing a set of alternative possible conscious experiences.
In this latter case the quantum mechanical state of the brain needs to be reduced , somehow, to the state corresponding to the experienced phenomenal reality. Neuroscientists have developed, on the basis of empirical data, fairly detailed classical models of how these important parts of the brain work. Within the terminal are vesicles, which are small storage areas containing chemicals called neurotransmitters. Thus the nerve terminals, as connecting links between neurons, are basic elements in brain dynamics. The channels through which the calcium ions enter the nerve terminal are called ion channels.
At their narrowest points they are only about a nanometer in width, hence not much larger than the calcium ions themselves. This extreme smallness of the opening in the ion channels has profound quantum mechanical import.
From Wikipedia, the free encyclopedia. Philosophers of mind should read this book because it frequently discusses elements of the contemporary debate in novel ways, and may trigger some entirely new debates. Customer Reviews. Science Consciousness and Human Values. NeuroQuantology 10 4 :
The consequence of this narrowness is essentially the same as the consequence of the squeezing of the state of the simple harmonic oscillator, or of the narrowness of the slits in the double-slit experiments. The narrowness of the channel restricts the lateral spatial dimension. This causes the quantum probability cloud associated with the calcium ion to fan out over an increasing area as it moves away from the tiny channel to the target region where the ion will be absorbed as a whole on some small triggering site, or will not be absorbed at all on that site.
Thus the probability cloud becomes spread out over a region that is much larger than the size of the calcium ion itself, or of the trigger site. This spreading of the ion wave packet means that the ion may or may not be absorbed on the small triggering site. The very large quantum uncertainty at the individual calcium level ensures that this large empirical uncertainty of release entails that the quantum state of the nerve terminal will become a quantum mixture of states where the neurotransmitter is released, or, alternatively, is not released.
This quantum splitting occurs at every one of the trillions of nerve terminals in the brain. In complex situations where the outcome at the classical level depends on noisy elements the corresponding quantum brain will evolve into a quantum mixture of the corresponding states. The process 2 evolution of the brain is highly nonlinear, in the classical sense that small events can trigger much larger events, and that there are very important feedback loops. In a system with such a sensitive dependence on unstable elements, and on massive feedbacks, it is not reasonable to suppose, and not possible to demonstrate, that the process 2 dynamical evolution will lead generally to a single nearly classically describable quantum state.
There might perhaps be particular special situations during which the massively parallel processing all conspires to cause the brain dynamics to become essentially deterministic and perhaps even nearly classically describable. The exact details of the chosen plan will, for a classical model, obviously depend upon the exact values of many noisy and uncontrolled variables. The automatic mechanical process 2 evolution generates this smearing, and is in principle unable to resolve or remove it. The form of such an intervention is not determined by the quantum analog process 2 of the physically deterministic continuous dynamical process of classical physics: some other kind of input is needed.
Thus the neural or brain correlate of an intentional act should be something like a collection of the vibratory modes of a drumhead in which many particles move in a coordinated way for an extended period of time. In quantum theory the enduring states are vibratory states. They are like the lowest-energy state of the simple harmonic oscillator discussed above, which tends to endure for a long time, or like the states obtained from such lowest-energy states by spatial displacements and shifts in velocity. Such states tend to endure as organized oscillating states, rather than quickly dissolving into chaotic disorder.
My earlier discussion of the quantum indeterminacies that enter brain dynamics in association with the entry of calcium ions into the nerve terminals was given in order to justify the claim that the brain must be treated as a quantum system. This dynamics expresses the core idea of the quantum theory of observation, which is that the reduction events are associated with increments in knowledge, and correspondingly reduce the physical state to the part of itself that is compatible with the knowledge entering a stream consciousness.
On the other hand, the only freedom provided by the quantum rules is the freedom to select the next process 1 action, and the instant at which it is applied. Take an example. Suppose you are in a situation that calls for you to raise your arm. Associations via stored memories should elicit a brain activity having a component that when active on former occasions resulted in your experiencing your arm rise, and in which the template for arm-raising is active.
Sudarshan and R. Greek philosopher, Zeno the Eleatic. I believe that the array of things we can attend to is so determined. No object can catch our attention except by the neural machinery. But the amount of the attention which an object receives after it has caught our attention is another question. Though it introduce no new idea, it will deepen and prolong the stay in consciousness of innumerable ideas which else would fade more quickly away. When developed it may make us act, and that act may seal our doom.
When we come to the chapter on the Will we shall see that the whole drama of the voluntary life hinges on the attention, slightly more or slightly less, which rival motor ideas may receive. James apparently recognized the incompatibility of these pronouncements with the physics of his day. At the end of Psychology: The Briefer Course, he said, presciently, of the scientists who would one day illuminate the mind—body problem: The best way in which we can facilitate their advent is to understand how great is the darkness in which we grope, and never forget that the natural-science assumptions with which we started are provisional and revisable things.
The connections described by James are explained on the basis of the same dynamical principles that had been introduced by physicists to explain atomic phenomena. The reasons for this failure are easy to see: classical physics systematically exorcizes all traces of mind from its precepts, thereby banishing any logical foothold for recovering mind. However, many psychologists, neuroscientists, and philosophers who intended to stay in tune with the basic precepts of physics became locked to the ideas of nineteenth century physicists and failed to acknowledge or recognize the jettisoning by twentieth century physicists of classical materialism and the principle of the causal closure of the physical.
So we can now inquire: How well does the above-described quantum-theory-based approach to mind—brain dynamics square with these newer data? Pashler organizes his discussion by separating perceptual processing from post-perceptual processing.
The former covers processing that, H. Pashler emphasizes p. A huge industry has developed that traces these essentially classically describable processes in the brain. Two kinds of process 1 actions are possible. One kind would be determined by brain activity alone. This quantum conceptualization of the action of mind on brain is, as we shall now see, in good accord with the details of the data described by Pashler. Those data did not necessarily — from non-quantum considerations — need to have the detailed structure that it is empirically found to have.
Indeed, the various classical-type theories examined by Pashler did not entail it. The perceptual aspects of the data described by Pashler can, I believe, be accounted for by essentially classical parallel mechanical processing. He emphasizes p. This kind of bottleneck is what the quantumphysics-based theory predicts: the bottleneck is precisely the single linear sequence of process 1 actions that enters so importantly into the quantum theoretic description of the mind—matter connection. So if part of this processing capacity is directed to another task, then the muscular force will diminish.
An interesting experiment mentioned by Pashler involves the simultaneous tasks of doing an IQ test and giving a foot response to rapidly presented sequences of tones of either or Hz. Pashler also notes p. If arousal is essentially the rate of occurrence of conscious events then this result is what the quantum model would predict. After analyzing various possible mechanisms that could cause the central bottleneck, Pashler pp. An example is the experiment of Ochsner et al.
The trial began with a 4-second presentation of a negative or neutral photo, during which participants were instructed simply to view the stimulus on the screen. This interval was intended to provide time for participants to apprehend complex scenes and allow an emotional response to be generated that participants would then be asked to regulate.
To verify whether the participants had, in fact, reappraised in this manner, during the post-scan rating session participants were asked to indicate for each photo whether they had reinterpreted the photo as instructed or had used some other type of reappraisal strategy. The fMRI results were that reappraisal was positively correlated with increased activity in the left lateral prefrontal cortex and the dorsal H.
Nor has this conjecture any rational foundation in science or basic physics. The conjecture originates from the classical principle of the causal closure of the physical, which does not generally hold in quantum theory. That principle rests on a classical-physics-based bottom-up determinism that starts at the elementary particle level and works up to the macro-level. But, according to the quantum principles, the determinism at the bottom ionic level fails badly in the brain. The presumption that it gets restored at the macro-level is wishful and unprovable.
According to quantum mechanics, the microscopic uncertainties must rationally be expected to produce, via the Schroedinger equation of brain plus environment , macroscopic variations that, to match observation, need to be cut back by quantum reductions. This means process 1 interventions. This replacement of inaccessible-in-principle 8 Application to Neuropsychology 49 data by accessible-in-practice data leads to statistical predictions connecting empirically describable conscious intentions to empirically describable perceptual feedbacks.
The psychologically described and mathematically described components of the theory become cemented together by quantum rules that work in practice. What is the rational motivation for adhering to the classical approximation? The applicability of the classical approximation to this phenomenon certainly does not follow from physics considerations: calculations based on the known properties of nerve terminals indicate that quantum theory must in principle be used.
Nor does it follow from the fact that classical physics works reasonably well in neuroanatomy and neurophysiology: quantum theory explains why the classical approximation works well in those domains. The materialist claim is that someday the mind will be understood to be the product of completely mindless matter. Neither of these opinions has any rational basis in contemporary science, as will be further elaborated upon in the 50 8 Application to Neuropsychology sections that follow. And they leave unanswered the hard question: Why should causally inert consciousness exist at all, and massively deceive us about its nature and function?
Quantitative estimates that appear to back up this negative opinion have been made by Tegmark The expected by most physicists lack of long-range quantum coherence in a living brain is, in fact, a great asset to the von Neumann approach described in this book. This lack of coherence decoherence means that the quantum brain can be conceived to be, to a very good approximation, simply a collection of classically conceived alternative possible states of the brain.
This permits neuroscientists unfamiliar with quantum theory to have a very accurate, simple, intuitive idea of the quantum state of a brain. It can be imagined to be an evolving set of nearly classical brains with, however, the following four non-classical properties: 1. Each almost-classical possibility is slightly smeared out in space relative to a strictly classical idealization, and it fans out in accordance with the uncertainty principle.
At each occurrence of a conscious thought, the set of possibilities is reduced to the subset compatible with the occurring increment of knowledge. Microscopic chemical interactions are treated quantum mechanically. The theory leaves open the important question of how these interventions, which are treated pragmatically simply as experimenter-selected choices of boundary conditions, come to be what they turn out to be: this is the causal gap! These interventions are not required by present understanding to be governed by algorithmic processes.
I use the term more broadly to include, at the pragmatic level, also the Copenhagen formulation. But at the ontological level I mean the von Neumann—Tomonaga— Schwinger description that includes the entire physical universe in the physically described quantum world, and that accepts the occurrence of the process 1 interventions in the process 2 evolution of the physically described state of the universe.
This conventional formulation of quantum theory — with experimenter-induced interventions — is the one used in practice by experimental physicists who need to compare the predictions of the theory to empirical data. It is consequently the form of the theory that is actually supported by the empirical facts. It might seem odd, therefore, that any quantum physicist would want to promote an alternative formulation.
Yet theoretical physicists who favor such a reversion do in fact exist. Given H. There are three main non-orthodox approaches to the problem of imbedding pragmatically validated quantum theory in some coherent conception of reality itself. These are the many-worlds approach initiated by Everett , the pilot-wave approach of Bohm , and the spontaneous-reductions approach of Ghirardi, Rimini, and Weber The many-worlds approach is the most radical and sweeping. It asserts that the quantum state of the physical universe exists and evolves always under the exclusive control of the local deterministic process 2.
In this scheme no reduction events occur at the level of objective reality itself. The fact that we seem to choose particular experiments that seem to have outcomes that conform to the predictions of quantum theory then needs to be explained as essentially some kind of persisting subjective illusion that produces coherent long-term streams of human conscious events that somehow conform over long times to the statistical predictions of the orthodox theory, even though the physical reduction events that logically entail these properties in the orthodox approach are now asserted not to occur.
The consciously perceived experiences that conform to the statistical rules of pragmatic quantum theory then need to be explained as intricate properties of the purely mental by-products of a continuous physical process that eschews the interventions and reductions that provide the mathematical foundation of the orthodox understanding of the empirical facts. The spontaneous-reductions approach maintains that the evolution via the local mechanical process 2 is interrupted from time to time by a sudden spontaneous and random reduction event that keeps the physical universe, at the visible level, roughly in accord with the precepts of There is much disagreement in the literature about the reduction process and how it works, including controversy over whether there is any such thing as reduction.
Joos, ; D. Zeh, ; W. Zurek, make no such strong claim. Joos p. Zeh p. Zurek p. The Many Worlds Interpretation aims to abolish the need for a border altogether. Every potential outcome is accommodated in the ever-proliferating branches of the wave function of the Universe. Yet our perception of a reality with alternatives — not a coherent superposition of alternatives — demands an explanation of when, where, and how it is decided what the observer actually records.
Considered in this context, the Many Worlds Interpretation in its original version does not really abolish the border but pushes it all the way to the boundary between the physical universe and consciousness. Needless to say, this is a very uncomfortable place to do physics. Later on pp. There is even less doubt that this rough outline will be further extended. Much work needs to be done, both on technical issues [.
If each of these parts of the brain were accompanied by the corresponding experience, then there would exist not just one experience corresponding to seeing the object in just one place, but a continuous aggregation of experiences, with one experience for each of the possible locations of the object in the large region. This discreteness condition is a technical point, but it constitutes the essential core of the measurement problem. Hence I must explain it! It is often called the measurement problem, and is the problem of relating the quantum mathematics to the empirically observed phenomena.
Evolution according to the Schroedinger equation process 2 generates in general, as I have just explained, a state of the brain of an observer that is a smeared out continuum of component parts. The need to specify a particular countable set of parts is the essential problem in the construction of a satisfactory quantum theory.
The problem is to divide a continuum of brain states into a countable set of discrete and orthogonal components by means of the strictly continuous process 2 alone, and in a way such that the distinct parts correspond to distinguishable experiences. Copenhagen quantum theory accomplishes this selection of a preferred set of discrete states by means of an intervention by the experimenter. Von Neumann solves this discreteness problem in this Suppose a pen that draws a line on a moving scroll is caused to draw a blip when a radioactive decay is detected by some detector.
If the only process in nature is process 2, then the state of the scroll will be a blurred out state in which the blip occurs in a continuum of alternative possible locations. Correspondingly, the brain of a person who is observing the scroll will be in a smeared out state containing a continuously connected collection of components, with one component corresponding to each of the possible locations of the blip on the scroll. Thus it is the empirically experienced discreteness of the choice made by the experimenter that resolves the discreteness problem.
This was a giant break from tradition. But the enormity of the problem demanded drastic measures. Because such powerful thinkers as Wolfgang Pauli and John von Neumann found it necessary to embrace this revolutionary idea, anyone who claims that this unprecedented step was wholly unnecessary certainly needs to carefully explain why. This has not yet been done. See the next chapter for further elaboration. Although bringing the consciousness of human agents into the dynamics is certainly quite contrary to the ideas of classical physics, the notion that our streams of consciousness play a causal role in the determination of our behavior is not outlandish: it is what one naturally expects on the basis of everyday experience.
He said that he completely agreed! Notice, in this connection, that in the last two chapters of his book with Hiley, Bohm goes beyond this simple model, and tries to come to grips with the deeper problems that are being considered here by introducing the notions of implicate and explicate order, But those extra ideas are considerably less mathematical, and much more speculative and vague, than the pilot-wave model that many other physicists want to take more seriously than did Bohm himself. Bohm certainly appreciated the need to deal more substantively with the problem of consciousness.
Over and beyond these problems with consciousness there is a serious technical problem: a Bohm-type deterministic model apparently cannot be made to accommodate relativistic particle creation and annihilation, which is an important feature of the actual world in which we live. Completing the ontology by adding a classically conceived mechanistically determined world — instead of choices made by agents and by nature — has never been satisfactorily achieved, except in an The ultimate problem with this Bohmian approach is precisely the discreteness problem previously emphasized.
But the state of a universe with no collapses at all will be one in which every physical feature of every device and every brain is completely smeared out, with no partitioning into discrete parts. A model of this kind was originally proposed by Ghirardi, Rimini, and Weber, and has been pursued vigorously by Philip Pearle. The bottom line is that it has not been possible to construct a model of this sort that accommodates particle creation and annihilation and that is relativistically invariant in the same satisfactory sense that the orthodox von Neumann—Tomonaga—Schwinger theory is relativistically invariant.
A quasi-relativistic theory of this kind has recently been proposed by Pearle , who expounds also on the inability of these spontaneous-collapse models to do better. This chapter, which is more technical than the others, explains these aspects, and, with the aid of some pictures, their relevance to the basis and decoherence problems. Suppose we divide this box into a very large number N of tiny cubic regions. A special rule can be introduced to cover the case where the particle lies exactly on a boundary.
Quantum mechanics is somewhat analogous to classical statistical mechanics. That theory covers situations where one wishes to make statistical predictions about future observations on the basis of the known equations of motion, when one has only statistical information H. This number represents the initial probability that the combination of the location and the momentum of the particle lies in that tiny region. These numbers will sum to unity one. Then the classical equations of motion can be used to determine how this probability density changes over the course of time.
The case just described is a very simple case in which the physical system being observed is just one single point particle. But the same discussion applies essentially unchanged to any physical system, including, in particular, the brain of a conscious human being. In a classically conceived statistical context a set of probability contributions that sums to unity can be distributed in any chosen way among these small boxes, each of which can in principle be shrunk to an arbitrarily small size.
The intrinsic wholeness of each conscious thought renders the phase space of classical physics The physical state of the brain is represented, rather, as a vector in an appropriate vector space, and each permissible conscious observation associated with that brain is associated with some set of mutually orthogonal perpendicular basis vectors.
Thus the basic mathematical structure needed for the conscious-observationbased quantum theory of phenomena is fundamentally incompatible with the mathematical structure used in the physical-measurementbased classical theory of phenomena. An irreducible element of wholeness is present in the former but absent from the latter. The neural correlates of our conscious thoughts are, according quantum mechanics, represented in a vector space of a very large number of dimensions.
But the basic idea of a vector in a vector space can be illustrated by the simple example in which that space has just two dimensions.
Draw a straight line that starts at this point, called the origin, and that extends out by a certain amount in a certain direction. Any pair of unit-length vectors in this space that are perpendicular to each other constitute a basis in this two-dimensional space. They are in fact an orthonormal basis, but that is the only kind of basis that will be considered here.
Given a basis, there is a unique way of decomposing any vector in the space into a sum of displacements, one along each of the two perpendicular basis vectors. The two individual terms in this sum are a pair of perpendicular vectors called the components of the vector in this basis. One such decomposition is indicated in Fig. This is what a sum of probabilities should be. Consequently, the concept of probability can be naturally linked to the concept of vectors in a space of vectors. An N -dimensional vector space is similar, but has N dimensions instead of just two.
This means that it allows not just two mutually perpendicular basis vectors, but N of them. For any N , and for any basis in the N -dimensional space, there is a unique way of decomposing any vector in that space into a sum of displacements each lying along one of the mutually perpendicular basis vectors.
Each possible observational process is, according to the basic principles of quantum theory, associated with such a choice of basis vectors. The N -dimensional generalization of the theorem of Pythagoras says that the sum of the squares of lengths of the mutually perpendicular components of the unit length vector V that represents the quantum state of the physical system is unity. Consequently, the probability interpretation of the lengths of the components of the vector V carries neatly over to the N -dimensional case. Vectors in a vector space provide, therefore, a way to represent in an abstract mathematical space the probabilities associated with the perceptual realities that form the empirical basis of science.
According to quantum theory, the alternative possible phenomenal outcomes of any process of observation are associated with a set Each such basis vector is associated with an — in principle fuzzy — region in the phase space of the system that is being probed, hence acted upon.
Thus the mathematical entities correspond possible perceptions in quantum theory are very restrictive as compared to the completely general sizes and shapes of the phase-space regions that are allowed to represent measurable properties of physical systems in classical physics. The transition to quantum theory imposes a severe restriction on observational realities, in comparison to the micro-structure that is deemed measurable in classical mechanics. That means that, given a basis, there is a unique decomposition of the state of the system into a countable set of elementary components.
The countability of the set of distinct or discrete possibilities is important. The decomposition into discrete holistic components associated with a set of mutually perpendicular basis vectors in a vector space is the foundation of the relationship of the quantum mathematics to empirical phenomena.
This feature blocks the association of arbitrarily tiny regions R in phase space with observation. To discuss decoherence adequately it is useful to employ the density matrix formulation of quantum mechanics described by von Neumann. This matrix representation is useful when the system of interest, say a human brain, is interacting with an environment upon which no actual measurements will ever be made. The brain of an observer can be represented, then, by a matrix with a very large number N of rows and columns.
The diagonal elements of the matrix are the elements that lie in a row and in a column that both correspond to the same basis vector. Each diagonal element can be made to correspond roughly to a smeared out classically described possible state of the entire brain or of a macroscopic part of the brain, e. Quantum theory allows statistical predictions to be made, on the basis of the numbers in this density matrix. According to a classical model, your brain will, if well conditioned, decide on one plan or another, not produce both plans with no decision between them. Figure The two corresponding sets of columns are also indicated.
It is assumed that the available energy and organizational structure will go to one template or the other, so that at the classical level of description the two templates will not be simultaneously activated. The diagonal elements correspond most nearly to the phase space of classical physics.
The diagram illustrates the two main points: 1. The entire portion of the matrix that corresponds to classically describable possibilities is retained essentially untouched. It sets certain of the elements of that matrix to zero and leaves the rest unchanged. It is important that the quantum decomposition into separate boxes is in terms of elements corresponding to basis vectors associated with 74 11 The Basis Problem in Many-Worlds Theories Fig.
In general, the basic realities in quantum theory are psychophysical events, and for each such event its physically described aspect is the reduction of the quantum state of an observed system to the part of that state that is compatible with the psychologically described aspect, which is an increment in knowledge entering a stream of consciousness. The evolving physical state is thereby kept in accord with our evolving These choices are not determined by the currently known laws of physics, and they link the quantum dynamics to observation. However, many of the greatest advances in science have come from unifying the treatments of neighboring realms of phenomena.
What seems pertinent is that basic physics was forced by the character of empirical phenomena to an incredibly successful way to link these same two realms. It seems reasonable to at least try to apply the solution discovered by physicists to the parallel problem in neuropsychology. Contemporary physics is essentially psychophysical, hence dualistic. Hence the quantum approach tends to be peremptorily rejected because it belongs to this despised category. But why are dualistic theories held in such contempt?
There is an historical reason. In the introspectionist E. Watson, emphasizing the failures of introspection to achieve reliable results, went to the opposite extreme. Its theoretical goal is the prediction and control of behavior. Yet the pariah status assigned to dualism by behaviorists lingered on after the fall of behaviorism, and it still persists today. But why should this bias continue after the demise of the discredited philosophy that spawned it? His book Consciousness Explained has a chapter entitled Why Dualism Is Forlorn, which begins with the words: The idea of mind as distinct [.
In short, the mind is the brain. Why is it in such disfavor? Contemporary physical theory allows, and its orthodox von Neumann form entails, an interactive dualism that is fully in accord with all the laws of physics. Any perception merely reduces the possibilities. Steven Pinker is an able reporter on contemporary neuroscience. He says: The Hard Problem is explaining how subjective experience arises from neural computation.
And not surprisingly everyone agrees that the hard problem if it is a problem is a mystery. But the mystery immediately dissolves when one passes over to quantum theory, which was formulated from the outset as a theory of the interplay between physical descriptions and conscious thoughts, and which comes with an elaborate and highly tested machinery for relating these two kinds of elements. That approach would be a misuse of the quantum mechanical use of the concepts of classical mechanics. The founders of quantum mechanics were very clear about the use, in the theory of observations, of the concepts of classical mechanics.
The use of the classical concepts is appropriate in that context because those pertinent experiences are actually describable in terms of the classical concepts, not because something was mysteriously supposed to actually happen merely when things became big enough for classical ideas to make sense. That criterion was too vague and ambiguous to be used to construct a satisfactory physical theory. The boundary between the large and the small could be shifted at will, within limits, but actuality cannot be shifted in this way.
There is, as in the classical approach, no intrinsic conceptual place for, or dynamical need for, our conscious experiences. There is within the given structure no entailment either of any reason for conscious experiences to exist at all, or of any principle that governs how these experiences are tied to brain activity. He expressed general approval, but raised one point: There is one problem I would like to mention, not in order to criticize the wording of your paper, but for inducing you to more investigation of this special point, which however is a very deep and old philosophical problem.
When you speak about the ideas especially in [section 3. In other words: have these ideas existed at the time when no human mind existed in the world?