Quantum statistics vs classical reductionism

Quantum statistics vs classical reductionism
apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#1 - Quote - Permalink
1 of 2 people found this post helpful
Posted Feb 5, 2013 - 6:15 PM:
Subject: quantum statistics vs classical reductionism
Quantum indistinguishability and boson/fermion statistics look to fatally undermine classical concepts of causality - that package of ideas that includes reductionism, determinism, randomness and microphysical effective cause.

Here is the guts of it. In the classical view, if you have a two particle system, and each particle can be randomly in one of two states - either 1 or 0 (for example spin up or spin down) - then the overall statistics says there are four equally possible states, 1/1, 0/0, 1/0 and 0/1. So the chances of an antisymmetrical outcome is .5.

However this is just not the way nature works at the microphysical level where things are quantumly entangled.

Instead, if the two particles are bosons, the chance of an antisymmetrical outcome is .33 - a third not a half. While if the two particle are fermions, the chance instead becomes 1 - there is now no chance of finding a symmetrical outcome where the particle share the same spin state.

See...
en.wikipedia.org/wiki/Ident...ies_of_bosons_and_fermions

So the classical view of reality that is adopted in many philosophical debates about randomness vs determinism, or Humean supervenience, or the reduction of causality to local effective causes, just is not supported by the empirical evidence.

If you tossed a pair of coins, you would be surprised if either only a third of the trials came up antisymmetric, and still more suprised if they only ever came up antisymmetric. So something has to give about classical notions of ontology. A different set of intuitions is needed.



andrewk
Inexhaustibly Curious
Avatar

Usergroup: Moderators
Joined: Oct 13, 2011
Location: Sydney, Australia

Total Topics: 41
Total Posts: 4657

Last Blog: On being, and eating, the juices of other animals

#2 - Quote - Permalink
1 of 1 people found this post helpful
Posted Feb 5, 2013 - 6:48 PM:

I like how the OP starts apokrisis, because quantum indistinguishability is a topic that has had me mesmerised for some time, with its deep, mysterious philosophical overtones yet the crystal clear, unambiguous equations it gives rise to.

But I couldn't see how you got to:

So the classical view of reality that is adopted in many philosophical debates about randomness vs determinism, or Humean supervenience, or the reduction of causality to local effective causes, just is not supported by the empirical evidence.


I can't see the connection between the indistinguishability of fundamental particles and issues of randomness and determinism.

It would be great if you could expand on how you see that connection arising.

Thanks
apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#3 - Quote - Permalink
Posted Feb 5, 2013 - 7:51 PM:

andrewk wrote:
I can't see the connection between the indistinguishability of fundamental particles and issues of randomness and determinism.

It would be great if you could expand on how you see that connection arising.


I am talking about the belief that causality is founded on concrete particulars (and so there is no room for the 'magic' of holistic contextuality or downward causality, for example).

So for instance take David Lewis's famous assertion...


... all there is to the world is a vast mosaic of local matters of particular fact, just one little thing
and then another. ... We have geometry: a system of external relations of spatiotemporal distances
between points. Maybe points of space-time itself, maybe point-sized bits of matter or aether or
fields, maybe both. And at those points we have local qualities: perfectly natural intrinsic
properties which need nothing bigger than a point at which to be instantiated. For short: we have
an arrangement of qualities. And that is all. There is no difference without difference in the
arrangement of qualities. All else supervenes on that.

http://arxiv.org/ftp/arxiv/papers/0904/0904.2702.pdf


So this seems to validate a belief that even the random is secretly determined - supervenient on micro-causes. If we toss a coin, it seems the outcome is random. But the argument goes that that boils down to merely epistemic ignorance on our part, because in fact a combination of initial conditions and dynamical laws gave reality no other ontic choice. If the coin came up heads, there was some determinstic trail of micro-causal events that fixed that outcome. There was some exact microscale state of play that fixed the observed macroscale outcome.

Yet at the most fundamental level, reality doesn't act according to the statistical picture such an ontology would entail. We instead have superfluidity and Bose Einstein condensates on the one hand, stable electron shells on the other.

QM offers a whole bunch of reasons to doubt a classical picture of the world. But this one seems especially pointed because it breaks with intuition in both directions - you have to be able to explain why you get a third probability with bosons and 1 with fermions. This greatly constrains the scope of any pro-reductionist argument. It has to shoot down two things with the same bullet. smiling face

I think it is also a stronger challenge because it connects QM tightly to special relativity. The explanation for fermion vs boson statistics goes back to a global fact about chiral symmetry in a relativistic reference frame.

So the statistics can't be waved away with the familiar tactic of "oh, we don't have a proper interpretation of quantum theory yet, so I'll just wait until there is one". Most people think special relativity is pretty straightforward. It certainly seems an accepted constraint on some of an ardent reductionist's possible get-out clauses like "maybe Bohmian mechanics will rescue realism for me if not locality".






Edited by apokrisis on Feb 13, 2013 - 1:09 AM. Reason: Added references for clarification
-Jove-
Unmoderated Member
Avatar

Usergroup: Unmoderated Member
Joined: Jul 20, 2007
Location: Ireland

Total Topics: 8
Total Posts: 0
#4 - Quote - Permalink
Posted Feb 5, 2013 - 9:21 PM:

Perhaps maybe the difference between making predictions through understanding within a controlled experiment is not so reflective of reality. I say this because I imagine that to predict the outcome of said coin one would need quite a substantial amount of real time data and a way of using it, or perhaps just many years of practice, since the coin flipper/observer was the one who initiated these initial conditions and dynamical laws. And for thought-play perhaps by sheer chance (or enough experiments) the ignorance of an imminent plane crash about to happen meant the coin would be neither heads or tails. grin

In all seriousness while I have little or no understanding of QM I do believe that in an epistemology sense the seeming 'secretly determined' aspect is only a reflection of our own aspiration too 'knowing'. We can see that the coin moves in a particular way in relation to gravity, we can throw it a certain way and predict its path. It can travel our prediction, many times. So I argue repeat it until the end of time, and if it could be done I would presume that there would be many instances when something external to the control occurred, known or unknown that changed the results. This external thing I must point out would have to come from reality.

I like a stubborn old man argue that because of this all knowledge is ultimately belief and so get into many endless debates that go of course nowhere. smiling face
apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#5 - Quote - Permalink
Posted Feb 5, 2013 - 9:37 PM:

-Jove- wrote:
Perhaps maybe the difference between making predictions through understanding within a controlled experiment is not so reflective of reality.


The plane crash is certainly an example of spontaneity crashing through from a higher unobserved scale of being. But here we are talking about spontaneity acting from the very bottom in terms of scale. And the controlled conditions would be needed to prevent the equivalent of plane crashes obscuring a clear view of whatever is the actual story at the microscale.

So probability depends on some system of constraints. This is obvious in a manufactured situation such as a coin-toss. But how is it also the case with reality in general? And why is the way it happens at a quantum level so different from our naive expectations - the kind of expectations we project onto a manufactured situation like a coin toss?

Quantum theory can seem to be talking about such spontaneity - the local effective causes of fluctuations and virtual particles. But look closer at the mathematical formalism and that kind of talk starts to be seen for what it is, a comforting epistemic fiction, a way to continue to think classically about something that is now behaving quite differently.




John Creighton
PF Addict

Usergroup: Members
Joined: Apr 22, 2012

Total Topics: 107
Total Posts: 1034
#6 - Quote - Permalink
Posted Feb 5, 2013 - 10:12 PM:

With regards to the supervenience issue we discussed this in another thread starting here:
forums.philosophyforums.com...ndpost=1049401#post1049401

and concluded that there is both top down and bottom up supervenience because of the uncertaintly principle.

With regards to this thread; I haven't been able to think of a killer argument which uses "a two state two boson system" to drive home the point that apokrisis is making.


However, let's presume the converse and say that the quantum state is the result of some random or deterministic process which actually does depend on the state of each particle. In our two particle system there are 16 possible transitions. We went from considering only three possible quantum states to 16 possible transitions. This is quite an increase in complexity and only for a two particle system and would seem to be a dubious choice of artifice unless we had some concise theory of the dynamics. It certainly isn't consistent with occam's razor.

The disparity between the actual number of physical states according to quantum physics and the number of state transitions which we would have to consider if there were hidden dynamics will grow very fast with the number of particles. Further by what law would these particles change state.

Here is a simple model of the converse of apokrisis postion:

Say in the two particle case, when the states have opposite polarity they spend half as long in that state and all state transitions are equally likely. This would make the system appear as though the particles had indistinguishable statistics. For more particles we could simply divide by the binomial coefficient to get the time in which the system would spend in a given state and continue to assume all state transitions are equally likely.

However, the question would be is, "How is the arrangement of the particles causing the transition times to be such that the system appears to have statistics of indistinguishable particles?". Why would we choose a much more complex model just to hold onto the notion that the system dynamics are actually governed by the behavior of individual particles? If we are going to be so stubborn with our presumptions why don't we just presuppose god?
apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#7 - Quote - Permalink
Posted Feb 5, 2013 - 11:11 PM:

John Creighton wrote:
However, let's presume the converse and say that the quantum state is the result of some random or deterministic process which actually does depend on the state of each particle. In our two particle system there are 16 possible transitions.


You appear to be talking about something else now. This poser is specifically about quantum situations where there are two entangled particles whose state has been "randomised" by allowing their wavefunction to evolve in time and interact with a noisy environment. As a composite system, it isn't undergoing separable transitions. There is only the one "transition" when a definite measurement is made.

John Creighton
PF Addict

Usergroup: Members
Joined: Apr 22, 2012

Total Topics: 107
Total Posts: 1034
#8 - Quote - Permalink
Posted Feb 5, 2013 - 11:21 PM:

apokrisis wrote:


You appear to be talking about something else now. This poser is specifically about quantum situations where there are two entangled particles whose state has been "randomised" by allowing their wavefunction to evolve in time and interact with a noisy environment. As a composite system, it isn't undergoing separable transitions. There is only the one "transition" when a definite measurement is made.



Yeah, but we could get the same statistics this way - as erroneous as it would be. However, because we can't measure the states of the individual particles without disturbing the system we have good reason to believe that such a classical interpretation is wrong.
apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#9 - Quote - Permalink
Posted Feb 5, 2013 - 11:27 PM:

John Creighton wrote:
With regards to the supervenience issue we discussed this in another thread starting here:
forums.philosophyforums.com...ndpost=1049401#post1049401

and concluded that there is both top down and bottom up supervenience because of the uncertaintly principle.


Yes, in case it isn't clear, the problem here is that even if the situation is explained in terms of downward acting constraints, there must still be some kind of local degrees of freedom for the constraints to act upon. There must be some requisite variety or spontaneity coming up the other way to get shaped into a statistical regularity.

apokrisis
semper politice rectam!
Avatar

Usergroup: Members
Joined: Feb 27, 2010

Total Topics: 16
Total Posts: 3584
#10 - Quote - Permalink
Posted Feb 5, 2013 - 11:34 PM:

John Creighton wrote:


Yeah, but we could get the same statistics this way - as erroneous as it would be. However, because we can't measure the states of the individual particles without disturbing the system we have good reason to believe that such a classical interpretation is wrong.


I'd have to see the workings out of the example to follow you here.

For instance, two fermions could never transition into 0/0 or 1/1 states. Even with bosons, where are the 16 different combinations now coming from?
locked
Download thread as
  • 100/5
  • 1
  • 2
  • 3
  • 4
  • 5


Recent Internal Replies
On Feb 12, 2013 - 7:40 PM, Sam26 replied internally to apokrisis's OK, you guys are jus....
On Feb 7, 2013 - 4:58 PM, andrewk replied internally to John Creighton's If we can agree that....
On Feb 7, 2013 - 1:24 PM, Flannel Jesus replied internally to Flannel Jesus's If there is someone ....
On Feb 7, 2013 - 12:45 PM, Kali Yuga replied internally to Flannel Jesus's If there is someone ....

This thread is closed, so you cannot post a reply.