Wednesday, April 14, 2010

A Hands-On Introduction to Single Photons and Quantum Mechanics for Undergraduates

This is another very useful demonstration for undergraduate students on the quantum effect of light, and how it differs from the classical description. Fascinatingly enough, it appears that this is a lab at the sophomore level. Maybe sophomores nowadays are more "sophisticated" than when I was in college, but this certainly would require quite a bit more knowledge of QM than I would have had back then at this level.

Abstract: We describe a series of experiments used in a sophomore-level quantum physics course that are designed to provide students with a hands-on introduction to quantum mechanics. By measuring correlations, we demonstrate that a helium-neon laser produces results consistent with a classical model of light. We then demonstrate that a light source derived from a spontaneous parametric down-conversion process produces results that can only be described using a quantum theory of light, thus providing a (nearly) single-photon source. These single photons are then sent into a Mach–Zehnder interferometer, and interference fringes are observed whenever the path of the photons cannot be determined. These experiments are investigated theoretically using straightforward quantum-mechanical calculations.

B.J. Pearson and D.P. Jackson, Am. J. Phys. v.78, p.471 (2010).

What makes this paper so wonderful is that it is full of various references and resources, and also discussed many of the physics background information. So for someone who wants to learn about the physics of the phenomena, this is almost a one-stop shop.

This paper compliments the ones that I've mentioned earlier that were also undergraduate laboratory or demonstration.

Zz.

2 comments:

Peter said...

From my point of view, this is very much better than the paper that was the subject of yesterday's post. Enough better, indeed, that I find it inspiring. I particularly like their hat-tip to Willis Lamb, "It has been suggested that the word photon itself contributes to the widespread misunderstanding that surrounds the word and should therefore be eliminated" (their ref 21), which fits well with your approving notice on Art Hobson's approach to quantum theory, at http://physicsandphysicists.blogspot.com/2008/08/art-hobsons-selected-papers.html. You also have Art Hobson in your links sidebar, so I will comment on this aspect.

Art Hobson's approach is, I take it, to say that what causes events in measurement apparatus is the quantum field that is generated by the preparation apparatus and contained by the experimental apparatus as a whole, as a way of not talking about particles. I invite readers to refer to the papers you link to for details. Although we can talk about states that are created by the action of a single creation operator on the vacuum, we can also construct arbitrary superpositions of that state with the vacuum (at least, we can for the EM field). Such superpositions undermine the idea that there is such a thing as an irreducible "single-photon state" because it is non-trivially different from a mixture of the same two states and we can continuously deform the ratio of vacuum and single-photon components.

Section IIIA of the Pearson and Jackson paper, "Photons at last", is where things get interesting. They discuss the two outputs of a parametric down-conversion crystal. They point out something that I have not previously focused on, that for one of those outputs considered in isolation the measure of coincidences between measurement events after a beam-splitter (B and B' in P&J's paper) is given by alpha=1, the same as for a coherent state. In contrast, and crucially, the measure of coincidences between measurement events after a beam-splitter, and conditioned on the times at which measurement events are observed in the other parametric down-conversion channel (BB' conditioned on A in P&J's paper), is given by alpha<<1. We can therefore know that if we condition on times when measurement events are not observed in the other parametric down-conversion channel, the measure of coincidences would be given by alpha>1 (BB' conditioned on not A).

Thus, we can say that the BB' coincidences are "modulated" by the down-conversion process. The "modulation" of BB' coincidences is random, because the timing of measurement events at A is not periodic, but nonetheless the same random modulation is applied to the A signal and to the BB' signal.

The advantage of this way of talking is that we no longer have to worry about whether a photon goes one way or another, we can instead understand a quantum field state that is a tensor product of two classically oscillating signals, which we take to cause measurement events and coincidences of measurement events, to describe specific random modulations of measurement coincidences.

Discussing QFT in something like signal processing language does not change the formalism, which remains that of Fock space and operators that act on Fock space. Hilbert space methods are routinely used in signal processing for deterministic signals; the kind of discussion of the mathematics of Fock space that is suggested here just introduces a specific kind of random modulation of deterministic signal processing as an interpretation of the math.

Apologies for length.

Anonymous said...

Agreed, only photon counting statistics can be fit to antibunching data. Yet, these statistics are for isotropic radiation in a cavity, not highly directional radiation in an interferometer. What's going on here?
A PDC source, for example, doesn't seem like a cavity.
Please enlighten!