My previous blog discussed some historical papers related to the intensity interferometer and its connection to quantum optics. Here, I explain the basic physics of an intensity interferometer.
In the context of spatial coherence, the coherence theory expresses the degree of spatial coherence as,
with \( U_i(t) \) representing the fields of sources \( i = 1 \) and \( 2 \).
An intensity interferometer measures the intensity correlation function between such sources. If \( I_1(t) \) is the intensity of source 1 and \( I_2(t) \) is the intensity of source 2, then the intensity correlation function is given by:
If one ignores the background (the first term in the sum of the above equation) and considers only the fluctuations in the signal (the second term), then the term of relevance will be:
The signal in the intensity interferometer is thus proportional to \( \left| \gamma_{12} \right|^2 \).
A conventional interferometer measures a signal that is proportional to \( \left| \gamma_{12} \right| \), which includes the amplitude and phase, whereas an intensity interferometer measures a signal proportional to \( \left| \gamma_{12} \right|^2 \), which is not sensitive to the phase.
Intensity interferometers have certain advantages compared to conventional interferometers (such as the Michelson interferometer). Below is a partial list:
Intensity measurements (unlike amplitude or phase) can be done directly using optoelectronic instruments.
They do not require precise, sub-wavelength optical alignment, unlike amplitude- or wavefront-dividing interferometers.
They can be used with two detectors that are placed far apart, thereby improving the spatial resolution of the measurement (relevant in astronomy).
A constraint of an intensity interferometer is that the intensity of the participating source should be bright.
There is an important connection between quantum optics and radio astronomy. Hanbury Brown and Twiss in the 1950s devised the intensity interferometer.
Particularly, they were interested in measuring the ‘diameter of discrete radio sources’. The title of their seminal paper reads “A new type of interferometer for use in radio astronomy”. As the authors claimed in their paper: “The principle of the instrument is based upon the correlation between the rectified outputs of two independent receivers at each end of a baseline, and it is shown that the cross-correlation coefficient between these outputs is proportional to the square of the amplitude of the Fourier transform of the intensity distribution across the source.”(Brown and Twiss, 1954)
First, they tested their technique in a laboratory situation and followed it up with a measurement of the diameter of Sirius. Their technique was a game-changer in measuring the diameter of bright stars.
As the intensity interferometers were being developed, the laser was realized in the early 1960s. Unlike conventional light sources, laser light is coherent, and this brings in unique features that can be used to understand the nature of light. In the context of laser optics, intensity interferometers had immediate utility in studying coherence through correlation measurement. It was logical to combine lasers with intensity interferometers and study the correlation. This combination is what led to the discovery of some fascinating aspects of quantum properties of light, including anti-bunching.
If the book by Born and Wolf is considered a classic on the electromagnetic theory of light, the quantum extrapolation is the book by Leonard Mandel and Emil Wolf titled Optical Coherence and Quantum Optics.
This book discusses the interface of statistical optics, optical coherence, and quantum optics. The core argument of the book starts with probability theory and its connection to fluctuations of light and builds optical coherence, polarization, and eventually quantum optical effects of light. It is a well-written treatise on light with a flavor of experiments (Mandel did some pioneering experiments in quantum optics) and theoretical explanation (a hallmark of Wolf).
In the preface of the book, they bring together the importance of intensity interferometers and the discovery of lasers and explain how and why it led to a deeper understanding of quantum optics:
“Prior to the development of the first lasers in the 1960s, optical coherence was not a subject with which many scientists had much acquaintance, even though early contributions to the field were made by several distinguished physicists, including Max von Laue, Erwin Schrodinger and Frits Zernike. However, the situation changed once it was realized that the remarkable properties of laser light depended on its coherence. An earlier development that also triggered interest in optical coherence was a series of important experiments by Hanbury Brown and Twiss in the 1950s, showing that correlations between the fluctuations of mutually coherent beams of thermal light could be measured by photoelectric correlation and two-photon coincidence counting experiments. The interpretation of these experiments was, however, surrounded by controversy, which emphasized the need for understanding the coherence properties of light and their effect on the interaction between light and matter.” (Mandel and Wolf, 1995, p. 1)
This further led to a series of studies on light-matter interaction from a coherence perspective, and included analysis of the fluctuation of light by understanding the randomness and the associated statistics of the fluctuations. Mandel, Wolf, Glauber, E.C.G. Surdarshan and many others across the world laid the foundation and connection between optical coherence and quantum optics. What started as a technical development in radio astronomy turned out to be a vital tool in quantum optics.
Brown, R. Hanbury, and R. Q. Twiss. ‘LXXIV. A New Type of Interferometer for Use in Radio Astronomy’. Philosophical Magazine 45, no. 366 (1954): 663–82. https://doi.org/10.1080/14786440708520475.
Brown, R. Hanbury, and R. Q. Twiss. ‘Correlation between Photons in Two Coherent Beams of Light’. Nature 177, no. 4497 (1956): 27–29. https://doi.org/10.1038/177027a0.
Hanbury Brown, R., and R. Q. Twiss. ‘A Test of a New Type of Stellar Interferometer on Sirius’. Nature 178, no. 4541 (1956): 1046–48. https://doi.org/10.1038/1781046a0.
If you need to admire complex analysis for its elegance and visual utility, try quantum optics. Specifically, the description of quantum states. Thanks to creation and annihilation operators, the position and momentum states of a quantum optical field can be represented as quadratures. These entities can now be represented on the orthogonal axes of a complex plane. The representation of Argand diagrams starting with a classical electromagnetic field and then extrapolating them to quantum theory is a tribute to its geometrical representation. The fact that two axes can be utilized to represent real and imaginary parts of the defined state is itself an interesting thing. By certain operations within the plane, one can realize the vacuum state, the coherent state, and the squeezed state of quantum optics.
The Vacuum Spread – One of the major consequences of quantum theory, and especially the second quantization, is the realization of the vacuum states. Even when there are zero photons, there is a residual energy in the system that manifests as vacuum states. How to define the presence or absence of a photon is a different proposition because vacuum states are also associated with something called virtual photons. That needs a separate discussion. Anyway, in a complex plane of quadrature, a vacuum state is represented by a circular blob and not a point (see fig. 1). It is the spread of the blob that indicates the uncertainty. In a way, it is an elegant representation of the uncertainty principle itself because the spread in the plane indicates the error in its measurement. Importantly, it emphasizes the point that no matter how low the energy of the system is, there is an inherent uncertainty in the quadrature of the field. This also forms the fundamental difference between a classical and a quantum state. The measurement of the vacuum fluctuation is a challenging task, but one of the most prevalent consequences of vacuum fluctuation is the oblivious spontaneous emission. If one looks at the emission process in terms of stimulated and spontaneous pathways, then the logical consequence of the vacuum state becomes evident in some literature on quantum optics. Spontaneous emission is also defined as stimulated emission triggered by vacuum state fluctuations. It is an interesting viewpoint and helps us to create a picture of the emission process vis-à-vis the stimulated emission.
Figure 1. Vacuum state representation. Note that their centre is at the origin and has a finite spread across all the quadrants. Figure adapted from ref. 2.
Another manifestation of the vacuum state is the Casimir effect, where an attractive force is induced as you bring two parallel plates close to each other. The distance being of the order of the wavelength or below this triggers a fascinating phenomenon which has deep implications not only in understanding the fundamentals of quantum optics and electrodynamics, but also in the design and development of quantum nanomechanical devices.
A shift in the plane – Coherent states are also described as displaced vacuum states, and this displacement is evident in the Argand diagrams. The quadrature can now help us visualize the uncertainty in the phase and the number of photons in the optical field. One of the logical consequences of the coherent state is the number-phase uncertainty. This gets clear if one observes the spread in the angle of the vector and the radius of the blob represented (see Fig. 2). Notice that the blob still exists. The only difference is that the location of the blob has shifted. The consequence of this spread has a deeper connection to the uncertainty in the average number of photons and the phase of the optical field. The connection to the number of photons is through the mod alpha, which essentially represents the square root of the average number of photons. Taken together, the blob in the Argand diagram represents the number-phase uncertainty.
Figure 2. Coherent state representation. Note that their centre is displaced. Figure adapted from ref. 2.
Lasers are the prototypical examples of coherent states. The fact that they obey Poissonian statistics is the direct consequence of the variance in the photon number, which is equivalent to the square root of the average number of photons. This means one can use photon statistics to discriminate between sources that are sub-Poissonian, Poissonian, or super-Poissonian in nature. The super-Poissonian case is the thermal light, and the sub-Poissonian state represents photon states whose number can reach up to 1 or 0. The coherent states sit in the middle, obeying the Poissonian statistics.
Everything has a cost – Once you have a circle with a defined area, it will be interesting to ask: Can you ‘squeeze’ this circle without changing its area? The answer is yes, and that is what manifests as a squeezed state. In this special state, one can squeeze the blob along one of the axes at the cost of a spread in the orthogonal direction. This converts the circle into an ellipse (see Fig. 3).
Figure 3. Squeezed State. Note the circle has been squeezed into an ellipse. Figure adapted from ref. 2.
Note that the area must be conserved, which means that the uncertainty principle still holds good; just that the reduction in the uncertainty along one axis is compensated by the increment in another. This geometrical trick has a deep connection to the behaviour of an optical field. If one squeezes the axis along the average number of photons, it means that you are able to create an amplitude-squeezed state. This means the uncertainty in the counting of photons in that state has reduced, albeit at the cost of the uncertainty in the measurement of phase. Similarly, if one squeezes the blob along the axis of the phase, then we end up with a lowering of the uncertainty for the optical phase. Of course, this comes at the cost of counting of number of photons. I should mention that the concept of optical phase itself is not clearly defined in quantum optics. This is because an ill-defined phase can have a value of 2π, which creates the problem. An interesting application of the phase-squeezed quantum states is in interferometric measurements. By reducing the uncertainty in the phase, one can create highly accurate measurements of phase shifts. So much so that this can have direct implications on high-precision measurements, including gravitational wave detection. The anticipation is also that such tiny shifts can be helpful in observing feeble fluctuations in macroscopic quantum systems.
Pictures can lead to more than 1,000 words. And if you add them to a quantum optical description, as in the case of the states that I have defined, they create a quantum tapestry. Perhaps this is the beauty of physics, where there is a coherence between mathematical language, geometrical representation, and physical reality. Feynman semi-jokingly may have said, “Nobody understands quantum mechanics,” but he forgot to add that there is great joy in the process of understanding through mathematical pictures. After all, he knew the power of diagrams.
References:
Ficek, Zbigniew, and Mohamed Ridza Wahiddin. Quantum Optics for Beginners. 1st edition. Jenny Stanford Publishing, 2014.
Fox, Mark. Quantum Optics: An Introduction. Oxford Master Series in Physics 15. Oxford University Press, 2006.
Gerry, Christopher C., and Peter L. Knight. Introductory Quantum Optics. Cambridge, United Kingdom ; New York, NY, 2024.
Saleh, B. E. A., and M. C. Teich. Fundamentals of Photonics. 2nd edition. Wiley India Pvt Ltd, 2012.
Jan 2026 – Apr 2026 – I am teaching a course on Quantum Optics. Below you will find some random thoughts and notes related to my reading. I will be updating the list as I go along the semester. You can add your comments below.
Anyone interested in physics should know a bit about renormalized QED and the efforts that went behind it… It still remains a benchmark of how experiments and theory work in elevating each other…
Hari Dass (erstwhile, IMSc) on FB made an interesting observation:it’s unfortunate that after all those and subsequent developments, a mystery is being built out of renormalisation..it was the price to pay for assuming, without any justification, that the microscopic description held to arbitrarily small distances..wilson,schwinger and even feynman have clarified that the right way to do physics is to start with an effective description with a cutoff, which can be fully quantum in nature, and keep extending it to higher and higher scales with the help of further data, as well as with better theoretical understanding..
“The photon is the only particle that was known as a field before it was detected as a particle.”
This is how Weinberg introduces the birth of quantum field theory. He further adds: “Thus it is natural that the formalism of quantum field theory should have been developed in the first instance in connection with radiation and only later applied to other particles and fields.”Ref: S. Weinberg (in Quantum Theory of Fields, p.15, 1995)
Sudipta Sarkar (IIT G) made an interesting observation in facebook:
“In some sense, it did right! Dirac started QFT with the effort to quantise radiation! But formally, it is not easy to write down the quantum version of electrodynamics owing to gauge symmetry. It took quite a bit of time to understand how to manage a quantum theory with massless states!“
My reply: “indeed..the reconciliation of symmetry was a bottleneck. I am also amazed by the progress of thought, especially by Dirac, who took the harmonic oscillator problem and treated it the way he did. Historically, the question of quantization of particles was already an established programme, but to quantize the field was indeed a major challenge, and hence ‘second quantization’.“
The concept of creation and annihilation operators is an intriguing one because it brings in the thoughts from the commutation relationship that existed in classical physics and transfers that into quantum mechanics. This intellectual connection is mainly attributed to Dirac, and historically, this has been one of the most important connections to be made. The question of field quantization already existed in 1920s, but it is thanks to Dirac who really made this connection in a systematic and mathematically consistent way.
In the context of the quantum harmonic oscillator model of electromagnetic radiation, the shift from canonical variables such as position and momentum to creation and annihilation operators is a fascinating one. Interestingly, this progression further leads to the so-called number operator. It is also a progression from Hermitian to non-Hermitian and again back to a Hermitian operator. In the process of understanding the number operators, one realizes that the ground-state results in the so-called zero-point energy. Taken further, the commutation of the number operator with the electric field of the electromagnetic radiation results in the number-amplitude uncertainty. This further gives an insight into why the field amplitude has a non-zero spread, even for the n = 0 state, and therefore results in the so-called vacuum fluctuations.
The word photon has an interesting and surprising origin – see this paper.
also see: Kragh, Helge. ‘The Names of Physics: Plasma, Fission, Photon’. The European Physical Journal H 39, no. 3 (2014): 263–81. https://doi.org/10.1140/epjh/e2014-50007-7.
Born & Wolf to Mandel & Wolf – a blog on a famous book and on the connection between radio astronomy and quantum optics.
Hong, C. K., Z. Y. Ou, and L. Mandel. ‘Measurement of Subpicosecond Time Intervals between Two Photons by Interference’. Physical Review Letters 59, no. 18 (1987): 2044–46. https://doi.org/10.1103/PhysRevLett.59.2044.
The references below discuss a few contemporary yet simple approaches toward the HOM experiment.
DiBrita, Nicholas S., and Enrique J. Galvez. ‘An Easier-to-Align Hong–Ou–Mandel Interference Demonstration’. American Journal of Physics 91, no. 4 (2023): 307–15. https://doi.org/10.1119/5.0119906.
Bjurlin, Cyrus, and Theresa Chmiel. ‘A Versatile Hong–Ou–Mandel Interference Experiment in Optical Fiber for the Undergraduate Laboratory’. American Journal of Physics 93, no. 2 (2025): 180–86. https://doi.org/10.1119/5.0210869.
Prof. Supradeepa from IISc made an important observation related to the non-Poissonian distribution and anti-bunching as follows: When I taught quantum optics earlier this semester, there was an interesting discussion with students which I had not had given thought previously. I have seen the terms anti-bunching and g2(0) < 1 sometimes interchanged. But the idea is that g2(0) < 1 is only non-poissonian while, the stronger condition that g2(0) < g2(\tau) is also needed to have anti-bunching. An easy to calculate example was fock states with |n> for n > 1. g2(tau) = 1-1/n, so these states are non-poissonian because g2(0)<1, but not anti-bunched.
My reply: This is an important point, and Fox’s book has a small discussion related to this non-equivalence: sub-Poisson distribution and anti-bunching can overlap, but need not be the same. As you mentioned, g2(0) < 1 and g2(0) < g2(\tau) have to be satisfied. The criteria for a single-photon source are much stricter than those for a sub-Poisson light source. In my lecture, I do mention this as seen in the picture…
The first of these quotes by Feynman is a guiding principle for anyone who wants to learn. The second quote is an idealistic one, but a good approach to becoming a ‘problem-solving’ researcher. Feynman was a master of this approach.
From a philosophy of science perspective, researchers can be both ‘problem creators’ and ‘problem solvers’. The latter ones are usually famous.
Michael Nielsen, a pioneer of quantum computing and champion of open science movement, has an essay titled: Principles of Effective Research, in which he explicitly identifies these two categories of researchers, and mentions that “they’re not really disjoint or exclusive styles of working, but rather idealizations which are useful ways of thinking about how people go about creative work.”.
He defines problem solvers as those “who works intensively on well-posed technical problems, often problems known (and sometimes well-known) to the entire research community in which they work.” Interesting, he connects this to sociology of researchers, and mentions that they “often attach great social cache to the level of difficulty of the problem they solve.”
On the other hand, problem creators, as Nielsen indicates, “ask an interesting new question, or pose an old problem in a new way, or demonstrate a simple but fruitful connection that no-one previously realized existed.”
He acknowledges that such bifurcation of researchers is an idealization, but a good model to “clarify our thinking about the creative process.”
Central to both of these processes is the problem itself, and what is a good research problem depends both on the taste of an individual and the consensus of a research community. This is one of the main reasons why researchers emphasize defining a problem so much. A counterintuitive aspect of the definition of the problem is that one does not know how good the ‘question’ is until one tries to answer and communicate it to others. This means feedback plays an important role in pursuing the problem further, and this aptly circles back to Feynman’s quote: “What I cannot create, I do not understand”.
More than 22 years ago, I started my journey as a research student in theoretical physics – Quantum Electrodynamics (QED) + Radiative Transfer (MSc summer project at the Indian Institute of Astrophysics), and my special paper in the MSc final semester was QED. Later in my PhD, I branched into experiments on light scattering (Raman, Mie & Rayleigh).
Over the years, QED and quantum optics have always been at the back of my mind while studying, researching and teaching.
Come January, I will be teaching a course on Quantum Optics to MS(Quantum Tech), MS-PhDs, and 4th-year physics UGs
I designed the first course on this topic at IISER Pune about a decade ago with the able inputs from Prof. Rajaram Nityananda, and I have taught the course a few times. Now, after a few years, I will teach it again.
With the emergence of quantum sci & tech, there is a new impetus and excitement on this topic.
Having said that, the foundations of the topic remain the same, and Quantum Optics has a wonderful history and philosophy associated with it…and where better to start than Dirac’s classic (see below).
Recently, I was talking to a college student who had read some of my blogs. He was interested in knowing what it means to humanize science. I told him that there are at least three aspects to it.
First is to bring out the wonder and curiosity in a human being in the pursuit of science. The second was to emphasize human qualities such as compassion, effort, mistakes, wrong directions, greed, competition and humour in the pursuit of science. The third thing was to bring out the utilitarian perspective.
The student was able to understand the first two points but wondered why utility was important in the pursuit of humanizing science. I mentioned that the origins of curiosity and various human tendencies can also be intertwined with the ability to use ideas. Some of the great discoveries and inventions, including those in the so-called “pure science” categories, have happened in the process of addressing a question that had its origin in some form of an application.
Some of the remarkable ideas in science have emerged in the process of applying another idea. Two great examples came into my mind: the invention of LASERs, and pasteurization.
I mentioned that economics has had a major role in influencing human ideas – directly or indirectly. As we conversed, I told the student that there is sometimes a tendency among young people who are motivated to do science to look down upon ideas that may have application and utility. I said that this needs a change in the mindset, and one way to do so is to study the history, philosophy and economics of science. I said that there are umpteen examples in history where applications have led to great ideas, both experimental and theoretical in nature, including mathematics.
“….So many people today—and even professional scientists—seem to me like someone who has seen thousands of trees but has never seen a forest. A knowledge of the historic and philosophical background gives that kind of independence from prejudices of his generation from which most scientists are suffering. This independence created by philosophical insight is—in my opinion—the mark of distinction between a mere artisan or specialist and a real seeker after truth..”
The student was pleasantly surprised and asked me how this is connected to economics. I mentioned that physicists like Marie Curie, Einstein and Feynman did think of applications and referred to the famous lecture by Feynman titled “There is Plenty of Room at the Bottom” (1959).
To give a gist of his thinking, I showed what Feynman had to say on miniaturization:
“There may even be an economic point to this business of making things very small. Let me remind you of some of the problems of computing machines. In computers we have to store an enormous amount of information. The kind of writing that I was mentioning before, in which I had everything down as a distribution of metal, is permanent. Much more interesting to a computer is a way of writing, erasing, and writing something else. (This is usually because we don’t want to waste the material on which we have just written. Yet if we could write it in a very small space, it wouldn’t make any difference; it could just be thrown away after it was read. It doesn’t cost very much for the material).”
I mentioned that this line of thinking on minaturization is now a major area of physics and has reached the quantum limit. The student was excited and left after noting the references.
On reflecting on the conversation, now I think that there is plenty of room to humanize science.
The 2025 Nobel Prize in physics has been awarded to John Clarke, Michel H. Devoret and John M. Martinis “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.“
The Nobel Prize webpage has an excellent summary of the work at popular and technical levels.
In this blog, I want to draw attention to an interesting review article from 1984 by Anthony Leggett that pre-empts the awarded work. Leggett himself was a Nobel laureate (2003), and his work on the theory of superconductors and superfluids forms one of the conceptual foundations for this year’s Nobel prize. Four of his papers have been cited by the Nobel committee as part of the scientific description of the award, and one of them is a review article I wish to emphasize.
The title of Leggett’s review is “Schrödinger’s cat and her laboratory cousins“. It discusses the detection and implication of macroscopic quantum mechanical entities, and has a description of the so-called Schrödinger’s cat, which is essentially a thought experiment describing quantum superposition and the bizarre consequence of such a formulation. Leggett utilizes this conceptual picture of the cat (in alive and dead states) and extends this to possible scenarios in which the macroscopic quantum superposition can be detected and verified under laboratory conditions. The text below from his review captures the essence of the concept and connects it to experimental verification :
“It is probably true to say that most physicists who are even conscious of the existence of the Cat paradox are inclined to dismiss it somewhat impatiently as a typical philosophers’ problem which no practising scientist need worry about. The reason why such an attitude can be maintained is that close examination of the paradox has seemed to lead to the conclusion that, worrying or not at the metaphysical level, it has at any rate no observable consequences: that is, the experimental consequences of the above apparently bizarre description are quite undetectable. Since physicists, by virtue of their profession, tend to be impatient of questions to which they know a priori no experiment can conceivably be relevant, they have tended to shrug off the paradox and leave the philosophers to worry about it.
Over the last few years, thanks to rapid advances in cryogenics, noise control and microfabrication technologies, it has become clear that the above conclusion may be over-optimistic (or pessimistic, depending on one’s point of view!). To be sure, it is unlikely that in the foreseeable future we will be able to exhibit the experimental consequences of a cat being in a linear superposition of states corresponding to life and death. However, at a more modest level it is possible to ask whether it would be possible to exhibit any macroscopic object in a superposition of states which by a reasonable common-sense criterion could be called macroscopically different. The answer which seems to be emerging is that it is almost certainly possible to obtain circumstantial evidence of such a state of affairs, and not out of the question that one might be able to set up a more spectacular, direct demonstration. This is the subject of this article.“
Cut to 2025, this is also the subject of the Nobel Prize in Physics today.
This department is steeped in history, and this post is to give you a pictorial glimpse of some people who worked there.
Werner Heisenberg, aged 25, became a Professor at the University of Leipzig, Germany. It was an illustrious department then, had professors such as Peter Debye, Gustav Hertz (of the Franck-Hertz experiment fame), Friedrich Hund and many others. Felix Bloch was a student of Heisenberg in Leipzig.
As the AIP archives describe, “Only 25 years old in October 1927, Heisenberg accepted appointment as professor of theoretical physics at the University of Leipzig, Germany. Friedrich Hund soon joined his former Göttingen colleague as Leipzig’s second professor of theoretical physics. Heisenberg headed the Institute for Theoretical Physics, which was a sub-section of the university’s Physics Institute, headed until 1936 by the experimentalist Peter Debye. Each of the three professors had his own students, assistants, postdocs, and laboratory technicians.”
Below are a few snapshots that I took while visiting the department. Special thanks to Diptabrata Paul (my former PhD student and currently a post-doc in Cichos’ group) for showing me around the department.