He chose a much more controversial topic, namely preons, for his fresh article in the Scientific American:
The Inner Life of QuarksPreons are hypothetical particles smaller than leptons and quarks that leptons and quarks are made out of. But can there be such particles?
At first sight, the proposal seems natural and may be described by the word "compositeness". Atoms were not indivisible, as the Greek word indicated, but they had smaller building blocks – the nucleus and the electron. The nuclei weren't indivisible, either – they had protons and neutrons inside. The protons and neutrons weren't indivisible – they have quarks inside.
Why shouldn't this process continue? Why shouldn't there be smaller particles inside quarks? Or inside the electron and other leptons?
Many people who pose this question believe that it is a rhetorical question and they don't expect any answer. Instead, they overwhelm you with detailed speculations `bout the possible composition of quarks and leptons while they possess lots of wishful thinking when they believe that all the problems they encounter are just details that can be overcome.
(Pati and Salam introduced preons for the first time in 1974. One of the other early enough particular realizations of preons were "rishons" by Harari, Shupe, and a young Seiberg, which means "primary" in Hebrew. I guess that prazdrojs and urquells would be the Czech counterparts. The terminology describing preons has been much more diverse than the actual number of promising ideas coming from this research. The names for "almost the same thing" have included prequarks, subquarks, maons, alphons, quinks, Rishons, tweedles, helons, haplons, Y-particles, and primons.)
However, the question above is a very good, serious question and it actually has an even better answer that explains why.
Mass scales and length scales
Since the mid 1920s and realizations due to Louis de Broglie, Werner Heisenberg, and a few others, we've known about a fundamental relationship between the momentum of a particle and the wavelength of a wave that is secretly associated with it:\[
\lambda = \frac{2\pi \hbar}{p}.
\] You may use units of mature particle physicists in which \(\hbar=1\). In those units, you may omit all factors of \(\hbar\) because they're equal to one and the momentum has the same dimension as the inverse length. Note that adult physicists also tend to set \(c=1\) because the speed of light is such a natural "conversion factor" between distances and times that has been appreciated since Einstein's discovery of special relativity in 1905.
In those \(\hbar=c=1\) units, energy and momentum (and the mass) have the same units, and space and time have the same units, too. The first group is inverse to the second group. Particle physicists love to use \(1\GeV\) for the energy (and therefore also momentum and mass); the inverse \(1\GeV^{-1}\) is therefore a unit for distances and times. One gigaelectronvolt is approximately the rest mass of the proton, slightly larger than the kinetic and potential energies of the quarks inside the proton; the inverse gigaelectronvolt interpreted as a distance is relatively close to the radius of the proton.
At any rate, the de Broglie relationship above says that the greater momentum a particle has, the shorter the wave associated with it is. Similarly, the periodicity of the wave obeys\[
\Delta t = \frac{2\pi\hbar}{E}
\] where \(E\) is the energy. The phase of the wave returns to the original value after a period of time that is inversely proportional to the energy. Now, it is sort of up to you whether \(E\) is the total energy that contains the latent energies \(E=mc^2\) or whether these terms are removed. If you want a fully relativistic description and you're ready to create and annihilate particles, you obviously need to include all the terms such as \(E=mc^2\).
On the other hand, if you study a non-relativistic system, it may be OK to remove \(E=mc^2\) from the total energy and consider \(mv^2/2\) to be the leading kinetic contribution to the energy. That's how we're doing it in non-relativistic quantum mechanics. These two conventions differ by a time-dependent reparameterization of the phase of the wave function (which isn't observable),\[
\psi_\text{relativistic}(\vec x,t) = \psi_\text{non-relativistic}(\vec x,t) \cdot \exp(-i\cdot Mc^2\cdot t/ \hbar)
\] where \(M\) is the total rest mass of all the particles. The relativistic wave function's phase is just rotating around much more quickly than the non-relativistic one.
Preons don't explain any patterns
Fine. Let's return to compositeness and preons. When you conjecture that leptons and quarks have a substructure, you want this idea to lead to exciting consequences. For example, you want to explain why there are many types (flavors) of leptons and quarks out of a more economic basic list of preonic building blocks. It's not a necessary condition for preons to exist but it would be nice and sort of needed for the idea to be attractive.
This goal doesn't really work with preons. Note that it did work with quarks; that's how Gell-Mann discovered or invented quarks. There were many hadrons and the idea that all these particles were composed of quarks was actually able to explain a whole zoo of hadrons – particles related to the proton and neutron, including these two – out of a more economic list of types of quarks.
Gell-Mann's success can't really be repeated with the preons. The list of known leptons and quarks is far from "minimal" but it is not sufficiently complicated, either. Quarks have three colors under \(SU(3)_c\). And both leptons and quarks are typically \(SU(2)_W\) doublets. And both leptons and quarks come in three generations.
These are three ways in which there seems to be a "pattern" in the list of types of quarks and leptons; three directions in which the lists of quarks and leptons seem to be "extended". But none of them may be nicely explained by preons. First, you can't really explain why there are \(SU(2)_W\) doublets or \(SU(3)_c\) triplets. Whatever elementary particles you choose, they must ultimately carry some nonzero \(SU(2)_W\) and \(SU(3)_c\) charges – and the charges of the doublets and triplets are really the minimal ones (the simplest representations) so whatever the preons are, they can't really be simpler than quarks or leptons.
(Here I am assuming that the gauge bosons and gauge fields aren't "composite". The possibility of their compositeness is related to preons and the discussion why it's problematic would be similar to this one but it would differ in some important details. The conclusion is that composite gauge bosons are even more problematic than preons.)
Also, you won't be able to produce three families out of a "simpler list of preons". To produce exactly three families, you need something that comes in three flavors, i.e. a particle of "pure flavor" that has three subtypes and that binds to other particles to make them first- or second- or third-generation quarks or leptons. But there must still be other particles that carry the weak and strong charges so the result just can't be simpler.
The comments above were really way too optimistic. The actual problems with the "diversity of the bound states" that you get out of preons are much worse. Much like there are hundreds of hadron species, you typically predict hundreds of bound states of preons. Moreover, they should allow multiple arrangements of the preons' spins, they should be ready to be excited, and they should produce much more structured bound states. None of these things is observed and the predicted structure just doesn't seem to have anything to do with the observed, rather simple, list of quark and lepton species.
But there exists a problem with preons that is even more serious: their mass.
If it makes any sense to talk about them as new particles, they must have some intrinsic rest mass, much like quarks and leptons. What can the mass be? We may divide the possibilities to two groups. The masses may either be smaller than \(1\TeV\) or greater than \(1\TeV\). I chose this energy because it's the energy that is slightly smaller than the LHC beams and that is already "pretty nicely accessible" by the LHC collider. Maybe I should have said \(100\GeV\) but let's not be too picky.
If the new hypothetical preons are lighter than \(1\TeV\), then the new hypothetical particles are so light that the LHC collider must be producing them rather routinely. If that were so, they would add extra bumps and resonances and corrections and dilution to various charts coming from the LHC. Those graphs would be incompatible with the Standard Model that assumes that there are no preons, of course. But it's not happening. The Standard Model works even though it shouldn't work if the preons were real and light.
So we're left with the other possibility, namely that preons are heavier than \(1\TeV\) or \(100\GeV\) or whatever energy similar to the cutting-edge energies probed by the LHC these days. But that's even worse because the very purpose of preons is to explain quarks and leptons as bound states of preons – and the known quarks and leptons are much lighter than \(1\TeV\).
To get a \(100\MeV\) strange quark, to pick a random "mediocre mass" example, the rest mass of preon(s) inside the quark, several \(\TeV\), would have to be almost precisely cancelled by other contributions to the mass and energy, with the accuracy better than 1 in 10,000. Clearly, the extra terms can't be kinetic energy which is positively definite: the compensating terms would have to be types of negative (binding) potential energy.
But it's extremely unlikely for the energy to be canceled this accurately, especially if you expect that the cancellation holds for many different bound states of preons (because many quarks and leptons are light).
Note that the virial theorem tells us that in non-relativistic physics, it's normal that the kinetic energy and the potential energy are of the same order. For example, for the harmonic oscillator with the \(kx^2/2\) potential energy, the average kinetic energy and the average potential energy are the same. For the Kepler/Coulomb problem, \(V\sim - k/r\), and the kinetic energy is \((-1/2)\) times the (negative) potential energy. More generally,\[
2\langle E_{\rm kin}\rangle = -\sum_{m=1}^N \langle \vec F_m\cdot \vec r_m\rangle
\] and if the potential goes like \(V\sim k r^n\), then \[
\langle E_{\rm kin} \rangle =\frac{n}{2} \langle V\rangle.
\] If you need the potential energy to cancel, you have to assume \(n=-2\). But the attractive potentials \(-1/r^2\) are extremely unnatural in 3+1 dimensions where \(-1/r\) is the only natural solution to the Poisson-like equations you typically derive from quantum field theories. You won't be able to derive them from any meaningful theory. Moreover, relativistic corrections will destroy the agreement even if you reached one. I was assuming that the motion of preons may be represented by non-relativistic physics – because the preons are pretty heavy and at relativistic speeds, they would be superheavy. If you assume that they're heavy and relativistic (near the speed of light), you will face an even tougher task to compensate their relativistically enhanced kinetic energy.
Even if you fine-tuned some parameters to get a cancellation, it will probably not work for other preon bound states. The degree of fine-tuning needed to obtain many light bound states is probably amazingly high. And we're just imposing a few conditions – the existence of light bound states that may be called "leptons and quarks". We should also impose all other known conditions – e.g. the non-existence of all the other bound states that the preon model could predict and the right interactions of the bound states with each other and with other particles – and if we do so, we find out that our problems are worse than just a huge amount of fine-tuning. We simply won't find any working model at all even if we're eager to insert arbitrarily fine-tuned parameters.
If you think about the arguments above, you are essentially learning that you shouldn't even attempt to explain light elementary particles – those that are lighter than the energy frontier, e.g. the energy scale that is being probed by the current collider – as composites. It can never really work. Quarks and leptons are much lighter than the LHC beam energy and because no sign of compositeness (involving new point-like particles) has been found, it really means that there can't be any.
Compositeness has done everything for us
So while the idea of compositeness is responsible for many advances in the history of physics, nothing guarantees that such "easy steps" may be done indefinitely. In fact, it seems likely that there won't be another step of this sort although some bold proposals that the top quark etc. could still be composite exist and are marginally compatible with the known facts.
After all, wouldn't you find it painful if the progress in physics were reduced to repeating the same step "our particles are composed of even smaller ones" that you would repeatedly and increasingly more mechanically apply to the current list of particles? The creativity in physics would be evaporating.
There exists a sense in which quarks and leptons are composite and the counter-arguments above are circumvented. In string theory, a lepton or a quark is a string. That means that you may interpret each such elementary particle as a bound state of "many string bits", pearls or beads along the string. If the number of conjectured "smaller building blocks" becomes infinite, like it is in the case of the stringy shape of an elementary particle, the cancellation between the kinetic and potential energy may become totally natural.
Despite the inner structure of elementary particles, string theory has an explanation why there are massless (or approximately massless, in various approximations) particles in the stringy spectrum. To some extent, this masslessness is guaranteed by having the "critical spacetime dimension" \(D=10\) or \(D=26\) for the superstring and bosonic string case, respectively. Well, string theory circumvents another problem we mentioned, too. We said that the kinetic energy is positive and the sum of all such positive terms must be positive, too. However, string theory uses the important fact that the sum of all positive integers equals \(-1/12\) which provides us with a very natural opportunity to cancel infinitely many terms although all of them seem to be positive.
Comparing preons and superpartners
The LHC hasn't found traces of any new particles beyond those postulated by the Standard Model of particle physics yet. However, that doesn't mean that all proposals for new physics are in the same trouble. In particular, I think it's important to explicitly compare preons with superpartners predicted by the supersymmetry.
At some point in the discussion above, I mentioned that preons could be either lighter or heavier than \(1\TeV\). The case of "light new particles" is generally excluded by the LHC (and previous experiments) because we would have already produced these new particles if they existed and if they were light.
The case of preons heavier than \(1\TeV\) was problematic because their "already high mass" must have been accurately cancelled by some negative contributions to the total energy/mass of the bound states and the negative potential energy required to do so seemed impossible, fine-tuned, and generally hopeless.
But the case of superpartners heavier than \(1\TeV\) doesn't have any problems of this sort. No supersymmetry phenomenologist really has any "rock solid" argument of this sort that would imply that the gluino is lighter than \(1\TeV\) or heavier than \(1\TeV\). We just don't know, these new particles may be discovered at every moment, and even at several \(\TeV\) or so, they still immensely improve the situation with the fine-tuning of the Higgs mass etc.
So while preons are pretty much completely dead – because you just can't construct light particles out of heavy ones, if I oversimplify just a tiny bit – superpartners remain immensely viable and well-motivated. The superpartners may still be rather light – the lower bound on their mass are often significantly lower than the lower bounds on other particles' masses in models of new physics – but there's nothing wrong about their being much heavier, either.
Much like in many texts, it's important not to become a dogmatic advocate of some ideas you decide to "love" in the first five minutes of your research. You could fall in love with the preons. Except that if you impartially study them in much more detail, you find out that this paradigm doesn't really agree with the known features of the world of particles well and some clever enough arguments may actually exclude rather vast and almost universal classes of such models. You should never become a blinded advocate of a theory who becomes blind to arguments of a certain type, e.g. the negative ones that unmask a general disease of your pet theory.
Preons are pretty much hopeless while other models of new physics remain extremely well motivated and promising.
And that's the memo.
P.S.: There will be a Hadron Collider Physics HCP 2012 conference in Kyoto in two weeks; see some of the ATLAS talks under HCP-2012. The detectors should update some of their data from 5 to 12+ inverse femtobarns of the 2012 data which means from 10 to 17 inverse femtobarns of total data. It's just a 30% improvement in the accuracy. Expect much more in March 2013 in Moriond.
Also, Czechia celebrates the main national holiday today, the anniversary of the 1918 birth of Czechoslovakia.
No comments:
Post a Comment