TeV Scale Tuesday

This is Day 2 of my outreach week challenge : why are we so bothered with energy scales and why is now a crucial time in particle physics? Also, what even is a TeV?

If you’ve never seen it before, now is a good time to watch Eames’ excellent documentary Powers of Ten :

The message is clear : the laws of physics apply to, and allow us to describe length scales 40 orders of magnitude apart! The movie shows very different structure almost at each power of ten (from electrons and atoms to superclusters of galaxy), but there is (almost always) structure. This is perhaps an idea that makes sense naturally - there’s a priori no reason why structure shouldn’t happen at every scale, and perhaps we’re in a privileged position (according to the documentary), being roughly in the logarithmic middle of all known scales - but it’s an idea that breaks down when physicists consider the other unit of measurement they’re interested in : energy.

So, what’s a TeV?

In the International System of Units, energies are measured in Joules (J). Right. But “how much energy” really is 1 J? It turns out that it’s in fact pretty reasonable : about the amount of energy required to lift a tomato up by one meter, or for a human to take a very slow step. You apparently also burn off about 60 J per second as body heat. So it is quite appropriate to the human scale.

However, as soon as you start talking about the energy consumption of heavy machinery, fast cars and power plants, you have to add prefixes like “kilo”, “mega” or “giga” (for thousands, millions and billions respectively). That’s fair enough, because you’re still working across scales : you can quickly answer questions like “how many light bulbs can I power with this generator?”. However, if you decide to build a whole new area of physics, very far removed from the scale of the Joule, perhaps it is time to come up with a new unit that is more representative of those physics (a bit like calories).

Enter the electronvolt :

By definition, it is the amount of energy gained (or lost) by the charge of a single electron moving across an electric potential difference of one volt.

That is, the electrons moving around in the AA battery of your TV remote probably have kinetic energies of a few eV. It therefore seems like a good ballpark unit for talking about particle physics. Interestingly, by the famous formula $E=mc^2$, we can also define a new unit of mass, the $eV/c^2$, which turns out to be rather appropriate for describing particle masses. In numbers :

so you can either keep using ridiculously low powers of ten in kilograms, or accept the electronvolt in your heart. While they’re at it, physicists usually pick a set of units, so-called “natural”, where the speed of light is set to 1. This has the useful consequence that mass and energy are now both expressed in eV. By a slight abuse of language, then, a physicist might say that the mass of a proton is 938 MeV, or that the Large Hadron Collider accelerates them to 13 TeV.

And there we have it : the teraelectronvolt, or TeV, $10^{12}$ eV.

But why is it so special?

There are a few reasons for that…

Because a lot of physics happens near it

In the Standard Model of particle physics, all the particles have masses below the TeV scale. The two heaviest are the Higgs boson, at 0.125 TeV, and the top quark, at 0.172 TeV. In a way, it’s the scale from which you start to play with all the cards available. Likewise, the number that governs electroweak interactions is about 0.25 TeV. All these numbers (one EW scale and ten particle masses) also happen to be free parameters of the theory : they are not inherently predicted, and must be measured. They say something about the particular universe we live in; another universe with slightly different parameters would still be consistent, but would give rise to wildly different physics.

Because that’s where the LHC is looking right now

As I mentioned earlier, the LHC is now running at a center-of-mass energy of 13 TeV. This is a bit like budgeting : you collide two protons as hard as you can, and get a large amount of energy together for a short while. How you spend it is up to you (well, up to the laws of quantum physics, really). By repeatedly applying the formula $E=mc^2$, you can “create” various massive (or not, up to you) particles, allocating some of that energy budget to mass, and the rest to forming bond states and giving all these new particles a good kick.

The idea is that running at this particular energy level allows for all the known physics of the Standard Model to happen, albeit at different frequencies. Then, by recording more and more data over the years, even those rare processes would have high enough statistics that we could say something about them. The more energetic you go, the better; but still, we should already be able to tell a lot about particle physics - and we have! All the elementary particles have been observed (the last one in date was the Higgs boson, July 2012), predicted rare bound states (possible combinations of elementary constituents) are being regularly ticked off the list, and we’re now engaged in an era of precision measurements. The name speaks for itself, but it’s really about nailing down those uncertainties that could make a particular number go either way in the description of a physical process, like an undecided voter at a general election…

Because that’s where we expect new physics

On top of all the “standard” physics we know and love, we would very much like to find new, possibly unforeseen particles (or even forces, why not…). After all, we know that the Standard Model is, at best, a good approximation of the energy scales we’ve been working at. Mathematically, it doesn’t answer all the questions, and so theorists have come up with a plethora of new hypotheses and models to fill in the gaps. That’s why you might have heard of mini-black holes, extra dimensions, ghost particles, heavy neutrinos, dark matter, etc. Perhaps the most “theoretically successful” (read, mathematically pleasing, consistent, useful and wide-reaching) is the idea of supersymmetry, which predicts no less than double the already observed particle content…

For supersymmetry to be a successful theory, coming to the rescue of the Standard Model, it needs to happen at a scale which is, you might have guessed it, the TeV scale. Too much above and it would become unnatural, bringing more problems than it solves. The subject of my PhD thesis is precisely to look for it at the LHC, with the ATLAS experiment; so far, we haven’t found anything, and we are pushing natural “SUSY” in its last trenches…

Of course, it’s not the only theory to go beyond the Standard Model, and a variety of new particles could be lurking out there, just waiting to be discovered. Or, as some have argued, it could be more subtle : rather than jumping out at us in the form of unseen particles, new physics could reveal itself as anomalies in rare processes that are just now finally being studied, thanks to the amounts of collected data at high energy. This is exactly the notion of “precision measurements” I mentioned above, and it will only get better with time.

Because there’s basically nothing else

This is a rather grim prospect, but it stems from our best understanding of the universe, through quantum physics and general relativity.

On the one hand, we have this TeV scale where all the aforementioned happens, and that’s great. All the numbers are obtained experimentally, correspond to the particular configuration our universe decided to take, and are in this sense unique. It’s the cosmos-given scale.

On the other hand, from the other constants of nature we know, we can make our own scale, with dimensions of energy (equivalently, mass) :

This Planck mass is built out of the Planck constant $\hbar$, the speed of light $c$ and the Newtonian gravitational constant $G$. All these numbers also had to be experimentally measured, and correspond to the uniqueness of our universe. Only $M_{\text{P}}$ is artificially derived; still, it can be understood as the energy where our understanding of gravity breaks down, because spacetime itself starts exhibiting quantum properties. Alternatively, it also is the energy ceiling above which the Standard Model is powerless, because it cannot accommodate gravitational effects. This combined problem is the subject of much effort and research, and goes by the name of quantum gravity.

If you look at the value of $M_\text{P}$ and take the above argument at face value, it’s as if we had only two interesting energy scales, intrinsically connected to the nature of Nature, yet are separated by sixteen orders of magnitude! That no new physics inhabit this “energy desert” is definitely a possibility, yet not a very positive one : how would we then proceed on learning about physics if the rest is so out of reach?

Personally, I prefer to think that, although quantum gravity remains the Graal, a number of solutions to the problems of the Standard Model lie along the way to $M_\text{P}$, in the form of new free parameters (i.e. new experimentally built theories) we haven’t thought of yet…

Appendix : some perspective

Although there is much talk of “high energy physics” and considerations of the Planck scale as a tremendous amount of energy, do remember that it still only corresponds to a fraction of human-scale units like the kilogram or the Joule. It’s also quite funny to realize that 1 TeV is about the energy of motion of a flying mosquito.

No, really it’s all about squeezing that energy down to infinitely small volumes (like that of an elementary particle). Then we can talk about “high energy (density) physics”…

comments powered by Disqus