One of the main topics of my PhD, the “stop0L search” with the ATLAS detector is a good example of how physicists have been looking for, but haven’t found, new physics – and how we move on from there.

# A very brief introduction to SUSY

Supersymmetry, or “SUSY” for short, is a theory of particle physics that goes Beyond the Standard Model (BSM); like most BSM theories, it tries to solve a problem with our current model (the Standard Model, SM) and ends up predicting new particles that we can look for – a falsifiable prediction. So what problem does SUSY theorists set out to solve? Well, people will most often mention “the hierarchy problem” or “Dark Matter candidates”, but in my thesis I took a slightly different approach, which I’ll try to summarise briefly here.

Modern theories of particle physics use a rather “low-level” set of mathematical tools, called symmetries (specifically, Lie groups and Lie algebras): these simplified descriptions effectively encode the whole particle content of the theory as well as its interactions. In the SM, this is how we describe fermions (particles of matter) and bosons (force carriers). In fact, these are so called “gauge” or internal symmetries, as opposed to external ones, like symmetries of space-time itself.

Talking about symmetries of space-time might seem like an odd concept, but it is rather intuitive: we want physics to be independent of various space and time translations and rotations, i.e. repeating an experiment at a different time and place, caeteris paribus, should not affect its outcome. Around 1915–1918, German mathematician Emmy Noether showed, in an eponymous theorem, that these symmetries of space-time in fact led to more familiar concepts, such as conservation of energy and momenta. Furthermore, the same concept applied to gauge symmetries too, leading to conservation of quantum currents and charges.

The simple problem SUSY solves is therefore the following: how can extend symmetries of space-time without breaking known physics, and especially without violating the Coleman-Mandula no-go theorem? This is quite a technical issue, but its resolution is quite beautiful: we need to make space-time anti-commute. This is in fact akin to introducing new components to space-time that behave in a fermion-like fashion, when regular “old” space-time was purely bosonic. As it turns out, this new fermion-boson symmetry is naturally communicated to the gauge symmetries of the theory and this really the power of supersymmetry: simply by making the description of space-time more general and elegant, we must invoke new particles to complete the picture. Contrary to many BSM theories, these come about completely naturally: as soon as you put a SM fermion on your SUSY space-time, you need to have a new partner boson (called sfermion); and SM bosons need new fermion partners (bosinos).

This effectively doubles the particle content of the SM:

## SUSY partners at the LHC

Out of all these new particles, which may well be very massive and out of the energetic range of the LHC, two are crucially important: the supersymmetric partner to the top quark (the stop) and the neutralino. The stop is a sfermion, in this case the bosonic scalar partner to the top quark. A number of theoretical arguments motivate the case for an inverted mass hierarchy in SUSY: when the first generation of fermions (electrons, up and down quarks) is much lighter than the third (tau leptons, top and bottom quarks), the situation is the opposite for their new partner sfermions. This makes the stop quark (and the sbottom) the lightest sfermion and usually within reach of the LHC (I relate those arguments in my thesis to a solution of the “hierarchy problem” mentioned earlier).

The neutralino is equally important. A mixture of the neutral bosinos (hence the name), it is a massive (although usually the lightest in the SUSY spectrum) neutral particle; furthermore, it is stable, since it cannot decay to (more massive) SUSY particles and interacts only very weakly, if at all, with SM particles. Under this description (more technically known as “R-parity conserving”), the neutralino is a prime candidate for Dark Matter. As it turns out, it is possible to imagine proton collisions at the LHC where both these key players appear:

Above you can see a mechanism for pair production of stop quarks, each decaying into their SM partner (the top quark) and a neutralino. The top quark then further decays, as usual, into a $W^\pm$ boson and a bottom quark, all reconstructed in the detector as jets (collimated streams of hadrons).

# Searches at ATLAS

I contributed to the results presented in the following two papers:

• ATLAS, Search for a scalar partner of the top quark in the jets plus missing transverse momentum final state at ${\sqrt{s}=13\,\mathrm{TeV}}$ with the ATLAS detector, JHEP 12 (2017) 085

• ATLAS, Search for a scalar partner of the top quark in the all-hadronic $t\bar{t}$ plus missing transverse momentum final state at $\sqrt{s}=13\,\mathrm{TeV}$ with the ATLAS detector, ATLAS-CONF-2020-004

with the latter being an update of the former, using nearly 4 times as much data. Neither found any signs of SUSY.

You can see in the following two plots what a SUSY signal might look like (pink dashed line), when we compare it to data (black dots) and our best estimates of SM processes that produce the same final state signature of many jets + missing energy (from the neutralinos not interacting):

Since we don’t know a priori what the masses of the stop quark and neutralino are, we generate a grid of signal points to cover as many different mass scenarios as possible. With these simulated signals, we then optimise our analysis to produce signal regions, like the two plots shown above, that maximise our sensitivity to the production these SUSY particles – note how few background events there are, after we’ve applied such a harsh selection to our entire datasets with billions of collisions!

If instead of plotting kinematic distributions, like above, we make a histogram of all data and background counts in each signal region, it would look like this:

It is quite clear that, as we say in the business, no significant data excess above the SM prediction is observed, i.e. no SUSY… There’s a tiny excess in the “SRB-TT” region, but it’s more consistent with data fluctuations than SUSY (especially given the deficit in the previous bin). But out of all the different models we generated along our mass grid, which ones did we really exclude and which ones do we simply not have enough data (or data at high enough energies) to tell?

By computing statistical significances for each model (a point along the two-dimensional mass grid) and interpolating, we can produce the following exclusion contour:

The entire region enclosed by the red line is excluded by this analysis; the previous exclusion (first paper linked above) is shown in grey, and you can see the improvements made, both statistically (more data) and systematically (better understanding of our detector and background processes). Stop masses are excluded up to about $1.2\,\mathrm{TeV}$ for neutralino masses below $400\,\mathrm{TeV}$. The less stable behaviour you can observe near the dashed diagonal lines correspond to much more difficult cases, where the mass difference between the stop and the neutralino is not enough to produce on-shell (“real”) top quarks, making those harder to identify and reconstruct.

# What’s next?

The negative result presented above is an important milestone (the first search for stop quarks using the full LHC Run 2 dataset) but clearly not the end of the story. Crucially, SUSY itself is not quite “dead” yet, because:

• this only a partial exclusion: we need to extend the reach of the red line in the plot above;
• how far should we extend it? in theory, until we find SUSY, which sounds a bit silly – thankfully, we have arguments from cosmology that for SUSY to be “useful” in solving problems of the Standard Model, it should appear in the TeV energy range we’re currently probing at the LHC (if it doesn’t, it becomes much less attractive a theory!);
• we made a number of assumptions in deriving this result (the stop is the lightest sfermion, the neutralino is the lightest SUSY particle, the decay rate is 100%, etc.)

So what should we do? Here are a few ideas…

## Improve sensitivity

Before we get more statistics (Run 3 of the LHC doesn’t start before 2021), we can try to make our exclusions more stringent by designing signal regions that are more sensitive to the class of SUSY models we’re studying. For instance, one could add targeted top quark reconstruction algorithms to better isolate the $t\bar{t}$-like processes: in regions SRA and SRB (see two plots above), the $Z+\text{jets}$ background still dominates. Combinations with other stop channels (considering semi- and di-leptonic decays of the $t\bar{t}$ system) will also improve the limits. Machine learning techniques might be able to provide additional discriminating variables and this is of course a highly active area of research right now.

## Reduce background uncertainty

With the impressive quantity of data collected in Run 2 of the LHC, we’ve clearly entered the era of systematic-dominated measurements, where the statistical uncertainty is much smaller than that related to detector effects and theoretical modelling. While reducing detector uncertainties is usually largely independent of the analysis team (the study of experimental systematics is done centrally in ATLAS), we now have more than enough data to conduct detailed and data-driven background studies, which would allow us to drop quite a few theoretical uncertainties on key background processes. The interplay of SUSY searches and differential SM measurements (such as the one presented here) should then become more apparent, with the latter providing state-of-the-art SM predictions and the former being able to access extreme regions of phase-space (such as large jet multiplicity or missing energy).

## Move away from model-driven searches

This is the realm of model-independent and general searches, such as those conducted by CMS and the ATLAS Exotics group. A variety of methods are available to go looking for excesses (direct hints of new particles) and deviations (indirect effects) in data, from “bump hunting” in invariant mass spectrums to performing data analysis over thousands of signal regions. I’m particularly interested in the development of machine learning tools towards anomaly detection, which I tried to motivate in this blogpost.