Monday 25 October 2010

The die has been RECAST

RECAST is an idea toward a more efficient use of experimental data collected by particle physics experiments. A paper outlining the proposal appeared on ArXiv 2 weeks ago. In order to explain what RECAST is and why it is good I need to make a small detour.

In the best of all worlds, all experimental data acquired by humanity would be stored in a convenient format and could be freely accessed by everyone. Believe it or not, the field of astrophysics is not so far from this utopia. The policy of the biggest sponsors in that field - NASA and ESA - is to require that more-or-less data (sometimes a pre-processed form) are posted some time, typically 1-2 years, after the experiment starts. This policy is followed by such cutting-edge experiments as WMAP, FERMI, or, in the near future, Planck. And it is not a futile gesture: quite a few people from outside of these collaborations have made a good use these publicly available data, and more than once maverick researchers have made important contributions to physics.

Although the above open-access approach appears successful, it is not being extended to other areas of fundamental research. There is a general consensus that in particle physics an open-access approach could not work because:
  • bla bla bla,
  • tra ta ta tra ta ta,
  • chirp chirp,
  • no way.
Consequently, data acquired by particle physics collaborations are classified and never become available on the outside of the collaboration. However, our past experience suggests that some policy shift might be in order. Take for example the case of the LEP experiment. Back in the 90s the bulk of experimental analyses was narrowly focused on a limited set of models, and it is often difficult or impossible to deduce how these analyses constrain more general models. One disturbing consequence is that up to this day we don't know for sure whether the Higgs boson was beyond LEP's reach or whether it was missed because it has unexpected properties. After LEP's shutdown, new theoretical developments suggested new possible Higgs signatures that were never analyzed by the LEP collaborations. But now, after 10 years, accessing the old LEP data requires extensive archeological excavations that few are willing to undertake, and in consequence scores of valuable information are rotting in the CERN basements. The situation does not appear to be much better at the Tevatron where the full potential of the collected data has not been explored, and it may never be, either because of theoretical prejudices, or simply because of lack of manpower within the collaborations. Now, what will happen at the LHC? It may well be that new physics will come straight in our faces, and there will never be any doubt what the underlying model is and what are the signals we should analyze. But it may not... Therefore, it would be wise to organize the data such that they could be easily accessed and tested against multiple theoretical interpretations. Since an open access is not realistic at the moment, we would welcome another idea.

Enter RECAST, a semi-automated framework enabling to recycle existing analyses so as to test for alternative signals. The idea goes as follows. Imagine that a collaboration performs a search for a fancy new physics model. In practice, what is searched for is a set of final states particles, say, a pair of muons, jets with unbalanced transverse energy, etc. The same final state may arise in a large class of models, many of which the experimenters would not think of, or which might not even exist at the time the analysis is done. The idea of RECAST is to provide an interface via which theorists or other experimentalists could submit a new signal (simply at the partonic level, in some common Les Houches format). RECAST would run the new signal through the analysis chain, including hadronization, detector simulations and exactly the same kinematical cuts as in the original analysis. Typically, most experimental effort goes into simulating the standard model background, which has already been done by the original analysis. Thus, simulating the new signal and producing limits on the production cross section of the new model would be a matter of seconds. At the same time, the impact of the original analyses could be tremendously expanded.

There is some hope that RECAST may click with experimentalists. First of all, it does not put a lot of additional burden on collaborations. For a given analysis, it only requires a one-time effort of interfacing it into RECAST (and one could imagine that at some point this step could be automatized too). The returns for this additional work would be a higher exposure of the analysis, which means more citations, which means more fame, more job offers, more money, more women... At the same time, RECAST ensures that no infidel hands ever touch the raw data. Finally, RECAST is not designed as a discovery tool, so the collaborations would keep the monopoly on that most profitable part of the business. All in all, lots of profits for a small price. Will it be enough to overcome the inertia? For the moment the only analysis available in the RECAST format is the search for Higgs decaying into 4 tau leptons performed recently by the ALEPH collaboration. For the program to kick off more analyses have to be incorporated. That depends on you....

Come visit the RECAST web page and tell the authors what you think about their proposal. See also another report, more in a this-will-never-happen vein.

Monday 18 October 2010

Maybe all that exists is the standard model...or even less

Throughout the previous decade Gia Dvali was arguing that there are $10^{32}$ copies of the standard model out there. Now, he made a U-turn and says that there is only 1. Or even less. Let me explain.

The reason why we are pretty sure that we are going to observe new phenomena in the LHC goes under the nickname unitarity of WW scattering. What hides behind this is, technically speakin, that the tree-level scattering amplitude of longitudinally polarized W bosons computed in the standard model without the Higgs particle grows as a square of the scattering energy, and at some point around 1 TeV it becomes inconsistent with unitarity, that is with conservation of probability. In the full standard model this problem is cured: the contribution from the Higgs exchange cancels the dangerously growing terms and the full amplitude is well behaving for arbitrary high energies. A slightly different mechanism is realized in technicolor theories, where the consistent UV behavior of the amplitude is ensured by the exchange of spin-1 resonances.
In spite of 40 years of intensive research we are only aware of these 2 ways of unitarizing the WW amplitude. Thus the LHC should see either the Higgs or new spin-1 resonances. Time will tell which of the 2 possibilities is realized in nature.

A paper last week by Dvali and co. suggests that there may be a 3rd possibility. The authors conjecture that the standard model without a Higgs and without any other embellishments could be a fully consistent theory, even though it appears to be in conflict with unitarity. They argue that the uncontrolled growth of the WW scattering amplitude is just an artifact of the perturbative approximation, while at the non-perturbative level the theory could be completely sane. The idea is that, as the scattering energy increases above TeV, the theory defends itself by producing "large" classical configurations during the scattering process. The higher the energy, we get the larger and more classical objects which then decay preferentially to many-body (rather than 2-body) final states. This way the 2-to-2 WW scattering remains unitary at energies above TeV. The authors, somewhat dully, call this mechanism classicalization. To put it differently, as we increase the scattering energy at some point we stop probing the physics at short distance scales; these small distances are screened from external observers, similar in spirit to black holes screening the short distance physics in transplanckian scattering when gravity is in the game.

If this is the case, what would it mean in practice, that is in experiment? Much as in technicolor, at TeV energies the LHC should observe resonances in WW scattering who ensure the unitarity of the perturbative amplitude in the low-energy effective theory. However, as the scattering energy is increased the resonances become more and more classical and spectacularly decay into many-particle final states. There is no new fundamental degrees of freedom at high energies, no new fundamental forces to discover, just the standard model and its non-perturbative classical dynamics.

Now, can this be true? The paper is rather cryptic, and provides few technical details. In this sense it feels like another emergent gravity. What it demonstrates is that in a class of theories that includes the standard model there exist classical solutions whose large distance behavior only depends on how much energy is sourcing it, and whose size grows in a universal way with the energy. The rest seems to be just words, and there is a long way to proving that classicalization can indeed lead to a fully consistent quantum theory. Nevertheless, given the scarcity of ideas concerning electroweak symmetry breaking, there is definitely some philosophical potential in the paper. We'll see whether it leads to something more concrete...

Update: See also Lubos' stance.

Saturday 16 October 2010

Back in Town

More than 2 months have passed since my last post. Sorry for these perturbations related to changing continents. I'm about to resume blogging after a few changes and adaptations due to my new environment:
  • One is the cute new banner you must have seen already.
  • Furthermore, the name of this blog has been changed from Resonaances to Résonaances.
  • Seriously ;-)
  • All you fellow bloggers, you should update the name in your blog roll, otherwise you risk being sued by La Commission de la Protection de la Langue Française.
  • Mhm, I actually found out that La Comission does not exist anymore, but one never knows...
  • And all you readers, mind that the pronunciation has changed :-)
  • The subsequent posts will have an abstract in French.
  • Joking, of course. French is perfect for flirting, but not as much for talking science.
  • Consequently, author's name remains Jester, it has *NOT* been changed to Le Bouffon ;-)
The last two months when I was out have been quiet anyway. Dark matter was discovered, again. Higgs was rumored to be seen at the Tevatron, again. Some unexplained events have been seen at the LHC. Just business as usual. These latter rumors should be exponentially growing with each picobarn acquired by the LHC; in case, you know where to ask ;-)