Dr Giovanni Lamanna, the Director of the CNRS-IN2P3 Laboratory of Particle Physics in Annecy, LAPP, discusses the laboratory’s research programme through a journey into particle and astroparticle physics and the scientific and technological challenges inherent therein.
The French National Centre for Scientific Research (CNRS) is an interdisciplinary public research organisation. IN2P3 is the CNRS thematic Institute carrying out the national mission of co-ordination in the fields of nuclear, particle and astroparticle physics, including the associated technological developments and applications.
LAPP, Laboratoire d’Annecy de Physique des Particules, one of IN2P3 major experimental research laboratories, is also associated with the local university (Université Savoie Mont-Blanc, USMB). More than 40 years ago, the CNRS established this laboratory in the French Alps, close to the Swiss Geneva region, aiming at providing scientists with a national ‘gateway’ to the CERN particle accelerator experiments. Today at LAPP, more than 150 physicists, engineers, technicians and administrative personnel participate in the research for the purpose of studying the ultimate constituents of matter and their fundamental interactions, as well as exploring their connections with the large structures of the Universe.
It is within international collaborations, gathering thousands of researchers, that LAPP pursues its experimental programme. LAPP has an international reputation and a recognised engagement in the next generation facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI) and in other world-class projects. The research conducted at LAPP participate in or are associated with major discoveries and are also awarded with Nobel Prizes in Physics, such as the Higgs boson in 2013 and the gravitational waves discovery in 2017 (being the most recent).
Particle and astroparticle physics
Accelerator-based particle physics and astroparticle physics are two research paths trying to answer some of the most fundamental questions by their own or through complementary approaches.
At LAPP, accelerator-based particle physics research is conducted mainly at the CERN Large Hadron Collider (LHC). Scientists explore the nature of fundamental forces that have shaped our Universe since the beginning of time and which will determine its future; they also search for new particles which could, for example, explain dark matter, and probe the asymmetry between matter and antimatter.
Astroparticle physics research is conducted through large instruments dedicated to the observation of the cosmos. It exploits cosmic ‘messengers’, including particles, electromagnetic radiation and gravitational waves to infer insights on how the Universe functions and on more fundamental physics topics such as:
- The detection of dark-matter particles;
- Tests of general relativity;
- The study of the matter-antimatter asymmetry; and
- The understanding of dark energy, the force that is responsible for the expansion of the Universe.
The international high-energy particle physics (HEP) community is now entering a two-year period of intensive discussions that will bring to the definition of a new European Strategy providing prioritisation to guide the direction of the field to the mid-2020s and beyond.
Different paths will be proposed, among which the options of electron-positron linear colliders for the detailed study of the Higgs sector and a future circular collider (FCC) about four times larger than the current LHC, which will enable us to explore even higher energies than the current LHC 13 TeV regime. However, the upgrading phase of the HL-LHC project, that will increase from 40 to 140 the number of collisions in the detectors each time the two counter circulating proton beams cross each other, will be the major project in this domain for the next 10-20 years.
New devices for astroparticle physics
Astroparticle physics research is entering a precision and high sensitivity epoch in which new devices will exploit all confirmed high-energy cosmic messengers such as gamma rays, neutrinos, cosmic rays and gravitational waves, as has emerged from the recently published European consortium APPEC roadmap for the period 2017-2026.
The new astroparticle physics research projects, aimed at building large and complex infrastructures, have now reached the same size as the HEP projects, mobilising hundreds to thousands of scientists. Amongst the most challenging scientific infrastructures to date, the Cherenkov Telescope Array (CTA) is composed of more than 100 third generation high-energy gamma-ray atmospheric Cherenkov telescopes and is the first astronomical observatory for the exploration of the non-thermal Universe at extreme energies.
The first discovery of gravitational waves (GW) emitted by multi-solar mass black holes binary systems and the LIGO-Virgo detection of GW emitted by the coalescence of neutron stars (event called GW170817) last year, which was followed up by several observatories at other wavelengths, have marked the start of the era of GW astronomy. It has also triggered the interest for real-time multi-messenger observation campaigns and has boosted the technological developments aimed at increasing the sensitivity of GW interferometers, such as LIGO and Virgo. The astroparticle field embraces today cosmology through the large galaxy-survey campaigns, combining spectroscopic, photometric and weak-lensing techniques with upcoming new facilities such as the astronomical telescopes LSST.
“This is a revolution!” says Dr Giovanni Lamanna. “This new precision era is bringing particle physics, astroparticle physics, modern astronomy and cosmology closer to one another. Together, they are fashioning scientists as a global multi-probe approach to the understanding of the laws of the Universe and its evolution.”
All this is also having an important sociological impact on the evolution of the profile of new generation researchers as well as on the way the diverse communities are sharing experiences, providing their data to open access, learning from each other, and facing coherently the preparation and operation of next generation Big Science projects.
Big Science research is all about building and operating large and complex research infrastructures, most of which are conceived to address fundamental science questions; they are the fruit of large worldwide efforts.
Lamanna says: “The many Big Science projects we commit to demand periods of R&D, prototyping, and continuous iterative and co-operative processes before converging on a final technical design and the going on to be ready for their construction and operation. Big Science projects are the result of a long-term process from inception to construction and a deep scientific and technological engagement of number of colleagues that all together experience a real human adventure, where mutual emulation/co-operation and excellence are the key words.”
Fundamental research conducted through Big Science projects are aimed at increasing our basic knowledge. Meanwhile, they give rise to new technological developments in different applied domains, from electronics to mechanics and computing, and, as it often happened in the past, they could change the world in which we live.
Challenging performance and specifications
The need and the consequent capacity to trigger developments appear constantly every time physicists commit to an upgrading phase of their research infrastructures.
LAPP scientists involved in the ATLAS project, one of the experiments based at the LHC at CERN, contributed and will investigate in the future the properties of the Higgs boson and will search for new physics phenomena through rare processes. These topics are among the main reasons for the HL-LHC. The boost of the LHC’s luminosity by a factor of 10 foreseen in the HL-LHC project, with a consequent increase of data to collect and therefore a higher probability to see rare processes, requires critical upgrades to the ATLAS detectors.
In such a context, because of the limitations of the data selection process (trigger) in terms of granularity, latency and bandwidth, which will clearly be insufficient in the HL-LHC, a large part of the electronics of the electromagnetic calorimeter, which was fundamental in the Higgs discovery, will have to be replaced. Suitable electronics is challenged in terms of acquisition speed close to the Terabit/s level and data processing programmable capacity to enable an on-the-fly greater flexibility in the choice of the collisions which are interesting for physics.
The construction of a very high performance tracker detector to reconstruct the primary vertex of the protons’ interactions and achieve sufficient resolution on the momentum measurements of high-energy tracks is critical. LAPP physicists have proposed an innovative silicon-sensor detector concept, including inclined modules baptised an ‘alpine design’ in tribute to the close Alps mountains.
ATLAS has retained the alpine tracker, based on the use of inclined silicon 2.5 x 10-3 mm2 pixel modules instead of a classic planar design. It responds to the optimisation of four main tracking objectives:
- Reducing the total silicon area for a same angular coverage;
- Reducing the extrapolation distance between measurement points;
- Increasing the multiple measuring points per layer; and
- Reducing the quantity of material crossed thanks to the orientation of the sensor that will be closer to the normal to the incident direction of the particles.
The construction of this new detector is implying an intense development of advanced technical solutions concerning diverse domains such as the optimisation of a dual-phase CO2-based compact cooling system, thermo-mechanical engineering, conception of flexes bent connections for electrical services and data flow management, and so on.
Designing new concepts
Large international consortia of scientists and engineers work together, both during the design definition and later in the construction phase. The achievement of the final technical design of a new experiment inspires new pathways to conceive high-performance detectors and innovative concepts. Often, they stimulate competition among partners supporting alternative ideas. They engage prototyping stages and comparative reviews. Finally, the big challenges are shared and the international collaborations work together to support the retained solutions.
In the USA, at the Fermi National Accelerator Laboratory in Batavia, Illinois, the world’s most intense neutrino beam accelerator, is being prepared. A huge cryogenic particle detector, installed underground and 1,300km downstream of the neutrinos source, is being developed by the international consortium Deep Underground Neutrino Experiment (DUNE). Thanks to this experiment, scientists will explore the phenomenon of the neutrino and antineutrino oscillation with the aim being to better understand the reason why the Universe is made of matter rather than antimatter. The DUNE detector, through its large geometric factor, could be also able to observe thousands of cosmic neutrinos coming from a core-collapse supernova in the Milky Way and we may also be able to learn more about black hole genesis.
DUNE’s detectors record the tracks of the particles that emerge from the rare collisions of neutrinos with the atoms of the target material, namely liquid argon. LAPP physicists are involved in the prototype study of a dual-phase ProtoDUNE detector, featuring signal amplifiers that operate in a layer of gaseous argon above the volume of liquid argon kept at -185ºC temperature. A huge 216m3 cubic detector containing liquid argon is bathed in a very strong electric field. Ionisation electrons left during the passage of a charged particle in the liquid drift upward for up to 6m before being detected on the plane of anodes (called charge readout planes – CRP). The CRP amplification electronics are situated above the liquid argon and suspended in a gaseous argon environment.
The major technical challenges currently tackled are the optimisation of the read-out electronics of photomultipliers and the design and construction of the mechanical structure, which will maintain with great precision the 36m2 of reading plan above the liquid argon bath, as well as the automation system which will keep the CRP always at a constant distance from the liquid surface with sub-millimetre precision. These important contributions are the result of challenging ideas and innovative solutions, supported by thermo-mechanical calculations and cryogenic tests to secure the proposed design and finally validated by physics performance evaluations.
In the Big Science context, close collaboration with industry in the very first stage of design as well as during the first technological developments is critical. The European Commission supports ESFRI projects’ co-operative technical works with industries during their preparatory phases. This is, for instance, what CTA successfully experienced.
CTA is an array of telescopes of different sizes and deployed in two sites: on the island of La Palma (Canary Islands) in Spain, and in the Atacama Desert in Chile. CTA is a global effort with more than 1,400 scientists and engineers. The CTA Large Sized Telescope (LST) is a fundamental instrument for observations of transient cosmic phenomena at extreme energies. The LSTs expand the CTA science reach to cosmological distances and fainter sources. Both the re-positioning speed and the low energy threshold provided by the LSTs are critical for CTA to respond promptly to the alerts given by other instruments for the observation of rare astrophysical transient sources, both in our own galaxy and for the study of active galactic nuclei and gamma-ray bursts at high redshift.
Recently, the first prototype LST-1 was deployed in La Palma. It has a 23-metre diameter parabolic reflective surface, which is supported by tubular structures. A reflective surface of 400m2 collects and focuses the Cherenkov light into the camera, where photomultipliers convert the light in electrical signals, processed by dedicated electronics. Although the LST-1 stands 45 metres tall and weighs around 100 tonnes, it is extremely nimble, with the ability to re-position within 20 seconds in order to capture brief gamma-ray signals.
LAPP plays a key role in the context of CTA as it takes responsibility for the design and delivery of various critical elements of the LST. One above all is the high mechanical structures (the Camera Support Structure – CSS) which is made of reinforced carbon fibre that includes the parabolic ‘arch’ and the mechanical camera housing.
LAPP researchers were able to validate their concept through an intense study of numerical calculations and prototyping shared with leading SMEs in the field of designing and manufacturing large size monolithic structures made of composite carbon fibre-based material. The Prepreg technology was chosen for the resulting CSS, that, using a high proportion of fibres with suitable orientation, makes possible to pursue the expected global mechanical properties of the structure in relation to the localised loading direction. This was key for a CSS that is light enough to reposition the camera at a speed of 5m/s while maintaining a high mechanical stability to guarantee a very good angular resolution of the telescope.
Big Science projects are often early adopters of the latest technological developments; they provide stimulating contexts for multidisciplinary R&D activities and they have also often been pushing the state-of-the-art in many different fields.
The European GW interferometer, Virgo, is a LAPP landmark project. The Annecy scientists’ commitment goes back to 28 years ago and it was finally rewarded by the recent detection of the GW170814 and GW170817 events. This ground-breaking discovery opens up the perspective of observing many such events in the next few years, and through a multi-messengers approach, it will participate to the revolution of our understanding about the expansion of the Universe, the mechanism of the most violent astrophysical phenomena, and the laws of gravitation.
For such major objectives, Virgo is moving forward through subsequent upgrading phases, known as an ‘advanced’ programme, that will bring its sensitivity to GW sources up to distances of more than 60Mpc very soon and up to 260Mpc in 2025. Virgo performs its observations in triangulation with the two LIGO antennas in the USA.
Virgo is a Michelson laser interferometer with two orthogonal arms, each 3km long. A beam splitter divides the incident laser beam into two equal components sent into the two arms of the interferometer. These beams are recombined out of phase on a detector so that, in principle, no light reaches the detector except when a GW crosses the interferometer plane. Instruments like Virgo and LIGO are designed to measure variations of the interferometer mirror positions of the order of 10-18m.
LAPP has a leading role in Virgo and provides many components, new solutions and operates diverse sub-systems (vacuum enclosures, optical benches, electronics, data acquisition system, etc.).
In order to increase Virgo’s sensitivity, some major future technological challenges are under study. They target improving the sensitivity of the instrument below the quantum limit. The sensitivity of such detectors is indeed limited by quantum vacuum fluctuations which add to the laser beam. The squeezing project, for instance, aims at acting on the laser light in order to reduce this quantum noise. The envisaged solution will consist in injecting vacuum states characterised by smaller fluctuations in the phase used to measure the GW signal (at the cost of larger ones in the other phase). This squeezed vacuum is generated using a nonlinear crystal and a 100m scale low loss optical cavity. Encouraging results have already been obtained with this squeezing technique.
The calibration of the detectors also demands stringent requirements. One challenging project is equipping the Virgo antenna with a Newtonian calibrator: a device enabling the variation of the Newtonian gravitational field thanks to fast rotating masses. The displacement induced by such a device on the interferometer mirrors would be used as an absolute reference for calibration.
Virgo and LIGO run day and night, listening to all signals that arrive at any time from any part of the Universe. The data coming from the interferometers, starting from the next observation campaign in 2019, will then be put at the disposal of the scientific community.
All next generation facilities, which LAPP is committed to, are Big Data projects since they will generate huge volumes of data up to the exabyte scale. Over the last decades, particle and astroparticle physics have been early adopters and major developers of the latest ICT and data management solutions. They have been able to take great advantage of the huge investments and the resulting advances in high-speed data transport, digital processing, and high performance computing and storage capacity. This has led to the establishment of important competences in the laboratories and the emerging of new profiles of e-technologists, data scientists, data analysts and software experts.
The emerging multi-messengers fundamental research paradigm pushes towards an open cloud of scientific data and requires a closer co-operation and shared challenges among the different communities.
Lamanna says: “We are playing a key role internationally in supporting links among different ESFRI facilities facing similar Big Data challenges, performing co-developments, organising transversal training initiatives, contributing to human capital development, facilitating a flow of expertise between scientists and e-infrastructures experts.”
These actions, supported by the European Commission’s Horizon 2020 programme, are also strengthening the MUST data centre operated by LAPP at a local level, which is evolving in a real ‘digital platform’ for excellence in science, open to society, training and industries.
The LAPP Director is the co-ordinator of the H2020 ESCAPE cluster project aimed at committing the particle physics and astrophysics communities to setting up the European Open Science Cloud (ESOC). EOSC is the cloud for research data in Europe that federates existing resources across data centres, e-infrastructures and research infrastructures, allowing researchers and citizens to access and re-use data produced by other scientists.
“There is a need to build participative bridges between Big Science projects and society, and we are engaging with that,” Lamanna concludes.