Professor Lucio Mayer of the University of Zurich‘s Center for Theoretical Astrophysics and Cosmology explains how 2016 will remain as a landmark in the history of physical sciences and galaxies.
2016 was the year in which the first direct detection of gravitational waves in galaxies was made, by a vast international consortium employing the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) ground-based interferometer.
The Laser Interferometer Space Antenna (LISA) Pathfinder space probe, designed to test the drag-free technology necessary for LISA – the future space-born gravitational wave detector planned by the European Space Agency (ESA) with international patterns – has not only flown with success after being launched at the end of 2015, but has reported a performance greatly superior to the expectations, with noise level detection already of the order of what we will need for LISA.
The ability to detect gravitational waves pushes our current technology to the limits in a number of areas, but opens the window to a completely new way of looking at the Universe.
Until now astronomy, astrophysics and cosmology have been based on some form of electromagnetic information, coming from any of the known emitting sources in our cosmos, from individual stars to entire galaxies. Astronomers have created communities specialised in the detection, analysis and interpretation of photons received from such sources in diverse regions of the electromagnetic spectrum, from visible optical frequencies to radio, and to X-ray or gamma rays associated with the most violent phenomena of the Universe, such as quasars or supernovae explosions. But, from the early astronomers in the Egyptian or Sumerian ancient civilisations to the modern astronomers using the Hubble space telescope, the Chandra X-ray space telescope or the Very Large Telescope, our Universe has always been studied by means of electromagnetic signals.
With gravitational waves we are really in front of a transformative step in the way we look at the sky. One of the many testable, and tested, predictions of our theory of gravity, General Relativity, gravitational waves will become our new tool to unveil the nature of the Universe, probing for the first time the fabric of space time.
The ultimate fate of stars
What are the prime sources in this new era, those that replace, for importance, stars in conventional astronomy? The answer is binaries of compact objects resulting from the ultimate fate of stars. Among these, binaries of massive black holes living at the centre of galaxies are the loudest sources, giving the strongest and most easily detectable signals when LISA will be operative.
The black holes in these binaries can weigh from just shy of a million solar masses to more than ten billion solar masses. Our Milky Way hosts a (single) black hole weighing a few million solar masses for example (Schodel et al. 2010). LISA will detect preferentially massive black hole mergers happening in the early stages of the Universe, ten or more billions of years before our time, when galaxies were still young and were often colliding against each other as the Universe was much denser than it is today.
As a scientist I like to think I should understand as much as possible the tools I need to carry out my research and go after the most challenging problems. Modern astronomy has come about because we have first elaborated a beautiful, coherent theory of stellar structure and evolution. The stars have been astronomy’s prime tool.
Without that, most of what we now know would have not been possible. Cosmology itself as a quantitative, verifiable science started in the 1920s because it was possible to measure distances of objects, such as galaxies, and this is also done using stars. Now the question is, in our time, do we understand the nature of our new sources, massive black hole binaries, in the same way as we understood stars in the late 1800s? The answer is no. But this is an exciting time to bring the knowledge of such objects to a new level.
When galaxies collide
It all begins when two galaxies, each with their own massive black hole sitting at their centre, collide and then gradually merge into one single galaxy as their large halos of dark matter create a mutual irresistible gravitational pull.
Massive black hole binaries are then thought to evolve across an enormous range of spatial scales. This was already clear at the time of the first major theoretical work on massive black hole binaries (Begelman, Blandford and Ress 1980, Nature).
For typical massive black holes, weighing 10-100 million solar masses, the stage at which gravitational wave emission becomes the dominant mechanism to drain the orbital energy of the binary and bring it to coalescence is reached only when the two black holes reach a separation of a milliparsec. But they start their journey tens of thousands light years away, when they are still in the nuclei of their merging host galaxies.
Ideas of the physical processes governing the evolution of the orbit of the pair of massive black holes have been around for a while, but modelling them correctly requires the use of complex computer models that solve the set of coupled partial differential equations for gravity, pressure forces and radiation to the very least.
The early part of their journey, until they are well above a milliparsec scale, can be described by Newtonian equations, while the latter part needs the intervention of general relativistic calculations solving Einstein’s equation, or at least some approximation of the latter in the form of the so-called post-Newtonian expansion (Prieto et al. 2008). Calculations of this type require the use of supercomputers. Indeed solving even the simplest of these models requires so many operations that it would take a thousand years on a conventional notebook or workstation.
A critical stage is when the two black holes become close enough to become mutually bound by gravity. At this point we can say that the binary has formed. Supercomputer calculations through the years have shown that in this phase the drag by the dense, cold interstellar gas in galactic nuclei, is the dominant process (Mayer et al. 2007; Chapon, Mayer et al. 2013).
After the binary has formed the jury is still out on what is the main source of the drag, but it may well depend on the type of galaxy where the binary is evolving. If there is plenty of cold gas drawn down to the heart of the nucleus, which would torque the binary as long as there are asymmetries in its distribution, a process similar to planet migration (Mayer 2013).
Alternatively, stars can ‘rob’ kinetic energy and angular momentum of the binary as a result of their gravitational pull when they fly close to it, and bring it to the gravitational wave regime (Milosavljevic and Merritt 2001; Khan et al. 2012; Vasiliev and Merritt 2014).
Traditionally computer models that were able to describe the effect of encounters with stars were not able to model friction and torques by gas, nor was it possible to study the whole binary shrinking process from the galaxy merger state to when relativistic effects begin.
Recently we have used some of the fastest supercomputers in the world, located in Switzerland, China and Germany, to carry out the first simulation that follows all the phases of the evolution of the binary, up to the point when gravitational wave radiation begins (Khan, Fiacconi, Mayer et al. 2016).
We started from a galaxy merger extracted from a state-of-the-art simulation of galaxy formation, called ARGO (Feldmann & Mayer 2015), which was previously run on the PizDaint supercomputer in Switzerland. The result is unexpected; the two black holes, which weigh more than 100 million solar masses, fuse into one with a gravitational wave burst in less than 10 million years after the galaxy collision. We also demonstrate that the emitted waves fall into the LISA band before they coalesce.
The timescale of the process is almost 100 times shorter than usually assumed to make forecasts for how many black hole merger events LISA should detect. This is exciting news, and it is also well understood; it is simply a consequence of the fact that galaxies were about 100 times denser than today several billion years ago whereby the key processes determining the shrinking of the binary all depend on density.
The power of machines
Now the challenge ahead of us is mostly computational. This simulation is the first of his kind, and required more than a year of nearly continuous computing even as we harnessed the power of such big machines. But there is a catch. Even the best simulation programmes we currently have cannot use even 10% of the total computing power of these supercomputers at once.
Inefficient usage could get even more evident when the bigger and more powerful exascale supercomputers appear in a couple of years. Yet computer science offers us new techniques to improve the so-called ‘scalability’ of simulation codes, namely their ability to run in parallel on a large number of processing units, from traditional CPUs to Graphics Processing Units (GPUs).
If we can advance our codes to approach 100% efficiency on the new supercomputers we could run tens of simulations in the same time we can currently run only one. This will be the way to provide the necessary theoretical support to produce realistic forecasts for LISA, and help with the interpretation of the data afterwards.
We can envision a supercomputer entirely dedicated to black hole merger simulations, including those focusing on the final phase of coalescence in full general relativity. This may seem ambitious but it may be the only way to go. The parameter space is huge and has to be explored with an ambitious simulation campaign. Supercomputers dedicated to very important tasks, such as weather forecasting, already exist.
A point in history
The endeavour of looking at galaxies and the Universe through the new window of gravitational waves might be a revolutionary step in Mankind’s knowledge; it might mark history as the first astronomical observations of Galileo, Kepler and Copernicus did five centuries before us. It definitely deserves an unprecedented effort in dedicating computational resources, and any kind of other necessary resources, to it.