This snapshot compares the distribution of galaxies in a simulated universe used to train SimBIG (right) to the distribution of galaxies observed in the real universe (left). Credit: Bruno Régaldo-Saint Blancard/SimBIG collaboration
The Standard Model of the universe is based on just six numbers. Using a new approach based on artificial intelligence, Flatiron Institute researchers and their colleagues have extracted information hidden in the distribution of galaxies to estimate the values of five of these cosmological parameters with incredible precision.
The results obtained represented a significant improvement over values obtained by previous methods. Compared to conventional techniques using the same galaxy data, the approach produced less than half the uncertainty for the parameter describing the aggregation of matter in the universe. The AI-based method also closely matches estimates of cosmological parameters based on observations of other phenomena, such as the oldest light in the universe.
The researchers present their method, Simulation-Based Inference of Galaxies (or SimBIG), in a series of recent papers, including a new study published August 21 in Astronomy of nature.
According to Shirley Ho, a co-author of the study and group leader at the Center for Computational Astrophysics (CCA) at the Flatiron Institute in New York City, generating tighter constraints on parameters while using the same data will be crucial to studying everything from the composition of dark matter to the nature of the dark energy that splits the universe. That’s especially true as new surveys of the cosmos come online in the coming years, she says.
“Each of these studies costs hundreds of millions, if not billions, of dollars,” Ho says. “The whole reason for doing these studies is that we want to better understand these cosmological parameters. So if you think about it from a very practical perspective, these parameters are worth tens of millions of dollars each. You want the best analysis possible to extract as much knowledge as possible from these studies and push the boundaries of our understanding of the universe.”
The six cosmological parameters describe the amount of ordinary matter, dark matter, and dark energy in the universe, as well as the conditions following the Big Bang, such as how opaque the early universe was as it cooled and whether mass in the cosmos is scattered or in large clumps. The parameters “are essentially the ‘settings’ of the universe that determine how it works on the largest scale,” says Liam Parker, a co-author of the study and a research analyst at CCA.
One of the most widely used methods by cosmologists to calculate these parameters is the study of the clustering of galaxies in the universe. Previously, these analyses only took into account the large-scale distribution of galaxies.
“We haven’t been able to get down to the small scales,” says ChangHoon Hahn, a research associate at Princeton University and lead author of the study. “For a few years, we’ve known there was additional information, but we didn’t have an efficient way to extract it.”
Hahn proposed a way to harness AI to extract this information on a small scale. His plan involved two phases. First, he and his colleagues would train an AI model to determine the values of cosmological parameters based on the appearance of simulated universes. Then, they would show their model real observations of the distribution of galaxies.
Hahn, Ho, Parker, and their colleagues trained their model by showing it 2,000 box-shaped universes from the CCA’s Quijote simulation suite, each universe created using different values for cosmological parameters. The researchers even made the 2,000 universes appear as data generated by galaxy surveys, including atmospheric defects and the telescopes themselves, to make the model look realistic.
“That’s a lot of simulations, but it’s a manageable number,” Hahn says. “Without machine learning, it would take hundreds of thousands.”
By integrating the simulations, the model learned over time how the values of cosmological parameters correlate with small-scale differences in the clustering of galaxies, such as the distance between individual pairs of galaxies. SimBIG also learned to extract information about the overall arrangement of galaxies in the universe by observing three or more galaxies at a time and analyzing the shapes created between them, such as long, stretched triangles or squat equilateral triangles.
An infographic presenting the methodology of the SimBIG (Simulation-Based Inference of Galaxies) project. Credit: Lucy Reading-Ikkanda/Simons Foundation
Once the model was trained, the researchers presented it with 109,636 real galaxies measured by the Baryon Oscillation Spectroscopic Survey. As they hoped, the model exploited both small- and large-scale details in the data to improve the accuracy of its estimates of cosmological parameters. These estimates were so accurate that they were equivalent to a traditional analysis using about four times as many galaxies.
That’s important, Ho explains, because there are only a limited number of galaxies in the universe. By achieving greater precision with less data, SimBIG can push the boundaries of what’s possible.
One exciting application of this precision, Hahn says, will be the cosmological crisis known as the Hubble tension. This tension results from faulty estimates of the Hubble constant, which describes how quickly everything in the universe is spreading.
To calculate the Hubble constant, one must estimate the size of the universe using “cosmic rulers.” Estimates based on the distances of exploding stars, called supernovae, in distant galaxies are about 10 percent larger than those based on the spacing of fluctuations in the universe’s earliest light.
New studies, which will come online in the coming years, will shed more light on the history of the universe. Combining data from these studies with data from SimBIG will help us understand the extent of the Hubble tension and determine whether this discrepancy can be resolved or requires a revised model of the universe, Hahn says. “If we measure the quantities very precisely and can say with certainty that there is tension, it could reveal new physics about dark energy and the expansion of the universe,” he says.
Hahn, Ho and Parker worked on the study alongside Michael Eickenberg of the Center for Computational Mathematics (CCM) at the Flatiron Institute, Pablo Lemos of CCA, Chirag Modi of CCA and CCM, Bruno Régaldo-Saint Blancard of CCM, Simons Foundation President David Spergel, Jiamin Hou of the University of Florida, Elena Massara of the University of Waterloo and Azadeh Moradinezhad Dizgah of the University of Geneva.
More information:
ChangHoon Hahn et al, Cosmological constraints from non-Gaussian and non-linear galaxy clustering using the SimBIG inference framework, Astronomy of nature (2024). DOI: 10.1038/s41550-024-02344-2
Provided by the Simons Foundation
Quote:Astrophysicists use AI to precisely calculate the ‘parameters’ of the universe (2024, August 26) retrieved August 26, 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.