Astronomers working with data obtained via NASA’s new James Webb Space Telescope (JWST) may hit a snag if they do not make certain adjustments, researchers at the Massachusetts Institute of Technology (MIT) warn.
The authors of the new study recently published in Nature Astronomy argue that the existing tools astronomers use to “decode light-based signals” obtained from telescopes may require retuning in order to adjust to the precision of the data supplied by JWST.
“Currently, the model we use to decrypt spectral information is not up to par with the precision and quality of data we have from the James Webb telescope,” Prajwal Niraula, graduate student at MIT and co-author of the study, said as quoted in a press release published in EurekAlert. “We need to up our game and tackle together the opacity problem.”
The researchers pointed specifically at the so-called opacity models that astronomers essentially use to gauge the properties of material far away from our planet by studying how light interacts with said material.
Julien de Wit, assistant professor at MIT and co-leader of the study, argued that while the existing state-of-the-art opacity model “has been doing OK” so far, “now that we’re going to the next level with Webb’s precision, our translation process will prevent us from catching important subtleties, such as those making the difference between a planet being habitable or not.”
De Wit and his colleagues have already made some suggestions on how the existing opacity models could possibly be improved, “including the need for more laboratory measurements and theoretical calculations to refine the models’ assumptions of how light and various molecules interact”, the press release states.