Astronomers have many ways to measure the distance to galaxies billions of light years away, but most of them rely upon standard candles. These are astrophysical processes that have a brightness we can calibrate, such as Cepheid variable stars or Type Ia supernovae. Of course, all of these standard candles have some inherent variability, so astronomers also look for where our assumptions about them can lead us astray. As a case in point, a recent study in The Astrophysical Journal shows how galactic dust can bias distance observations.
The study compares two slightly different ways to measure galactic distances. The first method compares the X-ray luminosity of a galaxy to its brightness at ultraviolet wavelengths. Known as LX–LUV, this approach relies on the fact that active galactic nuclei (AGNs) have a similar spectrum depending on their overall brightness. The LX–LUV allows astronomers to determine the absolute magnitude, and therefore galactic distance. The second method is known as R – L and compares the ultraviolet luminosity of the accretion disk around the galaxy’s supermassive black hole with the radius of that accretion disk. The bigger the disk, the brighter it is, thus getting the absolute magnitude.
Both of these methods focus on the brightness of the AGN, and both involve UV brightness, so both methods should give us a similar distance. But often they don’t. The authors of this paper wanted to find out why, so they looked at 58 galaxies where both methods had been used to determine their distance. They then looked for factors that might skew the results of one method relative to the other.
The team found that dust within a galaxy can affect the LX–LUV method. Galactic dust, mostly made of carbon and silicon, can absorb X-ray light and re-emit other wavelengths. The more dusty a galaxy is, the more significantly it can skew the distance result. The team also found that the presence of dust doesn’t significantly bias the R – L method. Based on this, the authors recommend that the LX–LUV method not be used to measure galactic distances. That’s a little unfortunate since the R – L method is a bit more difficult to measure, but it means we can rule out data that could be skewing our cosmic distance measures. This could help us better understand the underlying issues of the Hubble tension, which continues to nag cosmologists.
The discovery of this bias doesn’t in any way undermine the standard model of cosmology, as these methods aren’t the only ones we can use to determine cosmic distances. Instead, it further improves our methods, so that we now have an even clearer understanding of how our Universe came to be.
Reference: Zaja?ek, Michal, et al. “Effect of Extinction on Quasar Luminosity Distances Determined from UV and X-Ray Flux Measurements.” The Astrophysical Journal 961.2 (2024): 229.
Meanwhile the South Polar Telescope has released a new cosmology paper [“A Measurement of Gravitational Lensing of the Cosmic Microwave Background Using SPT-3G 2018 Data”, arxiv 2308.11608]. “Due to the different
combination of angular scales and sky area, this lensing analysis provides an independent check on lensing measurements by ACT and Planck.”
They agree with Planck, but see less anomalies and is consistent with a flat universe without adding BAO data. Their Hubble constant estimate is H_0 = 68.8 +1.3/?1.6 km s^?1 Mpc^?1.
Their lensing observations prepares for their upcoming inflation observations. “High fidelity lensing maps generated by SPT-3G will be critical in removing the lensing contamination in the search of primordial gravitational waves, parametrized by the tensor-to-scalar ratio r, when jointly analyzed with data from BICEP Array [114]. The most stringent constraint on r to date from [115] is already limited by lensing. With SPT-3G’s full-survey lensing map, we expect to improve BICEP Array’s r uncertainty by a factor of about 2.5, reaching ?(r) of about 0.003.”