[/caption]
Gravitational waves are apparently devilishly difficult things to model with Einstein field equations, since they are highly dynamic and non-symmetric. Traditionally, the only way to get close to predicting the likely effects of gravity waves was to estimate the required Einstein equation parameters by assuming the objects causing the gravity waves did not generate strong gravity fields themselves – and nor did they move at velocities anywhere close to the speed of light.
Trouble is, the mostly likely candidate objects that might generate detectable gravity waves – close binary neutron stars and merging black holes – have exactly those properties. They are highly compact, very massive bodies that often move at relativistic (i.e. close to the speed of light) velocities.
Isn’t it weird then that the ‘guesstimate’ approach described above actually works brilliantly in predicting the behaviors of close massive binaries and merging black holes. Hence a recent paper titled: On the unreasonable effectiveness of post-Newtonian approximation in gravitational physics.
So, firstly no-one has yet detected gravity waves. But even in 1916, Einstein considered their existence likely and demonstrated mathematically that gravitational radiation should arise when you replace a spherical mass with a rotating dumbbell of the same mass which, due to its geometry, will generate dynamic ebb and flow effects on space-time as it rotates.
To test Einstein’s theory, it’s necessary to design very sensitive detecting equipment – and to date all such attempts have failed. Further hopes now largely rest on the Laser Interferometer Space Antenna (LISA), which is not expected to launch before 2025.
However, as well as sensitive detection equipment like LISA, you also need to calculate what sort of phenomena and what sort of data would represent definitive evidence of a gravity wave – which is where all the theory and math required to determine these expected values is vital.
Initially, theoreticians worked out a post-Newtonian (i.e. Einstein era) approximation (i.e. guesstimate) for a rotating binary system – although it was acknowledged that this approximation would only work effectively for a low mass, low velocity system – where any complicating relativistic and tidal effects, arising from the self-gravity and velocities of the binary objects themselves, could be ignored.
Then came the era of numerical relativity where the advent of supercomputers made it possible to actually model all the dynamics of close massive binaries moving at relativistic speeds, much as how supercomputers can model very dynamic weather systems on Earth.
Surprisingly, or if you like unreasonably, the calculated values from numerical relativity were almost identical to those calculated by the supposedly bodgy post-Newtonian approximation. The post-Newtonian approximation approach just isn’t supposed to work for these situations.
All the authors are left with is the possibility that gravitational redshift makes processes near very massive objects appear slower and gravitationally ‘weaker’ to an external observer than they really are. That could – kind of, sort of – explain the unreasonable effectiveness… but only kind of, sort of.
Further reading: Will, C. On the unreasonable effectiveness of the post-Newtonian approximation in gravitational physics.
In 1960, in preparation for the first SETI conference, Cornell astronomer Frank Drake formulated an…
The Pentagon office in charge of fielding UFO reports says that it has resolved 118…
The Daisy World model describes a hypothetical planet that self-regulates, maintaining a delicate balance involving…
Researchers have been keeping an eye on the center of a galaxy located about a…
When it comes to telescopes, bigger really is better. A larger telescope brings with it…
Pluto may have been downgraded from full-planet status, but that doesn't mean it doesn't hold…