Measuring Fundamental Constants with Methanol

Diagram of the methanol molecule
Diagram of the methanol molecule

[/caption]

 

Key to the astronomical modeling process by which scientists attempt to understand our universe, is a comprehensive knowledge of the values making up these models. These are generally measured to exceptionally high confidence levels in laboratories. Astronomers then assume these constants are just that – constant. This generally seems to be a good assumption since models often produce mostly accurate pictures of our universe. But just to be sure, astronomers like to make sure these constants haven’t varied across space or time. Making sure, however, is a difficult challenge. Fortunately, a recent paper has suggested that we may be able to explore the fundamental masses of protons and electrons (or at least their ratio) by looking at the relatively common molecule of methanol.

The new report is based on the complex spectra of the methane molecule. In simple atoms, photons are generated from transitions between atomic orbitals since they have no other way to store and translate energy. But with molecules, the chemical bonds between the component atoms can store the energy in vibrational modes in much the same way masses connected to springs can vibrate. Additionally, molecules lack radial symmetry and can store energy by rotation. For this reason, the spectra of cool stars show far more absorption lines than hot ones since the cooler temperatures allow molecules to begin forming.

Many of these spectral features are present in the microwave portion of the spectra and some are extremely dependent on quantum mechanical effects which in turn depend on precise masses of the proton and electron. If those masses were to change, the position of some spectral lines would change as well. By comparing these variations to their expected positions, astronomers can gain valuable insights to how these fundamental values may change.

The primary difficulty is that, in the grand scheme of things, methanol (CH3OH) is rare since our universe is 98% hydrogen and helium. The last 2% is composed of every other element (with oxygen and carbon being the next most common). Thus, methanol is comprised of three of the four most common elements, but they have to find each other, to form the molecule in question. On top of that, they must also exist in the right temperature range; too hot and the molecule is broken apart; too cold and there’s not enough energy to cause emission for us to detect it. Due to the rarity of molecules with these conditions, you might expect that finding enough of it, especially across the galaxy or universe, would be challenging.

Fortunately, methanol is one of the few molecules which are prone to creating astronomical masers. Masers are the microwave equivalent of lasers in which a small input of light can cause a cascading effect in which it induces the molecules it strikes to also emit light at specific frequencies. This can greatly enhance the brightness of a cloud containing methanol, increasing the distance to which it could be readily detected.

By studying methanol masers within the Milky Way using this technique, the authors found that, if the ratio of the mass of an electron to that of a proton does change, it does so by less than three parts in one hundred million. Similar studies have also been conducted using ammonia as the tracer molecule (which can also form masers) and have come to similar conclusions.

Australian Student Uncovers the Universe’s Missing Mass

Comic Microwave Background Courtesy of NASA / WMAP Science Team

[/caption]

Not since the work of Fritz Zwicky has the astronomy world been so excited about the missing mass of the Universe. His evidence came from the orbital velocities of galaxies in clusters, rotational speeds, and gravitational lensing of background objects. Now there’s even more evidence that Zwicky was right as Australian student – Amelia Fraser-McKelvie – made another breakthrough in the world of astrophysics.

Working with a team at the Monash School of Physics, the 22-year-old undergraduate Aerospace Engineering/Science student conducted a targeted X-ray search for the hidden matter and within just three months made a very exciting discovery. Astrophysicists predicted the mass would be low in density, but high in temperature – approximately one million degrees Celsius. According to theory, the matter should have been observable at X-ray wavelengths and Amelia Fraser-McKelvie’s discovery has proved the prediction to be correct.

Dr Kevin Pimbblet from the School of Astrophysics explains: “It was thought from a theoretical viewpoint that there should be about double the amount of matter in the local Universe compared to what was observed. It was predicted that the majority of this missing mass should be located in large-scale cosmic structures called filaments – a bit like thick shoelaces.”

Up until this point in time, theories were based solely on numerical models, so Fraser-McKelvie’s observations represent a true break-through in determining just how much of this mass is caught in filamentary structure. “Most of the baryons in the Universe are thought to be contained within filaments of galaxies, but as yet, no single study has published the observed properties of a large sample of known filaments to determine typical physical characteristics such as temperature and electron density.” says Amelia. “We examine if a filament’s membership to a supercluster leads to an enhanced electron density as reported by Kull & Bohringer (1999). We suggest it remains unclear if supercluster membership causes such an enhancement.”

Still a year away from undertaking her Honors year (which she will complete under the supervision of Dr Pimbblet), Ms Fraser-McKelvie is being hailed as one of Australia’s most exciting young students… and we can see why!

AMS Now Attached to the Space Station, Ready to Observe the Invisible Universe

The AMS sits near the center of this graphic, which shows where the experiment is located on the truss of the ISS. Credit: NASA

The long-awaited Alpha Magnetic Spectrometer, a particle physics detector that could unlock mysteries about dark matter and other cosmic radiation, has now been installed outside the International Space Station. It is the largest and most complex scientific instrument yet on board the orbiting laboratory, and will examine ten thousand cosmic-ray hits every minute, looking for nature’s best-kept particle secrets, searching for clues into the fundamental nature of matter.

“Thank you very much for the great ride and safe delivery of AMS to the station,” said Dr. Samuel Ting, speaking via radio to the crew on orbit who installed the AMS. Ting is the AMS Principal Investigator who has worked on the project for close to 20 years. “Your support and fantastic work have taken us one step closer to realizing the science potential of AMS. With your help, for the next 20 years, AMS on the station will provide us a better understanding of the origin of the universe.”

“Thank you, Sam,” Endeavour commander Mark Kelly radioed back, “I was just looking out the window of the orbiter and AMS looks absolutely fantastic on the truss. I know you guys are really excited and you’re probably getting data and looking at it already.”

By collecting and measuring vast numbers of cosmic rays and their energies, particle physicists hope to understand more about how and where they are born, since a long-standing mystery is where cosmic rays originate. They could be created in the magnetic fields of exploded stars, or perhaps in the hearts of active galaxies, or maybe in places as yet unseen by astronomers.

The AMS is actually AMS-02 – a prototype of the instrument, AMS-01, was launched on board the space shuttle in 1998, and showed great potential. But Ting and his collaborators from around the world knew that to make a significant contribution to particle science, they needed a detector that could be in space for a long period of time.

AMS-02 will operate on the ISS until at least 2020, and hopefully longer, depending on the life of the space station.

[/caption]

The AMS will also search for antimatter within the cosmic rays, and attempt to determine whether the antimatter is formed from collisions between particles of dark matter, the mysterious substance that astronomers believe may make up about 22% of the Universe.

There is also the remote chance that AMS-02 will detect a particle of anti-helium, left over from the Big Bang itself.

“The most exciting objective of AMS is to probe the unknown; to search for phenomena which exist in nature that we have not yet imagined nor had the tools to discover,” said Ting.

For more information about the AMS, NASA has a detailed article.

Source: ESA, NASA TV

Did the Early Universe Have Just One Dimension?

Planck all-sky image. Credit: ESA, HFI and LFI consortia.

[/caption]

From a University of Buffalo press release:

Did the early universe have just one spatial dimension? That’s the mind-boggling concept at the heart of a theory that physicist Dejan Stojkovic from the University at Buffalo and colleagues proposed in 2010. They suggested that the early universe — which exploded from a single point and was very, very small at first — was one-dimensional (like a straight line) before expanding to include two dimensions (like a plane) and then three (like the world in which we live today).

The theory, if valid, would address important problems in particle physics.

Now, in a new paper in Physical Review Letters, Stojkovic and Loyola Marymount University physicist Jonas Mureika describe a test that could prove or disprove the “vanishing dimensions” hypothesis.

Because it takes time for light and other waves to travel to Earth, telescopes peering out into space can, essentially, look back into time as they probe the universe’s outer reaches.

Gravitational waves can’t exist in one- or two-dimensional space. So Stojkovic and Mureika have reasoned that the Laser Interferometer Space Antenna (LISA), a planned international gravitational observatory, should not detect any gravitational waves emanating from the lower-dimensional epochs of the early universe.

Stojkovic, an assistant professor of physics, says the theory of evolving dimensions represents a radical shift from the way we think about the cosmos — about how our universe came to be.

The core idea is that the dimensionality of space depends on the size of the space we’re observing, with smaller spaces associated with fewer dimensions. That means that a fourth dimension will open up — if it hasn’t already — as the universe continues to expand.

The theory also suggests that space has fewer dimensions at very high energies of the kind associated with the early, post-big bang universe.

If Stojkovic and his colleagues are right, they will be helping to address fundamental problems with the standard model of particle physics, including the following:

The incompatibility between quantum mechanics and general relativity. Quantum mechanics and general relativity are mathematical frameworks that describe the physics of the universe. Quantum mechanics is good at describing the universe at very small scales, while relativity is good at describing the universe at large scales. Currently, the two theories are considered incompatible; but if the universe, at its smallest levels, had fewer dimensions, mathematical discrepancies between the two frameworks would disappear.

Physicists have observed that the expansion of the universe is speeding up, and they don’t know why. The addition of new dimensions as the universe grows would explain this acceleration. (Stojkovic says a fourth dimension may have already opened at large, cosmological scales.)

The standard model of particle physics predicts the existence of an as yet undiscovered elementary particle called the Higgs boson. For equations in the standard model to accurately describe the observed physics of the real world, however, researchers must artificially adjust the mass of the Higgs boson for interactions between particles that take place at high energies. If space has fewer dimensions at high energies, the need for this kind of “tuning” disappears.

“What we’re proposing here is a shift in paradigm,” Stojkovic said. “Physicists have struggled with the same problems for 10, 20, 30 years, and straight-forward extensions of the existing ideas are unlikely to solve them.”

“We have to take into account the possibility that something is systematically wrong with our ideas,” he continued. “We need something radical and new, and this is something radical and new.”

Because the planned deployment of LISA is still years away, it may be a long time before Stojkovic and his colleagues are able to test their ideas this way.

However, some experimental evidence already points to the possible existence of lower-dimensional space.

Specifically, scientists have observed that the main energy flux of cosmic ray particles with energies exceeding 1 teraelectron volt — the kind of high energy associated with the very early universe — are aligned along a two-dimensional plane.

If high energies do correspond with lower-dimensional space, as the “vanishing dimensions” theory proposes, researchers working with the Large Hadron Collider particle accelerator in Europe should see planar scattering at such energies.

Stojkovic says the observation of such events would be “a very exciting, independent test of our proposed ideas.”

Sources: EurekAlert, Physical Review Letters.

The Universe in a Chocolate Creme Egg

Can chocolate cream eggs help explain the mysteries of the Universe? As part of the University of Nottingham’s Sixty Symbols science video series, the Cadbury creme egg has been featured this week, with several eggcellent videos just in time for Easter. This one discusses the cosmological constant, and the possibility of how we might be surrounded by tiny eggs from another dimension. Surprisingly, scientists can explain and demonstrate the some fundamental scientific laws that govern the universe with yummy cream filled chocolate eggs. See more egg-themed discussions at Sixty Symbols.

No Joy for Dark Matter Detector’s First 100 Days

Bottom photomultiplier tube array on the XENON 100 detector. Credit: the XENON collaboration

[/caption]

We’re still mostly in the dark about Dark Matter, and the highly anticipated results from the XENON100 detector has perhaps shed a tad more light on the subject – by not making a detection in the first 100 days of the experiment. Researchers from the project say they have now been able to place the most stringent limits yet on the properties of dark matter.

To look for any possible hints of Dark Matter interacting with ordinary matter, the project has been looking for WIMPS — or weakly interacting massive particles – but for now, there is no new evidence for the existence of WIMPS, or Dark Matter either.

The extremely sensitive XENON100 detector is buried beneath the Gran Sasso mountain in central Italy, shielding it from cosmic radiation so it hopefully can detect WIMPS, hypothetical particles that might be heavier than atomic nuclei, and the most popular candidate for what Dark Matter might be made of. The detector consists of 62 kg of liquid xenon contained within a heavily shielded tank. If a WIMP would enter the detector, it should interact with the xenon nuclei to generate light and electric signals – which would be a kind of “You Have Won!” indicator.

Dark Matter is thought to make up more than 80% of all mass in the universe, but the nature of it is still unknown. Scientists believe that it is made up of exotic particles unlike the normal (baryonic) matter, which we, the Earth, Sun and stars are made of, and it is invisible so it has only been inferred from its gravitational effects.

The XENON detector ran from January to June 2010 for its first run, and in their paper on arxiv, the team revealed they found three candidate events that might be due to Dark Matter. But two of these were expected to appear anyway because of background noise, the team said, so their results are effectively negative.

Does this rule out the existence of WIMPS? Not necessarily – the team will keep working on their search. Plus, results from a preliminary analysis from11.2 days worth of data, taken during the experiment’s commissioning phase in October and November 2009, already set new upper limits on the interaction rate of WIMPs – the world’s best for WIMP masses below about 80 times the mass of a proton.

And the XENON100 team was optimistic. “These new results reveal the highest sensitivity reported as yet by any dark matter experiment, while placing the strongest constraints on new physics models for particles of dark matter,” the team said in a statement.

Read the team’s paper.

More info on XENON100

Sources: EurekAlert, physicsworld

A New Way to Visualize Warped Space and Time

By combining theory with computer simulations, Thorne and his colleagues at Two doughnut-shaped vortexes ejected by a pulsating black hole. Also shown at the center are two red and two blue vortex lines attached to the hole, which will be ejected as a third doughnut-shaped vortex in the next pulsation. Credit: The Caltech/Cornell SXS Collaboration

[/caption]

Trying to understand the warping of space and time is something like visualizing a scene from Alice in Wonderland where rooms can change sizes and locations. The most-used description of the warping of space-time is how a heavy object deforms a stretched elastic sheet. But in actuality, physicists say this warping is so complicated that they really haven’t been able to understand the details of what goes on. But new conceptual tools that combines theory and computer simulations are providing a better way to for scientists to visualize what takes place when gravity from an object or event changes the fabric of space.

Researchers at Caltech, Cornell University, and the National Institute for Theoretical Physics in South Africa developed conceptual tools that they call tendex lines and vortex lines which represent gravitation waves. The researchers say that tendex and vortex lines describe the gravitational forces caused by warped space-time and are analogous to the electric and magnetic field lines that describe electric and magnetic forces.

“Tendex lines describe the stretching force that warped space-time exerts on everything it encounters,” said says David Nichols, a Caltech graduate student who came up with the term ‘tendex.’. “Tendex lines sticking out of the Moon raise the tides on the Earth’s oceans, and the stretching force of these lines would rip apart an astronaut who falls into a black hole.”

Vortex lines, on the other hand, describe the twisting of space. So, if an astronaut’s body is aligned with a vortex line, it would get wrung like a wet towel.

Two spiral-shaped vortexes (yellow) of whirling space sticking out of a black hole, and the vortex lines (red curves) that form the vortexes. Credit: The Caltech/Cornell SXS Collaboration

They tried out the tools specifically on computer simulated black hole collisions, and saw that such impacts would produce doughnut-shaped vortex lines that fly away from the merged black hole like smoke rings. The researchers also found that a bundle of vortex lines spiral out of the black hole like water from a rotating sprinkler. Depending on the angles and speeds of the collisions, the vortex and tendex lines — or gravitational waves — would behave differently.

“Though we’ve developed these tools for black-hole collisions, they can be applied wherever space-time is warped,” says Dr. Geoffrey Lovelace, a member of the team from Cornell. “For instance, I expect that people will apply vortex and tendex lines to cosmology, to black holes ripping stars apart, and to the singularities that live inside black holes. They’ll become standard tools throughout general relativity.”

The researchers say the tendex and vortex lines provide a powerful new way to understand the nature of the universe. “Using these tools, we can now make much better sense of the tremendous amount of data that’s produced in our computer simulations,” says Dr. Mark Scheel, a senior researcher at Caltech and leader of the team’s simulation work.

Their paper has been published in the April 11 in the Physical Review Letters.

Source: CalTech

Particle Physicists See Something Little That Could be Really Big

The dijet invariant mass distribution seen by Fermilab. The blue histogram represents something that is not predicted by the Standard Model. Credit: Fermilab

[/caption]

Physicists from Fermilab have seen a “bump” in their data that could indicate a brand new particle unlike any ever seen before. If verified, this could re-write particle physics as we know it. “Essentially, the Tevatron has seen evidence for a new particle, 150 times mass of proton, that doesn’t behave like a standard Higgs particle,” said physicist Brian Cox on Twitter. “If this stands up to scrutiny and more data (there is not yet enough data for a “discovery”), then it is RIP Standard Model.”

“It was hard for us to not go crazy when we saw the results,” said Viviana Cavaliere from the University of Illinois (UIUC), one of the 500-member team working with the CDF particle detector at Fermi National Accelerator Laboratory in Batavia, Illinois, speaking on a webcast on April 6. “But for now, we need to stay focused on what we do know.”

The result comes from CDF’s (the Collider Detector at Fermilab) analysis of billions of collisions of protons and antiprotons produced by Fermilab’s Tevatron collider. In high energy collisions, subatomic particles can be detected that otherwise can’t be seen. Physicists try to identify the particles they see by studying the combinations of more-familiar particles into which they decay, while trying to find new particles, such as the theoretical Higgs Boson which is predicted by the Standard Model of particle physics.

The Standard Model contains a description of the elementary particles and forces inside atoms which make up everything around us. The model has been successful at making predictions that have been subsequently verified. There are sixteen named particles in the Standard Model, and the last particles discovered were the W and Z bosons in 1983, the top quark in 1995, and the tauon neutrino in 2000. But most physicists agree the Standard Model is probably not the final word in particle physics.

The researchers at Fermilab were searching for collisions that produced a W boson, which weighs about 87 times as much as a proton, as well as some other particles that disintegrate into two sprays of particles called “jets,” which are produced when a collision scatters out a particle called a quark.

Instead, they saw about 250 events which indicate a new particle weighing about 150 times as much as a proton, the team said at the webcast from Fermilab and in their paper on arXiv.

The researchers estimate the statistical chances of random jets or jet pairs from other sources producing a fake signal that strong at 1 in 1300.

The Standard Model does not predict anything like what was seen in the CDF experiment, and since this particle has not been seen before and appears to have some strange properties, the physicists want to verify and retest before claiming a discovery.

“If it is not a fluctuation, it is a new particle,” Cox said.

The Tevatron accelerator at Fermilab is scheduled to be shut down later this year, due to lack of funding and because of sentiments that it would be redundant to the Large Hadron Collider.

You can see more complete discussions and interpretations of the results at:

Cosmic Variance

Science News

MSNBC

Astronomy Without A Telescope – Doubly Special Relativity

The Large Hadron Collider - destined to deliver fabulous science data, but uncertain if these will include an evidence basis for quantum gravity theories. Credit: CERN.

[/caption]

General relativity, Einstein’s theory of gravity, gives us a useful basis for mathematically modeling the large scale universe – while quantum theory gives us a useful basis for modeling sub-atomic particle physics and the likely small-scale, high-energy-density physics of the early universe – nanoseconds after the Big Bang – which general relativity just models as a singularity and has nothing else to say on the matter.

Quantum gravity theories may have more to say. By extending general relativity into a quantized structure for space-time, maybe we can bridge the gap between small and large scale physics. For example, there’s doubly special relativity.

With conventional special relativity, two different inertial frames of reference may measure the speed of the same object differently. So, if you are on a train and throw a tennis ball forward, you might measure it moving at 10 kilometers an hour. But someone else standing on the train station platform watching your train pass by at 60 kilometers an hour, measures the speed of the ball at 60 + 10 – i.e. 70 kilometers an hour. Give or take a few nanometers per second, you are both correct.

However, as Einstein pointed out, do the same experiment where you shine a torch beam, rather than throw a ball, forward on the train – both you on the train and the person on the platform measure the torch beam’s speed as the speed of light – without that additional 60 kilometers an hour – and you are both correct.

It works out that for the person on the platform, the components of speed (distance and time) are changed on the train so that distances are contracted and time dilated (i.e. slower clocks). And by the math of Lorenz transformations, these effects become more obvious the faster than train goes. It also turns out that the mass of objects on the train increase as well – although, before anyone asks, the train can’t turn into a black hole even at 99.9999(etc) per cent of the speed of light.

Now, doubly special relativity, proposes that not only is the speed of light always the same regardless of your frame of reference, but Planck units of mass and energy are also always the same. This means that relativistic effects (like mass appearing to increase on the train) do not occur at the Planck (i.e. very small) scale – although at larger scales, doubly special relativity should deliver results indistinguishable from conventional special relativity.

The Planck spacecraft - an observatory exploring the universe and named after the founder of quantum theory. Coincidence? Credit: ESA.

Doubly special relativity might also be generalized towards a theory of quantum gravity – which, when extended up from the Planck scale, should deliver results indistinguishable from general relativity.

It turns out that at the Planck scale e = m, even though at macro scales e=mc2. And at the Planck scale, a Planck mass is 2.17645 × 10-8 kg – supposedly the mass of a flea’s egg – and has a Schwarzschild radius of a Planck length – meaning that if you compressed this mass into such a tiny volume, it would become a very small black hole containing one Planck unit of energy.

To put it another way, at the Planck scale, gravity becomes a significant force in quantum physics. Although really, all we are saying that is that there is one Planck unit of gravitational force between two Planck masses when separated by a Planck length – and by the way, a Planck length is the distance that light moves within one unit of Planck time!

And since one Planck unit of energy (1.22×1019 GeV) is considered the maximal energy of particles – it’s tempting to consider that this represents conditions expected in the Planck epoch, being the very first stage of the Big Bang.

It all sounds terribly exciting, but this line of thinking has been criticized as being just a trick to make the math work better, by removing important information about the physical systems under consideration. You also risk undermining fundamental principles of conventional relativity since, as the paper below outlines, a Planck length can be considered an invariable constant independent of an observer’s frame of reference while the speed of light does become variable at very high energy densities.

Nonetheless, since even the Large Hadron Collider is not expected to deliver direct evidence about what may or may not happen at the Planck scale – for now, making the math work better does seem to be the best way forward.

Further reading: Zhang et al. Photon Gas Thermodynamics in Doubly Special Relativity.

Astronomy Without A Telescope – Unreasonable Effectiveness

CAP

[/caption]

Gravitational waves are apparently devilishly difficult things to model with Einstein field equations, since they are highly dynamic and non-symmetric. Traditionally, the only way to get close to predicting the likely effects of gravity waves was to estimate the required Einstein equation parameters by assuming the objects causing the gravity waves did not generate strong gravity fields themselves – and nor did they move at velocities anywhere close to the speed of light.

Trouble is, the mostly likely candidate objects that might generate detectable gravity waves – close binary neutron stars and merging black holes – have exactly those properties. They are highly compact, very massive bodies that often move at relativistic (i.e. close to the speed of light) velocities.

Isn’t it weird then that the ‘guesstimate’ approach described above actually works brilliantly in predicting the behaviors of close massive binaries and merging black holes. Hence a recent paper titled: On the unreasonable effectiveness of post-Newtonian approximation in gravitational physics.

So, firstly no-one has yet detected gravity waves. But even in 1916, Einstein considered their existence likely and demonstrated mathematically that gravitational radiation should arise when you replace a spherical mass with a rotating dumbbell of the same mass which, due to its geometry, will generate dynamic ebb and flow effects on space-time as it rotates.

To test Einstein’s theory, it’s necessary to design very sensitive detecting equipment – and to date all such attempts have failed. Further hopes now largely rest on the Laser Interferometer Space Antenna (LISA), which is not expected to launch before 2025.

The proposed Laser Interferometer Space Antenna (LISA) system using laser interferometry to monitor the fluctuations in the relative distances between three spacecraft, arranged in an equilateral triangle with five million kilometer sides. Hopefully, this will be sensitive enough to detect gravity waves. Credit: NASA.

However, as well as sensitive detection equipment like LISA, you also need to calculate what sort of phenomena and what sort of data would represent definitive evidence of a gravity wave – which is where all the theory and math required to determine these expected values is vital.

Initially, theoreticians worked out a post-Newtonian (i.e. Einstein era) approximation (i.e. guesstimate) for a rotating binary system – although it was acknowledged that this approximation would only work effectively for a low mass, low velocity system – where any complicating relativistic and tidal effects, arising from the self-gravity and velocities of the binary objects themselves, could be ignored.

Then came the era of numerical relativity where the advent of supercomputers made it possible to actually model all the dynamics of close massive binaries moving at relativistic speeds, much as how supercomputers can model very dynamic weather systems on Earth.

Surprisingly, or if you like unreasonably, the calculated values from numerical relativity were almost identical to those calculated by the supposedly bodgy post-Newtonian approximation. The post-Newtonian approximation approach just isn’t supposed to work for these situations.

All the authors are left with is the possibility that gravitational redshift makes processes near very massive objects appear slower and gravitationally ‘weaker’ to an external observer than they really are. That could – kind of, sort of – explain the unreasonable effectiveness… but only kind of, sort of.

Further reading: Will, C. On the unreasonable effectiveness of the post-Newtonian approximation in gravitational physics.