First Look At Interstellar Turbulence

Regions of gas where the density and magnetic field are changing rapidly as a result of turbulence. [Technical note: the image shows the gradient of linear polarisation over an 18-square-degree region of the Southern Galactic Plane.] Image credit – B. Gaensler et al. Data: CSIRO/ATCA

[/caption]

All of the space that surrounds us isn’t empty. We’ve always known the Milky Way was filled with great areas of turbulent gas, but we’ve never been able to see them… Until now. Professor Bryan Gaensler of the University of Sydney, Australia, and his team used a CSIRO radio telescope in eastern Australia to create this first-ever look which was published in Nature today.

“This is the first time anyone has been able to make a picture of this interstellar turbulence,” said Professor Gaensler. “People have been trying to do this for 30 years.”

So what’s the point behind the motion? Turbulence distributes magnetism, disperses heat from supernova events and even plays a role in star formation.

“We now plan to study turbulence throughout the Milky Way. Ultimately this will help us understand why some parts of the galaxy are hotter than others, and why stars form at particular times in particular places,” Professor Gaensler said.

Employing CSIRO’s Australia Telescope Compact Array because “it is one of the world’s best telescopes for this kind of work,” as Dr. Robert Braun, Chief Scientist at CSIRO Astronomy and Space Science, explained, the team set their sights about 10,000 light years away in the constellation of Norma. Their goal was to document the radio signals which emanate from that section of the Milky Way. As the radio waves pass through the swirling gas, they become polarized. This changes the direction in which the light waves can “vibrate” and the sensitive equipment can pick up on these small differentiations.

By measuring the polarization changes, the team was able to paint a radio portrait of the gaseous regions where the turbulence causes the density and magnetic fields to fluctuate wildly. The tendrils in the image are also important, too. They show just how fast changes are occurring – critical for their description. Team member Blakesley Burkhart, a PhD student from the University of Wisconsin, made several computer simulations of turbulent gas moving at different speeds. By matching the simulations with the actual image, the team concluded “the speed of the swirling in the turbulent interstellar gas is around 70,000 kilometers per hour — relatively slow by cosmic standards.”

Original Story Source: CSIRO Astronomy and Space Science News Release. For Further Reading: Low Mach number turbulence in interstellar gas revealed by radio polarization gradients.

Uncloaking Type Ia Supernovae

This three-color composite of a portion of the Subaru Deep Field shows mostly galaxies with a few stars. The inset shows one of the 10 most distant and ancient Type Ia supernovae discovered by the American, Israeli and Japanese team.

Type Ia supernovae… Right now they are one of the most studied – and most mysterious – of all stellar phenomenon. Their origins are sheer conjecture, but explaining them is only half the story. Taking a look back into almost the very beginnings of our Universe is what it’s all about and a team of Japanese, Israeli, and U.S. astronomers have employed the Subaru Telescope to give us the most up-to-date information on these elementally explosive cosmic players.

By understanding the energy release of a Type Ia supernova, astronomers have been able to measure unfathomable distances and speculate on dark energy expansion. It was popular opinion that what caused them was a white dwarf star pulling in so much matter from a companion that it finally exploded, but new research points in a different direction. According to the latest buzz, it may very well be the merging of two white dwarfs.

“The nature of these events themselves is poorly understood, and there is a fierce debate about how these explosions ignite,” said Dovi Poznanski, one of the main authors of the paper and a post-doctoral fellow at the University of California, Berkeley, and Lawrence Berkeley National Laboratory.

“The main goal of this survey was to measure the statistics of a large population of supernovae at a very early time, to get a look at the possible star systems,” he said. “Two white dwarfs merging can explain well what we are seeing.”

Can you imagine the power behind this theory? The Type Ia unleashed a thermonuclear reaction so strong that it is able to be traced back to nearly the beginning of expansion after the Big Bang. By employing the Subaru telescope and its prime focus camera (Suprime-Cam), the team was able to focus their attention four times on a small area named the Subaru Deep Field. In their imaging they caught 150,000 individual galaxies containing a total of 40 Type Ia supernova events. One of the most incredible parts of these findings is that these events happened about five times more frequently in the early Universe. But no worries… Even though the mechanics behind them are still poorly understood, they still serve as “cosmic distance markers”.

“As long as Type Ias explode in the same way, no matter what their origin, their intrinsic brightnesses should be the same, and the distance calibrations would remain unchanged.” says Alex Filippenko, UC Berkeley professor of astronomy.

Original Story Source: University of Berkeley News Release. For Further Reading: National Astronomical Observatory of Japan: Subaru News Release.

Accelerating Expansion of Universe Discovery Wins 2011 Nobel Prize in Physics

The accelerating, expanding Universe. Credit: NASA/WMAP

[/caption]

Three scientists shared the 2011 Nobel Prize for physics for the discovery that the expansion of the universe is speeding up, the Nobel prize committee announced today. Half of the $1.5 million prize went to American Saul Perlmutter and the rest to two members of a second team which conducted similar work: American Adam Riess and U.S.-born Brian Schmidt, who is based in Australia. All three made the discovery through observations of distant supernovae.

Perlmutter is from the Lawrence Berkeley National Laboratory and University of California, Berkeley, and worked on the Supernova Cosmology Project. Schmidt is from the Australian National University and Riess is from the Johns Hopkins University and Space Telescope Science Institute, Baltimore. They worked together on the High-z Supernova Search Team.

In response to the announcement, Professor Sir Peter Knight, President of the Institute of Physics, said, “The recipients of today’s award are at the frontier of modern astrophysics and have triggered an enormous amount of research on dark energy.

“These researchers have opened our eyes to the true nature of our Universe. They are very well-deserved recipients.”

Source: IOP

New Simulation Shows How the Universe Evolved

Bolshoi Simulation

How has the universe evolved over time? A new supercomputer simulation has provided what scientists say is the most accurate and detailed large cosmological model of the evolution of the large-scale structure of the universe. Called the Bolshoi simulation, and it gives physicists and astronomers a powerful new tool for understanding cosmic mysteries such as galaxy formation, dark matter, and dark energy.

If the simulation is right, it is showing that the standard cosmological model is fairly spot-on.
Continue reading “New Simulation Shows How the Universe Evolved”

Dark Energy Ignited By Gamma-Ray Bursts?

An artistic image of the explosion of a star leading to a gamma-ray burst. (Source: FUW/Tentaris/Maciej Fro?ow)

[/caption]

Dark energy… We’re still not exactly sure of what it is or where it comes from. Is it possible this mysterious force is what’s driving the expansion of the Universe? A group of astronomers from the universities in Warsaw and Naples, headed by Dr. Ester Piedipalumbo, are taking a closer look at a way to measure this energetic enigma and they’re doing it with one of the most intense sources they can find – gamma-ray bursts.

“We are able to determine the distance of an explosion on the basis of the properties of the radiation emitted during gamma-ray bursts. Given that some of these explosions are related to the most remote objects in space that we know about, we are able, for the first time, to assess the speed of space-time expansion even in the relatively early periods after the Big Bang,” says Prof. Marek Demianski (FUW).

What spawned this new method? In 1998, astronomers were measuring the energy given off by Type Ia supernovae events and realized the expelled forces were consistent. Much like the standard candle model, this release could be used to determine cosmic distances. But there was just one caveat… The more remote the event, the weaker the signature.

While these faint events weren’t lighting up the night, they were lighting up the way science thought about things. Perhaps these Type Ia supernovae were farther away than surmised… and if this were true, perhaps instead of slowing down the expansion of the Universe, maybe it was accelerating! In order to set the Universal model to rights, a new form of mass-energy needed to be introduced – dark energy – and it needed to be twenty times more than what we could perceive. “Overnight, dark energy became, quite literally, the greatest mystery of the Universe,” says Prof. Demianski. In a model put forward by Einstein it’s a property of the cosmological constant – and another model suggests accelerated expansion is caused by some unknown scalar field. “In other words, it is either-or: either space-time expands by itself or is expanded by a scalar physical field inside it,” says Prof. Demianski.

So what’s the point behind the studies? If it is possible to use a gamma-ray burst as a type of standard candle, then astronomers can better assess the density of dark energy, allowing them to further refine models. If it stays monophonic, it belongs to the cosmological constant and is a property of space-time. However, if the acceleration of the Universe is the property of a scalar field, the density of dark energy would differ. “This used to be a problem. In order to assess the changes in the density of dark energy immediately after the Big Bang, one needs to know how to measure the distance to very remote objects. So remote that even Type Ia supernovae connected to them are too faint to be observed,” says Demianski.

Now the real research begins. Gamma-ray bursts needed to have their energy levels measured and to do that accurately meant looking at previous studies which contained verified sources of distance, such as Type Ia supernovae. “We focused on those instances. We knew the distance to the galaxy and we also knew how much energy of the burst reached the Earth. This allowed us to calibrate the burst, that is to say, to calculate the total energy of the explosion,” explains Prof. Demianski. Then the next step was to find statistical dependencies between various properties of the radiation emitted during a gamma-ray burst and the total energy of the explosion. Such relations were discovered. “We cannot provide a physical explanation of why certain properties of gamma-ray bursts are correlated,” points out Prof. Demianski. “But we can say that if registered radiation has such and such properties, then the burst had such and such energy. This allows us to use bursts as standard candles, to measure distances.”

Dr. Ester Piedipalumbo and a team of researchers from the universities in Warsaw and Naples then took up the gauntlet. Despite this fascinating new concept, the reality is that distant gamma-ray bursts are unusual. Even with 95 candidates listed in the Amanti catalogue, there simply wasn’t enough information to pinpoint dark energy. “It is quite a disappointment. But what is important is the fact that we have in our hands a tool for verifying hypotheses about the structure of the Universe. All we need to do now is wait for the next cosmic fireworks,” concludes Prof. Demianski.

Let the games begin…

Original Story Source: University of Warsaw Press Release. For Further Reading: Cosmological models in scalar tensor theories of gravity and observations: a class of general solutions.

Astronomy Without A Telescope – The Edge Of Significance

A two hemisphere spherical mapping of the cosmic microwave background. Credit: WMAP/NASA.

[/caption]

Some recent work on Type 1a supernovae velocities suggests that the universe may not be as isotropic as our current standard model (LambdaCDM) requires it to be.

The standard model requires the universe to be isotropic and homogeneous – meaning it can be assumed to have the same underlying structure and principles operating throughout and it looks measurably the same in every direction. Any significant variation from this assumption means the standard model can’t adequately describe the current universe or its evolution. So any challenge to the assumption of isotropy and homogeneity, also known as the cosmological principle, is big news.

Of course since you are hearing about such a paradigm-shifting finding within this humble column, rather than as a lead article in Nature, you can safely assume that the science is not quite bedded down yet. The Union2 data set of 557 Type 1a supernovae, released in 2010, is allegedly the source of this latest challenge to the cosmological principle – even though the data set was released with the unequivocal statement that the flat concordance LambdaCDM model remains an excellent fit to the Union2 data.

Anyhow, in 2010 Antoniou and Perivolaropoulos ran a hemisphere comparison – essentially comparing supernova velocities in the northern hemisphere of the sky with the southern hemisphere. These hemispheres were defined using galactic coordinates, where the orbital plane of the Milky Way is set as the equator and the Sun, which is more or less on the galactic orbital plane, is the zero point.

The galactic coordinate system. Credit: thinkastronomy.com

Antoniou and Perivolaropoulos’ analysis determined a preferred axis of anisotropy – with more supernovae showing higher than average velocities towards a point in the northern hemisphere (within the same ranges of redshift). This suggests that a part of the northern sky represents a part of the universe that is expanding outwards with a greater acceleration than elsewhere. If correct, this means the universe is neither isotropic nor homogeneous.

However, they note that their statistical analysis does not necessarily correspond with statistically significant anisotropy and then seek to strengthen their finding by appealing to other anomalies in cosmic microwave background data which also show anisotropic tendencies. So this seems to be a case of looking at number of unrelated findings with common trends – that in isolation are not statistically significant – and then arguing that if you put all these together they somehow achieve a consolidated significance that they did not possess in isolation.

More recently, Cai and Tuo ran much the same hemispherical analysis and, not surprisingly, got much the same result. They then tested whether these data favoured one dark energy model over another – which they didn’t. Nonetheless, on the strength of this, Cai and Tuo gained a write up in the Physics Arxiv blog under the heading More Evidence for a Preferred Direction in Spacetime – which seems a bit of a stretch since it’s really just the same evidence that has been separately analysed for another purpose.

It’s reasonable to doubt that anything has been definitively resolved at this point. The weight of current evidence still favours an isotropic and homogeneous universe. While there’s no harm in mucking about at the edge of statistical significance with whatever limited data are available – such fringe findings may be quickly washed away when new data comes in – e.g. more Type 1a supernovae velocity measures from a new sky survey – or a higher resolution view of the cosmic microwave background from the Planck spacecraft. Stay tuned.

Further reading:
– Antoniou and Perivolaropoulos. Searching for a Cosmological Preferred Axis: Union2 Data Analysis and Comparison with Other Probes.
– Cai and Tuo. Direction Dependence of the Deceleration Parameter.

Astronomy Without A Telescope – Cosmic Coincidence

caption...

[/caption]

Cosmologists tend not to get all that excited about the universe being 74% dark energy and 26% conventional energy and matter (albeit most of the matter is dark and mysterious as well). Instead they get excited about the fact that the density of dark energy is of the same order of magnitude as that more conventional remainder.

After all, it is quite conceivable that the density of dark energy might be ten, one hundred or even one thousand times more (or less) than the remainder. But nope, it seems it’s about three times as much – which is less than ten and more than one, meaning that the two parts are of the same order of magnitude. And given the various uncertainties and error bars involved, you might even say the density of dark energy and of the more conventional remainder are roughly equivalent. This is what is known as the cosmic coincidence.

To a cosmologist, particularly a philosophically-inclined cosmologist, this coincidence is intriguing and raises all sorts of ideas about why it is so. However, Lineweaver and Egan suggest this is actually the natural experience of any intelligent beings/observers across the universe, since their evolution will always roughly align with the point in time at which the cosmic coincidence is achieved.

A current view of the universe describes its development through the following steps:

Inflationary era – a huge whoomp of volume growth driven by something or other. This is a very quick era lasting from 10-35 to 10-32 of the first second after the Big Bang.
Radiation dominated era – the universe continues expanding, but at a less furious rate. Its contents cools as their density declines. Hadrons begin to cool out from hot quark-gluon soup while dark matter forms out of whatever it forms out of – all steadily adding matter to the universe, although radiation still dominates. This era lasts for maybe 50,000 years.
Matter dominated era – this era begins when the density of matter exceeds the density of radiation and continues through to the release of the cosmic microwave background radiation at 380,000 years, when the first atoms formed – and then continues on for a further 5 billion years. Throughout this era, the energy/matter density of the whole universe continues to gravitationally restrain the rate of expansion of the universe, even though expansion does continue.
Cosmological constant dominated era – from 5 billion years to now (13.7 billion) and presumably for all of hereafter, the energy/matter density of the universe is so diluted that it begins losing its capacity to restrain the expansion of universe – which hence accelerates. Empty voids of space grow ever larger between local clusters of gravitationally-concentrated matter.

And here we are. Lineweaver and Egan propose that it is unlikely that any intelligent life could have evolved in the universe much earlier than now (give or take a couple of billion years) since you need to progressively cycle through the star formation and destruction of Population III, II and then I stars to fill the universe with sufficient ‘metals’ to allow planets with evolutionary ecosystems to develop.

The four eras of the universe mapped over a logarithmic time scale. Note that "Now" occurs as the decline in matter density and the acceleration in cosmic expansion cross over. Credit: Lineweaver and Egan.

So any intelligent observer in this universe is likely to find the same data which underlie the phenomenon we call the cosmological coincidence. Whether any aliens describe their finding as a ‘coincidence’ may depend upon what mathematical model they have developed to formulate the cosmos. It’s unlikely to be the same one we are currently running with – full of baffling ‘dark’ components, notably a mysterious energy that behaves nothing like energy.

It might be enough for them to note that their observations have been taken at a time when the universe’s contents no longer have sufficient density to restrain the universe’s inherent tendency to expand – and so it expands at a steadily increasing rate.

Further reading: Lineweaver and Egan. The Cosmic Coincidence as a Temporal Selection Effect Produced by the Age Distribution of Terrestrial Planets in the Universe (subsequently published in Astrophysical Journal 2007, Vol 671, 853.)

Q&A with Brian Cox, part 1: Recent Hints of the Higgs

Brian Cox at CERN with Kevin Eldon and Simon Munnery. Photo by Gia Milinovich, courtesy Brian Cox

[/caption]

At two separate conferences in July, particle physicists announced some provoking news about the Higgs boson, and while the Higgs has not yet been found, physicists are continuing to zero in on the elusive particle. Universe Today had the chance to talk with Professor Brian Cox about these latest findings, and he says that within six to twelve months, physicists should be able to make a definite statement about the existence of the Higgs particle. Cox is the Chair in Particle Physics at the University of Manchester, and works on the ATLAS experiment (A Toroidal LHC ApparatuS) at the Large Hadron Collider at CERN. But he’s also active in the popularization of science, specifically with his new television series and companion book, Wonders of the Universe, a follow up to the 2010 Peabody Award-winning series, Wonders of the Solar System.

Universe Today readers will have a chance to win a copy of the book, so stay tuned for more information on that. But today, enjoy the first of a three-part interview with Cox:


Universe Today: Can you tell us about your work with ATLAS and its potential for finding things like extra dimensions, the unification of forces or dark matter?

Brian Cox, during the filming of one of his television series. Image courtesy Brian Cox.

Brian Cox: The big question is the origin and mass of the universe. It is very, very important because it is not an end in itself. It is a fundamental part of Quantum Field Theory, which is our theory of three of the four forces of nature. So if you ask the question on the most basic level of how does the universe work, there are only two pillars of our understanding at the moment. There is Einstein’s Theory of General Relatively, which deals with gravity — the weakest force in the Universe that deals with the shape of space and time and all those things. But everything else – electromagnetism, the way the atomic nuclei works, the way molecules work, chemistry, all that – everything else is what’s called a Quantum Field Theory. Embedded in that is called the Standard Model of particle physics. And embedded in that is this mechanism for generating mass, and it’s just so fundamental. It’s not just kind of an interesting add-on, it’s right in the heart of the way the theory works.

So, understanding whether our current picture of the Universe is right — and if there is this thing called the Higgs mechanism or whether there is something else going on — is critical to our progress because it is built into that picture. There are hints in the data recently that maybe that mechanism is right. We have to be careful. It’s not a very scientific thing to say that we have hints. We have these thresholds for scientific discovery, and we have them for a reason, because you get these statistical flukes that appear in the data and when you get more data they go away again.

The statement from CERN now is that if they turn out to be more than just fluctuations, really, within six months we should be able to make some definite statement about the existence of the Higgs particle.

I think it is very important to emphasize that this is not just a lot of particle physicists looking for particles because that’s their job. It is the fundamental part of our understanding of three of the four forces of nature.

Brian Cox at Fermilab. Photo by Paul Olding.

UT : So these very interesting results from CERN and the Tevatron at Fermilab giving us hints about the Higgs, could you can talk little bit more about that and your take on the latest findings?

COX: The latest results were published in a set of conferences a few weeks ago and they are just under what is called the Three Sigma level. That is the way of assessing how significant the results are. The thing about all quantum theory and particle physics in general, is it is all statistical. If you do this a thousand times, then three times this should happen, and eight times that should happen. So it’s all statistics. As you know if you toss a coin, it can come up heads ten times, there is a probability for that to happen. It doesn’t mean the coin is weighted or there’s something wrong with it. That’s just how statistics is.

So there are intriguing hints that they have found something interesting. Both experiments at the Large Hadron Collider, the ATLAS and the Compact Muon Solenoid (CMS) recently reported “excess events” where there were more events than would be expected if the Higgs does not exist. It is about the right mass: we think the Higgs particle should be somewhere between about 120 and 150 gigaelectron volts [GeV—a unit of energy that is also a unit of mass, via E = mc2, where the speed of light, c, is set to a value of one] which is the expected mass range of the Higgs. These hints are around 140, so that’s good, it’s where it should be, and it is behaving in the way that it is predicted to by the theory. The theory also predicts how it should decay away, and what the probability should be, so all the data is that this is consistent with the so-called standard model Higgs.

But so far, these events are not consistently significant enough to make the call. It is important that the Tevatron has glimpsed it as well, but that has even a lower significance because that was low energy and not as many collisions there. So you’ve got to be scientific about things. There is a reason we have these barriers. These thresholds are to be cleared to claim discoveries. And we haven’t cleared it yet.

But it is fascinating. It’s the first time one of these rumors have been, you know, not just nonsense. It really is a genuine piece of exciting physics. But you have to be scientific about these things. It’s not that we know it is there and we’re just not going to announce it yet. It’s the statistics aren’t here yet to claim the discovery.

Brian Cox, while filming a BBC series in the Sahara. Image courtesy Brian Cox

UT : Well, my next question was going to be, what happens next? But maybe you can’t really answer that because all you can do is keep doing the research!

COX: The thing about the Higgs, it is so fundamentally embedded in quantum theory. You’ve got to explore it because it is one thing to see a hint of a new particle, but it’s another thing to understand how that particle behaves. There are lots of different ways the Higgs particles can behave and there are lots of different mechanisms.

There is a very popular theory called supersymmetry which also would explain dark matter, one of the great mysteries in astrophysics. There seems to be a lot of extra stuff in the Universe that is not behaving the way that particles of matter that we know of behave, and with five times more “stuff” as what makes up everything we can see in the Universe. We can’t see dark matter, but we see its gravitational influence. There are theories where we have a very strong candidate for that — a new kind of particle called a supersymmetry particles. There are five Higgs particles in them rather than one. So the next question is, if that is a Higgs-like particle that we’ve discovered, then what is it? How does it behave? How does it talk to the other particles?

And then there are a huge amount of questions. The Higgs theory as it is now doesn’t explain why the particles have the masses they do. It doesn’t explain why the top quark, which is the heaviest of the fundamental particles, is something like 180 times heavier than the proton. It’s a tiny point-like thing with no size but it’s 180 times the mass of a proton! That is heavier than some of the heaviest atomic nuclei!

Why? We don’t know.

I think it is correct to say there is a door that needs to be opened that has been closed in our understanding of the Universe for decades. It is so fundamental that we’ve got to open it before we can start answering these further questions, which are equally intriguing but we need this answered first.

UT: When we do get some of these questions answered, how is that going to change our outlook and the way that we do things, or perhaps the way YOU do things, anyway! Maybe not us regular folks…

COX: Well, I think it will – because this is part of THE fundamental theory of the forces of nature. So quantum theory in the past has given us an understanding, for example, of the way semiconductors work, and it underpins our understanding of modern technology, and the way chemistry works, the way that biological systems work – it’s all there. This is the theory that describes it all. I think having a radical shift and deepening in understanding of the basic laws of nature will change the way that physics proceeds in 21st century, without a doubt. It is that fundamental. So, who knows? At every paradigm shift in science, you never really could predict what it was going to do; but the history of science tells you that it did something quite remarkable.

There is a famous quote by Alexander Fleming, who discovered penicillin, who said that when he woke up on a certain September morning of 1928, he certainly didn’t expect to revolutionize modern medicine by discovering the world’s first antibiotic. He said that in hindsight, but he just discovered some mold, basically, but there it was.

But it was fundamental and that is the thing to emphasize.

Some of our theories, you look at them and wonder how we worked them! The answer is mathematically, the same way that Einstein came up with General Relativity, with mathematical predictions. It is remarkable we’ve been able to predict something so fundamental about the way that empty space behaves. We might turn out to be right.

Tomorrow: Part 2: The space exploration and hopes for the future

Find out more about Brian Cox at his website, Apollo’s Children

Astronomy Without A Telescope – A Photon’s Point Of View

What would you see at the speed of light/

[/caption]

From a photon’s point of view, it is emitted and then instantaneously reabsorbed. This is true for a photon emitted in the core of the Sun, which might be reabsorbed after crossing a fraction of a millimetre’s distance. And it is equally true for a photon that, from our point of view, has travelled for over 13 billion years after being emitted from the surface of one of the universe’s first stars.

So it seems that not only does a photon not experience the passage of time, it does not experience the passage of distance either. But since you can’t move a massless consciousness at the speed of light in a vacuum, the real point of this thought experiment is to indicate that time and distance are just two apparently different aspects of the same thing.

If we attempt to achieve the speed of light, our clocks will slow relative to our point of origin and we will arrive at our destination quicker that we anticipate that we should – as though both the travel time and the distance have contracted.

Similarly, as we approach the surface of a massive object, our clocks will slow relative to a point of higher altitude – and we will arrive at the surface quicker than we might anticipate, as though time and distance contract progressively as we approach the surface.

Again, time and distance are just two aspects of the same thing, space-time, but we struggle to visualise this. We have evolved to see the world in snapshot moments, perhaps because a failure to scan the environment with every step we take might leave us open to attack by a predator.

Science advocates and skeptics say that we should accept the reality of evolution in the same way that we accept the reality of gravity – but actually this is a terrible analogy. Gravity is not real, it’s just our dumbed-down interpretation of space-time curvature.

If you could include the dimension of time in this picture you might get a rough idea of why things appear to accelerate towards a massive object - even though they do not themselves experience any acceleration.

Astronauts moving at a constant velocity through empty space feel weightless. Put a planet in their line of trajectory and they will continue to feel weightless right up until the moment they collide with its surface.

A person on the surface will watch them steadily accelerate from high altitude until that moment of collision. But such doomed astronauts will not themselves experience any such change to their velocity. After all, if they were accelerating, surely they would be pushed back into their seat as a consequence.

Nonetheless, the observer on the planet’s surface is not suffering from an optical illusion when they perceive a falling spacecraft accelerate. It’s just that they fail to acknowledge their particular context of having evolved on the surface of a massive object, where space-time is all scrunched up.

So they see the spacecraft move from an altitude where distance and time (i.e. space-time) is relatively smooth – down to the surface, where space-time (from the point of view of a high altitude observer) is relatively scrunched up. A surface dweller hence perceives that a falling object is experiencing acceleration and wrongly assumes that there must be a force involved.

As for evolution – there are fossils, vestigial organs and mitochondrial DNA. Get real.

Footnote: If you were falling into a black hole you would still not experience acceleration. However, your physical structure would be required to conform to the extremely scrunched up space-time that you move through – and spaghettification would result.