And Now Exo-magnetospheres

An artist’s impression of WASP 12-b being slowly consumed as a result of its ridiculously tight orbit around its star. More recent observations suggests the exoplanet has a magnetosphere which may be partially protecting it from stellar wind erosion. Credit: NASA.

[/caption]

New observations of one of the biggest and hottest known exoplanets in the galaxy, WASP 12b, suggest that it is generating a powerful magnetic field sufficient to divert much of its star’s stellar wind into a bow shock wave.

Like exoplanets themselves, the discovery of an exo-magnetosphere isn’t that much of a surprise – indeed it would be a surprise if Jovian-type gas giants didn’t have magnetic fields, since the gas giants in our own backyard have quite powerful ones. But, assuming the data for this finding remains valid on further scrutiny, it is a first – even if it is just a confirming-what-everyone-had-suspected-all-along first.

WASP-12 is a Sun-like G type yellow star about 870 light years away from Earth. The exoplanet WASP-12b orbits it at a distance of only 3.4 million km out, with an orbital period of only 26 hours. Compare this to Mercury’s orbital period of 88 days at a 46 million kilometer distance from the Sun at orbital perihelion.

So habitable zone, this ain’t – but a giant among gas giants ploughing through a dense stellar wind of charged particles sounds like an ideal set of circumstances to look for an exo-magnetosphere.

The bow shock was detected by an initial dip of the star’s ultraviolet light output ahead of the more comprehensive dip which was produced by the transiting planet itself. Given the rapid orbital speed of the planet, some bow wave effect might be expected regardless whether or not the planet generates a strong magnetic field. But apparently, the data from WASP 12-b best fits a model where the bow shock is produced by a magnetic, rather than just a dynamic physical, effect.

The finding is based on data from the SuperWASP (Wide Angle Search for Planets) project as well as Hubble Space Telescope data. Team leader Dr. Aline Vidotto of the University of St. Andrews said of the new finding. “The location of this bow shock provides us with an exciting new tool to measure the strength of planetary magnetic fields. This is something that presently cannot be done in any other way.”

Although WASP 12b’s magnetic field may be prolonging its life somewhat, by offering some protection from its star’s stellar wind – which might otherwise being blowing away its outer layers – WASP 12-b is still doomed due to the gravitational effects of the close-by WASP 12 star which has already been observed to be drawing material from the planet. Current estimates are that WASP 12-b will be completely consumed in about 10 million years.

WASP 12-b is not only one of the hottest hot Jupiters we've found, but also one of the biggest (although this may be largely a result of expansion due to heating).

There is at least one puzzle here, not really testable from such a distance. Presuming that a planet so close to its star is probably tidally-locked, it would not be spinning on its axis – which is generally thought to be a key feature of planets generating strong magnetic fields – at least the ones in our Solar System. This may need something like an OverwhelminglySuperWASP to investigate further.

Further reading: RAS National Astronomy Meeting 2011 press release.

Astronomy Without A Telescope – Alien Mining

A disk of debris around a star is a likely indicator of planets. A disk of debris with a wildly atypical chemistry could mean aliens.

[/caption]

Recently, some researchers speculated on what types of observational data from distant planetary systems might indicate the presence of an alien civilization, determined that asteroid mining was likely to be worth looking for – but ended up concluding that most of the effects of such activity would be difficult to distinguish from natural phenomena.

And in any case, aren’t we just anthropomorphizing by assuming that intelligent alien activity will be anything like human activity?

Currently – apart from a radio, or other wavelength, transmission carrying artificial and presumably intelligent content – it’s thought that indicators of the presence of an alien civilization might include:
• Atmospheric pollutants, like chlorofluorocarbons – which, unlike methane or molecular oxygen, are clearly manufactured rather than just biogenically produced
• Propulsion signatures – remember how the Vulcans detected humanity in First Contact (or at least they decided we were worth visiting after all, despite all the I Love Lucy re-runs)
Stellar engineering – where a star’s lifetime is artificially extended to maintain the habitable zone of its planetary system
Dyson spheres – or at least their more plausible off-shoots, such as Dyson swarms.

And perhaps add to this list – asteroid mining, which would potentially create a lot of dust and debris around a star on a scale that might be detectable from Earth.

There is a lot of current interest in debris disks around other stars, which are detectable when they are heated up by the star they surround and then radiate that heat in the infra-red and sub-millimeter wavelengths. For mainstream science, debris disk observations may offer another way to detect exoplanets, which might produce clumping patterns in the dust through gravitational resonance. Indeed it may turn out that the presence of a debris disk strongly correlates with the existence of rocky terrestrial planets in that system.

But now going off the mainstream… presuming that we can eventually build up a representative database of debris disk characteristics, including their density, granularity and chemistry derived from photometric and spectroscopic analysis, it might become possible to identify anomalous debris disks that could indicate alien mining activities.

Some recent astronomy pareidolia. Not an alien mining operation on Mercury, but a chunk of solidified ejecta commonly found in the center of many impact craters. Credit: NASA.

For example, we might see a significant deficiency in a characteristic element (say, iron or platinum) because the aliens had extracted these elements – or we might see an unusually fine granularity in the disk because the aliens had ground everything down to fine particles before extracting what they wanted.

But surely it’s equally plausible to propose that if the aliens are technologically advanced enough to undertake asteroid mining, they would also do it with efficient techniques that would not leave any debris behind.

The gravity of Earth makes it easy enough to just blow up big chunks of rock to get at what you want since all the debris just falls back to the ground and you can sort through it later for secondary extraction.

Following this approach with an asteroid would produce a floating debris field that might represent a risk to spacecraft, as well as leaving you without any secondary extraction opportunities. Better to mine under a protective canopy or just send in some self-replicating nanobots, which can separate out an enriched chunk of the desired material and leave the remainder intact.

If you’re going to play the alien card, you might as well go all in.

Further reading: Forgan and Elvis. Extrasolar Asteroid Mining as Forensic Evidence for Extraterrestrial Intelligence.

Some useful tips on asteroid mining can be found here.

Astronomy Without A Telescope – Assumptions

This model assumes the cosmological principle. The LCDM universe is homogeneous and isotropic. Time dilation and redshift z are attributed to a Doppler-like shift in electromagnetic radiation as it travels across expanding space. This model assumes a nearly "flat" spatial geometry. Light traveling in this expanding model moves along null geodesics. Light waves are 'stretched' by the expansion of space as a function of time. The expansion is accelerating due to a vacuum energy or dark energy inherent in empty space. Approximately 73% of the energy density of the present universe is estimated to be dark energy. In addition, a dark matter component is currently estimated to constitute about 23% of the mass-energy density of the universe. The 5% remainder comprises all the matter and energy observed as subatomic particles, chemical elements and electromagnetic radiation; the material of which gas, dust, rocks, planets, stars, galaxies, etc., are made. This model includes a single originating big bang event, or initial singularity, which constitutes an abrupt appearance of expanding space containing radiation. This event was immediately followed by an exponential expansion of space (inflation).

[/caption]

The current standard model of the universe, Lambda-Cold Dark Matter, assumes that the universe is expanding in accordance with the geometrical term Lambda – which represents the cosmological constant used in Einstein’s general relativity. Lambda might be assumed to represent dark energy, a mysterious force driving what we now know to be an accelerating expansion of space-time. Cold dark matter is then assumed to be the scaffolding that underlies the distribution of visible matter at a large scale across the universe.

But to make any reasonable attempt at modelling how the universe is – and how it unfolded in the past and will unfold in the future – we first have to assume that it is roughly the same everywhere.

This is sometimes called the Cosmological Principle which states that when viewed on a sufficiently large scale, the properties of the Universe are the same for all observers. This captures two concepts – that of isotropy, which means that the universe looks roughly the same anywhere you (that is you) look – and homogeneity, which means the properties of the universe look roughly the same for any observers anywhere they are and wherever they look. Homogeneity is not something we can expect to ever confirm by observation – so we must assume that the part of the universe we can directly observe is a fair and representative sample of the rest of the universe.

An assessment of isotropy is at least theoretically possible down our past light-cone. In other words, we look out into the universe and receive historical information about how it behaved in the past. We then assume that those parts of the universe we can observe have continued to behave in a consistent and predictable manner up until the present – even though we can’t confirm whether this is true until more time has passed. But anything outside our light cone is not something we can expect to ever know about and hence we can only ever assume the universe is homogenous throughout.

You occupy a position in space-time from which a proportion of the universe can be observed in your past light cone. You can also shine a torch beam forwards towards a proportion of the future universe - knowing that one day that light beam can reach an object that lies in your future light cone. However, you can never know about anything happening right now at a distant position in space - because it lies on the 'hypersurface of the present'. Credit: Aainsqatsi.

Maartens has a go a developing at developing an argument as to why it might be reasonable for us to assume that the universe is homogenous. Essentially, if the universe we can observe shows a consistent level of isotropy over time, this strongly suggests that our bit of the universe has unfolded in a manner consistent with it being a part of a homogenous universe.

The isotropy of the observable universe can be strongly implied if you look out in any direction and find:
• consistent matter distribution;
• consistent bulk velocities of galaxies and galactic clusters moving away from us via universal expansion.
• consistent measurements of angular diameter distance (where objects of the same absolute size look smaller at a greater distance – until a distance of redshift 1.5, when they start looking larger – see here); and
• consistent gravitational lensing by large scale objects like galactic clusters.

These observations support the assumption that both matter distribution and the underlying space-time geometry of the observable universe is isotropic. If this isotropy is true for all observers then the universe is consistent with the Friedmann–Lemaître–Robertson–Walker (FLRW) metric. This would mean it is homogenous, isotropic and connected – so you can travel anywhere (simply connected) – or it might have wormholes (multiply connected) so not only can you travel anywhere, but there are short cuts.

That the observable universe has always been isotropic – and is likely to continue being so into the future – is strongly supported by observations of the cosmic microwave background, which is isotropic down to a fine scale. If this same isotropy is visible to all observers – then it is likely that the universe has, is and will always be homogenous as well.

Finally, Maartens appeals to the Copernican Principle – which says that not only are we not the center of the universe, but our position is largely arbitrary. In other words, the part of the universe we can observe may well be a fair and representative sample of the wider universe.

Further reading: Maartens Is the universe homogenous?

Astronomy Without A Telescope – Our Unlikely Solar System

A circumstellar disk of debris around a matured stellar system may indicate that Earth-like planets lie within. LUVOIR will be able to see inside the disk to watch planets forming. Credit: NASA
A circumstellar disk of debris around a matured stellar system may indicate that Earth-like planets lie within. LUVOIR will be able to see inside the disk to watch planets forming. Credit: NASA

[/caption]

Recent modeling of Sun-like stars with planetary systems, found that a system with four rocky planets and four gas giants in stable orbits – and only a sparsely populated outer belt of planetesimals – has only a 15 to 25% likelihood of developing. While you might be skeptical about the validity of a model that puts our best known planetary system in the unlikely basket, there may be some truth in this finding.

This modeling has been informed by the current database of known exoplanets and otherwise based on some prima facie reasonable assumptions. Firstly, it is assumed that gas giants are unable to form within the frost line of a system – a line beyond which hydrogen compounds, like water, methane and ammonia would exist as ice. For our Solar System, this line is about 2.7 astronomical units from the Sun – which is roughly in the middle of the asteroid belt.

Gas giants are thought to only be able to form this far out as their formation requires a large volume of solid material (in the form of ices) which then become the cores of the gas giants. While there may be just as much rocky material like iron, nickel and silicon outside the frost line, these materials are not abundant enough to play a significant role in forming giant planets and any planetesimals they may form are either gobbled up by the giants or flung out of orbit.

However, within the frost line, rocky materials are the dominant basis for planet forming – since most light gas is blown out of the region by force of the stellar wind and other light compounds (such as H2O and CO2) are only sustained by accretion within forming planetesimals of heavier materials (such as iron, nickel and silicates). Appreciably-sized rocky planets would probably form in these regions within 10-100 million years after the star’s birth.

So, perhaps a little parochially, it is assumed that you start with a system of three regions – an inner terrestrial planet forming region, a gas giant forming region and an outer region of unbound planetesimals, where the star’s gravity is not sufficient to draw material in to engage in further accretion.

From this base, Raymond et al ran a set of 152 variations, from which a number of broad rules emerged. Firstly, it seems that the likelihood of sustaining terrestrial inner planets is very dependent on the stability of the gas giants’ orbits. Frequently, gravitational perturbations amongst the gas giants results in them adopting more eccentric elliptical orbits which then clears out all the terrestrial planets – or sends them crashing into the star. Only 40% of systems retained more than one terrestrial planet, 20% had just one and 40% had lost them all.

The Moon has retained a comprehensive record of the Late Heavy Bombardment from 4.1 to 3.8 billion years ago - resulting from a reconfiguration of the gas giants. As well as clearing out much of debris disk of the early Solar System, this reconfiguration flung material into the inner solar system to bombard the rocky planets.

Debris disks of hot and cold dust were found to be common phenomena in matured systems which did retain terrestrial planets. In all systems, primal dust is largely cleared out within the first few hundred million years – by radiation or by planets. But, where terrestrial planets are retained, there is a replenishment of this dust – presumably via collisional grinding of rocky planetesimals.

This finding is reflected in the paper’s title Debris disks as signposts of terrestrial planet formation. If this modeling work is an accurate reflection of reality, then debris disks are common in systems with stable gas giants – and hence persisting terrestrial planets – but are absent from systems with highly eccentric gas giant orbits, where the terrestrial planets have been cleared out.

Nonetheless, the Solar System appears as unusual in this schema. It is proposed that perturbations within our gas giants’ orbits, leading to the Late Heavy Bombardment, were indeed late with respect to how other systems usually behave. This has left us with an unusually high number of terrestrial planets which had formed before the gas giant reconfiguration began. And the lateness of the event, after all the collisions which built the terrestrial planets were finished, cleared out most of the debris disk that might have been there – apart from that faint hint of Zodiacal light that you might notice in a dark sky after sunset or before dawn.

Further reading: Raymond et al Debris disks as signposts of terrestrial planet formation.

Astronomy Without A Telescope – Our Inferred Universe

A galaxy far, far away - long. long ago. UDFy-38135539 - the confirmed most distant observed object, where UDF stands for (Hubble) Ultra-Deep Field.

[/caption]

The universe is a big place – and getting bigger all the time – so at a large scale all unbound structures are all moving away from each other. So when we look out at distant objects, we need to remind ourselves that not only are we seeing them as they appeared in the past, when the light that hits our eyes first left them, but also that they are no longer in that location where they appear to be.

This issue reaches an extreme when we consider observations of the first luminous stars and galaxies – with the galaxy UDFy-38135539 currently holding the record as the most distant object observed and one of the youngest, existing 13.1 billion years ago – although UDFj-39546284 may be the next contender at 13.2 billion years old, subject to further spectroscopic confirmation.

UDFy-38135539 has a redshift (z) of 10 and provides no measurable light at visible wavelengths. Although light from it took 13.1 billion years ago to reach us – it is not correct to say that it is 13.1 billion light years away. In that intervening period, both it and us have moved further away from each other.

So not only is it now further away than it appears, but when the light that we see now was first emitted, it and the location that we now occupy were much closer together than 13.1 billion light years. For this reason it appears larger, but much dimmer than it would appear in a static universe – where it might genuinely be 13.1 billion light years away.

So we need to clarify UDFy-38135539’s distance as a comoving distance (calculated from its apparent distance and the assumed expansion rate of the universe). This calculation would represent the proper distance between us and it – as if a tape measure could be right now instantaneously laid down between us and it.

This distance works out to be about 30 billion light years. But we are just guessing that UDFy-38135539 is still there – more likely it has merged with other young galaxies – perhaps becoming part of a huge spiral galaxy similar to our own Milky Way, which itself contains stars that are over 13 billion years old.

The observable - or inferred - universe. Even this may just be a small component of the whole ball game. At this scale, our immediate galactic neighborhood, the Virgo Supercluster, is too small to be seen. And it is extremely unlikely that it represents the center of the universe. Credit: Azcolvin429.

It is generally said that the comoving distance to the particles that emitted the cosmic microwave background is about 45.7 billion light years away – even though the photons those particles emitted have only been traveling for almost 13.7 billion years. Similarly, by inference, the absolute edge of the observable universe is 46.6 billion light years away.

However, you can’t conclude that this is the actual size of the universe – nor should you conclude that the cosmic microwave background has a distant origin. Your coffee cup may contain particles that originally emitted the cosmic microwave background – and the photons they emitted may be 45.7 billion light years away now – perhaps just now being collected by alien astronomers who will hence have their own 46.6 billion light year radius universe to infer – most of which they can’t directly observe either.

All universal residents have to infer the scale of the universe from the age of the photons that come to us and the other information that they carry. And this will always be historical information.

From Earth we can’t expect to ever come to know about anything that is happening right now in objects that are more distant than a comoving distance of around 16 billion light years, being the cosmic event horizon (equivalent to a redshift of around z = 1.8).

This is because those objects are right now receding from us at faster than the speed of light, even though we may continue receiving updated historical data about them for many billion of years to come – until they become so redshifted as to appear to wink out of existence.

Further reading: Davis and Lineweaver. Expanding Confusion: common misconceptions of cosmological horizons and the superluminal expansion of the universe.

Astronomy Without A Telescope – Dark Statistics

The dark flow hypothesis. A region of the observable universe is being influenced by a mysterious something outside the observable universe. Source: universe-review.ca

[/caption]

The hypothetical dark flow seen in the movement of galaxy clusters requires that we can reliably identify a clear statistical correlation in the motion of distant objects which are, in any case, flowing outwards with the expansion of the universe and may also have their own individual (or peculiar) motion arising from gravitational interactions.

For example, although galaxies have a general tendency to rush away from each other as space-time expands between them, the Milky Way and the Andromeda Galaxy are currently on a gravitationally bound collision course.

So, if you are interested in the motion of the universe at a large scale, it’s best to study bulk flow – where you step back from consideration of individual objects and instead look for general tendencies in the motion of large numbers of objects.

Very large scale observations of the motion of galaxy clusters were proposed by Kashlinsky et al in 2008 to indicate a region of aberrant flow, inconsistent with the general tendency in the motion and velocity expected by the expansion of the universe – and which cannot be accounted for by localized gravitational interactions.

On the basis of such findings, Kashlinsky has proposed that inhomogeneities in the early universe may have existed prior to cosmic inflation – which would represent a violation of the currently favored standard model for the evolution of the universe, known as the Lambda Cold Dark Matter (Lambda CDM) model.

The aberrant bulk flow might result from the existence of a large concentration of mass beyond the edge of the observable universe – or heck, maybe it is another adjacent universe. Since the cause is unknown – and perhaps unknowable, if the cause is beyond our observable horizon – the astronomical interrobang ‘dark’ is invoked – giving us the term ‘dark flow’.

To be fair, a lot of the more ‘out there’ suggestions to account for these data are made by commentators of Kashlinsky, rather than Kashlinsky and fellow researchers themselves – and that includes use of the term dark flow. Nonetheless, if the Kashlinsky data isn’t rock solid, all this wild speculation becomes a little redundant – and Occam’s razor suggests we should continue assuming that the universe is best explained by the current standard Lambda CDM model.

The apparent aberrant 'dark flow' (between the constellations of Centaurus and Vela) is alleged to show up in both close and distant galaxy clusters - where red is most distant, blue is least distant. This would suggest it is something that has been there since the universe was very young. Credit: Kashlinsky, NASA.

The Kashlinsky interpretation does have its critics. For example, Dai et al have provided a recent assessment of bulk flow based on the individual (peculiar) velocities of type 1A supernovae.

The Kashlinsky analysis is based on observations of the Sunyaev–Zel’dovich effect – which involves faint distortions in the cosmic microwave background (CMB) resulting from CMB photons interacting with energetic electrons – and these observations are only considered useful for identifying and observing the behavior of very large scale structures such as galaxy clusters. Dai et al instead use specific data points – being standard candle Type 1a supernovae observations – and look at the statistical fit of these data to the expected bulk flow of the universe.

So, while Kashlinsky et al say we should ignore the motion of individual units and just look at the bulk flow – Dai et al counter with saying we should look at the motion of individual units and determine how well those data fit an assumed bulk flow.

It turns out that Dai et al find the supernovae data can fit the general trend of bulk flow proposed by Kashlinsky – but only in closer (low red shift) regions. More significantly, they are unable to replicate any aberrant velocities. Kashlinsky measured an aberrant bulk flow of more than 600 kilometers a second, while Dai et al found velocities derived from Type 1a supernovae observations to best fit a bulk flow of only 188 kilometers a second. This is a close fit with the bulk flow expected from the Lambda CDM model of the expanding universe, which is around 170 kilometers a second.

Either way, it’s all down to a statistical analysis of general tendencies. More data would help here.

Further reading: Dai et al. Measuring the cosmological bulk flow using the peculiar velocities of supernovae.

Astronomy Without A Telescope – Doubly Special Relativity

The Large Hadron Collider - destined to deliver fabulous science data, but uncertain if these will include an evidence basis for quantum gravity theories. Credit: CERN.

[/caption]

General relativity, Einstein’s theory of gravity, gives us a useful basis for mathematically modeling the large scale universe – while quantum theory gives us a useful basis for modeling sub-atomic particle physics and the likely small-scale, high-energy-density physics of the early universe – nanoseconds after the Big Bang – which general relativity just models as a singularity and has nothing else to say on the matter.

Quantum gravity theories may have more to say. By extending general relativity into a quantized structure for space-time, maybe we can bridge the gap between small and large scale physics. For example, there’s doubly special relativity.

With conventional special relativity, two different inertial frames of reference may measure the speed of the same object differently. So, if you are on a train and throw a tennis ball forward, you might measure it moving at 10 kilometers an hour. But someone else standing on the train station platform watching your train pass by at 60 kilometers an hour, measures the speed of the ball at 60 + 10 – i.e. 70 kilometers an hour. Give or take a few nanometers per second, you are both correct.

However, as Einstein pointed out, do the same experiment where you shine a torch beam, rather than throw a ball, forward on the train – both you on the train and the person on the platform measure the torch beam’s speed as the speed of light – without that additional 60 kilometers an hour – and you are both correct.

It works out that for the person on the platform, the components of speed (distance and time) are changed on the train so that distances are contracted and time dilated (i.e. slower clocks). And by the math of Lorenz transformations, these effects become more obvious the faster than train goes. It also turns out that the mass of objects on the train increase as well – although, before anyone asks, the train can’t turn into a black hole even at 99.9999(etc) per cent of the speed of light.

Now, doubly special relativity, proposes that not only is the speed of light always the same regardless of your frame of reference, but Planck units of mass and energy are also always the same. This means that relativistic effects (like mass appearing to increase on the train) do not occur at the Planck (i.e. very small) scale – although at larger scales, doubly special relativity should deliver results indistinguishable from conventional special relativity.

The Planck spacecraft - an observatory exploring the universe and named after the founder of quantum theory. Coincidence? Credit: ESA.

Doubly special relativity might also be generalized towards a theory of quantum gravity – which, when extended up from the Planck scale, should deliver results indistinguishable from general relativity.

It turns out that at the Planck scale e = m, even though at macro scales e=mc2. And at the Planck scale, a Planck mass is 2.17645 × 10-8 kg – supposedly the mass of a flea’s egg – and has a Schwarzschild radius of a Planck length – meaning that if you compressed this mass into such a tiny volume, it would become a very small black hole containing one Planck unit of energy.

To put it another way, at the Planck scale, gravity becomes a significant force in quantum physics. Although really, all we are saying that is that there is one Planck unit of gravitational force between two Planck masses when separated by a Planck length – and by the way, a Planck length is the distance that light moves within one unit of Planck time!

And since one Planck unit of energy (1.22×1019 GeV) is considered the maximal energy of particles – it’s tempting to consider that this represents conditions expected in the Planck epoch, being the very first stage of the Big Bang.

It all sounds terribly exciting, but this line of thinking has been criticized as being just a trick to make the math work better, by removing important information about the physical systems under consideration. You also risk undermining fundamental principles of conventional relativity since, as the paper below outlines, a Planck length can be considered an invariable constant independent of an observer’s frame of reference while the speed of light does become variable at very high energy densities.

Nonetheless, since even the Large Hadron Collider is not expected to deliver direct evidence about what may or may not happen at the Planck scale – for now, making the math work better does seem to be the best way forward.

Further reading: Zhang et al. Photon Gas Thermodynamics in Doubly Special Relativity.

Astronomy Without A Telescope – Black Hole Entropy

Black holes - throw something in them and that's the end of the story, right? Well, some physicists can't leave it at that.

[/caption]

An easy way to think about the entropy of black holes is to consider that entropy represents the loss of free energy – that is, energy that is available to do work – from a system. Needless to say, anything you throw into a black hole is no longer available to do any work in the wider universe.

An easy way to think about the second law of thermodynamics (which is the one about entropy) is to consider that heat can’t flow from a colder location to a hotter location – it only flows the other way. As a result, any isolated system should eventually achieve a state of thermal equilibrium. Or if you like, the entropy of an isolated system will tend to increase over time – achieving a maximum value when that system achieves thermal equilibrium.

If you express entropy mathematically – it is a calculable value and one that tends to increase over time. In the seventies, Jacob Bekenstein expressed black hole entropy as a problem for physics. No doubt he could explain it much better than I could, but I think the idea is that if you suddenly transfer a system with a known entropy value past the event horizon of a black hole, it becomes immeasurable – as though its entropy vanishes. This represents a violation of the second law of thermodynamics – since the entropy of a system should at best stay constant – or more often increase – it can’t suddenly plummet like that.

So the best way to handle that is to acknowledge that whatever entropy a system possesses is transferred to the black hole when the system goes into it. This is another reason why black holes can be considered to have a very high entropy.

Then we come to the issue of information. The sentence The quick brown fox jumped over the lazy dog is a highly engineered system with a low level of entropy – while drawing out 26 tiles from a scrabble set and laying them down however they come delivers an randomly ordered object with a high level of entropy and uncertainty (to the extent that it could be any of a billion possible variations).

Throw your scrabble tiles into a black hole – they will carry with them whatever entropy value they began with – which is likely to increase further within the black hole. Indeed it’s likely that the tiles will not only become more disorganized but actually crushed to bits within the black hole.

Now there is fundamental principle in quantum mechanics which requires that information cannot be destroyed or lost. It’s more about wave functions than about scrabble tiles – but let’s stick with the analogy.

You won’t violate the conservation of information principle by filling a black hole with scrabble tiles. Their information is just transfered to the black hole rather than being lost – and even if the tiles are crushed to bits, the information is still there in some form. This is OK.

But, there is a problem if in a googol or so years, the black hole evaporates via Hawking radiation, which arises from quantum fluctuations at the event horizon and has no apparent causal connection with the contents of the black hole.

The Hawking radiation story. A quantum fluctuation proximal to a black hole's event horizon produces a particle and an antiparticle. The antiparticle enters the black hole and annihilates when it collides with a particle in there. The remaining particle is free to join the rest of the universe outside the event horizon. To an external observer, the black hole appears to have lost mass and radiated a particle. Over time this process would result in the black hole evaporating. To date - good story, evidence nil, but watch this space. Credit: NAU.

A currently favored solution to this problem is the holographic principle – which suggests that whatever enters the black hole leaves an imprint on its event horizon – such that information about the entire contents of the black hole can be derived from just the event horizon ‘surface’ – and any subsequent Hawking radiation is influenced at a quantum level by that information – such that Hawking radiation does succeed in carrying information out of the black hole as the black hole evaporates.

Zhang et al offer another approach of suggesting that Hawking radiation, via quantum tunneling, carries entropy out of the black hole – and since reduced entropy means reduced uncertainty – this represents a nett gain of information drawn out from the black hole. So Hawking radiation carries not only entropy, but also information, out of the black hole.
But is this more or less convincing than the hologram idea? Well, that’s uncertain…

Further reading: Zhang et al. An interpretation for the entropy of a black hole.

Astronomy Without A Telescope – Unreasonable Effectiveness

CAP

[/caption]

Gravitational waves are apparently devilishly difficult things to model with Einstein field equations, since they are highly dynamic and non-symmetric. Traditionally, the only way to get close to predicting the likely effects of gravity waves was to estimate the required Einstein equation parameters by assuming the objects causing the gravity waves did not generate strong gravity fields themselves – and nor did they move at velocities anywhere close to the speed of light.

Trouble is, the mostly likely candidate objects that might generate detectable gravity waves – close binary neutron stars and merging black holes – have exactly those properties. They are highly compact, very massive bodies that often move at relativistic (i.e. close to the speed of light) velocities.

Isn’t it weird then that the ‘guesstimate’ approach described above actually works brilliantly in predicting the behaviors of close massive binaries and merging black holes. Hence a recent paper titled: On the unreasonable effectiveness of post-Newtonian approximation in gravitational physics.

So, firstly no-one has yet detected gravity waves. But even in 1916, Einstein considered their existence likely and demonstrated mathematically that gravitational radiation should arise when you replace a spherical mass with a rotating dumbbell of the same mass which, due to its geometry, will generate dynamic ebb and flow effects on space-time as it rotates.

To test Einstein’s theory, it’s necessary to design very sensitive detecting equipment – and to date all such attempts have failed. Further hopes now largely rest on the Laser Interferometer Space Antenna (LISA), which is not expected to launch before 2025.

The proposed Laser Interferometer Space Antenna (LISA) system using laser interferometry to monitor the fluctuations in the relative distances between three spacecraft, arranged in an equilateral triangle with five million kilometer sides. Hopefully, this will be sensitive enough to detect gravity waves. Credit: NASA.

However, as well as sensitive detection equipment like LISA, you also need to calculate what sort of phenomena and what sort of data would represent definitive evidence of a gravity wave – which is where all the theory and math required to determine these expected values is vital.

Initially, theoreticians worked out a post-Newtonian (i.e. Einstein era) approximation (i.e. guesstimate) for a rotating binary system – although it was acknowledged that this approximation would only work effectively for a low mass, low velocity system – where any complicating relativistic and tidal effects, arising from the self-gravity and velocities of the binary objects themselves, could be ignored.

Then came the era of numerical relativity where the advent of supercomputers made it possible to actually model all the dynamics of close massive binaries moving at relativistic speeds, much as how supercomputers can model very dynamic weather systems on Earth.

Surprisingly, or if you like unreasonably, the calculated values from numerical relativity were almost identical to those calculated by the supposedly bodgy post-Newtonian approximation. The post-Newtonian approximation approach just isn’t supposed to work for these situations.

All the authors are left with is the possibility that gravitational redshift makes processes near very massive objects appear slower and gravitationally ‘weaker’ to an external observer than they really are. That could – kind of, sort of – explain the unreasonable effectiveness… but only kind of, sort of.

Further reading: Will, C. On the unreasonable effectiveness of the post-Newtonian approximation in gravitational physics.

Astronomy Without A Telescope – Knots In Space

A double Einstein ring. Either two distant galaxies are coincidentally lined up directly behind a closer massive galactic cluster - or it's a donut-shaped portal to an alternate universe. Tough choice, huh?

[/caption]

So finally you possess that most valuable of commodities, a traversable wormhole – and somehow or other you grab one end of it and accelerate it to a very rapid velocity.

This might only take you a couple of weeks since you accelerate to the same velocity as your end of the wormhole. But for a friend who has sat waiting at the first entrance to the wormhole, time dilation means that ten years might have passed while you have mucked about at close-to-light-speed-velocities with the other end of the wormhole.

So when you decide to travel back through the wormhole to see your friend, you naturally maintain your own frame of reference and hence your own proper time, as is indicated by observing the watch on your wrist. So when you emerge at the other end of the wormhole, you can surprise your ageing partner with a newspaper you grabbed from 2011 – since he now lives in 2021.

You encourage your friend to come back with you through the wormhole – and traveling ten years back in time to 2011, he spends an enjoyable few days following his ten year younger self around, sending cryptic text messages that encourages his younger self to invent transparent aluminum. However, your friend is disappointed to find that when you both travel back through the wormhole to 2021, his bank account remains depressing low, because the wormhole is connected to what has become an alternate universe – where the time travel event that you just experienced, never happened.

You also realize that your wormhole time machine has other limits. You can further accelerate your end of the wormhole to 100 or even 1000 years of time dilation, but it still remains the case that you can only travel back in time as far as 2011, when you first decided to accelerate your end of the wormhole.

But anyway, wouldn’t it be great if any of this was actually possible? If you looked out into the universe to try and observe a traversable wormhole – you might start by looking for an Einstein ring. A light source from another universe (or a light source from a different time in an analogue of this universe) should be ‘lensed’ by the warped space-time of the wormhole – if the wormhole and the light source are in your direct line of sight. If all of that is plausible, then the light source should appear as a bright ring of light.

The theoretical light signatures of a donut-shaped 'ringhole' type wormhole and a Klein bottle 'time machine'. The ringhole signature is a double Einstein ring - and the Klein bottle signature is two concentric truncated spirals. A Klein bottle time machine is a wormhole of warped space-time where the exit has the identical spatial position as the entrance - so going through it means you should only travel in time. Credit: González-Díaz and Alonso-Serrano.

In fact there’s lots of these Einstein rings out there , but a more mundane cause for their existence is generally attributed to gravitational lensing by a massive object (like a galactic cluster) situated between you and a bright light source – all of which are still in our universe.

A recent theoretical letter has proposed that a ringhole rather than a wormhole structure might arise from an unlikely set of circumstances (i.e. this is pure theory – best just to go with it). So rather than a straight tube you could have a toroidal ‘donut’ connection with an alternate universe – which should then create a double Einstein ring – being two concentric circles of light.

This is a much rarer phenomenon and the authors suggest that the one well known instance (SDSSJ0946+1006) needs to be explained by the fortuitous alignment of three massive galactic clusters – which is starting to stretch belief a little… maybe?

Whether or not you find that a convincing argument, the authors then propose that if a Klein bottle wormhole existed – it would create such an unlikely visual phenomenon (two concentric truncated spirals of light) that surely then we might concede that such exotic structures exist?

And OK, if we ever do observe two concentric truncated spirals in the sky that could be pause for thought. Watch this space.

Further reading: González-Díaz and Alonso-Serrano Observing other universes through ringholes and Klein-bottle holes.