The Future of Gravitational Wave Astronomy: Pulsar Webs, Space Interferometers and Everything

A merging of two massive objects, sending ripples through the fabric of space and time. Image credit: R. Hurt/Caltech JPL

It’s the hot new field in modern astronomy. The recent announcement of the direct detection of gravitational waves by the Laser Interferometer Gravitational-wave Observatory (LIGO) ushers in a new era of observational astronomy that is completely off the electromagnetic spectrum. This detection occurred on September 14th, 2015 and later earned itself the name GW150914. This occurred shortly after Advanced LIGO turned on in early September, a great sign concerning the veracity of the equipment. Continue reading “The Future of Gravitational Wave Astronomy: Pulsar Webs, Space Interferometers and Everything”

The Definitive Guide To Terraforming

Artist's impression of the terraforming of Mars, from its current state to a livable world. Credit: Daein Ballard
Artist's impression of the terraforming of Mars, from its current state to a livable world. Credit: Daein Ballard

Terraforming. Chances are you’ve heard that word uttered before, most likely in the context of some science fiction story. However, in recent years, thanks to renewed interest in space exploration, this word is being used in an increasingly serious manner. And rather than being talked about like a far-off prospect, the issue of terraforming other worlds is being addressed as a near-future possibility.

In recent years, we’ve heard luminaries like Elon Musk and Stephen Hawking claiming that humanity needs a “backup location” to ensure our survival, private ventures like Mars One enlisting thousands of volunteers to colonize the Red Planet, and space agencies like NASA, the ESA, and China discussing the prospect of long-term habitability on Mars or the Moon. From all indications, it looks like terraforming is yet another science-fiction concept that is migrating into the realm of science fact.

But just what does terraforming entail? Where exactly could we go about using this process? What kind of technology would we need? Does such technology already exist, or do we have to wait? How much in the way of resources would it take? And above all, what are the odds of it succeeding? Answering any or all of these questions requires a bit of digging. Not only is terraforming a time-honored concept, but as it turns out, humanity already has quite a bit of experience in this area!

Origin Of The Term:

To break it down, terraforming is the process whereby a hostile environment (i.e., a planet that is too cold, too hot, and/or has an unbreathable atmosphere) is altered to make it suitable for human life. This could involve modifying the temperature, atmosphere, surface topography, ecology, or all of the above to make a planet or moon more “Earth-like.”

Venus is also considered a prime candidate for terraforming. Credit: NASA/JPL/io9.com
Venus is considered by many to be a prime candidate for terraforming. Credit: NASA/JPL/io9.com

The term was coined by Jack Williamson, an American science fiction writer who has also been called “the Dean of science fiction” (after the death of Robert Heinlein in 1988). The term appeared as part of a science-fiction story, titled “Collision Orbit,” published in the 1942 edition of the magazine Astounding Science Fiction. This is the first known mention of the concept, though there are examples of it appearing in fiction beforehand.

Terraforming in Fiction:

Science fiction is filled with examples of altering planetary environments to be more suitable to human life, many of which predate scientific studies by many decades. For example, in H.G. Wells’ War of the Worlds, he mentions at one point how the Martian invaders begin transforming Earth’s ecology for the sake of long-term habitation.

In Olaf Stapleton’s Last And First Men (1930), two chapters are dedicated to describing how humanity’s descendants terraform Venus after Earth becomes uninhabitable. In the process, they commit genocide against the native aquatic life. By the 1950s and 60s, due to the beginning of the Space Age, terraforming appeared in works of science fiction with increasing frequency.

One such example is Farmer in the Sky (1950) by Robert A. Heinlein. In this novel, Heinlein offers a vision of Jupiter’s moon Ganymede that is being transformed into an agricultural settlement. This was a very significant work, in that it was the first where the concept of terraforming is presented as a serious and scientific matter, rather than the subject of mere fantasy.

. Credit: Metro-.Goldwyn-Mayer
Scene from 2010: The Year We Make Contact, the movie adaptation of Clarke’s novel. Credit: Metro-Goldwyn-Mayer

In 1951, Arthur C. Clarke wrote the first novel in which the terraforming of Mars was presented in fiction. Titled The Sands of Mars, the story involves Martian settlers heating up the planet by converting Mars’ moon Phobos into a second sun and growing plants that break down the Martian sands in order to release oxygen. In his seminal book 2001: A Space Odyssey – and its sequel, 2010: Odyssey Two – Clarke presents a race of ancient beings (“Firstborn”) turning Jupiter into a second sun so that Europa will become a life-bearing planet.

Poul Anderson also wrote extensively about terraforming in the 1950s. In his 1954 novel, The Big Rain, Venus is altered through planetary engineering techniques over a very long period of time. The book was so influential that the term term “Big Rain” has since come to be synonymous with the terraforming of Venus.  This was followed in 1958 by the Snows of Ganymede, where the Jovian moon’s ecology is made habitable through a similar process.

In Issac Asimov’s Robot series, colonization and terraforming are performed by a powerful race of humans known as “Spacers,” who conduct this process on fifty planets in the known universe.  In his Foundation series, humanity has effectively colonized every habitable planet in the galaxy and terraformed them to become part of the Galactic Empire.

In 1984, James Lovelock and Michael Allaby wrote what is considered by many to be one of the most influential books on terraforming. Titled The Greening of Mars, the novel explores the formation and evolution of planets, the origin of life, and Earth’s biosphere. The terraforming models presented in the book actually foreshadowed future debates regarding the goals of terraforming.

Kim Stanley Robinson's Red Mars Trilogy. Credit: variety.com
Kim Stanley Robinson’s Red Mars Trilogy. Credit: variety.com

In the 1990s, Kim Stanley Robinson released his famous trilogy that deals with the terraforming of Mars. Known as the Mars TrilogyRed Mars, Green Mars, Blue Mars – this series centers on the transformation of Mars over the course of many generations into a thriving human civilization. This was followed up in 2012 with the release of 2312, which deals with the colonization of the Solar System – including the terraforming of Venus and other planets.

Countless other examples can be found in popular culture, ranging from television and print to films and video games.

Study of Terraforming:

In an article published by the journal Science in 1961, famed astronomer Carl Sagan proposed using planetary engineering techniques to transform Venus. This involved seeding the atmosphere of Venus with algae, which would convert the atmosphere’s ample supplies of water, nitrogen, and carbon dioxide into organic compounds and reduce Venus’ runaway greenhouse effect.

In 1973, he published an article in the journal Icarus titled “Planetary Engineering on Mars,” where he proposed two scenarios for transforming Mars. These included transporting low albedo material and/or planting dark plants on the polar ice caps to ensure it absorbed more heat, melted, and converted the planet to more “Earth-like conditions.”

In 1976, NASA addressed the issue of planetary engineering officially in a study titled “On the Habitability of Mars: An Approach to Planetary Ecosynthesis.” The study concluded that photosynthetic organisms, the melting of the polar ice caps, and the introduction of greenhouse gases could all be used to create a warmer, oxygen, and ozone-rich atmosphere. The first conference session on terraforming – referred to as “Planetary Modeling” at the time- was organized that same year.

Living Mars. Credit: Kevin Gill
Artist concept of a ‘Living’ Mars. Credit: Kevin Gill

And then in March of 1979, NASA engineer and author James Oberg organized the First Terraforming Colloquium – a special session at the Tenth Lunar and Planetary Science Conference, which is held annually in Houston, Texas. In 1981, Oberg popularized the concepts that were discussed at the colloquium in his book New Earths: Restructuring Earth and Other Planets.

In 1982, Planetologist Christopher McKay wrote “Terraforming Mars”, a paper for the Journal of the British Interplanetary Society. In it, McKay discussed the prospects of a self-regulating Martian biosphere, which included both the required methods for doing so and the ethics of it. This was the first time that the word terraforming was used in the title of a published article, and would henceforth become the preferred term.

This was followed by James Lovelock and Michael Allaby’s The Greening of Mars in 1984. This book was one of the first to describe a novel method of warming Mars, where chlorofluorocarbons (CFCs) are added to the atmosphere in order to trigger global warming. This book motivated biophysicist Robert Haynes to begin promoting terraforming as part of a larger concept known as Ecopoiesis.

Derived from the Greek words oikos (“house”) and poiesis (“production”), this word refers to the origin of an ecosystem. In the context of space exploration, it involves a form of planetary engineering where a sustainable ecosystem is fabricated from an otherwise sterile planet. As described by Haynes, this begins with the seeding of a planet with microbial life, which leads to conditions approaching that of a primordial Earth. This is then followed by the importation of plant life, which accelerates the production of oxygen, and culminates in the introduction of animal life.

An engineer suggests building a roof over a small planet so that Earthlike conditions could be maintained. Credit: by Karl Tate, Infographics Artist - See more at: http://www.space.com/23082-shell-worlds-planet-terraforming-technology-infographic.html#sthash.LB9CyN2g.dpuf
An engineer suggests building a roof over a small planet so that Earth-like conditions could be maintained. Credit: Karl Tate/space.com

In 2009, Kenneth Roy – an engineer with the US Department of Energy – presented his concept for a “Shell World” in a paper published with the Journal of British Interplanetary Sciences. Titled “Shell Worlds – An Approach To Terraforming Moons, Small Planets and Plutoids“, his paper explored the possibility of using a large “shell” to encase an alien world, keeping its atmosphere contained long enough for long-term changes to take root.

There is also the concept where a usable part of a planet is enclosed in a dome in order to transform its environment, which is known as “paraterraforming”. This concept, originally coined by British mathematician Richard L.S. Talyor in his 1992 publication Paraterraforming – The worldhouse concept, could be used to terraform sections of several planets that are otherwise inhospitable, or cannot be altered in whole.

Potential Sites:

Within the Solar System, several possible locations exist that could be well-suited to terraforming. Consider the fact that besides Earth, Venus and Mars also lie within the Sun’s Habitable Zone (aka. “Goldilocks Zone”). However, owing to Venus’ runaway greenhouse effect, and Mars’ lack of a magnetosphere, their atmospheres are either too thick and hot or too thin and cold, to sustain life as we know it. However, this could theoretically be altered through the right kind of ecological engineering.

Other potential sites in the Solar System include some of the moons that orbit the gas giants. Several Jovian (i.e. in orbit of Jupiter) and Cronian (in orbit of Saturn) moons have an abundance of water ice, and scientists have speculated that if the surface temperatures were increased, viable atmospheres could be created through electrolysis and the introduction of buffer gases.

Artist's conception of a terraformed Mars. Credit: Ittiz/Wikimedia Commons
Artist’s conception of a terraformed Mars. Credit: Ittiz/Wikimedia Commons

There is even speculation that Mercury and the Moon (or at least parts thereof) could be terraformed in order to be suitable for human settlement. In these cases, terraforming would require not only altering the surface but perhaps also adjusting their rotation. In the end, each case presents its own share of advantages, challenges, and likelihoods for success. Let’s consider them in order of distance from the Sun.

Inner Solar System:

The terrestrial planets of our Solar System present the best possibilities for terraforming. Not only are they located closer to our Sun, and thus in a better position to absorb its energy, but they are also rich in silicates and minerals – which any future colonies will need to grow food and build settlements. And as already mentioned, two of these planets (Venus and Mars) skirt the inner and outer edge of the Sun’s habitable zone.

Mercury:
The vast majority of Mercury’s surface is hostile to life, where temperatures gravitate between extremely hot and cold – i.e. 700 K (427 °C; 800 °F) 100 K (-173 °C; -280 °F). This is due to its proximity to the Sun, the almost total lack of an atmosphere, and its very slow rotation. However, at the poles, temperatures are consistently low -93 °C (-135 °F) due to it being permanently shadowed.

Images of Mercury's northern polar region, provided by MESSENGER. Credit: NASA/JPL
Images of Mercury’s northern polar region, provided by MESSENGER. Credit: NASA/JPL

The presence of water ice and organic molecules in the northern polar region has also been confirmed thanks to data obtained by the MESSENGER mission. Colonies could therefore be constructed in the regions, and limited terraforming (aka. paraterraforming) could take place. For example, if domes (or a single dome) of sufficient size could be built over the Kandinsky, Prokofiev, Tolkien, and Tryggvadottir craters, the northern region could be altered for human habitation.

Theoretically, this could be done by using mirrors to redirect sunlight into the domes which would gradually raise the temperature. The water ice would then melt, and when combined with organic molecules and finely ground sand, soil could be made. Plants could then be grown to produce oxygen, which combined with nitrogen gas, would produce a breathable atmosphere.

Venus:
As “Earth’s Twin“, there are many possibilities and advantages to terraforming Venus. The first to propose this was Sagan with his 1961 article in Science. However, subsequent discoveries – such as the high concentrations of sulfuric acid in Venus’ clouds – made this idea unfeasible. Even if algae could survive in such an atmosphere, converting the extremely dense clouds of CO² into oxygen would result in an over-dense oxygen environment.

In addition, graphite would become a by-product of the chemical reactions, which would likely form into a thick powder on the surface. This would become CO² again through combustion, thus restarting the entire greenhouse effect. However, more recent proposals have been made that advocate using carbon sequestration techniques, which are arguably much more practical.

In these scenarios, chemical reactions would be relied on to convert Venus’ atmosphere to something breathable while also reducing its density. In one scenario, hydrogen and iron aerosol would be introduced to convert the CO² in the atmosphere into graphite and water. This water would then fall to the surface, where it will cover roughly 80% of the planet – due to Venus having little variation in elevation.

Another scenario calls for the introduction of vast amounts of calcium and magnesium into the atmosphere. This would sequester carbon in the form of calcium and magnesium carbonates. An advantage to this plan is that Venus already has deposits of both minerals in its mantle, which could then be exposed to the atmosphere through drilling. However, most of the minerals would have to come from off-world in order to reduce the temperature and pressure to sustainable levels.

Yet another proposal is to freeze the atmospheric carbon dioxide down to the point of liquefaction – where it forms dry ice – and letting it accumulate on the surface. Once there, it could be buried and would remain in a solid state due to pressure, and even mined for local and off-world use. And then there is the possibility of bombarding the surface with icy comets (which could be mined from one of Jupiter’s or Saturn’s moons) to create a liquid ocean on the surface, which would sequester carbon and aid in any other of the above processes.

Last, there is the scenario in which Venus’ dense atmosphere could be removed. This could be characterized as the most direct approach to thinning an atmosphere that is far too dense for human occupation. By colliding large comets or asteroids into the surface, some of the dense CO² clouds could be blasted into space, thus leaving less atmosphere to be converted.

Artist's conception of a terraformed Venus, showing a surface largely covered in oceans. Credit: Wikipedia Commons/Ittiz
Artist’s conception of a terraformed Venus, showing a surface largely covered in oceans. Credit: Wikipedia Commons/Ittiz

A slower method could be achieved using mass drivers (aka. electromagnetic catapults) or space elevators, which would gradually scoop up the atmosphere and either lift it into space or fire it away from the surface. And beyond altering or removing the atmosphere, there are also concepts that call for reducing the heat and pressure by either limiting sunlight (i.e. with solar shades) or altering the planet’s rotational velocity.

The concept of solar shades involves using either a series of small spacecraft or a single large lens to divert sunlight from a planet’s surface, thus reducing global temperatures. For Venus, which absorbs twice as much sunlight as Earth, solar radiation is believed to have played a major role in the runaway greenhouse effect that has made it what it is today.

Such a shade could be space-based, located in the Sun-Venus L1 Lagrangian Point, where it would not only prevent some sunlight from reaching Venus but also serve to reduce the amount of radiation Venus is exposed to. Alternately, solar shades or reflectors could be placed in the atmosphere or on the surface. This could consist of large reflective balloons, sheets of carbon nanotubes or graphene, or low-albedo material.

Placing shades or reflectors in the atmosphere offers two advantages: for one, atmospheric reflectors could be built in-situ, using locally-sourced carbon. Second, Venus’ atmosphere is dense enough that such structures could easily float atop the clouds. However, the amount of material would have to be large and would have to remain in place long after the atmosphere had been modified. Also, since Venus already has highly reflective clouds, any approach would have to significantly surpass its current albedo (0.65) to make a difference.

Solar shades placed in orbit of Venus are a possible means of terraforming the planet. Credit: IEEE Spectrum/John MacNeill
Solar shades placed in orbit of Venus are a possible means of terraforming the planet. Credit: IEEE Spectrum/John MacNeill

Also, the idea of speeding up Venus’ rotation has been floating around as a possible means of terraforming. If Venus could be spun-up to the point where its diurnal (day-night) cycle is similar to Earth’s, the planet might just begin to generate a stronger magnetic field. This would have the effect of reducing the amount of solar wind (and hence radiation) from reaching the surface, thus making it safer for terrestrial organisms.

The Moon:
As Earth’s closest celestial body, colonizing the Moon would be comparatively easy compared to other bodies. But when it comes to terraforming the Moon, the possibilities and challenges closely resemble those of Mercury. For starters, the Moon has an atmosphere that is so thin that it can only be referred to as an exosphere. What’s more, the volatile elements that are necessary for life are in short supply (i.e. hydrogen, nitrogen, and carbon).

These problems could be addressed by capturing comets that contain water ices and volatiles and crashing them into the surface. The comets would sublimate, dispersing these gases and water vapor to create an atmosphere. These impacts would also liberate water that is contained in the lunar regolith, which could eventually accumulate on the surface to form natural bodies of water.

The transfer of momentum from these comets would also get the Moon rotating more rapidly, speeding up its rotation so that it would no longer be tidally locked. A Moon that was sped up to rotate once on its axis every 24 hours would have a steady diurnal cycle, which would make colonization and adapting to life on the Moon easier.

There is also the possibility of paraterraforming parts of the Moon in a way that would be similar to terraforming Mercury’s polar region. In the Moon’s case, this would take place in the Shackleton Crater, where scientists have already found evidence of water ice. Using solar mirrors and a dome, this crater could be turned into a micro-climate where plants could be grown and a breathable atmosphere created.

Mars:
When it comes to terraforming, Mars is the most popular destination. There are several reasons for this, ranging from its proximity to Earth, its similarities to Earth, and the fact that it once had an environment that was very similar to Earth’s – which included a thicker atmosphere and the presence of warm, flowing water on the surface. Lastly, it is currently believed that Mars may have additional sources of water beneath its surface.

In brief, Mars has a diurnal and seasonal cycle that are very close to what we experience here on Earth. In the former case, a single day on Mars lasts 24 hours and 40 minutes. In the latter case, and owing to Mars’ similarly-tilted axis (25.19° compared to Earth’s 23°), Mars experiences seasonal changes that are very similar to Earth’s. Though a single season on Mars lasts roughly twice as long, the temperature variation that results is very similar – ±178 °C (320°F) compared to Earth’s ±160 °C (278°F).

Beyond these, Mars would need to undergo vast transformations in order for human beings to live on its surface. The atmosphere would need to be thickened drastically, and its composition would need to be changed. Currently, Mars’ atmosphere is composed of 96% carbon dioxide, 1.93% argon, and 1.89% nitrogen, and the air pressure is equivalent to only 1% of Earth’s at sea level.

Above all, Mars lacks a magnetosphere, which means that its surface receives significantly more radiation than we are used to here on Earth. In addition, it is believed that Mars once had a magnetosphere and that the disappearance of this magnetic field led to the stripping of Mars’ atmosphere by solar wind. This in turn is what led Mars to become the cold, desiccated place it is today.

Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill
Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill

Ultimately, this means that in order for the planet to become habitable by human standards, its atmosphere would need to be significantly thickened and the planet significantly warmed. The composition of the atmosphere would need to change as well, from the current CO²-heavy mix to a nitrogen-oxygen balance of about 70/30. And above all, the atmosphere would need to be replenished every so often to compensate for the loss.

Luckily, the first three requirements are largely complementary, and present a wide range of possible solutions. For starters, Mars’ atmosphere could be thickened and the planet warmed by bombarding its polar regions with meteors. These would cause the poles to melt, releasing their deposits of frozen carbon dioxide and water into the atmosphere and triggering a greenhouse effect.

The introduction of volatile elements, such as ammonia and methane, would also help to thicken the atmosphere and trigger warming. Both could be mined from the icy moons of the outer Solar System, particularly from the moons of Ganymede, Callisto, and Titan. These could also be delivered to the surface via meteoric impacts.

After impacting on the surface, the ammonia ice would sublimate and break down into hydrogen and nitrogen – the hydrogen interacting with the CO² to form water and graphite, while the nitrogen acts as a buffer gas. The methane, meanwhile, would act as a greenhouse gas that would further enhance global warming. In addition, the impacts would throw tons of dust into the air, further fueling the warming trend.

In time, Mars’ ample supplies of water ice – which can be found not only in the poles but in vast subsurface deposits of permafrost – would all sublimate to form warm, flowing water. And with significantly increased air pressure and a warmer atmosphere, humans might be able to venture out onto the surface without the need for pressure suits.

However, the atmosphere will still need to be converted into something breathable. This will be far more time-consuming, as the process of converting the atmospheric CO² into oxygen gas will likely take centuries. In any case, several possibilities have been suggested, which include converting the atmosphere through photosynthesis – either with cyanobacteria or Earth plants and lichens.

Other suggestions include building orbital mirrors, which would be placed near the poles and direct sunlight onto the surface to trigger a cycle of warming by causing the polar ice caps to melt and release their CO² gas. Using dark dust from Phobos and Deimos to reduce the surface’s albedo, thus allowing it to absorb more sunlight, has also been suggested.

In short, there are plenty of options for terraforming Mars. And many of them, if not being readily available, are at least on the table…

Outer Solar System:

Beyond the Inner Solar System, there are several sites that would make for good terraforming targets as well. Particularly around Jupiter and Saturn, there are several sizable moons – some of which are larger than Mercury – that have an abundance of water in the form of ice (and in some cases, maybe even interior oceans).

The moons of the Solar System, shown to scale. Credit: planetary.org
The moons of the Solar System, shown to scale. Credit: planetary.org

At the same time, many of these same moons contain other necessary ingredients for functioning ecosystems, such as frozen volatiles  – like ammonia and methane. Because of this, and as part of our ongoing desire to explore farther out into our Solar System, many proposals have been made to seed these moons with bases and research stations. Some plans even include possible terraforming to make them suitable for long-term habitation.

The Jovian Moons:
Jupiter’s largest moons, Io, Europa, Ganymede, and Callisto – known as the Galileans, after their founder (Galileo Galilei) – have long been the subject of scientific interest. For decades, scientists have speculated about the possible existence of a subsurface ocean on Europa, based on theories about the planet’s tidal heating (a consequence of its eccentric orbit and orbital resonance with the other moons).

Analysis of images provided by the Voyager 1 and Galileo probes added weight to this theory, showing regions where it appeared that the subsurface ocean had melted through. What’s more, the presence of this warm water ocean has also led to speculation about the existence of life beneath Europa’s icy crust – possibly around hydrothermal vents at the core-mantle boundary.

Because of this potential for habitability, Europa has also been suggested as a possible site for terraforming. As the argument goes, if the surface temperature could be increased, and the surface ice melted, the entire planet could become an ocean world. Sublimation of the ice, which would release water vapor and gaseous volatiles, would then be subject to electrolysis (which already produces a thin oxygen atmosphere).

However, Europa has no magnetosphere of its own and lies within Jupiter’s powerful magnetic field. As a result, its surface is exposed to significant amounts of radiation – 540 rem of radiation per day compared to about 0.0030 rem per year here on Earth – and any atmosphere we create would begin to be stripped away by Jupiter. Ergo, radiation shielding would need to be put in place that could deflect the majority of this radiation.

And then there is Ganymede, the third most-distant of Jupiter’s Galilean moons. Much like Europa, it is a potential site of terraforming and presents numerous advantages. For one, it is the largest moon in our Solar System, larger than our own moon and even larger than the planet Mercury. In addition, it also has ample supplies of water ice, is believed to have an interior ocean, and even has its own magnetosphere.

Hence, if the surface temperature were increased and the ice sublimated, Ganymede’s atmosphere could be thickened. Like Europa, it would also become an ocean planet, and its own magnetosphere would allow for it to hold on to more of its atmosphere. However, Jupiter’s magnetic field still exerts a powerful influence over the planet, which means radiation shields would still be needed.

Lastly, there is Callisto, the fourth-most distant of the Galileans. Here too, abundant supplies of water ice, volatiles, and the possibility of an interior ocean all point towards the potential for habitability. But in Callisto’s case, there is the added bonus of it being beyond Jupiter’s magnetic field, which reduces the threat of radiation and atmospheric loss.

Artist's cut-away representation of the internal structure of Ganymede. Credit: Wikipedia Commons/kelvinsong
Artist’s cut-away representation of the internal structure of Ganymede. Credit: Wikipedia Commons/kelvinsong

The process would begin with surface heating, which would sublimate the water ice and Callisto’s supplies of frozen ammonia. From these oceans, electrolysis would lead to the formation of an oxygen-rich atmosphere, and the ammonia could be converted into nitrogen to act as a buffer gas. However, since the majority of Callisto is ice, it would mean that the planet would lose considerable mass and have no continents. Again, an ocean planet would result, necessitating floating cities or massive colony ships.

The Cronians Moons:
Much like the Jovian Moons, Saturn’s Moons (also known as the Cronian) present opportunities for terraforming. Again, this is due to the presence of water ice, interior oceans, and volatile elements. Titan, Saturn’s largest moon, also has an abundance of methane that comes in liquid form (the methane lakes around its northern polar region) and in gaseous form in its atmosphere. Large caches of ammonia are also believed to exist beneath the surface ice.

Titan is also the only natural satellite to have a dense atmosphere (one and half times the pressure of Earth’s) and the only planet outside of Earth where the atmosphere is nitrogen-rich. Such a thick atmosphere would mean that it would be far easier to equalize pressure for habitats on the planet. What’s more, scientists believe this atmosphere is a prebiotic environment rich in organic chemistry – i.e. similar to Earth’s early atmosphere (only much colder).

iagram of the internal structure of Titan according to the fully differentiated dense-ocean model. Credit: Wikipedia Commons/Kelvinsong
Diagram of the internal structure of Titan according to the fully differentiated dense-ocean model. Credit: Wikipedia Commons/Kelvinsong

As such, converting it to something Earth-like would be feasible. First, the surface temperature would need to be increased. Since Titan is very distant from the Sun and already has an abundance of greenhouse gases, this could only be accomplished through orbital mirrors. This would sublimate the surface ice, releasing ammonia beneath, which would lead to more heating.

The next step would involve converting the atmosphere to something breathable. As already noted, Titan’s atmosphere is nitrogen-rich, which would remove the need for introducing a buffer gas. And with the availability of water, oxygen could be introduced by generating it through electrolysis. At the same time, the methane and other hydrocarbons would have to be sequestered, in order to prevent an explosive mixture with the oxygen.

But given the thickness and multi-layered nature of Titan’s ice, which is estimated to account for half of its mass, the moon would be very much an ocean planet- i.e. with no continents or landmasses to build on. So once again, any habitats would have to take the form of either floating platforms or large ships.

Enceladus is another possibility, thanks to the recent discovery of a subsurface ocean. Analysis by the Cassini space probe of the water plumes erupting from its southern polar region also indicated the presence of organic molecules. As such, terraforming it would be similar to terraforming Jupiter’s moon of Europa, and would yield a similar ocean moon.

Artist's rendering of possible hydrothermal activity that may be taking place on and under the seafloor of Enceladus. Image Credit: NASA/JPL
Artist’s rendering of possible hydrothermal activity that may be taking place on and under the seafloor of Enceladus. Credit: NASA/JPL

Again, this would likely have to involve orbital mirrors, given Enceladus’ distance from our Sun. Once the ice began to sublimate, electrolysis would generate oxygen gas. The presence of ammonia in the subsurface ocean would also be released, helping to raise the temperature and serving as a source of nitrogen gas, with which to buffer the atmosphere.

Exoplanets:
In addition to the Solar System, extra-solar planets (aka. exoplanets) are also potential sites for terraforming. Of the 1,941 confirmed exoplanets discovered so far, these planets are those that have been designated “Earth-like. In other words, they are terrestrial planets that have atmospheres and, like Earth, occupy the region around a star where the average surface temperature allows for liquid water (aka. habitable zone).

The first planet confirmed by Kepler to have an average orbital distance that placed it within its star’s habitable zone was Kepler-22b. This planet is located about 600 light-years from Earth in the constellation of Cygnus, was first observed on May 12th, 2009, and then confirmed on Dec 5th, 2011. Based on all the data obtained, scientists believe that this world is roughly 2.4 times the radius of Earth, and is likely covered in oceans or has a liquid or gaseous outer shell.

In addition, there are star systems with multiple “Earth-like” planets occupying their habitable zones. Gliese 581 is a good example, a red dwarf star that is located 20.22 light-years away from Earth in the Libra constellation. Here, three confirmed and two possible planets exist, two of which are believed to orbit within the star’s habitable zone. These include the confirmed planet Gliese 581 d and the hypothetical Gliese 581 g.

Tau Ceti is another example. This G-class star, which is located roughly 12 light-years from Earth in the constellation Cetus, has five possible planets orbiting it. Two of these are Super-Earths that are believed to orbit the star’s habitable zone – Tau Ceti e and Tau Ceti f. However, Tau Ceti e is believed to be too close for anything other than Venus-like conditions to exist on its surface.

In all cases, terraforming the atmospheres of these planets would most likely involve the same techniques used to terraform Venus and Mars, though to varying degrees. For those located on the outer edge of their habitable zones, terraforming could be accomplished by introducing greenhouse gases or covering the surface with low albedo material to trigger global warming. On the other end, solar shades and carbon sequestering techniques could reduce temperatures to the point where the planet is considered hospitable.

The latest list of potentially habitable exoplanets, courtesy of The Planetary Habitability Laboratory. Credit: phl.upr.edu
The latest list of potentially habitable exoplanets, courtesy of The Planetary Habitability Laboratory. Credit: phl.upr.edu

Potential Benefits:

When addressing the issue of terraforming, there is the inevitable question – “why should we?” Given the expenditure in resources, the time involved, and other challenges that naturally arise (see below), what reasons are there to engage in terraforming? As already mentioned, there are the reasons cited by Musk, about the need to have a “backup location” to prevent any particular cataclysm from claiming all of humanity.

Putting aside for the moment the prospect of a nuclear holocaust, there is also the likelihood that life will become untenable on certain parts of our planet in the coming century. As the NOAA reported in March of 2015, carbon dioxide levels in the atmosphere have now surpassed 400 ppm, a level not seen since the Pliocene Era – when global temperatures and sea levels were significantly higher.

And as a series of scenarios computed by NASA show, this trend is likely to continue until 2100, and with serious consequences. In one scenario, carbon dioxide emissions will level off at about 550 ppm toward the end of the century, resulting in an average temperature increase of 2.5 °C (4.5 °F). In the second scenario, carbon dioxide emissions rise to about 800 ppm, resulting in an average increase of about 4.5 °C (8 °F). Whereas the increases predicted in the first scenario are sustainable, in the latter scenario, life will become untenable on many parts of the planet.

NASA predicts that temperatures could increase by up to 4.5 C by 2100. Credit: svs.gsfc.nasa.gov
NASA predicts that, based on current emissions rates, temperatures could increase by up to 4.5 degrees Celsius by 2100. Credit: svs.gsfc.nasa.gov

As a result of this, creating a long-term home for humanity on Mars, the Moon, Venus, or elsewhere in the Solar System may be necessary. In addition to offering us other locations from which to extract resources, cultivate food, and as a possible outlet for population pressures, having colonies on other worlds could mean the difference between long-term survival and extinction.

There is also the argument that humanity is already well-versed in altering planetary environments. For centuries, humanity’s reliance on industrial machinery, coal, and fossil fuels has had a measurable effect on Earth’s environment. And whereas the Greenhouse Effect that we have triggered here was not deliberate, our experience and knowledge in creating it here on Earth could be put to good use on planets where surface temperatures need to be raised artificially.

In addition, it has also been argued that working with environments where there is a runaway Greenhouse Effect – i.e. Venus – could yield valuable knowledge that could in turn be used here on Earth. Whether it is the use of extreme bacteria, introducing new gases, or mineral elements to sequester carbon, testing these methods out on Venus could help us to combat Climate Change here at home.

It has also been argued that Mars’ similarities to Earth are a good reason to terraform it. Essentially, Mars once resembled Earth, until its atmosphere was stripped away, causing it to lose virtually all the liquid water on its surface. Ergo, terraforming it would be tantamount to returning it to its once-warm and watery glory. The same argument could be made of Venus, where efforts to alter it would restore it to what it was before a runaway Greenhouse Effect turned it into the harsh, extremely hot world it is today.

One of a few "fan-based" Mars Colonization Transport (MCT) design concepts. The design may be dominated by massive spherical fuel tanks and inflatable modules to house 100 Mars colonists. (Credit: Reddit user P3rkoz)
Artist’s concept for a SpaceX Mars Colonization Transport (MCT). (Credit: Reddit user P3rkoz)

Last, but not least, there is the argument that colonizing the Solar System could usher in an age of “post-scarcity”. If humanity were to build outposts and based on other worlds, mine the asteroid belt, and harvest the resources of the Outer Solar System, we would effectively have enough minerals, gases, energy, and water resources to last us indefinitely. It could also help trigger a massive acceleration in human development, defined by leaps and bounds in technological and social progress.

Potential Challenges:

When it comes right down to it, all of the scenarios listed above suffer from one or more of the following problems:

  1. They are not possible with existing technology
  2. They require a massive commitment of resources
  3. They solve one problem, only to create another
  4. They do not offer a significant return on the investment
  5. They would take a really, REALLY long time

Case in point, all of the potential ideas for terraforming Venus and Mars involve infrastructure that does not yet exist and would be very expensive to create. For instance, the orbital shade concept that would cool Venus calls for a structure that would need to be four times the diameter of Venus itself (if it were positioned at L1). It would therefore require megatons of material, all of which would have to be assembled on site.

All asteroids and comets visited by spacecraft as of November 2010 Credits: Montage by Emily Lakdawalla. Ida, Dactyl, Braille, Annefrank, Gaspra, Borrelly: NASA / JPL / Ted Stryk. Steins: ESA / OSIRIS team. Eros: NASA / JHUAPL. Itokawa: ISAS / JAXA / Emily Lakdawalla. Mathilde: NASA / JHUAPL / Ted Stryk. Lutetia: ESA / OSIRIS team / Emily Lakdawalla. Halley: Russian Academy of Sciences / Ted Stryk. Tempel 1, Hartley 2: NASA / JPL / UMD. Wild 2: NASA / JPL.
All asteroids and comets visited by spacecraft as of November 2010. Credits: Emily Lakdawalla/NASA/JPL/Ted Stryk/ESA/OSIRIS team/JHUAPL/ISAS/JAXA/RAS/UMD

In contrast, increasing the speed of Venus’s rotation would require energy many orders of magnitude greater than the construction of orbiting solar mirrors. As with removing Venus’ atmosphere, the process would also require a significant number of impactors that would have to be harnessed from the outer solar System – mainly from the Kuiper Belt.

In order to do this, a large fleet of spaceships would be needed to haul them, and they would need to be equipped with advanced drive systems that could make the trip in a reasonable amount of time. Currently, no such drive systems exist, and conventional methods – ranging from ion engines to chemical propellants – are neither fast or economical enough.

To illustrate, NASA’s New Horizons mission took more than 11 years to get make its historic rendezvous with Pluto in the Kuiper Belt, using conventional rockets and the gravity-assist method. Meanwhile, the Dawn mission, which relied on ionic propulsion, took almost four years to reach Vesta in the Asteroid Belt. Neither method is practical for making repeated trips to the Kuiper Belt and hauling back icy comets and asteroids, and humanity has nowhere near the number of ships we would need to do this.

The Moon’s proximity makes it an attractive option for terraforming. But again, the resources needed – which would likely include several hundred comets – would again need to be imported from the outer Solar System. And while Mercury’s resources could be harvested in-situ or brought from Earth to paraterraform its northern polar region, the concept still calls for a large fleet of ships and robot builders which do not yet exist.

The moons of Saturn, from left to right: Mimas, Enceladus, Tethys, Dione, Rhea; Titan in the background; Iapetus (top) and irregularly shaped Hyperion (bottom). Some small moons are also shown. All to scale. Credit: NASA/JPL/Space Science Institute
The moons of Saturn, from left to right: Mimas, Enceladus, Tethys, Dione, Rhea; Titan in the background; Iapetus (top) and Hyperion (bottom). Credit: NASA/JPL/Space Science Institute

The outer Solar System presents a similar problem. In order to begin terraforming these moons, we would need infrastructure between here and there, which would mean bases on the Moon, Mars, and within the Asteroid Belt. Here, ships could refuel as they transport materials to the Jovian sand Cronian systems, and resources could be harvested from all three of these locations as well as within the systems themselves.

But of course, it would take many, many generations (or even centuries) to build all of that, and at considerable cost. Ergo, any attempts at terraforming the outer Solar System would have to wait until humanity had effectively colonized the inner Solar System. And terraforming the Inner Solar System will not be possible until humanity has plenty of space hauler on hand, not to mention fast ones!

The necessity for radiation shields also presents a problem. The size and cost of manufacturing shields that could deflect Jupiter’s magnetic field would be astronomical. And while the resources could be harvested from the nearby Asteroid Belt, transporting and assembling them in space around the Jovian Moons would again require many ships and robotic workers. And again, there would have to be extensive infrastructure between Earth and the Jovian system before any of this could proceed.

As for item three, there are plenty of problems that could result from terraforming. For instance, transforming Jupiter’s and Saturn’s moons into ocean worlds could be pointless, as the volume of liquid water would constitute a major portion of the moon’s overall radius. Combined with their low surface gravities, high orbital velocities, and the tidal effects of their parent planets, this could lead to severely high waves on their surfaces. In fact, these moons could become totally unstable as a result of being altered.

Mars-manned-mission vehicle (NASA Human Exploration of Mars Design Reference Architecture 5.0) feb 2009. Credit: NASA
Mars-manned-mission vehicle (NASA Human Exploration of Mars Design Reference Architecture 5.0) Feb. 2009. Credit: NASA

There are also several questions about the ethics of terraforming. Basically, altering other planets in order to make them more suitable to human needs raises the natural question of what would happen to any lifeforms already living there. If in fact Mars and other Solar System bodies have indigenous microbial (or more complex) life, which many scientists suspect, then altering their ecology could impact or even wipe out these lifeforms. In short, future colonists and terrestrial engineers would effectively be committing genocide.

Another argument that is often made against terraforming is that any effort to alter the ecology of another planet does not present any immediate benefits. Given the cost involved, what possible incentive is there to commit so much time, resources, and energy to such a project? While the idea of utilizing the resources of the Solar System makes sense in the long run, the short-term gains are far less tangible.

Basically, harvested resources from other worlds is not economically viable when you can extract them here at home for much less. And real-estate is only the basis of an economic model if the real estate itself is desirable. While MarsOne has certainly shown us that there are plenty of human beings who are willing to make a one-way trip to Mars, turning the Red Planet, Venus, or elsewhere into a “new frontier” where people can buy up land will first require some serious advances in technology, some serious terraforming, or both.

As it stands, the environments of Mars, Venus, the Moon, and the outer Solar System are all hostile to life as we know it. Even with the requisite commitment of resources and people willing to be the “first wave”, life would be very difficult for those living out there. And this situation would not change for centuries or even millennia. Like it not, transforming a planet’s ecology is very slow, laborious work.

Artist's concept of a Martian astronaut standing outside the Mars One habitat. Credit: Bryan Versteeg/Mars One
Artist’s concept of a Martian astronaut standing outside the Mars One habitat. Credit: Bryan Versteeg/Mars One

Conclusion:

So… after considering all of the places where humanity could colonize and terraform, what it would take to make that happen, and the difficulties in doing so, we are once again left with one important question. Why should we? Assuming that our very survival is not at stake, what possible incentives are there for humanity to become an interplanetary (or interstellar) species?

Perhaps there is no good reason. Much like sending astronauts to the Moon, taking to the skies, and climbing the highest mountain on Earth, colonizing other planets may be nothing more than something we feel we need to do. Why? Because we can! Such a reason has been good enough in the past, and it will likely be sufficient again in the not-too-distant future.

This should is no way deter us from considering the ethical implications, the sheer cost involved, or the cost-to-benefit ratio. But in time, we might find that we have no choice but to get out there, simply because Earth is just becoming too stuffy and crowded for us!

Check out the full Definitive Guide here:

We’ve also got articles that explore the more radical side of terraforming, like Could We Terraform Jupiter?, Could We Terraform The Sun?, and Could We Terraform A Black Hole? and Student Team Wants to Terraform Mars Using Cyanobacteria.

Astronomy Cast also has good episodes on the subject, like Episode 96: Humans to Mar, Part 3 – Terraforming Mars

For more information, check out Terraforming Mars  at NASA Quest! and NASA’s Journey to Mars.

Retro Travel Posters Show Us The Future

Visitors to Jupiter view the Jovian auroras from balloons. Image: NASA/JPL.
Visitors to Jupiter view the Jovian auroras from balloons. Image: NASA/JPL.

One of the greatest things about being a space enthusiast is all of the discoveries that come out on an almost daily basis. One of the saddest things about being a space enthusiast is all of the discoveries and destinations that are so close, just beyond the horizon of our lifespan.

Will we colonize other planets? Sure, but most of us living will be gone by then. Will we spend time in glorious, gleaming space habitats? Obviously, but we’ll just be epitaphs by then. Sentient, alien species that gift us faster-than-light travel and other wonders? Maybe, but not before my bucket list has its final item checked off.

Citizen space travel? Hmmmm, tantalizingly within reach.

But now, new retro style posters from NASA, designed by the team at Invisible Creature, are making us feel nostalgic about things that haven’t even happened yet, and are helping us leave behind gloomy thoughts of being born at the wrong time.

The Grand Tour. Image: NASA/JPL
The Grand Tour. Image: NASA/JPL

The Grand Tour celebrates a time when our probes toured the planets, using gravity assist to propel them on their missions.

“Grandpa, do you remember the Grand Tour, when spacecraft used gravity assist to visit other worlds?”

“I sure do. Gravity assist. Those were the days. Swooping so close to Jupiter, you could feel the radiation killing your hair follicles. Only to be sling-shotted on to the next planet.”

“But why didn’t you just use a quantum drive to bend space time and appear at your destination?”

“Quantum drives! Those things ain’t natural. And neither is bending space-time. Give me a good old-fashioned chemical rocket any time.”

“Oh Grandpa.”

Visit Historic Mars. Image: NASA/JPL
Visit Historic Mars. Image: NASA/JPL

Visit the Historic Sites of Mars recalls a time when space pioneers colonized and terraformed Mars.

“Grandpa, what was Mars like in the Early Days?”

“You mean before it was terraformed? Very tough times.”

“Because conditions were so difficult? And food was hard to grow?”

“No. Because of the protesters.”

“Protesters? On Mars?”

“Yup. Every time we found a good spot for a Bacterial Production Facility (BPF), it seemed like there was an expired old rover in the way. The protesters didn’t think we should move ’em. Part of our heritage.”

“So what did you do Grandpa?”

“We created a network of computers that everybody would stare at all day. After that, nobody noticed what we did anymore.”

“Oh Grandpa.”

Visit Beautiful Southern Enceladus. Image: NASA/JPL
Visit Beautiful Southern Enceladus. Image: NASA/JPL

Visit Beautiful Southern Enceladus invites vacationers to visit Saturn’s sixth largest moon to view the ice geysers there.

“Grandpa, did you ever visit Enceladus?”

“I sure did. A beautiful, haunting place.”

“Was it scary? With all of the ice geysers erupting unpredictably?”

“On no. I always knew when one was going to erupt.”

“What? How did you know?”

“My arthritis would flare up.”

“Oh Grandpa.”

Other Posters

NASA has a growing collection of other posters. You can see them here.

SpaceX has their own posters, which you can see here. They also have cool t-shirts with the same designs.

Challenges We’re Overcoming Following the Challenger Accident

The crew of Challenger, lost on January 28, 1986. Credit: NASA.

It was thirty years ago, January 28, 1986, that space shuttle Challenger exploded 73 seconds into its flight, killing seven astronauts. This is a tough time of year in the history of human spaceflight, as 19 years on January 27, 1967 three astronauts died in a fire in the module of Apollo 1. Then on February 1, 2003, space shuttle Columbia disintegrated as it reentered Earth’s atmosphere, killing all seven crew members.

Remembering these events brings home the fact that even today, spaceflight remains far from routine. But over the years, what else have we learned from these tragedies?

I recently touched base with long-time NASA engineer Jerry Woodfill, whose name you may recall from our two series about Apollo 13 — 13 Things That Saved Apollo 13 and 13 More Things That Saved Apollo 13.

Christa McAuliffe. Credit: Challenger's Lost Lessons
Christa McAuliffe. Credit: Challenger’s Lost Lessons

But Jerry was also featured in an article we did in 2008. A year earlier he came across a file of papers from 1985 that proposed how teacher Christa McAuliffe’s eight lessons would be performed on orbit as part of the Challenger mission. Woodfill worked to find old videos, photographs and other materials that had been tucked away in sadness and grief following the loss of Challenger and put together lesson plans and gave them to the Challenger Center. The lessons are available on the Center’s website.

Jerry and I discussed other “lessons” that may have been learned from the tragedies, and he had some interesting ideas about paradigm shifts that have occurred over the past 30-plus years. Here are a few “old” ideas that have changed or are in the process of changing:

Civilians, especially women should not be launched on risky missions to space

The 2013 astronaut candidate class. Front row, left to right: Jessica Meir, Christina Hammock, Andrew Morgan. Back row (left to right), Anne McClain, Nicole Mann, Tyler (Nick) Hague, Josh Cassada and Victor Glover. Credit: NASA
The 2013 astronaut candidate class. Front row, left to right: Jessica Meir, Christina Hammock, Andrew Morgan. Back row (left to right), Anne McClain, Nicole Mann, Tyler (Nick) Hague, Josh Cassada and Victor Glover. Credit: NASA

We’re certainly beyond the “women can’t do what men can” in our society (for the most part, anyway), and NASA’s last class of astronauts was 50% women (4 out of 8). It did take NASA until 1978 to hire the first female astronauts.

As far as civilians being part of space flight…. that’s the whole point the pioneers of spaceflight did what they did, to try and make flying to space as routine as flying in an airplane.

“While we’re not quite there yet,” said Woodfill, “the prospects for civilian space travel is altogether more plausible. “Now we have a maturing commercial space paradigm that wholly embraces the idea of everybody someday being eligible for a trip to space.”

Woodfill also mentioned that he used to hear that some people thought the idea that a Challenger-like mission should never be attempted again.

“That is refuted by the Challenger’s Lost Lessons project in 2008 and how much these recovered lessons mean to the families of the crew,” he said, “ and to the teachers that are now using these lessons in their classrooms.”
McAuliffe’s backup, Barbara Morgan completed her space shuttle flight in 2007 as a mission specialist, doing special education activities during the mission.

Nothing good can come of such a tragedy.

“An obvious challenge to such a posture was a redesigned, safer, more robust Solid Rocket Booster system,” Woodfill said. “In fact, it led to the work-horse SRBs adapted and upgraded for the Space Launch System (SLS) which will likely take us to Mars.”

The tragedies have provided lessons to be learned. “Go-fever” has been tempered with a more analytical view of each mission and the potential risks it entails. Crew safety at NASA has become top priority. All NASA workers are told to “speak up” if they see something that might compromise any mission.

Human spaceflight is too risky.

Dr. Robert H. Goddard (second from right) and his colleagues hold a liquid-propellant rocket in 1932 at their New Mexico workshop. Credit: NASA Goddard Space Flight Center
Dr. Robert H. Goddard (second from right) and his colleagues hold a liquid-propellant rocket in 1932 at their New Mexico workshop. Credit: NASA Goddard Space Flight Center

This debate will likely continue, but ask anyone associated with spaceflight and they’ll tell you they know the risks and that it’s all worth it for what it means for humanity. You can read Neil de Grasse Tyson’s ideas about this here.

National Geographic is currently running a show they produced called “Challenger Disaster: Lost Tapes,” that shows some old footage shot at NASA following the accident. Shown is then-Vice President George H.W. Bush and astronaut and Senator John Glenn who met with NASA’s space shuttle launch team at Kennedy Space Center in Florida. Bush said he met with the families of the lost astronauts and relayed that they pleaded that the space shuttle program continue “forward full speed.”

Glenn said, in part, “We’ve had tremendous triumph. …. And with this program, we’ve succeeded. Really, if we’re honest about it, beyond our wildest dreams. I would have never thought we’d go this far without losing some people, at something where you’re going at 5 miles a second, with the heat of reentry and the complexity of a system where everything has to go right. Now, we have a tragedy that goes along with our triumphs. I guess that’s the story of mankind.”

As many have said, the future doesn’t belong to the faint of heart, and it is part of human nature to explore and push the boundaries. But there are always lessons to be learned and ideas to be challenged. That’s part of the story of humankind, too.

Find out more about the National Geographic special here.

What Is The Plum Pudding Atomic Model?

Diagram of J.J. Thomson's "Plum Pudding Model" of the atom. Credit: boundless.com

Ever since it was first proposed by Democritus in the 5th century BCE, the atomic model has gone through several refinements over the past few thousand years. From its humble beginnings as an inert, indivisible solid that interacts mechanically with other atoms, ongoing research and improved methods have led scientists to conclude that atoms are actually composed of even smaller particles that interact with each other electromagnetically.

This was the basis of the atomic theory devised by English physicist J.J. Thompson in the late 19th an early 20th centuries. As part of the revolution that was taking place at the time, Thompson proposed a model of the atom that consisted of more than one fundamental unit. Based on its appearance, which consisted of a “sea of uniform positive charge” with electrons distributed throughout, Thompson’s model came to be nicknamed the “Plum Pudding Model”.

Though defunct by modern standards, the Plum Pudding Model represents an important step in the development of atomic theory. Not only did it incorporate new discoveries, such as the existence of the electron, it also introduced the notion of the atom as a non-inert, divisible mass. Henceforth, scientists would understand that atoms were themselves composed of smaller units of matter and that all atoms interacted with each other through many different forces.

Atomic Theory to the 19th century:

The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.

Various atoms and molecules as depicted in John Dalton’s A New System of Chemical Philosophy (1808). Credit: Public Domain

It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways.

Dalton began with the question of why elements reacted in ratios of small whole numbers and concluded that these reactions occurred in whole-number multiples of discrete units – i.e. atoms. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory. This theory expanded on the laws of conversation of mass and definite proportions – formulated by the end of the 18th century – and remains one of the cornerstones of modern physics and chemistry.

The theory comes down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.

By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, the situation would change drastically.

Lateral view of a sort of a Crookes tube with a standing cross. Credit: Wikipedia Commons/D-Kuru
Lateral view of a sort of a Crookes tube with a standing cross. Credit: Wikimedia Commons/D-Kuru

Thompson’s Experiments:

Sir Joseph John Thomson (aka. J.J. Thompson) was an English physicist and the Cavendish Professor of Physics at the University of Cambridge from 1884 onwards. During the 1880s and 1890s, his work largely revolved around developing mathematical models for chemical processes, the transformation of energy in mathematical and theoretical terms, and electromagnetism.

However, by the late 1890s, he began conducting experiments using a cathode ray tube known as the Crookes’ Tube. This consists of a sealed glass container with two electrodes that are separated by a vacuum. When voltage is applied across the electrodes, cathode rays are generated (which take the form of a glowing patch of gas that stretches to the far end of the tube).

Through experimentation, Thomson observed that these rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles he called “corpuscles”. Upon measuring the mass-to-charge ration of these particles, he discovered that they were 1ooo times smaller and 1800 times lighter than hydrogen.

This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged corpuscles were distributed in a uniform sea of positive charge.

A depiction of the atomic structure of the helium atom. Credit: Creative Commons
A depiction of the atomic structure of the helium atom. Credit: Creative Commons

These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’s Philosophical Magazine, to wide acclaim.

Problems With the Plum Pudding Model:

Unfortunately, subsequent experiments revealed a number of scientific problems with the model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil.

In what would come to be known as the “gold foil experiment“, they measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.

Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.

The anticipated results of the Gieger-Marsden experiment (left), compared to the actual results (right). Credit: Wikimedia Commons/Kurzon
The anticipated results of the Gieger-Marsden experiment (left), and the actual results (right). Credit: Wikimedia Commons/Kurzon

.

By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.

Subsequent experiments by Antonius Van den Broek and Neils Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).

Though it would come to be discredited in just five years time, Thomson’s “Plum Pudding Model” would prove to be a crucial step in the development of the Standard Model of particle physics. His work in determining that atom’s were divisible, as well as the existence of electromagnetic forces within the atom, would also prove to be major influence on the field of quantum physics.

We have written many interesting articles on the subject of atomic theory here at Universe Today. For instance, here is How Many Atoms Are There In The Universe?, John Dalton’s Atomic Model, What Are The Parts Of The Atom?, Bohr’s Atomic Model,

For more information, be sure to check out Physic’s Worlds pages on 100 years of the electron: from discovery to application and Proton and neutron masses calculated from first principles

Astronomy Cast also has some episodes on the subject: Episode 138: Quantum Mechanics, Episode 139: Energy Levels and Spectra, Episode 378: Rutherford and Atoms and Episode 392: The Standard Model – Intro.

What Is The Heliocentric Model Of The Universe?

Heliocentric Model
Andreas Cellarius's illustration of the Copernican system, from the Harmonia Macrocosmica (1708). Credit: Public Domain

The Scientific Revolution, which took place in the 16th and 17th centuries, was a time of unprecedented learning and discovery. During this period, the foundations of modern science were laid, thanks to breakthroughs in the fields of physics, mathematics, chemistry, biology, and astronomy. And when it comes to astronomy, the most influential scholar was definitely Nicolaus Copernicus, the man credited with the creation of the Heliocentric model of the Universe.

Based on ongoing observations of the motions of the planets, as well as previous theories from classical antiquity and the Islamic World, Copernicus’ proposed a model of the Universe where the Earth, the planets and the stars all revolved around the Sun. In so doing, he resolved the mathematical problems and inconsistencies arising out of the classic geocentric model and laid the foundations for modern astronomy.

While Copernicus was not the first to propose a model of the Solar System in which the Earth and planets revolved around the Sun, his model of a heliocentric universe was both novel and timely. For one, it came at a time when European astronomers were struggling to resolve the mathematical and observational problems that arose out of the then-accepted Ptolemaic model of the Universe, a geocentric model proposed in the 2nd century CE.

In addition, Copernicus’ model was the first astronomical system that offered a complete and detailed account of how the Universe worked. Not only did his model resolves issues arising out of the Ptolemaic system, it offered a simplified view of the universe that did away with complicated mathematical devices that were needed for the geocentric model to work. And with time, the model gained influential proponents who contributed to it becoming the accepted convention of astronomy.

The Geocentric View of the Solar System
An illustration of the Ptolemaic geocentric system by Portuguese cosmographer and cartographer Bartolomeu Velho, 1568. Credit: Bibliothèque Nationale, Paris

The Ptolemaic (Geocentric) Model:

The geocentric model, in which planet Earth is the center of the Universe and is circled by the Sun and all the planets, had been the accepted cosmological model since ancient times. By late antiquity, this model had come to be formalized by ancient Greek and Roman astronomers, such as Aristotle (384 – 322 BCE) – who’s theories on physics became the basis for the motion of the planets – and Ptolemy (ca. 100 – ca.?170 CE), who proposed the mathematical solutions.

The geocentric model essentially came down to two common observations. First of all, to ancient astronomers, the stars, the Sun, and the planets appeared to revolve around the Earth on daily basis. Second, from the perspective of the Earth-bound observer, the Earth did not appear to move, making it a fixed point in space.

The belief that the Earth was spherical, which became an accepted fact by the 3rd century BCE, was incorporated into this system. As such, by the time of Aristotle, the geocentric model of the universe became one where the Earth, Sun and all the planets were spheres, and where the Sun, planets and stars all moved in perfect circular motions.

However, it was not until Egyptian-Greek astronomer Claudius Ptolemaeus (aka. Ptolemy) released his treatise Almagest in the 2nd century BCE that the details became standardized. Drawing on centuries of astronomical traditions, ranging from Babylonian to modern times, Ptolemy argued that the Earth was in the center of the universe and the stars were all at a modest distance from the center of the universe.

About every two years, however, the Earth passes Mars as they orbit around the Sun. Credit: NASA
The planet Mars, undergoing “retrograde motion” – a phenomena where it appears to be moving backwards in the sky – in late 2009 and early 2010. Credit: NASA

Each planet in this system is also moved by a system of two spheres – a deferent and an epicycle. The deferent is a circle whose center point is removed from the Earth, which was used to account for the differences in the lengths of the seasons. The epicycle is embedded in the deferent sphere, acting as a sort of “wheel within a wheel”. The purpose of he epicycle was to account for retrograde motion, where planets in the sky appear to be slowing down, moving backwards, and then moving forward again.

Unfortunately, these explanations did not account for all the observed behaviors of the planets. Most noticeably, the size of a planet’s retrograde loop (especially Mars) were sometimes smaller, and larger, than expected. To alleviate the problem, Ptolemy developed the equant – a geometrical tool located near the center of a planet’s orbit that causes it to move at a uniform angular speed.

To an observer standing at this point, a planet’s epicycle would always appear to move at uniform speed, whereas it would appear to be moving at non-uniform speed from all other locations.While this system remained the accepted cosmological model within the Roman, Medieval European and Islamic worlds for over a thousand years, it was unwieldy by modern standards.

However, it did manage to predict planetary motions with a fair degree of accuracy, and was used to prepare astrological and astronomical charts for the next 1500 years. By the 16th century, this model was gradually superseded by the heliocentric model of the universe, as espoused by Copernicus, and then Galileo and Kepler.

Picture of George Trebizond's Latin translation of Almagest. Credit: Public Domain.
Picture of George Trebizond’s Latin translation of Almagest. Credit: Public Domain

The Copernican (Heliocentric) Model:

In the 16th century, Nicolaus Copernicus began devising his version of the heliocentric model. Like others before him, Copernicus built on the work of Greek astronomer Atistarchus, as well as paying homage to the Maragha school and several notable philosophers from the Islamic world (see below). By the early 16th century, Copernicus summarized his ideas in a short treatise titled Commentariolus (“Little Commentary”).

By 1514, Copernicus began circulating copies amongst his friends, many of whom were fellow astronomers and scholars. This forty-page manuscript described his ideas about the heliocentric hypothesis, which was based on seven general principles. These principles stated that:

  • Celestial bodies do not all revolve around a single point
  • The center of Earth is the center of the lunar sphere—the orbit of the moon around Earth
  • All the spheres rotate around the Sun, which is near the center of the Universe
  • The distance between Earth and the Sun is an insignificant fraction of the distance from Earth and Sun to the stars, so parallax is not observed in the stars
  • The stars are immovable – their apparent daily motion is caused by the daily rotation of Earth
  • Earth is moved in a sphere around the Sun, causing the apparent annual migration of the Sun. Earth has more than one motion
  • Earth’s orbital motion around the Sun causes the seeming reverse in direction of the motions of the planets

Thereafter he continued gathering data for a more detailed work, and by 1532, he had come close to completing the manuscript of his magnum opus – De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres). In it, he advanced his seven major arguments, but in more detailed form and with detailed computations to back them up.

A comparison of the geocentric and heliocentric models of the universe. Credit: history.ucsb.edu
A comparison of the geocentric and heliocentric models of the universe. Credit: history.ucsb.edu

By placing the orbits of Mercury and Venus between the  Earth and the Sun, Copernicus was able to account for changes in their appearances. In short, when they are on the far side of the Sun, relative to Earth, they appear smaller but full. When they are on the same side of the Sun as the Earth, they appear larger and “horned” (crescent-shaped).

It also explained the retrograde motion of planets like Mars and Jupiter by showing that Earth astronomers do not have a fixed frame of reference but a moving one. This further explained how Mars and Jupiter could appear significantly larger at certain times than at others. In essence, they are significantly closer to Earth when at opposition than when they are at conjunction.

However, due to fears that the publication of his theories would lead to condemnation from the church (as well as, perhaps, worries that his theory presented some scientific flaws) he withheld his research until a year before he died. It was only in 1542, when he was near death, that he sent his treatise to Nuremberg to be published.

Historical Antecedents:

As already noted, Copernicus was not the first to advocate a heliocentric view of the Universe, and his model was based on the work of several previous astronomers. The first recorded examples of this are traced to classical antiquity, when Aristarchus of Samos (ca. 310 – 230 BCE) published writings that contained references which were cited by his contemporaries (such as Archimedes).

Aristarchus's 3rd century BC calculations on the relative sizes of, from left, the Sun, Earth and Moon. Credit: Wikipedia Commons
Aristarchus’s 3rd century BC calculations on the relative sizes of, from left, the Sun, Earth and Moon. Credit: Wikipedia Commons

In his treatise The Sand Reckoner, Archimedes described another work by Aristarchus in which he advanced an alternative hypothesis of the heliocentric model. As he explained:

Now you are aware that ‘universe’ is the name given by most astronomers to the sphere whose center is the center of the earth and whose radius is equal to the straight line between the center of the sun and the center of the earth. This is the common account… as you have heard from astronomers. But Aristarchus of Samos brought out a book consisting of some hypotheses, in which the premises lead to the result that the universe is many times greater than that now so called. His hypotheses are that the fixed stars and the sun remain unmoved, that the earth revolves about the sun in the circumference of a circle, the sun lying in the middle of the orbit, and that the sphere of the fixed stars, situated about the same center as the sun, is so great that the circle in which he supposes the earth to revolve bears such a proportion to the distance of the fixed stars as the center of the sphere bears to its surface.

This gave rise to the notion that there should be an observable parallax with the “fixed stars” (i.e an observed movement of the stars relative to each other as the Earth moved around the Sun). According to Archimedes, Aristarchus claimed that the stars were much farther away than commonly believed, and this was the reason for no discernible parallax.

The only other philosopher from antiquity who’s writings on heliocentrism have survived is Seleucis of Seleucia (ca. 190 – 150 BCE). A Hellenistic astronomer who lived in the Near-Eastern Seleucid empire, Seleucus was a proponent of the heliocentric system of Aristarchus, and is said to have proved the heliocentric theory.

According to contemporary sources, Seleucus may have done this by determining the constants of the geocentric model and applying them to a heliocentric theory, as well as computing planetary positions (possibly using trigonometric methods). Alternatively, his explanation may have involved the phenomenon of tides, which he supposedly theorized to be related to the influence of the Moon and the revolution of the Earth around the Earth-Moon ‘center of mass’.

In the 5th century CE, Roman philosopher Martianus Capella of Carthage expressed an opinion that the planets Venus and Mercury revolved around the Sun, as a way of explaining the discrepancies in their appearances. Capella’s model was discussed in the Early Middle Ages by various anonymous 9th-century commentators, and Copernicus mentions him as an influence on his own work.

During the Late Middle Ages, Bishop Nicole Oresme (ca. 1320-1325 to 1382 CE) discussed the possibility that the Earth rotated on its axis. In his 1440 treatise De Docta Ignorantia (On Learned Ignorance) Cardinal Nicholas of Cusa (1401 – 1464 CE) asked whether there was any reason to assert that the Sun (or any other point) was the center of the universe.

Indian astronomers and cosmologists also hinted at the possibility of a heliocentric universe during late antiquity and the Middle Ages. In 499 CE, Indian astronomer Aaryabhata published his magnum opus Aryabhatiya, in which he proposed a model where the Earth was spinning on its axis and the periods of the planets were given with respect to the Sun. He also accurately calculated the periods of the planets, times of the solar and lunar eclipses, and the motion of the Moon.

Ibn al-Shatir's model for the appearances of Mercury, showing the multiplication of epicycles using the Tusi couple, thus eliminating the Ptolemaic eccentrics and equant. Credit: Wikipedia Commons
Ibn al-Shatir’s model for the appearances of Mercury, showing the multiplication of epicycles using the Tusi couple, thus eliminating the Ptolemaic eccentrics and equant. Credit: Wikipedia Commons

In the 15th century, Nilakantha Somayaji published the Aryabhatiyabhasya, which was a commentary on Aryabhata’s Aryabhatiya. In it, he developed a computational system for a partially heliocentric planetary model, in which the planets orbit the Sun, which in turn orbits the Earth. In the Tantrasangraha (1500), he revised the mathematics of his planetary system further and incorporated the Earth’s rotation on its axis.

Also, the heliocentric model of the universe had proponents in the medieval Islamic world, many of whom would go on to inspire Copernicus. Prior to the 10th century, the Ptolemaic model of the universe was the accepted standard to astronomers in the West and Central Asia. However, in time, manuscripts began to appear that questioned several of its precepts.

For instance, the 10th-century Iranian astronomer Abu Sa’id al-Sijzi contradicted the Ptolemaic model by asserting that the Earth revolved on its axis, thus explaining the apparent diurnal cycle and the rotation of the stars relative to Earth. In the early 11th century, Egyptian-Arab astronomer Alhazen wrote a critique entitled Doubts on Ptolemy (ca. 1028) in which he criticized many aspects of his model.

Entrance to the observatory of Ulug'Beg (now Museum) in Samarkand (Uzbekistan). Credit: WIkipedia Commons/Sigismund von Dobschütz
Entrance to the observatory of Ulug’Beg in Samarkand (Uzbekistan). Credit: Wikipedia Commons/Sigismund von Dobschütz

Around the same time, Iranian philosopher Abu Rayhan Biruni  973 – 1048) discussed the possibility of Earth rotating about its own axis and around the Sun – though he considered this a philosophical issue and not a mathematical one. At the Maragha and the Ulugh Beg (aka. Samarkand) Observatory, the Earth’s rotation was discussed by several generations of astronomers between the 13th and 15th centuries, and many of the arguments and evidence put forward resembled those used by Copernicus.

Impact of the Heliocentric Model:

Despite his fears about his arguments producing scorn and controversy, the publication of Copernicu’s theories resulted in only mild condemnation from religious authorities. Over time, many religious scholars tried to argue against his model. But within a few generation’s time, Copernicus’ theory became more widespread and accepted, and gained many influential defenders in the meantime.

These included Galileo Galilei (1564-1642), who’s investigations of the heavens using the telescope allowed him to resolve what were seen as flaws in the heliocentric model, as well as discovering aspects about the heavens that supported heliocentrism. For example, Galileo discovered moons orbiting Jupiter, Sunspots, and the imperfections on the Moon’s surface – all of which helped to undermine the notion that the planets were perfect orbs, rather than planets similar to Earth. While Galileo’s advocacy of Copernicus’ theories resulted in his house arrest, others soon followed.

German mathematician and astronomer Johannes Kepler (1571-1630) also helped to refine the heliocentric model with his introduction of elliptical orbits. Prior to this, the heliocentric model still made use of circular orbits, which did not explain why planets orbited the Sun at different speeds at different times. By showing how the planet’s sped up while at certain points in their orbits, and slowed down in others, Kepler resolved this.

In addition, Copernicus’ theory about the Earth being capable of motion would go on to inspire a rethinking of the entire field of physics. Whereas previous ideas of motion depended on an outside force to instigate and maintain it (i.e. wind pushing a sail) Copernicus’ theories helped to inspire the concepts of gravity and inertia. These ideas would be articulated by Sir Isaac Newton, who’s Principia formed the basis of modern physics and astronomy.

Although its progress was slow, the heliocentric model eventually replaced the geocentric model. In the end, the impact of its introduction was nothing short of a revolutionary. Henceforth, humanity’s understanding of the universe and our place in it would be forever changed.

We have written many interesting articles on the heliocentric model here at Universe Today. For starters, here’s Galileo Returns to the Vatican and The Earth Goes Around the Sun, Who Was Nicolaus Copernicus? and What is the Difference Between the Geocentric and Heliocentric Models?

For more information on heliocentrism, take a look at these articles from NASA on Copernicus or the center of the galaxy.

Astronomy Cast also has an episode on the subject, titled Episode 77: Where is the Center of the Universe and Episode 302: Planetary Motion in the Sky.

Timeline of the Universe, From the Big Bang to the Death of Our Sun

A teeny, tiny, minuscule portion of Martin Vargic’s Timeline of the Universe.
A teeny, tiny, minuscule portion of Martin Vargic’s Timeline of the Universe.

Don’t know much about history? How about the future? A new infographic by graphic designer Martin Vargic portrays both past and forthcoming events in our Universe, from the Big Bang to the death of our Sun. The graphic is color-coded and shows “significant events in cosmic and natural history.” It also illustrates how briefly humanity has been part of the scene.

Fun future events are when Earth’s day will become 25 hours long (Earth’s rotation is slowing down), and the amazingly distant time when the Solar System finally completes one orbit around the galactic core.

The full infographic is below, and be prepared to give your scroll wheel a workout. This thing is huge, but very comprehensive for covering about 23.8 billion years!
Continue reading “Timeline of the Universe, From the Big Bang to the Death of Our Sun”

The Solar Heliospheric Observatory at 20

Image credit:

Flashback to 1995: Clinton was in the White House, Star Trek Voyager premiered, we all carried pagers in the pre-mobile phone era, and Windows 95 and the Internet itself was shiny and new to most of us. It was also on this day in late 1995 when our premier eyes on the Sun—The SOlar Heliospheric Observatory (SOHO)—was launched. A joint mission between NASA and the European Space Agency, SOHO lit up the pre-dawn sky over the Florida Space Coast as it headed space-ward atop an Atlas IIAS rocket at 3:08 AM EST from launch complex 39B at Cape Canaveral Air Force Station.

Envisioning SOHO

soho_photo3
SOHO on Earth

There aren’t a whole lot of 20th century spacecraft still in operation; SOHO joins the ranks of Hubble and the twin Voyager spacecraft as platforms from another era that have long exceeded their operational lives. Seriously, think back to what YOU were doing in 1995, and what sort of technology graced your desktop. Heck, just thinking of how many iterations of mobile phones spanned the last 20 years is a bit mind-bending. A generation of solar astronomers have grown up with SOHO, and the space-based observatory has consistently came through for researchers and scientists, delivering more bang for the buck.

“SOHO has been truly extraordinary and revolutionary in countless ways,” says  astrophysicist Karl Battams at the Naval Research Laboratory in Washington D.C. “SOHO has completely changed our way of thinking about the Sun, solar active regions, eruptive events, and so much more. I honestly can’t think of a more broadly influential space mission than SOHO.”

SOHO has monitored the Sun now for the complete solar cycle #23 and well into the ongoing solar cycle #24. SOHO is a veritable Swiss Army Knife for solar astrophysics, not only monitoring the Sun across optical and ultraviolet wavelengths, but also employing the Michelson Doppler Imager to record magnetogram data and the Large Angle Spectrometric Coronograph (LASCO) able to create an artificial solar eclipse and monitor the pearly white corona of the Sun.

Image credit
Peering into the solar interior.

SOHO observes the Sun from its perch one million miles sunward located at the L1 Sun-Earth point. It actually circles this point in space in what is known as a lissajous, or ‘halo’ orbit.

SOHO has revolutionized solar physics and the way we perceive our host star. We nearly lost SOHO early on in its career in 1998, when gyroscope failures caused the spacecraft to lose a lock on the Sun, sending it into a lazy one revolution per minute spin. Quick thinking by engineers led to SOHO using its reaction wheels as a virtual gyroscope, the first spacecraft to do so. SOHO has used this ad hoc method to point sunward ever since. SOHO was also on hand to document the 2003 Halloween flares, the demise of comet ISON on U.S. Thanksgiving Day 2013, and the deep and strangely profound solar minimum that marked the transition from solar cycle 23 to 24.

What was your favorite SOHO moment?

Massive sunspot
A massive sunspot witnessed by SOHO in 2000, compared to the Earth.

SOHO is also a champion comet hunter, recently topping an amazing 3000 comets and counting. Though it wasn’t designed to hunt for sungrazers, SOHO routinely sees ’em via its LASCO C2 and C3 cameras, as well as planets and background stars near the Sun. The effort to hunt for sungrazing comets crossing the field of view of SOHO’s LASCO C3 and C2 cameras represents one of the earliest crowd-sourced efforts to do volunteer science online. SOHO has discovered enough comets to characterize and classify the Kreutz family of sungrazers, and much of this effort is volunteer-based. SOHO grew up with the internet, and the images and data made publicly available are an invaluable resource that we now often take for granted.

Image credit
A ‘neat’ image…  Comet NEAT photobombs the view of SOHO’s LASCO C3 camera.

NASA/ESA has extended SOHO’s current mission out to the end of 2016. With any luck, SOHO will complete solar cycle 24, and take us into cycle 25 to boot.

“Right now, it (SOHO) is operating in a minimally funded mode, with the bulk of its telemetry dedicated solely to the LASCO coronagraph,” Battams told Universe Today. “Many of its instruments have now been superseded by instruments on other missions. As of today it remains healthy, and I think that’s a testament to the amazing collaboration between ESA and NASA. Together, they’ve kept a spacecraft designed for a two-year mission operating for twenty years.”

Today, missions such as the Solar Dynamics Observatory, Hinode, and Proba-2 have joined SOHO in watching the Sun around the clock. The solar occulting disk capabilities of SOHO’s LASCO C2 and C3 camera remains unique, though ESA’s Proba-3 mission launching in 2018 will feature a free-flying solar occulting disk.

Happy 20th SOHO… you’ve taught us lots about our often tempestuous host star.

-It’s also not too late to vote for your favorite SOHO image.

Spotting Asterix: France Marks 50 Years of Space Exploration

Image credit:

Author’s note: In the wake of the November 13th terrorist attacks, the French Space Agency CNES canceled the celebration of the 50th anniversary of the launch of Asterix. This post commemorates the launch of France’s first satellite 50 years ago this week, and pays a small tribute to the noblest of human endeavors, namely the exploration of space and the pioneering spirit of humanity exemplified by a heroic nation.

A milestone in space flight occurred 50 years ago tomorrow, when France became the sixth nation—behind the U.S.S.R., the United States, Canada, the United Kingdom and Italy—to field a satellite. The A1 mission, renamed Asterix after a popular cartoon character, launched from a remote desert base in Algeria a few hours after dawn at 9:52 UT on November 26th.

Though France was 6th nation in space, it was 3rd—following the Soviet Union and the United States—to launch a satellite atop its very own rocket: the three stage Diamant-A.

P12108
The launch of Asterix into the blue desert skies over Algeria.

The satellite launch was intended mainly to test the ability of the French-built rocket, which flew 11 more times before its retirement in 1975. Asterix did carry a signal transmitter, and was due to carry out ionospheric measurements during its short battery-powered life span. With a high elliptical orbit, Asterix won’t reenter the Earth’s atmosphere for several centuries to come.

The launch occurred from the remote desert air base of Hammaguir, located 31 degrees north of the equator in western Algeria. Then as today, the site is a forlorn and austere location with very few creature comforts, though we can personally attest from our deployment to a similar French Air Base in Djibouti that the French military does serve wine in their mess hall…

Image credit
The tense control room during the launch of Asterix.

The French space program started in 1961 under president Charles de Gaulle and centered around the construction and use of the Diamant rocket. Three variants were built, including the one used to place Asterix in orbit. One of the stranger tales of the early space age involved the first—and thus far only—sub-orbital launch of a cat into space from the same Algerian site in 1963, though Iran recently made a vague statement that it would do the same in 2013.

Image credit
An aerial shot of Hammaguir Air Base in Algeria from the early 1960s.

Contact with Asterix was lost due to a damaged satellite antenna shortly after launch. Founded in 1961, the French space agency CNES (The Centre National d’Etudes Spatiales, or National Centre for Space Studies) now partners with NASA and the European Space Agency on missions including micro-gravity studies on the International Space Station, Rosetta’s historic exploration of comet 67P Churyumov-Gerasimenko and more. And although the Hammaguir space facility in Algeria is no longer in use, CNES operates out of the Kourou Space Center in French Guiana and the Toulouse Space Center in southern France today.

A stamp series
A stamp series commemorating the Diamant rocket and Asterix.

Tracking Asterix

Though inoperative, Asterix still orbits the Earth once every 107 minutes in an elliptical low Earth orbit. Asterix ranges from a perigee of 523 kilometers to an apogee of 1,697 kilometers. In an orbit inclined 34 degrees relative to the Earth’s equator, Asterix isn’t expected to reenter for several centuries.

Image credit
A replica of Asterix hanging in the Paris Air and Space Museum. Image credit: Pine/Wikimedia Commons

A 42 kilogram satellite approximately a meter across, Asterix is visible worldwide from about 40 degrees north to 40 degrees south latitude. Essentially a binocular object, you can nonetheless see Asterix from your backyard if you know exactly where and when to look for it in the sky. Asterix will appear brightest on a perigee pass directly overhead.

Asterix’s NORAD ID satellite catalog number is 01778/COSPAR ID 1965-096A.

The orbital trace of Asterix. Image credit: Orbitron
The orbital trace of Asterix. Image credit: Orbitron

When it comes to hunting for binocular satellites, you need to now exactly where it’ll be in the sky at what time. We use Heavens-Above to discern exactly when a given satellite will pass a bright star, then simply watch at the appointed time with binoculars. We also run WWV radio in the background for a precise audio time hack. This allows us to keep our eyes continuously on the sky. This simple method is similar to that used by Project Moonwatch volunteers to track and record satellites starting in the late 1950s.

Image credit
The evolution of the Diamant rocket.

Other satellite challenges from the early Space Age include Alouette-1 (Canada’s first satellite), Prospero (UK’s first and only indigenous satellite) and the oldest of them all, the first three Vanguard satellites launched by the United States.

Don’t miss a chance to see this living relic of the early space age, still in orbit. Happy 50th to the CNES space agency: may your spirit of space exploration continue to soar and inspire us all.

Who was Stephen Hawking?

In honor of Dr. Stephen Hawking, the COSMOS center will be creating the most detailed 3D mapping effort of the Universe to date. Credit: BBC, Illus.: T.Reyes

When we think of major figures in the history of science, many names come to mind. Einstein, Newton, Kepler, Galileo – all great theorists and thinkers who left an indelible mark during their lifetime. In many cases, the full extent of their contributions would not be appreciated until after their death. But those of us that are alive today are fortunate to have a great scientist among us who made considerable contributions – Dr. Stephen Hawking.

Considered by many to be the “modern Einstein”, Hawking’s work in cosmology and theoretical physics was unmatched among his contemporaries. In addition to his work on gravitational singularities and quantum mechanics, he was also responsible for discovering that black holes emit radiation. On top of that, Hawking was a cultural icon, endorsing countless causes, appearing on many television shows as himself, and penning several books that have made science accessible to a wider audience.

Early Life:

Hawking was born on January 8th, 1942 (the 300th anniversary of the death of Galileo) in Oxford, England. His parents, Frank and Isobel Hawking, were both students at Oxford University, where Frank studied medicine and Isobel studied philosophy, politics and economics. The couple originally lived in Highgate, a suburb of London, but moved to Oxford to get away from the bombings during World War II and give birth to their child in safety. The two would go on to have two daughters, Philippa and Mary, and one adopted son, Edward.

The family moved again in 1950, this time to St. Albans, Hertfordshire, because Stephen’s father became the head of parasitology at the National Institute for Medical Research (now part of the Francis Crick Institute). While there, the family gained the reputation for being highly intelligent, if somewhat eccentric. They lived frugally, living in a large, cluttered and poorly maintained house, driving around in a converted taxicab, and constantly reading (even at the dinner table).

Stephen Hawking as a young man. Credit: gazettereview.com
Stephen Hawking as a young man. Credit: gazettereview.com

Education:

Hawking began his schooling at the Byron House School, where he experienced difficulty in learning to read (which he later blamed on the school’s “progressive methods”.) While in St. Albans, the eight-year-old Hawking attended St. Albans High School for Girls for a few months (which was permitted at the time for younger boys). In September of 1952, he was enrolled at Radlett School for a year, but would remain at St. Albans for the majority of his teen years due the family’s financial constraints.

While there, Hawking made many friends, with whom he played board games, manufactured fireworks, model airplanes and boats, and had long discussions with on subjects ranging from religion to extrasensory perception. From 1958, and with the help of the mathematics teacher Dikran Tahta, Hawking and his friends built a computer from clock parts, an old telephone switchboard and other recycled components.

Though he was not initially academically successfully, Hawking showed considerable aptitude for scientific subjects and was nicknamed “Einstein”. Inspired by his teacher Tahta, he decided to study mathematics at university. His father had hoped that his son would attend Oxford and study medicine, but since it was not possible to study math there at the time, Hawking chose to study physics and chemistry.

Stephen Hawking (holding the handkerchief) and the Oxford Boat Club. Credit: focusfeatures.com
Stephen Hawking (holding the handkerchief) and the Oxford Boat Club. Credit: focusfeatures.com

In 1959, when he was just 17, Hawking took the Oxford entrance exam and was awarded a scholarship. For the first 18 months, he was bored and lonely, owing to the fact that he was younger than his peers and found the work “ridiculously easy”. During his second and third year, Hawking made greater attempts to bond with his peers and developed into a popular student, joining the Oxford Boat Club and developing an interest in classical music and science fiction.

When it came time for his final exam, Hawking’s performance was lackluster. Instead of answering all the questions, he chose to focus on theoretical physics questions and avoided any that required factual knowledge. The result was a score that put him on the borderline between first- and second-class honors. Needing a first-class honors for his planned graduate studies in cosmology at Cambridge, he was forced to take a via (oral exam).

Concerned that he was viewed as a lazy and difficult student, Hawking described his future plans as follows during the viva: “If you award me a First, I will go to Cambridge. If I receive a Second, I shall stay in Oxford, so I expect you will give me a First.” However, Hawking was held in higher regard than he believed, and received a first-class BA (Hons.) degree, thus allowing him to pursue graduate work at Cambridge University in October 1962.

Hawking on graduation day in 1962. Credit: telegraph.co.uk
Hawking on graduation day in 1962. Credit: telegraph.co.uk

Hawking experienced some initial difficulty during his first year of doctoral studies. He found his background in mathematics inadequate for work in general relativity and cosmology, and was assigned Dennis William Sciama (one of the founders of modern cosmology) as his supervisor, rather than noted astronomer Fred Hoyle (whom he had been hoping for).

In addition, it was during his graduate studies that Hawking was diagnosed with early-onset amyotrophic lateral sclerosis (ALS). During his final year at Oxford, he had experienced an accident where he fell down a flight of stairs, and also began experiencing difficulties when rowing and incidents of slurred speech. When the diagnosis came in 1963, he fell into a state of depression and felt there was little point in continuing his studies.

However, his outlook soon changed, as the disease progressed more slowly than the doctors had predicted – initially, he was given two years to live. Then, with the encouragement of Sciama, he returned to his work, and quickly gained a reputation for brilliance and brashness. This was demonstrated when he publicly challenged the work of noted astronomer Fred Hoyle, who was famous for rejecting the Big Bang theory, at a lecture in June of 1964.

Stephen Hawking and Jane Wilde on their wedding day, July 14, 1966. Credit: telegraph.co.uk
Stephen Hawking and Jane Wilde on their wedding day, July 14, 1966. Credit: telegraph.co.uk

When Hawking began his graduate studies, there was much debate in the physics community about the prevailing theories of the creation of the universe: the Big Bang and the Steady State theories. In the former, the universe was conceived in a gigantic explosion, in which all matter in the known universe was created. In the latter, new matter is constantly created as the universe expands. Hawking quickly joined the debate.

Hawking became inspired by Roger Penrose’s theorem that a spacetime singularity – a point where the quantities used to measure the gravitational field of a celestial body become infinite – exists at the center of a black hole. Hawking applied the same thinking to the entire universe, and wrote his 1965 thesis on the topic. He went on to receive a research fellowship at Gonville and Caius College and obtained his PhD degree in cosmology in 1966.

It was also during this time that Hawking met his first wife, Jane Wilde. Though he had met her shortly before his diagnosis with ALS, their relationship continued to grow as he returned to complete his studies. The two became engaged in October of 1964 and were married on July 14th, 1966. Hawking would later say that his relationship with Wilde gave him “something to live for”.

Scientific Achievements:

In his doctoral thesis, which he wrote in collaboration with Penrose, Hawking extended the existence of singularities to the notion that the universe might have started as a singularity. Their joint essay – entitled, “Singularities and the Geometry of Space-Time” – was the runner-up in the 1968 Gravity Research Foundation competition and shared top honors with one by Penrose to win Cambridge’s most prestigious Adams Prize for that year.

In 1970, Hawking became part of the Sherman Fairchild Distinguished Scholars visiting professorship program, which allowed him to lecture at the California Institute of Technology (Caltech). It was during this time that he and Penrose published a proof that incorporated the theories of General Relativity and the physical cosmology developed by Alexander Freidmann.

Based on Einstein’s equations, Freidmann asserted that the universe was dynamic and changed in size over time. He also asserted that space-time had geometry, which is determined by its overall mass/energy density. If equal to the critical density, the universe has zero curvature (i.e. flat configuration); if it is less than critical, the universe has negative curvature (open configuration); and if greater than critical, the universe has a positive curvature (closed configuration)

According to the Hawking-Penrose singularity theorem, if the universe truly obeyed the models of general relativity, then it must have begun as a singularity. This essentially meant that, prior to the Big Bang, the entire universe existed as a point of infinite density that contained all of the mass and space-time of the universe, before quantum fluctuations caused it to rapidly expand.

Per the Friedmann equations, the geometry of the universe is determined by its overall mass/energy density. If equal to the critical density, ?0 the universe has zero curvature (flat configuration). If less than critical, the universe has negative curvature (open configuration). If greater than critical, the universe has positive curvature (closed configuration). Image credit: NASA/GSFC
Per the Friedmann equations, the geometry of the universe is determined by its overall mass/energy density, and can have either flat, negative, or positive curvature. Credit: NASA/GSFC

Also in 1970, Hawking postulated what became known as the second law of black hole dynamics. With James M. Bardeen and Brandon Carter, he proposed the four laws of black hole mechanics, drawing an analogy with the four laws of thermodynamics.

These four laws stated that – for a stationary black hole, the horizon has constant surface gravity; for perturbations of stationary black holes, the change of energy is related to change of area, angular momentum, and electric charge; the horizon area is, assuming the weak energy condition, a non-decreasing function of time; and that it is not possible to form a black hole with vanishing surface gravity.

In 1971, Hawking released an essay titled “Black Holes in General Relativity” in which he conjectured that the surface area of black holes can never decrease, and therefore certain limits can be placed on the amount of energy they emit. This essay won Hawking the Gravity Research Foundation Award in January of that year.

In 1973, Hawking’s first book, which he wrote during his post-doc studies with George Ellis, was published. Titled, The Large Scale Structure of Space-Time, the book describes the foundation of space itself and the nature of its infinite expansion, using differential geometry to examine the consequences of Einstein’s General Theory of Relativity.

Hawking was elected a Fellow of the Royal Society (FRS) in 1974, a few weeks after the announcement of Hawking radiation (see below). In 1975, he returned to Cambridge and was given a new position as Reader, which is reserved for senior academics with a distinguished international reputation in research or scholarship.

The mid-to-late 1970s was a time of growing interest in black holes, as well as the researchers associated with them. As such, Hawking’s public profile began to grow and he received increased academic and public recognition, appearing in print and television interviews and receiving numerous honorary positions and awards.

In the late 1970s, Hawking was elected Lucasian Professor of Mathematics at the University of Cambridge, an honorary position created in 1663 which is considered one of the most prestigious academic posts in the world. Prior to Hawking, its former holders included such scientific greats as Sir Isaac Newton, Joseph Larmor, Charles Babbage, George Stokes, and Paul Dirac.

His inaugural lecture as Lucasian Professor of Mathematics was titled: “Is the end in sight for Theoretical Physics”. During the speech, he proposed N=8 Supergravity – a quantum field theory which involves gravity in 8 supersymmetries – as the leading theory to solve many of the outstanding problems physicists were studying.

Hawking’s promotion coincided with a health crisis which led to Hawking being forced to accept some nursing services at home. At the same time, he began making a transition in his approach to physics, becoming more intuitive and speculative rather than insisting on mathematical proofs. By 1981, this saw Hawking begin to focus his attention on cosmological inflation theory and the origins of the universe.

Inflation theory – which had been proposed by Alan Guth that same year – posits that following the Big Bang, the universe initially expanded very rapidly before settling into to a slower rate of expansion. In response, Hawking presented work at the Vatican conference that year, where he suggested that their might be no boundary or beginning to the universe.

During the summer of 1982, he and his colleague Gary Gibbons organized a three-week workshop on the subject titled “The Very Early Universe” at Cambridge University. With Jim Hartle, an American physicist and professor of physics at the University of California, he proposed that during the earliest period of the universe (aka. the Planck epoch) the universe had no boundary in space time.

In 1983, they published this model, known as the Hartle-Hawking state. Among other things, it asserted that before the Big Bang, time did not exist, and the concept of the beginning of the universe is therefore meaningless. It also replaced the initial singularity of the Big Bang with a region akin to the North Pole which (similar to the real North Pole) one cannot travel north of because it is a point where lines meet that has no boundary.

This proposal predicted a closed universe, which had many existential implications, particularly about the existence of God. At no point did Hawking rule out the existence of God, choosing to use God in a metaphorical sense when explaining the mysteries of the universe. However, he would often suggest that the existence of God was unnecessary to explain the origin of the universe, or the existence of a unified field theory.

In 1982, he also began work on a book that would explain the nature of the universe, relativity and quantum mechanics in a way that would be accessible to the general public. This led him to sign a contract with Bantam Books for the sake of publishing A Brief History of Time, the first draft of which he completed in 1984.

After multiple revisions, the final draft was published in 1988, and was met with much critical acclaim. The book was translated into multiple languages, remained at the top of bestseller lists in both the US and UK for months, and ultimately sold an estimated 9 million copies. Media attention was intense, and Newsweek magazine cover and a television special both described him as “Master of the Universe”.

Further work by Hawking in the area of arrows of time led to the 1985 publication of a paper theorizing that if the no-boundary proposition were correct, then when the universe stopped expanding and eventually collapsed, time would run backwards. He would later withdraw this concept after independent calculations disputed it, but the theory did provide valuable insight into the possible connections between time and cosmic expansion.

During the 1990’s, Hawking continued to publish and lecture on his theories regarding physics, black holes and the Big Bang. In 1993, he co-edited a book with Gary Gibbons on on Euclidean quantum gravity, a theory they had been working on together in the late 70s. According to this theory, a section of a gravitational field in a black hole can be evaluated using a functional integral approach, such that it can avoid the singularities.

That same year, a popular-level collection of essays, interviews and talks titled, Black Holes and Baby Universes and Other Essays was also published. In 1994, Hawking and Penrose delivered a series of six lectures at Cambridge’s Newton Institute, which were published in 1996 under the title “The Nature of Space and Time“.

It was also in 1990s that major developments happened in Hawking’s personal life. In 1990, he and Jane Hawking commenced divorce proceedings after many years of strained relations, owing to his disability, the constant presence of care-givers, and his celebrity status. Hawking remarried in 1995 to Elaine Mason, his caregiver of many years.

Stephen Hawking lectured regularly throughout the 90s and 2000s. Credit: educatinghumanity.com
Stephen Hawking lectured regularly throughout the 90s, many of which were collected and published in “The Nature of Space and Time” in 1996. Credit: educatinghumanity.com

In the 2000s, Hawking produced many new books and new editions of older ones. These included The Universe in a Nutshell (2001), A Briefer History of Time (2005), and God Created the Integers (2006). He also began collaborating with Jim Hartle of the University of California, Santa Barbara, and the European Organization for Nuclear Research (CERN) to produce new cosmological theories.

Foremost of these was Hawking’s “top-down cosmology”, which states that the universe had not one unique initial state but many different ones, and that predicting the universe’s current state from a single initial state is therefore inappropriate. Consistent with quantum mechanics, top-down cosmology posits that the present “selects” the past from a superposition of many possible histories.

In so doing, the theory also offered a possible resolution of the “fine-tuning question”, which addresses the possibility that life can only exist when certain physical constraints lie within a narrow range. By offering this new model of cosmology, Hawking opened up the possibility that life may not be bound by such restrictions and could be much more plentiful than previously thought.

In 2006, Hawking and his second wife, Elaine Mason, quietly divorced, and Hawking resumed closer relationships with his first wife Jane, his children (Robert, Lucy and Timothy), and grandchildren. In 2009, he retired as Lucasian Professor of Mathematics, which was required by Cambridge University regulations. Hawking has continued to work as director of research at the Cambridge University Department of Applied Mathematics and Theoretical Physics ever since, and has made no indication of retiring.

“Hawking Radiation” and the “Black Hole Information Paradox”:

In the early 1970s, Hawking’s began working on what is known as the “no-hair theorem”. Based on the Einstein-Maxwell equations of gravitation and electromagnetism in general relativity, the theorem stated that all black holes can be completely characterized by only three externally observable classical parameters: mass, electric charge, and angular momentum.

In this scenario, all other information about the matter which formed a black hole or is falling into it (for which “hair’ is used as a metaphor), “disappears” behind the black-hole event horizon, and is therefore preserved but permanently inaccessible to external observers.

In 1973, Hawking traveled to Moscow and met with Soviet scientists Yakov Borisovich Zel’dovich and Alexei Starobinsky. During his discussions with them about their work, they showed him how the uncertainty principle demonstrated that black holes should emit particles. This contradicted Hawking’ second law of black hole thermodynamics (i.e. black holes can’t get smaller) since it meant that by losing energy they must be losing mass.

What’s more, it supported a theory advanced by Jacob Bekenstein, a graduate student of John Wheeler University, that black holes should have a finite, non-zero temperature and entropy. All of this contradicted the “no-hair theorem” about black boles. Hawking revised this theorem shortly thereafter, showing that when quantum mechanical effects are taken into account, one finds that black holes emit thermal radiation at a temperature.

From 1974 onward, Hawking presented Bekenstein’s results, which showed that black holes emit radiation. This came to be known as “Hawking radiation”, and was initially controversial. However, by the late 1970s and following the publication of further research, the discovery was widely accepted as a significant breakthrough in theoretical physics.

However, one of the outgrowths of this theory was the likelihood that black holes gradually lose mass and energy. Because of this, black holes that lose more mass than they gain through other means are expected to shrink and ultimately vanish – a phenomena which is known as black hole “evaporation”.

In 1981, Hawking proposed that information in a black hole is irretrievably lost when a black hole evaporates, which came to be known as the “Black Hole Information Paradox”. This states that physical information could permanently disappear in a black hole, allowing many physical states to devolve into the same state.

This was controversial because it violated two fundamental tenets of quantum physics. In principle, quantum physics tells us that complete information about a physical system – i.e. the state of its matter (mass, position, spin, temperature, etc.) – is encoded in its wave function up to the point when that wave function collapses. This in turn gives rise to two other principles.

The first is Quantum Determinism, which states that – given a present wave function – future changes are uniquely determined by the evolution operator. The second is Reversibility, which states that the evolution operator has an inverse, meaning that the past wave functions are similarly unique. The combination of these means that the information about the quantum state of matter must always be preserved.

By proposing that this information disappears once a black evaporates, Hawking essentially created a fundamental paradox. If a black hole can evaporate, which causes all the information about a quantum wave function to disappear, than information can in fact be lost forever. This has been the subject of ongoing debate among scientists, one which has remained largely unresolved.

However, by 2003, the growing consensus among physicists was that Hawking was wrong about the loss of information in a black hole. In a 2004 lecture in Dublin, he conceded his bet with fellow John Preskill of Caltech (which he made in 1997), but described his own, somewhat controversial solution to the paradox problem – that black holes may have more than one topology.

In the 2005 paper he published on the subject – “Information Loss in Black Holes” – he argued that the information paradox was explained by examining all the alternative histories of universes, with the information loss in those with black holes being cancelled out by those without. As of January 2014, Hawking has described the Black Hole Information Paradox as his “biggest blunder”.

Other Accomplishments:

In addition to advancing our understanding of black holes and cosmology through the application of general relativity and quantum mechanics, Stephen Hawking has also been pivotal in bringing science to a wider audience. Over the course of his career, he has published many popular books, traveled and lectured extensively, and has made numerous appearances and done voice-over work for television shows, movies and even provided narration for the Pink Floyd song, “Keep Talking”.

Stephen Hawking's theories on black holes became the subject of many television specials, such as . Credit: discovery.com
Stephen Hawking’s theories on black holes became the subject of television specials, such as “Stephen Hawking’s Universe” on PBS. Credit: discovery.com

A film version of A Brief History of Time, directed by Errol Morris and produced by Steven Spielberg, premiered in 1992. Hawking had wanted the film to be scientific rather than biographical, but he was persuaded otherwise. In 1997, a six-part television series Stephen Hawking’s Universe premiered on PBS, with a companion book also being released.

In 2007, Hawking and his daughter Lucy published George’s Secret Key to the Universe, a children’s book designed to explain theoretical physics in an accessible fashion and featuring characters similar to those in the Hawking family. The book was followed by three sequels – George’s Cosmic Treasure Hunt (2009), George and the Big Bang (2011), George and the Unbreakable Code (2014).

Since the 1990s, Hawking has also been a major role model for people dealing with disabilities and degenerative illnesses, and his outreach for disability awareness and research has been unparalleled. At the turn of the century, he and eleven other luminaries joined with Rehabilitation International to sign the Charter for the Third Millennium on Disability, which called on governments around the world to prevent disabilities and protect disability rights.

Professor Stephen Hawking during a zero-gravity flight. Image credit: Zero G.
Professor Stephen Hawking participating in a zero-gravity flight (aka. the “Vomit Comet”) in 2007. Credit: gozerog.com

Motivated by the desire to increase public interest in spaceflight and to show the potential of people with disabilities, in 2007 he participated in zero-gravity flight in a “Vomit Comet” – a specially fitted aircraft that dips and climbs through the air to simulate the feeling of weightlessness – courtesy of Zero Gravity Corporation, during which he experienced weightlessness eight times.

In August 2012, Hawking narrated the “Enlightenment” segment of the 2012 Summer Paralympics opening ceremony. In September of 2013, he expressed support for the legalization of assisted suicide for the terminally ill. In August of 2014, Hawking accepted the Ice Bucket Challenge to promote ALS/MND awareness and raise contributions for research. As he had pneumonia in 2013, he was advised not to have ice poured over him, but his children volunteered to accept the challenge on his behalf.

During his career, Hawking has also been a committed educator, having personally supervised 39 successful PhD students.He has also lent his name to the ongoing search for extra-terrestrial intelligence and the debate regarding the development of robots and artificial intelligence. On July 20th, 2015, Stephen Hawking helped launch Breakthrough Initiatives, an effort to search for extraterrestrial life in the universe.

Also in 2015, Hawking lent his voice and celebrity status to the promotion of The Global Goals, a series of 17 goals adopted by the United Nations Sustainable Development Summit to end extreme poverty, social inequality, and fixing climate change over the course of the next 15 years.

President Barack Obama talks with Stephen Hawking in the Blue Room of the White House before a ceremony presenting him and 15 others the Presidential Medal of Freedom, August 12, 2009. The Medal of Freedom is the nation's highest civilian honor. (Official White House photo by Pete Souza)
President Barack Obama talks with Stephen Hawking in the Blue Room of the White House before a ceremony presenting him and 15 others the Presidential Medal of Freedom, August 12th, 2009. Credit: Pete Souza/White House photo stream

Honors and Legacy:

As already noted, in 1974, Hawking was elected a Fellow of the Royal Society (FRS), and was one of the youngest scientists to become a Fellow. At that time, his nomination read:

Hawking has made major contributions to the field of general relativity. These derive from a deep understanding of what is relevant to physics and astronomy, and especially from a mastery of wholly new mathematical techniques. Following the pioneering work of Penrose he established, partly alone and partly in collaboration with Penrose, a series of successively stronger theorems establishing the fundamental result that all realistic cosmological models must possess singularities. Using similar techniques, Hawking has proved the basic theorems on the laws governing black holes: that stationary solutions of Einstein’s equations with smooth event horizons must necessarily be axisymmetric; and that in the evolution and interaction of black holes, the total surface area of the event horizons must increase. In collaboration with G. Ellis, Hawking is the author of an impressive and original treatise on “Space-time in the Large.

Other important work by Hawking relates to the interpretation of cosmological observations and to the design of gravitational wave detectors.

On 12 November Peter Higgs and Stephen Hawking visited the "Collider" exhibition at London's Science Museum (Image: c. Science Museum 2013)
Peter Higgs and Stephen Hawking visiting the “Collider” exhibition at London’s Science Museum in 2013, in honor of the discovery of the Higgs Boson. Credit: sciencemuseum.org.uk

In 1975, he was awarded both the Eddington Medal and the Pius XI Gold Medal, and in 1976 the Dannie Heineman Prize, the Maxwell Prize and the Hughes Medal. In 1977, he was appointed a professor with a chair in gravitational physics, and received the Albert Einstein Medal and an honorary doctorate from the University of Oxford by the following year.

In 1981, Hawking was awarded the American Franklin Medal, followed by a Commander of the Order of the British Empire (CBE) medal the following year. For the remainder of the decade, he was honored three times, first with the Gold Medal of the Royal Astronomical Society in 1985, the Paul Dirac Medal in 1987 and, jointly with Penrose, with the prestigious Wolf Prize in 1988. In 1989, he was appointed Member of the Order of the Companions of Honour (CH), but reportedly declined a knighthood.

In 1999, Hawking was awarded the Julius Edgar Lilienfeld Prize of the American Physical Society. In 2002, following a UK-wide vote, the BBC included him in their list of the 100 Greatest Britons. More recently, Hawking has been awarded the Copley Medal from the Royal Society (2006), the Presidential Medal of Freedom, America’s highest civilian honor (2009), and the Russian Special Fundamental Physics Prize (2013).

Several buildings have been named after him, including the Stephen W. Hawking Science Museum in San Salvador, El Salvador, the Stephen Hawking Building in Cambridge, and the Stephen Hawking Center at Perimeter Institute in Canada. And given Hawking’s association with time, he was chosen to unveil the mechanical “Chronophage” – aka. the Corpus Clock – at Corpus Christi College Cambridge in September of 2008.

Stephen Hawking being presented by his daughter Lucy Hawking at the lecture he gave for NASA's 50th anniversary. Credit: NASA/Paul Alers
Stephen Hawking being presented by his daughter Lucy Hawking at the lecture he gave for NASA’s 50th anniversary. Credit: NASA/Paul Alers

Also in 2008, while traveling to Spain, Hawking received the Fonseca Prize – an annual award created by the University of Santiago de Compostela which is awarded to those for outstanding achievement in science communication. Hawking was singled out for the award because of his “exceptional mastery in the popularization of complex concepts in Physics at the very edge of our current understanding of the Universe, combined with the highest scientific excellence, and for becoming a public reference of science worldwide.”

Multiple films have been made about Stephen Hawking over the years as well. These include the previously mentioned A Brief History of Time, the 1991 biopic film directed by Errol Morris and Stephen Spielberg; Hawking, a 2004 BBC drama starring Benedict Cumberbatch in the title role; the 2013 documentary titled “Hawking”, by Stephen Finnigan.

Most recently, there was the 2014 film The Theory of Everything that chronicled the life of Stephen Hawking and his wife Jane. Directed by James Marsh, the movie stars Eddie Redmayne as Professor Hawking and Felicity Jones as Jane Hawking.

Death:

Dr. Stephen Hawking passed away in the early hours of Wednesday, March 14th, 2018 at his home in Cambridge. According to a statement made by his family, he died peacefully. He was 76 years old, and is survived by his first wife, Jane Wilde, and their three children – Lucy, Robert and Tim.

When all is said and done, Stephen Hawking was the arguably the most famous scientist alive in the modern era. His work in the field of astrophysics and quantum mechanics has led to a breakthrough in our understanding of time and space, and will likely be poured over by scientists for decades. In addition, he has done more than any living scientist to make science accessible and interesting to the general public.

Stephen Hawking holding a public lecture at the Stockholm Waterfront congress center, 24 August 2015. Credit: Public Domain/photo by Alexandar Vujadinovic
Stephen Hawking holding a public lecture at the Stockholm Waterfront congress center, 24 August 2015. Credit: Public Domain/photo by Alexandar Vujadinovic

To top it off, he traveled all over the world and lectured on topics ranging from science and cosmology to human rights, artificial intelligence, and the future of the human race. He also used the celebrity status afforded him to advance the causes of scientific research, space exploration, disability awareness, and humanitarian causes wherever possible.

In all of these respects, he was very much like his predecessor, Albert Einstein – another influential scientist-turned celebrity who was sure to use his powers to combat ignorance and promote humanitarian causes. But what was  especially impressive in all of this is that Hawking has managed to maintain his commitment to science and a very busy schedule while dealing with a degenerative disease.

For over 50 years, Hawking lived with a disease that doctor’s initially thought would take his life within just two. And yet, he not only managed to make his greatest scientific contributions while dealing with ever-increasing problems of mobility and speech, he also became a jet-setting personality who travelled all around the world to address audiences and inspire people.

His passing was mourned by millions worldwide and, in the worlds of famed scientist and science communicator Neil DeGrasse Tyson , “left an intellectual vacuum in its wake”. Without a doubt, history will place Dr. Hawking among such luminaries as Einstein, Newton, Galileo and Curie as one of the greatest scientific minds that ever lived.

We have many great articles about Stephen Hawking here at Universe Today. Here is one about Hawking Radiation, How Do Black Holes Evaporate?, why Hawking could be Wrong About Black Holes, and recent experiments to Replicate Hawking Radiation in a Laboratory.

And here are some video interviews where Hawking addresses how God is not necessary for the creation of the Universe, and the trailer for Theory of Everything.

Astronomy Cast has a number of great podcasts that deal with Hawing and his discoveries, like: Episode 138: Quantum Mechanics, and Questions Show: Hidden Fusion, the Speed of Neutrinos, and Hawking Radiation.

For more information, check out Stephen Hawking’s website, and his page at Biography.com