Venus Near Pleiades For a Few Days

Image credit: NASA
The Pleiades are elusive. You rarely find them on purpose. They’re best seen out of the corner of your eye, a pretty little surprise that pops out of the night sky when you’re staring elsewhere.

Venus is just the opposite. Dazzling, bright enough to cast faint shadows, it beams down from the heavens and grabs you, mesmerizing. You can’t take your eyes off it.

This weekend, Venus and the Pleiades are coming together. It happens every 8 years: Venus glides through the Pleiades star cluster and, while dissimilar things don’t always go well together, these do. It’s going to be a beautiful ensemble.

Step outside after dark on Thursday, April 1st and look west. Venus is the improbably-bright “star” about halfway up the sky. Just above Venus lies the Pleiades, often mistaken for the Little Dipper because the faint stars of the Pleiades trace the shape of ? a little dipper.

If you go outside and look several nights in a row, you can see how fast Venus travels across the sky. On Friday, April 2nd, Venus enters the Pleiades, just below the dipper’s bowl. On Saturday, April 3rd, Venus scoots upward to join the stars in the dipper’s handle. On Sunday, April 4th, Venus exits the cluster altogether. Compared to what you saw on April 1st, the two have switched places.

Here are a few things to think about while you’re watching the show:

The Pleiades are a clutch of baby stars. They formed barely 100 million years ago, during the age of dinosaurs on Earth, from a collapsing cloud of interstellar gas. The biggest and brightest of the cluster are blue-white and about five times wider than our own sun.

The Pleiades didn’t exist when Venus first emerged from the protosolar nebula 4.5 billion years ago. No one knows what Venus was like in those early days of the solar system. It might have been lush, verdant, Earth-like. Today, though, it’s hellish. A runaway greenhouse effect on Venus has super-heated the planet to nearly 900? F, hot enough to melt lead. Dense gray clouds laced with sulfuric acid completely hide Venus’ surface from telescopes on Earth. The smothering clouds, it turns out, are excellent reflectors of sunlight, and that’s why Venus looks so bright.

As seen from Earth, Venus shines about 600 times brighter than Alcyone, the most luminous star in the Pleiades. During the weekend try scanning the group with binoculars. You’ll see dozens of faint Pleiades invisible to the unaided eye. Among them, bright Venus looks like a supernova.

But, really, it’s just an ancient planet gliding in front of some baby stars–a dissimilar ensemble that you won’t want to miss.

Original Source: NASA Science

New Images of Titan

Image credit: ESO
Titan, the largest moon of Saturn was discovered by Dutch astronomer Christian Huygens in 1655 and certainly deserves its name. With a diameter of no less than 5,150 km, it is larger than Mercury and twice as large as Pluto. It is unique in having a hazy atmosphere of nitrogen, methane and oily hydrocarbons. Although it was explored in some detail by the NASA Voyager missions, many aspects of the atmosphere and surface still remain unknown. Thus, the existence of seasonal or diurnal phenomena, the presence of clouds, the surface composition and topography are still under debate. There have even been speculations that some kind of primitive life (now possibly extinct) may be found on Titan.

Titan is the main target of the NASA/ESA Cassini/Huygens mission, launched in 1997 and scheduled to arrive at Saturn on July 1, 2004. The ESA Huygens probe is designed to enter the atmosphere of Titan, and to descend by parachute to the surface.

Ground-based observations are essential to optimize the return of this space mission, because they will complement the information gained from space and add confidence to the interpretation of the data. Hence, the advent of the adaptive optics system NAOS-CONICA (NACO) [1] in combination with ESO’s Very Large Telescope (VLT) at the Paranal Observatory in Chile now offers a unique opportunity to study the resolved disc of Titan with high sensitivity and increased spatial resolution.

Adaptive Optics (AO) systems work by means of a computer-controlled deformable mirror that counteracts the image distortion induced by atmospheric turbulence. It is based on real-time optical corrections computed from image data obtained by a special camera at very high speed, many hundreds of times each second.

A team of French astronomers [2] have recently used the NACO state-of-the-art adaptive optics system on the fourth 8.2-m VLT unit telescope, Yepun, to map the surface of Titan by means of near-infrared images and to search for changes in the dense atmosphere.

These extraordinary images have a nominal resolution of 1/30th arcsec and show details of the order of 200 km on the surface of Titan. To provide the best possible views, the raw data from the instrument were subjected to deconvolution (image sharpening).

Images of Titan were obtained through 9 narrow-band filters, sampling near-infrared wavelengths with large variations in methane opacity. This permits sounding of different altitudes ranging from the stratosphere to the surface.

Titan harbours at 1.24 and 2.12 ?m a “southern smile”, that is a north-south asymmetry, while the opposite situation is observed with filters probing higher altitudes, such as 1.64, 1.75 and 2.17 ?m.

A high-contrast bright feature is observed at the South Pole and is apparently caused by a phenomenon in the atmosphere, at an altitude below 140 km or so. This feature was found to change its location on the images from one side of the south polar axis to the other during the week of observations.

Original Source: ESO News Release

Teams of Spacecraft Might Explore Better

Image credit: ESA
Will swarms of co-operating robots one day be exploring some of the most intriguing worlds in the solar system? James Law, an engineer who is a doctoral student at the Open University, supports the idea that using whole teams of robotic explorers working together offers distinct advantages, especially when it comes to tackling the challenges presented by remote bodies such as Europa and Titan. In a presentation on Wednesday 31 March at the Royal Astronomical Society’s National Astronomy Meeting at the Open University, he will be reviewing some current ideas on co-operative robot technology and suggesting how it might be applied to a Titan mission with a concept for a ‘Master’ robot controlling a bevy of ‘Slaves’.

Of the 17 landers sent to investigate Mars, only 5 have survived to perform their missions. In spite of this, scientists are already looking for their next planetary targets, with Saturn’s moon Titan and Jupiter’s moon Europa being distinct possibilities. Given both the greater distances involved, and extreme climatic conditions, how can the likelihood of a successful robotic surface mission be increased? Although robotic rovers have become the preferred choice over static landers, due to their greater versatility, the addition of motion systems increases their weight and reduces the reliability of these already complex mechanisms.

Advantages of teamwork
One alternative, proposed in 1989 by Rodney Brooks of the Massachusetts Institute of Technology, is finally coming to fruition – the idea of replacing solitary rovers with swarms of cooperative robots. With scientific equipment evenly distributed between them, each rover can be made smaller, lighter, and less complex. These robots can then work together or independently, in order to complete the mission objectives.

This approach has several distinct advantages. Launch costs could be reduced and soft landings achieved by delivering lighter payloads. Robustness is improved, since a critical failure on any rover is isolated from the rest. Although losing a rover may restrict the capabilities of the swarm, it is not likely to result in termination of the mission. Indeed, in many cases the affected rover will still be able to play a useful, though limited role.

Robotic swarms permit a variety of new missions, such as simultaneous measurements over wide areas, useful in climate monitoring and seismic sounding, or multiple experiments performed concurrently by different robots. Rovers can also work together to access areas of greater scientific interest, for example cliff faces. James Law cites David Barnes of the University of Wales at Aberystwyth, who is developing a swarm of aerobots – flying robots which could be used for terrain mapping or deploying smaller micro rovers. Another benefit of using small cooperative rovers is that additional robots can be launched and integrated into the swarm to extend a mission, enabling new experiments, or replacement of lost and damaged rovers.

Robots for Titan
In his talk, James Law will present his own vision for a mission to Titan. Though we have to wait for the Huygens probe, due to land on Titan early next year, to discover the true nature of Titan’s surface, it is likely to be mixed. “In this situation, a Master-Slave robot configuration with a variety of transport modes could be favourable,” he suggests. “A ‘Master’ lander supplying power and communications provides an outpost for a number of small ‘Slave’ rovers and balloons. The lander would be equipped with a range of scientific packages, which it could distribute amongst its slave robots depending on the environment around the landing site. These subordinate robots are then able to act either cooperatively – for example, to dig and image a trench in order to investigate its geological layers – or on their own, analysing or collecting samples and returning them to the lander for more in-depth analysis. The rovers would return to the lander to recharge their batteries and change their scientific payloads. Robots capable of operating in a liquid environment could be dispersed on any Titan sea to measure wave motion, perhaps by balloon, then be sacrificed, by ‘drowning’, to measure conditions below the surface.”

Exploring Europa
Among schemes proposed by others that James Law will review is one for the exploration of Europa, devised by Jeff Johnson of the Open University and Rodney Buckland of the University of Kent. It involves Self Organising IMAging Robots, or soimars, small cube-shaped robots each carrying a single-pixel imaging device (such as a photodiode) and weighing as little as 10 grams. Each one is able to communicate with its neighbours and is capable of moving in water, using small propulsion screws. A swarm of these tiny robots could be deployed into a sub surface ocean on Europa to image the environment.

A transport craft containing communications and power facilities would land on Europa’s ice crust and release an ice-penetrating device containing the soimars. This device would bore through the ice and release the soimars into the ocean. The soimars then self-organise into a stack, aligning their imaging devices. By cooperatively swimming, the stack scans an area under the ice. If a single imaging device fails, the faulty soimar is simply released and the swarm reorganises to form an error free array. This also enables more soimars, perhaps from subsequent landers, to join the swarm and improve the image resolution. In this configuration, the soimars are physically attached to one another. An alternative use would be to equip them with touch sensors and have them swim as a dispersed cloud along the ocean floor, mapping its elevation. A simulation has been developed at the Open University to demonstrate the self-organising behaviour of the swarm.

A mechanical workforce for Mars
The Jet Propulsion Laboratory (JPL) has research underway on cooperative robot teams, including robotic work crews for carrying large items, robotic excavation teams, and robots that can rappel one another down steep cliff faces. An objective of this work at JPL is to deploy a robotic workforce on Mars to construct mining and refining facilities, which will provide fuel for future human missions. With proposals to land men on Mars, and eventually more distant locations, these robotic work crews will be indispensable in both investigating the destinations, and creating outposts to support our arrival.

Original Source: RAS News Release

What Would Titan’s Oceans Look Like?

Image credit: ESA
When the European Huygens probe on the Cassini space mission parachutes down through the opaque smoggy atmosphere of Saturn’s moon Titan early next year, it may find itself splashing into a sea of liquid hydrocarbons. In what is probably the first piece of “extraterrestrial oceanography” ever carried out, Dr Nadeem Ghafoor of Surrey Satellite Technology and Professor John Zarnecki of the Open University, with Drs Meric Srokecz and Peter Challenor of the Southampton Oceanography Centre, calculated how any seas on Titan would compare with Earth’s oceans. Their results predict that waves driven by the wind would be up to 7 times higher but would move more slowly and be much farther apart. Dr Ghafoor will present their findings at the RAS National Astronomy Meeting at the Open University on Wednesday 31 March.

The team worked with a computer simulation, or ‘model’, that predicts how wind-driven waves on the surface of the sea are generated on Earth, but they changed all the basic inputs, such as the local gravity, and the properties of the liquid, to values they might expect on Titan.

Arguments about the nature of Titan’s surface have raged for a number of years. Following the flyby of the Voyager 1 spacecraft in 1980, some researchers suggested that Titan’s concealed surface might be at least partly covered by a sea of liquid methane and ethane. But there are several other theories, ranging from a hard icy surface at one extreme to a near-global hydrocarbon ocean at the other. Other variants include the notion of hydrocarbon ‘sludge’ overlying an icy surface. Planetary scientists hope that the Cassini/Huygens mission will provide an answer to this question, with observations from Cassini during several flybys of Titan and from Huygens, which will land (or ‘splash’) on 14 January 2005.

The idea that Titan has significant bodies of surface liquid has recently been reinforced by the announcement that radar reflections from Titan have been detected using the giant Arecibo radio dish in Puerto Rico. Importantly, the returned signals in 12 out the 16 attempts made contained reflections of the kind expected from a polished surface, like a mirror. (This is similar to seeing a blinding patch of light on the surface of the sea where the Sun is being reflected.) The radar researchers concluded that 75% of Titan’s surface may be covered by ‘open bodies of liquid hydrocarbons’ – in other words, seas.

The exact nature of the reflected radar signal can be used to determine how smooth or choppy the liquid surface is. This interpretation says that the slope of the waves is typically less than 4 degrees, which is consistent with the predictions of the British scientists, who showed that the maximum possible slope of waves generated by wind speeds up to 7 mph would be 11 degrees.

“Hopefully ESA’s Huygens probe will end the speculation” says Dr Ghafoor. “Not only will this be by far the most remote soft landing of a spacecraft ever attempted but Huygens might become the first extraterrestrial boat if it does indeed land on a hydrocarbon lake or sea.” Although not designed specifically to survive landing or to float, the chances it will do so are reasonable. However, the link back to Earth from Huygens via Cassini, which will be flying past Titan and acting as a relay, will only last for a maximum of 2 hours. During this time, if the probe is floating on a sea, one of the 6 instruments Huygens is carrying, the Surface Science Package experiment, which is led by John Zarnecki, will be making oceanography measurements. Among the 9 sensors that it carries are ones that will measure the height and frequency of the waves and also the depth of the sea using sonar. It will also attempt to determine the composition of the sea.

What would the sea look like? “Huygens does carry a camera so it is possible we shall have some direct images,” says Professor Zarnecki, “but let’s try to imagine that we are sitting onboard the probe after it has landed in a Titan ocean. What would we see? Well, the waves would be more widely dispersed than on Earth but they will be very much higher – mostly as a result of the fact that Titan gravity is only about 15% of that on Earth. So the surface around us would probably appear flat and deceptively calm, but in the distance we might see a rather tall, slow-moving wave advancing towards us – a wave that could overwhelm or sink us.”

Original Source: RAS News Release

New Study Finds Fundamental Force Hasn’t Changed Over Time

Image credit: ESO
Detecting or constraining the possible time variations of fundamental physical constants is an important step toward a complete understanding of basic physics and hence the world in which we live. A step in which astrophysics proves most useful.

Previous astronomical measurements of the fine structure constant – the dimensionless number that determines the strength of interactions between charged particles and electromagnetic fields – suggested that this particular constant is increasing very slightly with time. If confirmed, this would have very profound implications for our understanding of fundamental physics.

New studies, conducted using the UVES spectrograph on Kueyen, one of the 8.2-m telescopes of ESO’s Very Large Telescope array at Paranal (Chile), secured new data with unprecedented quality. These data, combined with a very careful analysis, have provided the strongest astronomical constraints to date on the possible variation of the fine structure constant. They show that, contrary to previous claims, no evidence exist for assuming a time variation of this fundamental constant.

A fine constant
To explain the Universe and to represent it mathematically, scientists rely on so-called fundamental constants or fixed numbers. The fundamental laws of physics, as we presently understand them, depend on about 25 such constants. Well-known examples are the gravitational constant, which defines the strength of the force acting between two bodies, such as the Earth and the Moon, and the speed of light.

One of these constants is the so-called “fine structure constant”, alpha = 1/137.03599958, a combination of electrical charge of the electron, the Planck constant and the speed of light. The fine structure constant describes how electromagnetic forces hold atoms together and the way light interacts with atoms.

But are these fundamental physical constants really constant? Are those numbers always the same, everywhere in the Universe and at all times? This is not as naive a question as it may seem. Contemporary theories of fundamental interactions, such as the Grand Unification Theory or super-string theories that treat gravity and quantum mechanics in a consistent way, not only predict a dependence of fundamental physical constants with energy – particle physics experiments have shown the fine structure constant to grow to a value of about 1/128 at high collision energies – but allow for their cosmological time and space variations. A time dependence of the fundamental constants could also easily arise if, besides the three space dimensions, there exist more hidden dimensions.

Already in 1955, the Russian physicist Lev Landau considered the possibility of a time dependence of alpha. In the late 1960s, George Gamow in the United States suggested that the charge of the electron, and therefore also alpha, may vary. It is clear however that such changes, if any, cannot be large or they would already have been detected in comparatively simple experiments. Tracking these possible changes thus requires the most sophisticated and precise techniques.

Looking back in time
In fact, quite strong constraints are already known to exist for the possible variation of the fine structure constant alpha. One such constraint is of geological nature. It is based on measures taken in the ancient natural fission reactor located near Oklo (Gabon, West Africa) and which was active roughly 2,000 million years ago. By studying the distribution of a given set of elements – isotopes of the rare earths, for example of samarium – which were produced by the fission of uranium, one can estimate whether the physical process happened at a faster or slower pace than we would expect it nowadays. Thus we can measure a possible change of the value of the fundamental constant at play here, alpha. However, the observed distribution of the elements is consistent with calculations assuming that the value of alpha at that time was precisely the same as the value today. Over the 2 billion years, the change of alpha has therefore to be smaller than about 2 parts per 100 millions. If present at all, this is a rather small change indeed.

But what about changes much earlier in the history of the Universe?

To measure this we must find means to probe still further into the past. And this is where astronomy can help. Because, even though astronomers can’t generally do experiments, the Universe itself is a huge atomic physics laboratory. By studying very remote objects, astronomers can look back over a long time span. In this way it becomes possible to test the values of the physical constants when the Universe had only 25% of is present age, that is, about 10,000 million years ago.

Very far beacons
To do so, astronomers rely on spectroscopy – the measurement of the properties of light emitted or absorbed by matter. When the light from a flame is observed through a prism, a rainbow is visible. When sprinkling salt on the flame, distinct yellow lines are superimposed on the usual colours of the rainbow, so-called emission lines. Putting a gas cell between the flame and the prism, one sees however dark lines onto the rainbow: these are absorption lines. The wavelength of these emission and absorption spectra lines is directly related to the energy levels of the atoms in the salt or in the gas. Spectroscopy thus allows us to study atomic structure.

The fine structure of atoms can be observed spectroscopically as the splitting of certain energy levels in those atoms. So if alpha were to change over time, the emission and absorption spectra of these atoms would change as well. One way to look for any changes in the value of alpha over the history of the Universe is therefore to measure the spectra of distant quasars, and compare the wavelengths of certain spectral lines with present-day values.

Quasars are here only used as a beacon – the flame – in the very distant Universe. Interstellar clouds of gas in galaxies, located between the quasars and us on the same line of sight and at distances varying from six to eleven thousand of million light years, absorb parts of the light emitted by the quasars. The resulting spectrum consequently presents dark “valleys” that can be attributed to well-known elements.

If the fine-structure constant happens to change over the duration of the light’s journey, the energy levels in the atoms would be affected and the wavelengths of the absorption lines would be shifted by different amounts. By comparing the relative gaps between the valleys with the laboratory values, it is possible to calculate alpha as a function of distance from us, that is, as a function of the age of the Universe.

These measures are however extremely delicate and require a very good modelling of the absorption lines. They also put exceedingly strong requirements on the quality of the astronomical spectra. They must have enough resolution to allow very precise measurement of minuscule shifts in the spectra. And a sufficient number of photons must be captured in order to provide a statistically unambiguous result.

For this, astronomers have to turn to the most advanced spectral instruments on the largest telescopes. This is where the Ultra-violet and Visible Echelle Spectrograph (UVES) and ESO’s Kueyen 8.2-m telescope at the Paranal Observatory is unbeatable, thanks to the unequalled spectral quality and large collecting mirror area of this combination.

Constant or not?
A team of astronomers [1], led by Patrick Petitjean (Institut d’Astrophysique de Paris and Observatoire de Paris, France) and Raghunathan Srianand (IUCAA Pune, India) very carefully studied a homogeneous sample of 50 absorption systems observed with UVES and Kueyen along 18 distant quasars lines of sight. They recorded the spectra of quasars over a total of 34 nights to achieve the highest possible spectral resolution and the best signal-to-noise ratio. Sophisticated automatic procedures specially designed for this programme were applied.

In addition, the astronomers used extensive simulations to show that they can correctly model the line profiles to recover a possible variation of alpha.

The result of this extensive study is that over the last 10,000 million years, the relative variation of alpha must be less than 0.6 part per million. This is the strongest constraint from quasar absorption lines studies to date. More importantly, this new result does not support previous claims of a statistically significant change of alpha with time.

Interestingly, this result is supported by another – less extensive – analysis, also conducted with the UVES spectrometer on the VLT [2]. Even though those observations were only concerned with one of the brightest known quasar HE 0515-4414, this independent study lends further support to the hypothesis of no variation of alpha.

Even though these new results represent a significant improvement in our knowledge of the possible (non-) variation of one of the fundamental physical constants, the present set of data would in principle still allow variations that are comparatively large compared to those resulting from the measurements from the Oklo natural reactor. Nevertheless, further progress in this field is expected with the new very-high-accuracy radial velocity spectrometer HARPS on ESO’s 3.6-m telescope at the La Silla Observatory (Chile). This spectrograph works at the limit of modern technology and is mostly used to detect new planets around stars other than the Sun – it may provide an order of magnitude improvement on the determination of the variation of alpha.

Other fundamental constants can be probed using quasars. In particular, by studying the wavelengths of molecular hydrogen in the remote Universe, one can probe the variations of the ratio between the masses of the proton and the electron. The same team is now engaged in such a large survey with the Very Large Telescope that should lead to unprecedented constraints on this ratio.

Original Source: ESO News Release

Interview with Greg Klerkx, Author of “Lost in Space”

Image credit: NASA
You target NASA as being responsible for many of the problems within the space sector today. If you were given one day as NASA administrator what would you do to address the problems?

First, I would immediately initiate an independent review of all NASA centers in the context of their relevance to NASA’s new mission. Regardless of how one feels about the new Bush plan (and I have some reservations), it is the equivalent of an order to NASA from the highest office: the plan lays out what NASA is to do in coming decades, and by omission also decrees what isn’t required. Yet from the standpoint of structure and operation, the agency’s response to date has been to simply assume that one way or another, every center will be found to have some critical contribution to that new mission. That’s hard to believe; even if centers like Ames or Glenn contribute something of value to the Moon/Mars effort, it’s hard to believe a whole center is needed, along with its huge cost burden. If the center structure isn’t overhauled – which will almost certainly involve closing or consolidating one or more – it’s hard to see how the Bush plan stands a chance.

Second (and I’d probably only have time for two big things), I’d send every senior manager out into the real world – beyond the aerospace contractors, the groupies, the space media – and have them strike up conversations with ordinary people about the importance of space exploration. Much of NASA’s problem is that it’s a mutual admiration society with little connection to what those who aren’t ‘space interested’ actually think about space. I’m sure there’d be some surprises. To be fair, this problem also afflicts the alt.space sector, too.

Your book had a brief reference to Earth problems in the sense that they need to be resolved before space gets developed. In particular overpopulation and exhaustion of natural resources seem to resonate. How do you see space development advancing given these ‘Earthly’ challenges?

The first reference was to Carl Sagan’s position on human space exploration; the second, that of overpopulation and resource exhaustion, referred to Gerard O’Neill’s thinking. My own thoughts are somewhere in the middle: I think human space exploration serves a useful social purpose, yet I don’t think it’s the cure-all for humanity’s woes that some believe it to be.

There doesn’t seem to be any references to space advocacy groups outside of the United States. Is this because there are none, because they are not very vocal or because they are not germaine to the book?

Most non-U.S. space advocacy groups tend to be small; the larger ones tend to be international branches of U.S. groups like the Mars Society and Planetary Society. It’s not that they’re not important, but I felt I represented their interests in reference to the U.S. groups.

There are also very few references to other national space institutes? Is this because other countries and citizens are less interested in space?

One of my primary missions in writing this book was to deflate some of the mythology that sustained (and still sustains) the original ‘Space Age’, the theory being that only through an honest assessment of the past can one find a clear way to the future. This naturally meant focusing more on the U.S. and Soviet/Russian space programs than on the programs of other countries. I think there is another book to be written on ‘international space’, or perhaps it’s more of a long magazine article since certainly the U.S. and Russia remain the most space-interested societies on Earth (this is true even with Russia’s diminished capability). Again, there are certainly other national space programs of note and which I touch on briefly – Europe’s, Japan’s, China’s – but they’re not central to what I was trying to accomplish.

If manned space flight capability were to disappear in the next 20 years do you think it will ever reappear? If so, how?

At present, human spaceflight has little military, scientific or economic significance (except for the latter’s significance to certain aerospace contractors): from a societal standpoint, human spaceflight is an endeavor sustained almost purely on emotional terms, as a beacon of national pride, creativity and adventure. If it disappeared, it would be difficult to restart, both because of the technological challenge (look at how NASA is scrambling to figure out how to return to the Moon, something that was almost routine by 1972) and because the geopolitical rationale that produced the space race and the spaceflight technology we have today is unlikely to be replicated in the future. Thus, it’s hard to imagine a future society spending the resources and energy to develop human spaceflight unless there was some new, compelling reason.

However, I don’t think the disappearance of government-sponsored human spaceflight would necessarily mean the end of human spaceflight altogether. Within 20 years, alt.space vehicles should be robust enough to ensure that at least sub-orbital spaceflight would still be around. If government-sponsored flight went away, perhaps some of the terminated technology (and technologists) would beef up the alt.space sector sufficiently to move it from sub-orbital to orbital flight. That might not be a bad scenario, actually!

If you met a bright, energetic youth that has the aptitude for science and engineering (as in the prologue) would you encourage them to enter the space sector? If so, how? If not, where would you direct them?

If they had interest in space, I wouldn’t discourage them. But I’d encourage them first to get a job with NASA and then, quickly, to leave the agency for the burgeoning entrepreneurial sector: you have to understand the beast in order to tame it, or at least to avoid being killed by it.

I’m sure you’ve heard of the latest government call to return to the Moon and then on to Mars. Again. Any thoughts on its chances for success and on what this directive means for NASA in the short and long term?

I applaud the idea of destinations for human spaceflight, but I’m disheartened that this idea is being shoved into the same old box. The initiative seems primarily designed to reinvigorate NASA, not to reinvigorate general interest in human spaceflight. Unless someone of vision and influence can see the distinction and act on it, I’m not optimistic that the initiative will meet a fate any different than those of Bush senior or Reagan (both, you’ll recall, also announced bold Moon/Mars plans to great fanfare).

The initiative has certainly caused a lot of bustle within NASA: ‘codes’ are being formed, projects are being sketched out, etc. Meanwhile, Congress is squabbling over just the first of the budget boosts needed to make the initiative happen? and remember, this is a Bush-friendly Congress! Thus, we see again the problem with a politically driven space agenda. The best thing to come out of the initiative will probably be the retirement of the shuttle and the gradual pullout from the space station. Beyond that, at present I’d give the Moon/Mars plan a 50/50 chance of success.

I have this feeling that people are building systems that are so complex that they can’t manage them adequately whether space shuttles, 777’s or computer operating systems. Thoughts?

I don’t think complexity alone is the Achilles’ heel of any given system or device. 777s are fine machines with a great track record of functioning as advertised. I complain about my Windows OS as much as anyone, but if I step back from my irritation, it actually works quite well most of the time.

The shuttle is a disastrous machine not because of its innate complexity but because of its rube goldberg design: it’s not just complex, it’s overcomplicated? it’s a bit of this, a bit of that, all the while being sold as every cure for every problem (well, less so now, but that’s how it went originally). Worse, NASA and its shuttle contractors have known this from the beginning and yet have continued to sell the shuttle as a robust, operational vehicle. It isn’t, it never has been, and it never will be.

To me, an individual’s pursuit of life, liberty and happiness is contrary to a state project that requires effort from all taxpayers yet only benefits a few. How does space development amplify life, liberty and happiness for everyone? From reading your book I get the feeling that you disapprove of a strong central government with a lot of control. As the government gets stronger, more controlling and more centralized do you see better times or worse times for space development?

I am neither for nor against ‘big government’ as a rule. That said, I believe there are some enterprises that are absolutely the province of government, such as health care, environmental protection and education. All members of a given society deserve a minimum standard of quality where such things are concerned; it should be the responsibility of the state – or, if you like, that collective of citizens that governs and funds itself – to provide such things, and they should never be subject to the necessarily cold machinations of the market.

There are other things that can, and should, be largely removed from government control. Spaceflight is one of them, at least in part. I am obviously a fan of space exploration and space travel, but I do not consider them to be fundamental to ‘life, liberty and the pursuit of happiness’. Therefore, I see tremendous potential for the market to grab hold of certain aspects of spaceflight that are now monopolized by the government and paid for by the taxpayers: or, as you put it, which require the effort of many and benefit few.

Buy Lost in Space from Amazon.com

Interview by Mark Mortimer

What’s Creating the Methane, Life or Volcanoes?

Image credit: ESA
Considered suggestive of life, an atmosphere of methane on another planet is considered one of the four best candidates for detecting habitable conditions using remote sensing and telescope spectrographs. While methane can be made both by biological and non-biological processes, it is also degraded by non-biological means, so a high concentration often is interpreted as requiring a source to replenish it. If metabolism is that source, then some of the prerequisites for a steady-state ecosystem may be in play.

On Earth there are four gases linked to the presence of life and habitable conditions: water vapor, carbon dioxide, methane and molecular oxygen (O2, or its proxy, ozone O3). Water is essential to all biology we understand today, while the exchange of carbon dioxide and oxygen constitute the collective respirator for photosynthesis and breathable worlds. The dominant gas on Mars today is by far carbon dioxide.

With methane, there are some methanogenic organisms that require consumption of this gas for their subsistence. Methanogenesis converts carbon dioxide to methane. Since strong chemical reactions quickly destroy (oxidize) methane at the Martian surface, if methane is found today, there must be some replenishment that gives a clue to active biology. Such biosynthesis leaves a ubiquitous signature of life even in specimens where there are no fossils visible.

Michael J. Mumma of Goddard Space Flight Center first reported in a poster at a recent planetary conference [DPS] that his preliminary search for methane with both of two ground-based infrared telescopes had found something interesting. His survey turned up intriguing signs of what may be methane’s spectral line in the Martian atmosphere.

These hints have now been confirmed by the European orbiter, Mars Express. Using an instrument called the Planetary Fourier Spectrometer (PFS), the work reported in Nature magazine identified the characteristic spectral fingerprint of methane. “We have detected methane at concentrations of ten parts per billion,” said Vittorio Formisano of the Institute of Physics of Interplanetary Space in Rome and the principal investigator in the PFS team.

The current martian atmosphere is 99% thinner than the Earth’s. The surface temperature averages -64 F (-53 C), but varies between 200 below zero during polar nights to 80 F (27 C) at midday peaks near the equator. The global picture of Mars is sometimes compared terrestrially to Antarctic dry regions, only colder.

Carbon, nitrogen and methane would be the gaseous precursors to what would be required to sustain or transform Mars from its current inhospitable state to a warmer, microbe-friendly planet. Because researchers believe that methane can persist in the Martian atmosphere for less than 300 years, any methane they find can be assumed to arise from recent biological processes, produced, for example, by methane-producing bacteria. This close link gives methne its less scientific name of swamp gas.

The European Mars Express mission is capable of detecting methane in the martian atmosphere. As Agustin Chicarro, Mars Express Project Scientist said, these “investigations will provide clues as to why the north of the planet is so smooth and the south so rugged, how the Tharsis and Elysium mounds were lifted up and whether active volcanoes exist on Mars today.”

There are some problems with trying to understand the history of methane and other greenhouse gases on Mars. There is no evidence on Mars of large limestone deposits from the first billion years, which would be directly linked to large amounts of C02, a greenhouse gas.

Methane — which can be created naturally by volcanic eruptions or produced by primitive life — thus may be a missing piece of the puzzle to finding out if organic remnants might once have sustained a primordial Mars. The last period of active volcanism on Mars is well before the last 300 years that methane can survive in the martian atmosphere of today. University of Buffalo volcanologist, Tracy Gregg, told Astrobiology Magazine, “the youngest surficial activity discovered to date (and it’s probably 1 million years old, which would be considered quite young, and possibly “active” on Mars) is in a region that contains no large volcanic structures of any kind.” Mars’ gigantic volcano Mons Olympus was active until 100 million years ago.

Earlier observations had speculated on methane concentrations as high as 50-70 parts per million, not what Mars Express detected as ten parts per billion. This low level could not likely sustain a global pattern suggestive of a biosphere, but might support local ecologies if methane has some underground source. Whatever the final concentration might be, its appearance in such an unstable atmosphere has taken on importance to unravel the mysteries of a martian biosphere. The most frequently mentioned example of a martian methane economy centers on a deep biosphere of methane rich biochemistry, or anerobic methanogens.

Original Source: Astrobiology Magazine

Landing on a Comet

Image credit: ESA
Rosetta?s lander Philae will do something never before attempted: land on a comet. But how will it do this, when the kind of surface it will land on is unknown?

With the surface composition and condition largely a mystery, engineers found themselves with an extraordinary challenge; they had to design something that would land equally well on either solid ice or powder snow, or any state in between.

In the tiny gravitational field of a comet, landing on hard icy surface might cause Philae to bounce off again. Alternatively, hitting a soft snowy one could result in it sinking. To cope with either possibility, Philae will touch as softly as possible. In fact, engineers have likened it more to docking in space.

Landing on a comet is nothing like landing on a large planet, you do not have to fight against the pull of the planet?s gravity, and there is no atmosphere.

The final touching velocity will be about one metre per second. That is near a walking pace. However, as anyone who has walked into a wall by mistake will tell you, it is still fast enough to do some damage. So, two other strategies have been implemented.

Firstly, to guard against bouncing off, Philae will fire harpoons upon contact to secure itself to the comet.

Secondly, to prevent Philae from disappearing into a snowy surface, the landing gear is equipped with large pads to spread its weight across a broad area ? which is how snowshoes work on Earth, allowing us to walk on powdery falls of snow.

When necessity forced Rosetta?s target comet to be changed in Spring 2003 from Comet Wirtanen to Comet 67P/Churyumov-Gerasimenko, the landing team re-analysed Philae?s ability to cope. Because Comet Churyumov-Gerasimenko is larger than Wirtanen, three times the radius, it will have a larger gravitational field with which to pull down Philae.

In testing it was discovered that the landing gear is capable of withstanding a landing of 1.5 metres per second ? this was better than originally assumed.

In addition, Rosetta will gently push out the lander from a low altitude, to lessen its fall. In the re-analysis, one small worry was that Philae might just topple, if it landed on a slope at high speed. So the lander team developed a special device called a ?tilt limiter?, and attached it to the lander before lift-off, to prevent this happening.

In fact, the unknown nature of the landing environment only serves to highlight why the Rosetta mission is vital in the first place. Astronomers and planetary scientists need to learn more about these dirty snowballs that orbit the Sun.

Original Source: ESA News Release

Mars Express Confirms Methane Discovery

Image credit: ESA
During recent observations from the ESA Mars Express spacecraft in orbit around Mars, methane was detected in its atmosphere.

Whilst it is too early to draw any conclusions on its origin, exciting as they may be, scientists are thinking about the next steps to take in order to understand more.

From the time of its arrival at Mars, the Mars Express spacecraft started producing stunning results. One of the aims of the mission is analysing in detail the chemical composition of the Martian atmosphere, known to consist of 95% percent carbon dioxide plus 5% of minor constituents. It is also from these minor constituents, which scientists expect to be oxygen, water, carbon monoxide, formaldehyde and methane, that we may get important information on the evolution of the planet and possible implications for the presence of past or present life.

The presence of methane has been confirmed thanks to the observations of the Planetary Fourier Spectrometer (PFS) on board Mars Express during the past few weeks. This instrument is able to detect the presence of particular molecules by analysing their ?spectral fingerprints? – the specific way each molecule absorbs the sunlight it receives.

The measurements confirm so far that the amount of methane is very small ? about 10 parts in a thousand million, so its production process is probably small. However, the exciting question ?where does this methane come from?? remains.

Methane, unless it is continuously produced by a source, only survives in the Martian atmosphere for a few hundreds of years because it quickly oxidises to form water and carbon dioxide, both present in the Martian atmosphere. So, there must be a mechanism that refills the atmosphere with methane.

?The first thing to understand is how exactly the methane is distributed in the Martian atmosphere,? says Vittorio Formisano, Principal Investigator for the PFS instrument. ?Since the methane presence is so small, we need to take more measurements. Only then we will have enough data to make a statistical analysis and understand whether there are regions of the atmosphere where methane is more concentrated?.

Once this is done, scientists will try to establish a link between the planet-wide distribution of methane and possible atmospheric or surface processes that may produce it. ?Based on our experience on Earth, the methane production could be linked to volcanic or hydro-thermal activity on Mars. The High Resolution Stereo camera (HRSC) on Mars Express could help us identify visible activity, if it exists, on the surface of the planet?, continues Formisano. Clearly, if it was the case, this would imply a very important consequence, as present volcanic activity had never been detected so far on Mars.

Other hypotheses could also be considered. On Earth, methane is a by-product of biological activity, such as fermentation. ?If we have to exclude the volcanic hypothesis, we could still consider the possibility of life,? concludes Formisano.

?In the next few weeks, the PFS and other instruments on-board Mars Express will continue gathering data on the Martian atmosphere, and by then we will be able to draw a more precise picture,? says Agustin Chicarro, ESA Mars Express Project Scientist.

Thanks to the PFS instrument, scientists are also gathering precious data about isotopes in atmospheric molecules such as water and carbon dioxide – very important to understand how the planet was formed and to add clues on the atmospheric escape. The PFS also gives important hints about water-cloud formation on the top of volcanoes, and shows the presence of active photochemical processes in the atmosphere.

Original Source: ESA News Release

Book Review: Lost in Space

In Greg’s view NASA, the premiere space institute, is a government bureaucracy that is more concerned with preserving itself than in extending the space frontier. He alludes to conspiracies and to too close a relationship between NASA and their predominant suppliers Boeing and Lockheed Martin. Much of the source of this problem is the perceived current purpose of NASA which is to garner as many votes as possible for the party in power. One need only consider NASA’s inception. Here, when President Kennedy was looking for a means of countering the Soviets he considered space but he argued, “We shouldn’t be spending this kind of money because I’m not that interested in space”. Nonetheless soon after saying this he gave NASA the mandate to go to the moon. NASA then achieved this goal while at the same time ensuring contracts were provided to constituents in each of the 51 states. Ever since then NASA has not had the necessary political backing for a large scale enterprise or even the maintenance of the status quo. Sadly, without the political necessity nor an immediate economic benefit the dreamers are getting drenched with the reality of too high a cost to extend the frontier for too low an economic return.

The deciding factor for all this dreaming is the cost of accessing space. Usually quoted as a cost per pound (or kilogram), the current value is one or two orders of magnitude too high for establishing an industry. Further, according to Greg, established big business and government garner greater benefits from maintaining their control over all elements and they therefore don’t want to reduce the cost nor see anyone else reduce the cost. This doesn’t mean it won’t happen. There is the X-Prize and its front runner, Rutan’s White Knight aircraft that will launch the rocket ship SpaceShipOne into sub-orbital flight. Robert Zubrin has his Mars Society. The Mars Habitat analogue on Devon Island is conditioning people for an eventual presence on Mars. MirCorp was an endeavour to privatize the Mir space station thus annulling governments’ current monopoly on housing humans in space. Almost all the well known alternative access to space advocates have a reference. Yet, with all their brilliant engineering constructs and all their courtship of politicians somehow the feeling from reading the book is that there is just not enough of a reward to ever overcome the cost.

This book is about a dream not some academic juxtaposing of facts and issues for dissemination of automatons sitting around a boardroom table. It is an anguished cry as this dream is foundering, not because of inability, but in the belief of the short sightedness or incomprehension of bureaucracy. You can’t sit on the fence after reading this book, either you want things changed for the better or you want to give up altogether. If you are interested in advancing humans in space you will read of many other like minded people and their successes. You will also find many routes for pursuing your own preference for advocating space development most of which don’t involve a boardroom table.

As a view in the alternative access to space movement, this book is excellent. However as a view into the contributions of NASA and the space industry, this book is very one-sided. NASA the institute is given a very negative persona; a self interested, overpowering bore. Yet the individuals within NASA all seem to be exceptionally fine. Then in considering the people working on alternative accesses to space, this book seems to say that they can do no wrong. All their ideas are eminently favourable and worthy of public support and funding. A more balanced view would have been fairer, but likely less passionate.

In summary, if you want to know where NASA has gone wrong or of the many ideas that individuals have been and are expounding for space access, Lost in Space is the book. Perhaps unexpectedly it also contains an interesting view of the power of individuals within a large democracy. Just be ready for passion about a dream as this book has lots of it.

Review by Mark Mortimer

More information from Amazon.com