Detailed Dark Matter Maps

Dwarf galaxy I Zwicky 18. Image credit: NASA. Click to enlarge
Clues revealed by the recently sharpened view of the Hubble Space Telescope have allowed astronomers to map the location of invisible “dark matter” in unprecedented detail in two very young galaxy clusters.

A Johns Hopkins University-Space Telescope Science Institute team reports its findings in the December issue of Astrophysical Journal. (Other, less-detailed observations appeared in the January 2005 issue of that publication.)

The team’s results lend credence to the theory that the galaxies we can see form at the densest regions of “cosmic webs” of invisible dark matter, just as froth gathers on top of ocean waves, said study co-author Myungkook James Jee, assistant research scientist in the Henry A. Rowland Department of Physics and Astronomy in Johns Hopkins’ Krieger School of Arts and Sciences.

“Advances in computer technology now allow us to simulate the entire universe and to follow the coalescence of matter into stars, galaxies, clusters of galaxies and enormously long filaments of matter from the first hundred thousand years to the present,” Jee said. “However, it is very challenging to verify the simulation results observationally, because dark matter does not emit light.”

Jee said the team measured the subtle gravitational “lensing” apparent in Hubble images ? that is, the small distortions of galaxies’ shapes caused by gravity from unseen dark matter ? to produce its detailed dark matter maps. They conducted their observations in two clusters of galaxies that were forming when the universe was about half its present age.

“The images we took show clearly that the cluster galaxies are located at the densest regions of the dark matter haloes, which are rendered in purple in our images,” Jee said.

The work buttresses the theory that dark matter ? which constitutes 90 percent of matter in the universe ? and visible matter should coalesce at the same places because gravity pulls them together, Jee said. Concentrations of dark matter should attract visible matter, and as a result, assist in the formation of luminous stars, galaxies and galaxy clusters.

Dark matter presents one of the most puzzling problems in modern cosmology. Invisible, yet undoubtedly there ? scientists can measure its effects ? its exact characteristics remain elusive. Previous attempts to map dark matter in detail with ground-based telescopes were handicapped by turbulence in the Earth’s atmosphere, which blurred the resulting images.

“Observing through the atmosphere is like trying to see the details of a picture at the bottom of a swimming pool full of waves,” said Holland Ford, one of the paper’s co-authors and a professor of physics and astronomy at Johns Hopkins.

The Johns Hopkins-STScI team was able to overcome the atmospheric obstacle through the use of the space-based Hubble telescope. The installation of the Advanced Camera for Surveys in the Hubble three years ago was an additional boon, increasing the discovery efficiency of the previous HST by a factor of 10.

The team concentrated on two galaxy clusters (each containing more than 400 galaxies) in the southern sky.

“These images were actually intended mainly to study the galaxies in the clusters, and not the lensing of the background galaxies,” said co-author Richard White, a STScI astronomer who also is head of the Hubble data archive for STScI. “But the sharpness and sensitivity of the images made them ideal for this project. That’s the real beauty of Hubble images: they will be used for years for new scientific investigations.”

The result of the team’s analysis is a series of vividly detailed, computer-simulated images illustrating the dark matter’s location. According to Jee, these images provide researchers with an unprecedented opportunity to infer dark matter’s properties.

The clumped structure of dark matter around the cluster galaxies is consistent with the current belief that dark matter particles are “collision-less,” Jee said. Unlike normal matter particles, physicists believe, they do not collide and scatter like billiard balls but rather simply pass through each other.

“Collision-less particles do not bombard one another, the way two hydrogen atoms do. If dark matter particles were collisional, we would observe a much smoother distribution of dark matter, without any small-scale clumpy structures,” Jee said.

Ford said this study demonstrates that the ACS is uniquely advantageous for gravitational lensing studies and will, over time, substantially enhance understanding of the formation and evolution of the cosmic structure, as well as of dark matter.

“I am enormously gratified that the seven years of hard work by so many talented scientists and engineers to make the Advanced Camera for Surveys is providing all of humanity with deeper images and understandings of the origins of our marvelous universe,” said Ford, who is principal investigator for ACS and a leader of the science team.

The ACS science and engineering team is concentrated at the Johns Hopkins University and the Space Telescope Science Institute on the university’s Homewood campus in Baltimore. It also includes scientists from other major universities in the United States and Europe. ACS was developed by the team under NASA contract NAS5-32865 and this research was supported by NASA grant NAG5-7697.

Original Source: JHU News Release

Dione and Rhea in the Same Frame

Saturn’s moons, Rhea and Dione. Image credit: NASA/JPL/SSI Click to enlarge
Saturn’s sibling moons, Rhea and Dione, pose for the Cassini spacecraft in this view.

Even at this distance, it is easy to see that Dione (below) appears to have been geologically active in the more recent past, compared to Rhea (above). Dione’s smoother surface and linear depressions mark a contrast with Rhea’s cratered terrain.

Sunlit terrain seen on Rhea (1,528 kilometers, or 949 miles across) is on the moon’s Saturn-facing hemisphere. Lit terrain on Dione (1,126 kilometers, or 700 miles across) is on that moon’s leading hemisphere. North is up.

The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Nov. 1, 2005, at a distance of approximately 1.8 million kilometers (1.1 million miles) from Rhea and 1.2 million kilometers (800,000 miles) from Dione. The image scale is 11 kilometers (7 miles) per pixel on Rhea and 7 kilometers (4 miles) per pixel on Dione.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .

Original Source: NASA/JPL/SSI News Release

Northern Lights on the Move

Earth’s northern lights. Image credit: Philippe Moussette.Click to enlarge
After some 400 years of relative stability, Earth’s North Magnetic Pole has moved nearly 1,100 kilometers out into the Arctic Ocean during the last century and at its present rate could move from northern Canada to Siberia within the next half-century.

If that happens, Alaska may be in danger of losing one of its most stunning natural phenomena – the Northern Lights.

But the surprisingly rapid movement of the magnetic pole doesn’t necessarily mean that our planet is going through a large-scale change that would result in the reversal of the Earth’s magnetic field, Oregon State University paleomagnetist Joseph Stoner reported at the annual meeting of the American Geophysical Union in San Francisco, Calif.

“This may be part of a normal oscillation and it will eventually migrate back toward Canada,” said Stoner, an assistant professor in OSU’s College of Oceanic and Atmospheric Sciences. “There is a lot of variability in its movement.”

Calculations of the North Magnetic Pole’s location from historical records goes back only about 400 years, while polar observations trace back to John Ross in 1838 at the west coast of Boothia Peninsula. To track its history beyond that, scientists have to dig into the Earth to look for clues.

Stoner and his colleagues have examined the sediment record from several Arctic lakes. These sediments – magnetic particles called magnetite – record the Earth’s magnetic field at the time they were deposited. Using carbon dating and other technologies – including layer counting – the scientists can determine approximately when the sediments were deposited and track changes in the magnetic field.

The Earth last went through a magnetic reversal some 780,000 years ago. These episodic reversals, in which south becomes north and vice versa, take thousands of years and are the result of complex changes in the Earth’s outer core. Liquid iron within the core generates the magnetic field that blankets the planet.

Because of that field, a compass reading of north in Oregon will be approximately 17 degrees east from “true geographic north.” In Florida, farther away and more in line with the poles, the declination is only 4-5 degrees west.

The Northern Lights, which are triggered by the sun and fixed in position by the magnetic field, drift with the movement of the North Magnetic Pole and may soon be visible in more southerly parts of Siberia and Europe – and less so in northern Canada and Alaska.

In their research, funded by the National Science Foundation, Stoner and his colleagues took core samples from several lakes, but focused on Sawtooth Lake and Murray Lake on Ellesmere Island in the Canadian Arctic. These lakes, about 40 to 80 meters deep, are covered by 2-3 meters of ice. The researchers drill through the ice, extend their corer down through the water, and retrieve sediment cores about five meters deep from the bottom of the lakes.

The 5-meter core samples provide sediments deposited up to about 5,000 years ago. Below that is bedrock, scoured clean by ice about 7,000 to 8,000 years ago.

“The conditions there give us nice age control,” Stoner said. “One of the problems with tracking the movement of the North Magnetic Pole has been tying the changes in the magnetic field to time. There just hasn’t been very good time constraint. But these sediments provide a reliable and reasonably tight timeline, having consistently been laid down at the rate of about one millimeter a year in annual layers.

“We’re trying to get the chronology down to a decadal scale or better.”

What their research has told Stoner and his colleagues is that the North Magnetic Pole has moved all over the place over the last few thousand years. In general, it moves back and forth between northern Canada and Siberia. But it also can veer sideways.

“There is a lot of variability in the polar motion,” Stoner pointed out, “but it isn’t something that occurs often. There appears to be a ‘jerk’ of the magnetic field that takes place every 500 years or so. The bottom line is that geomagnetic changes can be a lot more abrupt than we ever thought.”

Shifts in the North Magnetic Pole are of interest beyond the scientific community. Radiation influx is associated with the magnetic field, and charged particles streaming down through the atmosphere can affect airplane flights and telecommunications.

Original Source: NASA Astrobiology

Hayabusa Probably Didn’t Get a Sample After All

Artist’s impression of Hayabusa spacecraft. Image credit: JAXA Click to enlarge
As has been reported, it is estimated that part of a series of attitude and orbit control commands to restore the Hayabusa from its safe-hold mode have not gone well, and the functions of its major systems, including its attitude and communication network, have significantly deteriorated. However, on Nov. 29, a beacon line through a low gain antenna was restored.

On Nov. 30, we started a restoration operation by turning on and off the radio frequency modulation through the autonomous diagnostic function. Subsequently, on Dec. 1, telemetry data were acquired at 8 bits per second through the low gain antenna, although the line was weak and often disconnected. According to the data transmitted so far, the attitude and orbit control commands sent on Nov. 27 did not work well due to an unknown reason, and either major attitude control trouble or a large electric power loss seems to have occurred. It is estimated that the overall power switching systems for many pieces of onboard equipment were reset as their temperature dropped substantially due to the evaporation of leaked propellant, and also because of a serious discharge of electricity from the batteries of many sets of onboard equipment and systems due to declining power generation. Details are still under analysis.

On Dec. 2, we tried to restart the chemical engine, but, even though a small thrust was confirmed, we were not able to restore full-scale operations. Consequently, the cause of the anomaly on Nov 27 is still under investigation, and we suspect that one of the causes could be the malfunction of the chemical engine.

On Dec. 3, we found that the angles between the axis of the onboard high gain antenna (+Z angle) and the Sun, and also that with the earth, had increased to 20 to 30 degrees. As an emergency attitude control method, we decided to adopt a method of jetting out xenon for the ion engine operation. Accordingly, we immediately started to create the necessary operation software. As we completed the software on Dec. 4, we changed the spin speed by xenon jet, and its function was confirmed. Without delay, we sent an attitude change command through this function.

As a result, on Dec. 5, the angle between the +Z axis and the sun, and the earth, recovered to 10 to 20 degrees, and the telemetry data reception and acquisition speed was restored to the maximum 256 bits per second through the mid gain antenna.

After that, we found that there was a high possibility that the projectile (bullet) for sampling had not been discharged on Nov. 26, as we finally acquired a record of the pyrotechnics control device for projectile discharging from which we were not able to confirm data showing a successful discharge. However, it may be because of the impact of the system power reset; therefore, we are now analyzing the details including the confirmation of the sequence before and after the landing on Nov. 26.

As of Dec. 6, the distance between the Hayabusa and the Itokawa is about 550 kilometers, and that from the earth is about 290 million kilometers. The explorer is relatively moving from the Itokawa toward the earth at about 5 kilometers per hour.

We are now engaging in turning on, testing, and verifying onboard equipment of the Hayabusa one by one to start the ion engine. We currently plan to shift the attitude control to one using the Z-axis reaction wheel, and restart the ion engine. The restart is expected to happen no earlier than the 14th. We are currently rescheduling the plan for the return trip to earth. We need to study how to relax the engine operation efficiency. We will do our utmost to solve the problem with the attitude control (such as the restoration of the chemical engine), then find a solution for the return trip.

Original Source: JAXA News Release

Women Wrap Up 60 Days of Simulated Spaceflight

WISE bed rest study participant Dorota. Image credit: ESA Click to enlarge
When the first women astronauts set foot on Mars, they may spare a thought for the 24 women who paved the way for lengthy space trips by giving three months of their lives to space science, two months of which involved staying in bed.

From March to May and from September to November, two different groups of 12 volunteers from eight European countries – the Czech Republic, Finland, France, Germany, the Netherlands, Poland, Switzerland, and the United Kingdom – took part in the Women International Space Simulation for Exploration (WISE) campaign on behalf of the European Space Agency (ESA), the French space agency (CNES), the Canadian Space Agency (CSA) and the US National Aeronautics and Space Administration (NASA).

The volunteers of the WISE femal bedrest study underwent numerous medical tests
They gathered at the MEDES Space Clinic in Rangueil Hospital in Toulouse, France, to take up an extraordinary challenge: a 60-day campaign of female bedrest. For two months, they had to lie down and undertake all daily activities in beds tilted at an angle of 6? below horizontal, so that their heads were slightly lower than their feet. This unusual position induces physiological changes similar to those experienced by astronauts in weightlessness.

The last volunteers of the second WISE campaign got up on 30 November, and are now undergoing rehabilitation and medical tests lasting until 20 December. Similar tests were conducted in the pre-bedrest period for comparison.

MEDES, the French Institute for Space Medicine and Physiology, organised the selection of the volunteers and provided medical, paramedical and technical staff to support the extensive science experiments.

The main objective of the WISE campaign has been to assess the roles of nutrition and physical exercise with adapted equipment in countering the adverse effects of prolonged microgravity conditions, in order to develop the counter-measures that will be required when future astronauts venture beyond the Earth orbit to explore other worlds.

The data collected by the international science teams during the WISE study will improve our knowledge of muscle condition, blood parameters, cardiovascular condition, coordination of movements, changes in endocrine and immune systems, metabolism, bone status, as well as psychological wellbeing. This will serve not only the future of human spaceflight, but our everyday lives on Earth too, by providing clues as to how to deal with osteoporosis, fight the ”metabolic syndrome?, which affects millions of sedentary workers who take insufficient physical exercise, assist recovery of bedridden patients, or prevent some cardiovascular conditions.

Twelve scientific teams from 11 countries – Belgium, Canada, Denmark, France, Germany, Italy, the Netherlands, Sweden, Switzerland, the United Kingdom and the United States – are involved in the study. It will take them several months to analyse their data and start publishing their findings. In order to answer certain scientific questions, a follow-up of the volunteers will continue for three more years.

?The WISE campaign has now come to a successful conclusion and I look forward to further campaigns in the future where there is this degree of international involvement and complexity?, said Didier Schmitt, Head of the Life Sciences Unit in ESA?s Directorate of Human Spaceflight, Microgravity and Exploration. ?Planning for future research is already under way with a programme of bedrest campaigns being prepared, covering the next three years. This will be a combination of short-term, intermediate and long-term bedrest studies, lasting 5, 21 and 60 days, respectively. A research announcement covering this period is due to be released in the near future as part of the European programme for Life and Physical Sciences and Applications using the ISS (ELIPS). A further two bedrest studies are planned, one in Berlin and the other at the DLR in Cologne and they have already been selected as part of the ESA Microgravity Applications Programme (MAP). These studies are currently awaiting the necessary funding, also from the ELIPS Programme.?

To mark the completion of the WISE 2005 campaigns, ESA, CNES and MEDES are to hold a press conference, together with representatives from NASA and CSA, science teams and volunteers from the second WISE campaign, at the “Cit? de l?Espace” in Toulouse on 13 December.

Media representatives wishing to attend this press conference are requested to apply using the attached form, which should be returned to the address shown at the bottom of the form.

For additional information, ESA has created a website on the WISE study at:
http://www.spaceflight.esa.int/wise

Original Source: ESA Portal

Hopping Microrobots

Planetary MicroBots. Image credit: NASA Click to enlarge
Interview with Penny Boston, Part I

If you want to travel to distant stars, or find life on another world, it takes a bit of planning. That’s why NASA has established NIAC, the NASA Institute for Advanced Concepts. For the past several years, NASA has been encouraging scientists and engineers to think outside the box, to come up with ideas just this side of science fiction. Their hope is that some of these ideas will pan out, and provide the agency with technologies it can use 20, 30, or 40 years down the road.

NIAC provides funding on a competitive basis. Only a handful of the dozens of proposals submitted are funded. Phase I funding is minimal, just enough for researchers to flesh out their idea on paper. If the idea shows merit, it then may get Phase II funding, allowing the research to continue from the pure-concept to the crude-prototype stage.

One of the projects that received Phase II funding earlier this year was a collaboration between Dr. Penelope Boston and Dr. Steven Dubowsky to develop “hopping microbots” capable of exploring hazardous terrain, including underground caves. If the project pans out, hopping microbots may some day be sent to search for life below the surface of Mars.

Boston spends a lot of time in caves, studying the microorganisms that live there. She is the director of the Cave and Karst Studies Program and an associate professor at New Mexico Tech in Socorro, New Mexico. Dubowsky is the director of the MIT Field and Space Robotics Laboratory at MIT, in Cambridge, Massachusetts. He is known in part for his research into artificial muscles.

Astrobiology Magazine interviewed Boston shortly after she and Dubowsky received their Phase II NIAC grant. This is the first of a two-part interview. Astrobiology Magazine (AM): You and Dr. Steven Dubowsky recently received funding from NIAC to work on the idea of using miniature robots to explore subsurface caves on Mars? How did this project come about?

Penny Boston (PB): We’ve been doing quite a lot of work in caves on Earth with an eye to looking at the microbial inhabitants of these unique environments. We think they can serve as templates for looking for life forms on Mars and other extraterrestrial bodies. I published a paper in 1992, with Chris McKay and Michael Ivanov, suggesting that the subsurface of Mars would be the last refuge of life on that planet as it became colder and drier over geological time. That got us into the business of looking into the subsurface on Earth. When we did, we discovered that there is an amazing array of organisms that are apparently indigenous to the subsurface. They interact with the mineralogy and produce unique biosignatures. So it became a very fertile area for us to study.

Getting into difficult caves even on this planet is not that easy. Translating that to robotic extraterrestrial missions requires some thought. We have good imaging data from Mars showing distinct geomorphological evidence for at least lava-tube caves. So we know that Mars has at least that one type of cave that could be a useful scientific target for future missions. It’s plausible to think that there are also other types of caves and we have a paper in press in an upcoming Geological Society of America Special Paper exploring unique cave-forming (speleogenetic) mechanisms on Mars. The big sticking point is how to get around in such rigorous and difficult terrain.

AM: Can you describe what you did in the first phase of the project?

PB: In Phase I, we wanted to focus on robotic units that were small, very numerous (hence expendable), largely autonomous, and that had the mobility that was needed for getting into rugged terrains. Based on Dr. Dubowsky’s ongoing work with artificial-muscle-activated robotic motion, we came up with the idea of many, many, tiny little spheres, about the size of tennis balls, that essentially hop, almost like Mexican jumping beans. They store up muscle energy, so to speak, and then they boink themselves off in various directions. That’s how they move.

credit:Render by R.D.Gus Frederick
Planetary Setting For Large-Scale Planetary Surface & Subsurface Exploration. Click image for larger view.
Image Credit: Render by R.D.Gus Frederick

We’ve calculated that we could probably pack about a thousand of these guys into a payload mass the size of one of the current MERs (Mars Exploration Rovers). That would give us the flexibility to suffer the loss of a large percentage of the units and still have a network that could be doing recon and sensing, imaging, and perhaps even some other science functions.

AM: How do all these little spheres co-ordinate with each other?

PB: They behave as a swarm. They relate to each other using very simple rules, but that produces a great deal of flexibility in their collective behavior that enables them to meet the demands of unpredictable and hazardous terrain. The ultimate product that we’re envisioning is a fleet of these little guys being sent to some promising landing site, exiting from the lander and then making their way over to some subsurface or other hazardous terrain, where they deploy themselves as a network. They create a cellular communication network, on a node-to-node basis.

AM: Are they able to control the direction in which they hop?

PB: We have aspirations for them ultimately to be very capable. As we move into Phase II, we’re working with Fritz Printz at Stanford on ultra-miniature fuel cells to power these little guys, which would enable them to be able to do a fairly complex array of things. One of those capabilities is to have some control over the direction in which they go. There are certain ways that they can be built that can allow them to preferentially go in one direction or another. It’s not quite as precise as it would be if they were wheeled rovers just going on a straight path. But they can preferentially cant themselves more or less in the direction that they wish to go. So we’re envisioning that they will have at least crude control over direction. But a lot of their value has to do with their swarm motion as an expanding cloud.

As wonderful as the MER rovers are, for the kind of science I do, I need something more akin to the insect robot idea pioneered by Rodney Brooks at MIT. Being able to tap into the model of insect intelligence and adaptation for exploration had long appealed to me. Adding that to the unique mobility provided by Dr. Dubowsky’s hopping idea, I think, can enable a reasonable percentage of these little units to survive the hazards of subsurface terrain – that just seemed like a magical combination to me.

HB: So in Phase I, did any of these actually get built?

PB: No. Phase I, with NIAC, is a six-months-long brain-straining, pencil-pushing study, to scope out the state of the art of the relevant technologies. In Phase II, we’re going to do a limited amount of prototyping and field-testing, over a two-year period. This is far less than what one might need for an actual mission. But, of course, that is NIAC’s mandate, to examine technology 10 to 40 years out. We’re thinking this is probably in the 10- to 20-year range.

AM: What kinds of sensors or scientific equipment do you imagine being able to put on these things?

PB: Imaging is clearly something that we would like to do. As cameras become incredibly tiny and robust, there are already units in the size range that could be mounted on these things. Possibly some of the units could be fitted with magnification capability, so one could look at the textures of the materials that they are landing on. Integrating images taken by tiny cameras on lots of different little units is one of the areas for future development. That’s beyond the scope of this project, but that’s what we’re thinking of for imaging. And then, certainly chemical sensors, being able to sniff and sense the chemical environment, which is very critical. Everything from tiny laser noses to ion-selective electrodes for gases.

We are envisioning having them not all identical, but rather an ensemble, with enough of the different kinds of units fitted out with different kinds of sensors so that the probability would still be high, even given fairly high losses of numbers of units, that we would still have a complete suite of sensors. Even though each individual unit cannot have a giant payload of sensors on it, you could have enough so that it could give significant overlap with its fellow units.

AM: Will it be possible to do biological testing?

PB: I think so. Particularly if you imagine the time frame that we’re looking at, with the advances that are coming online with everything from quantum dots to lab-on-a-chip devices. Of course, the difficulty is getting sample material to those. But when we’re dealing with little ground-contacting units like our hopping microbots, you might be able to position them directly over the material that they wish to test. In combination with microscopy and wider-field imagery, I think that the capability is there to do some serious biological work.

AM: Do you have an idea of what the milestones are that you’re hoping to hit during your two-year project?

PB: We’re anticipating that by March we may have crude prototypes that have the relevant mobility. But that may be overly ambitious. Once we do have mobile units, our plan is to do field testing in real lava-tube caves that we are doing science on in New Mexico.

The field site’s already tested. As part of Phase I the MIT group came out and I taught them a little bit about caving and what the terrain was actually like. It was a big eye-opener for them. It’s one thing to design robots for the halls of MIT, but it’s another thing to design them for real-world rocky environments. It was a very educational experience for us all. I think they have a pretty good idea what the conditions are that they have to meet with their design.

AM: What are those conditions?

PB: Extremely uneven terrain, lots of crevices that these guys could get temporarily jammed in. So we’ll need modes of operation that will allow them to extricate themselves, at least with a reasonable chance of success. The challenges of line-of-sight communication in a highly rough surface. Getting over big boulders. Getting stuck in little cracks. Things of that sort.

Lava is not smooth. The interior of lava tubes is intrinsically smooth after they’re formed, but there is a lot of material that shrinks and cracks and falls down. So there are rubble piles to get around and over, and a lot of elevational change. And these are things that conventional robots don’t have the capability to do.

Original Source: NASA Astrobiology

Book Review: Miss Leavitt’s Stars

Miss Henrietta Swan Leavitt obtained work at Harvard Observatory to review photographic plates. These were coming in fast and furious from the many large observatories being built in the Americas. These plates recorded the moment, but humans needed to interpret the dots. Small differences may be due to atmospheric effects, telescope adjustments, emulsion reactions or human intervention. Yet interpreting dots was considered an unworthy task for men, so women like Miss Leavitt were paid about minimum wage to spend hours every day looking at these plates, comparing each against another and against various metrics. With their effort, characteristics were catalogued for tens of thousands of stars.

The biography of a human computer sounds dry without even cracking open a book’s cover. Their task would simply be onerous repetition of the mundane. However, Johnson puts little time describing this aspect of Leavitt’s life. Actually, as Johnson acknowledges, there’s precious little remaining that describes Leavitt at all. Almost no first hand records exist. Most documents are second hand in nature and regard her circumstances from a very business like view. For example, either the observatory director or another astronomer would write discussing Leavitt’s work, her results and interest for future work. Johnson even had to dig into census data to discover where she lived and with whom. With such a dearth of information, Johnson has had to expand upon writing a biography so he adds a good look at the venture directly related to Leavitt’s work, the estimation of the size of the universe.

As such, Johnson smoothly takes the reader on a journey through parallax measurements, red-blue shifting, luminosity, galaxies and variables. Certainly there’s Leavitt’s discovery published in 1908 where she noted that brighter variables have longer periods. This observation came in a publication that gave a full account of 1777 variables in the Magellanic Cloud, and was so entitled. We also read of Shapely’s and Curtis’s debate in 1920 on whether the Milky Way was the universe or whether the Milky Way was just one typical galaxy amongst others. Eventually Edwin Hubble used Leavitt’s relationship of Cepheid variables to show that Barnard’s Galaxy was over 700 000 light years away and certainly outside the realm of the Milky Way. Johnson then ends the book with a discussion of Hubble’s constant that relates a galaxy’s velocity to its distance.

As one can tell, this book is much more than just about Leavitt. There’s some mention of her childhood, her accommodations and relatives. There’s also some information about her vacation travels, her frequent time off for convalescence and the on-set of her deafness. Johnson does add nice touches about society at the time, such as Leavitt completing the requirements for a Bachelor of Arts degree, but because she wasn’t male, she could only get a certificate. He also notes the better known information, such as her epic in 1914 on the North Polar Sequence, which at 84 pages defined 96 stars for use as a standard for all astronomers. But as most of this could have been done in a small number of pages, Johnson ably and expansively enlarges this biography to include the topic that so dominated Leavitt’s work.

Therefore, though the title may be a bit misleading, this book does an admirable job at presenting Leavitt’s life and especially her life’s interest. As well, Johnson wrote all astronomical details from a generalist’s point of view which can easily be understood by anyone without training. Corollaries are common and clear. The occasional wandering in the subject adds to the reading rather than distracts the reader. The few pictures help visualize the main characters, while the adherence to the subject keeps the book tight and informative.

Computers will do what they’re told. But they can’t step back and deduce patterns nor generalize. Humans excel at this function and George Johnson in his book, Miss Leavitt’s Stars, presents the benefit all astronomers owe to Miss Henrietta Swan Leavitt, the human computer who first came to understand the relationship between the periodicity of Cepheid variables and their distance. His book shows she was a special person who admirably worked above the call of duty to augment our knowledge one step further.

Review by Mark Mortimer

Read more reviews online, or purchase a copy from Amazon.com.

Saturn’s Graceful Crescent

Saturn’s crescent. Image credit: NASA/JPL/SSI Click to enlarge
Feathery cloud bands fill Saturn’s graceful crescent. Features in the atmosphere are visible all the way to the terminator, the boundary between night and day, where the Sun’s rays are coming in almost horizontally.

Because it is possible to see down to the same level, regardless of how high the Sun is above the horizon, this indicates that the atmosphere above the clouds is relatively clear.

The dark line across the top of the image is the nearly edge-on ringplane.

The image was taken in infrared light (centered at 728 nanometers) with the Cassini spacecraft wide-angle camera on Oct. 31, 2005, at a distance of approximately 1.2 million kilometers (800,000 miles) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 131 degrees. Image scale is 69 kilometers (43 miles) per pixel. The image was contrast enhanced to improve visibility of features in the atmosphere.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .

Original Source: NASA/JPL/SSI News Release

Dust Storms on the Moon

The Lunar Ejecta and Meteorites Experiment (LEAM). Image credit: NASA Click to enlarge
Every lunar morning, when the sun first peeks over the dusty soil of the moon after two weeks of frigid lunar night, a strange storm stirs the surface.

The next time you see the moon, trace your finger along the terminator, the dividing line between lunar night and day. That’s where the storm is. It’s a long and skinny dust storm, stretching all the way from the north pole to the south pole, swirling across the surface, following the terminator as sunrise ceaselessly sweeps around the moon.

see captionNever heard of it? Few have. But scientists are increasingly confident that the storm is real.

The evidence comes from an old Apollo experiment called LEAM, short for Lunar Ejecta and Meteorites. “Apollo 17 astronauts installed LEAM on the moon in 1972,” explains Timothy Stubbs of the Solar System Exploration Division at NASA’s Goddard Space Flight Center. “It was designed to look for dust kicked up by small meteoroids hitting the moon’s surface.”

Billions of years ago, meteoroids hit the moon almost constantly, pulverizing rocks and coating the moon’s surface with their dusty debris. Indeed, this is the reason why the moon is so dusty. Today these impacts happen less often, but they still happen.

Apollo-era scientists wanted to know, how much dust is ejected by daily impacts? And what are the properties of that dust? LEAM was to answer these questions using three sensors that could record the speed, energy, and direction of tiny particles: one each pointing up, east, and west.

LEAM’s three-decade-old data are so intriguing, they’re now being reexamined by several independent groups of NASA and university scientists. Gary Olhoeft, professor of geophysics at the Colorado School of Mines in Golden, is one of them:

“To everyone’s surprise,” says Olhoeft, “LEAM saw a large number of particles every morning, mostly coming from the east or west–rather than above or below–and mostly slower than speeds expected for lunar ejecta.”
What could cause this? Stubbs has an idea: “The dayside of the moon is positively charged; the nightside is negatively charged.” At the interface between night and day, he explains, “electrostatically charged dust would be pushed across the terminator sideways,” by horizontal electric fields. (Learn more: “Moon Fountains.” )

Even more surprising, Olhoeft continues, a few hours after every lunar sunrise, the experiment’s temperature rocketed so high–near that of boiling water–that “LEAM had to be turned off because it was overheating.”

Those strange observations could mean that “electrically-charged moondust was sticking to LEAM, darkening its surface so the experiment package absorbed rather than reflected sunlight,” speculates Olhoeft.

But nobody knows for sure. LEAM operated for a very short time: only 620 hours of data were gathered during the icy lunar night and a mere 150 hours of data from the blazing lunar day before its sensors were turned off and the Apollo program ended.

Astronauts may have seen the storms, too. While orbiting the Moon, the crews of Apollo 8, 10, 12, and 17 sketched “bands” or “twilight rays” where sunlight was apparently filtering through dust above the moon’s surface. This happened before each lunar sunrise and just after each lunar sunset. NASA’s Surveyor spacecraft also photographed twilight “horizon glows,” much like what the astronauts saw.

It’s even possible that these storms have been spotted from Earth: For centuries, there have been reports of strange glowing lights on the moon, known as “lunar transient phenomena” or LTPs. Some LTPs have been observed as momentary flashes–now generally accepted to be visible evidence of meteoroids impacting the lunar surface. But others have appeared as amorphous reddish or whitish glows or even as dusky hazy regions that change shape or disappear over seconds or minutes. Early explanations, never satisfactory, ranged from volcanic gases to observers’ overactive imaginations (including visiting extraterrestrials).

Now a new scientific explanation is gaining traction. “It may be that LTPs are caused by sunlight reflecting off rising plumes of electrostatically lofted lunar dust,” Olhoeft suggests.

All this matters to NASA because, by 2018 or so, astronauts are returning to the Moon. Unlike Apollo astronauts, who never experienced lunar sunrise, the next explorers are going to establish a permanent outpost. They’ll be there in the morning when the storm sweeps by.

The wall of dust, if it exists, might be diaphanous, invisible, harmless. Or it could be a real problem, clogging spacesuits, coating surfaces and causing hardware to overheat.

Which will it be? Says Stubbs, “we’ve still got a lot to learn about the Moon.”

Original Source: NASA News Release