JIMO Ion Engine Passes the Test

Image credit: NASA/JPL

A new ion engine design, under consideration for NASA’s Jupiter Icy Moons Orbiter mission, has been successfully tested. This was the first performance test of the Nuclear Electric Xenon Ion System, which will use a nuclear reactor to generate electricity for the spacecraft’s ion engine – previous ion engines, like on Deep Space 1 and SMART-1 are solar powered. The new engine operated with 10 times the thrust of Deep Space 1, and should be able to run for 10 years; enough time to visit each of Jupiter’s icy moons which are potential candidates for life.

A new ion propulsion engine design, one of several candidate propulsion technologies under study by NASA’s Project Prometheus for possible use on the proposed Jupiter Icy Moons Orbiter mission, has been successfully tested by a team of engineers at NASA’s Jet Propulsion Laboratory, Pasadena, Calif.

The event marked the first performance test of the Nuclear Electric Xenon Ion System (Nexis) ion engine at the high-efficiency, high-power, and high-thrust operating conditions needed for use in nuclear electric propulsion applications. For this test the Nexis engine was powered using commercial utility electrical power. Ion engines used on the proposed Jupiter Icy Moons Orbiter spacecraft would draw their power from an on-board space nuclear reactor. The ion engines, or electric thrusters, would propel the orbiter around each of the icy worlds orbiting Jupiter — Ganymede, Callisto and Europa — to conduct extensive, close-range exploration of their makeup, history and potential for sustaining life.

“On the very first day of performance testing, the Nexis thruster demonstrated one of the highest efficiencies of any xenon ion thruster ever tested,” said Dr. James Polk, the principal investigator of the ion engine under development at JPL.

The test was conducted on December 12, in the same vacuum chamber at JPL where earlier this year, the Deep Space 1 flight spare ion thruster set the all time endurance record of 30,352 hours (nearly 3.5 years) of continuous operation. The Nexis engine operated at a power level of over 20 kilowatts, nearly 10 times that of the Deep Space 1 thruster, which enables greater thrust and ultimately higher spacecraft velocities for a given spacecraft mass. It is designed to process two metric tons of propellant, 10 times the capability of the Deep Space 1 engine, and operate for 10 years, two to three times the Deep Space 1 thruster life.

Team members working on the Nexis engine also helped develop the first ion engine ever flown on NASA’s highly successful Deep Space 1 mission, which validated 12 high-risk advanced technologies, among them the use of the first ion engine in space.

“The Nexis thruster is a larger, high performance descendant of the Deep Space 1 thruster that achieves its extraordinary life by replacing the metal, previously used in key components, with advanced carbon based materials,” said Tom Randolph, the Nexis program manager at JPL. “The thruster’s revolutionary performance results from an extensive design process including simulations using detailed computer models developed and validated with the Deep Space 1 life test, and other component test data.”

Unlike the short, high-thrust burns of most chemical rocket engines that use solid or liquid fuels, the ion engine emits only a faint blue glow of electrically charged atoms of xenon – the same gas found in photo flash tubes and in many lighthouse bulbs. The thrust from the engine is as gentle as the force exerted by a sheet of paper held in the palm of your hand. Over the long haul though, the engine can deliver 20 times as much thrust per kilogram of fuel than traditional rockets.

Key to the ion technology is its high exhaust velocity. The ion engine can run on a few hundred grams of propellant per day, making it lightweight. Less weight means less cost to launch, yet an ion-propelled spacecraft can go much faster and farther than any other spacecraft.

“This test, in combination with the recent test of the High Power Electric Propulsion ion engine at NASA’s Glenn Research Center, is another example of the progress we are making in developing the technologies needed to support flagship space exploration missions throughout the solar system and beyond,” said Alan Newhouse, director, Project Prometheus. “We have challenged our team with difficult performance goals and they are demonstrating their ability to be creative in overcoming technical challenges.”

NASA’s Project Prometheus is making strategic investments in space nuclear fission power and electric propulsion technologies that would enable a new class of missions to the outer Solar System, with capabilities far beyond those possible with current power and propulsion systems. The first such mission under study, the Jupiter Icy Moon Orbiter would launch in the next decade and provide NASA significantly improved scientific and telecommunications capabilities and mission design options. Instead of generating only hundreds of watts of electricity like the Cassini or Galileo missions, which used radioisotope thermoelectric generators, the Jupiter Icy Moons Orbiter could have up to tens of thousands of watts of power, increasing the potential science return many times over.

Development of the Nexis ion engine is being carried out by a team of engineers from JPL; Aerojet, Redmond, Wash.; Boeing Electron Dynamic Devices, Torrance, Calif.; NASA’s Marshall Space Flight Center, Huntsville, Ala.; Colorado State University, Fort Collins, Colo.; Georgia Institute of Technology, Atlanta, Ga.; and the Aerospace Corporation, Los Angeles, Calif.

For more information about Project Prometheus on the Internet, visit: http://spacescience.nasa.gov/missions/prometheus.htm .

Information on the proposed Jupiter Icy Moons Orbiter mission is available at: NASA Jimo MIssion .

Original Source: NASA/JPL News Release

Dark Matter Bends Light from a Distant Quasar

Image credit: SDSS

Gravitational lensing happens when the light from a distant object, such as a quasar, is distorted by the gravity of a closer object. Astronomers have discovered just such a lens, where the distortions are so great, they have to be caused by a significant amount of dark matter – the visible material alone couldn’t be responsible. Dark matter is predicted by its gravitational influence on galaxies and stars in the Universe, but so far, astronomers aren’t really sure what it is; whether it’s just regular matter which is too cold to be seen from Earth, or some kind of exotic particle.

Sloan Digital Sky Survey scientists have discovered a gravitationally lensed quasar with the largest separation ever recorded, and, contrary to expectations, found that four of the most distant, most luminous quasars known are not gravitationally lensed.

Albert Einstein’s Theory of General Relativity predicts that the gravitational pull of a massive body can act as a lens, bending and distorting the light of a distant object. A massive structure somewhere between a distant quasar and Earth can “lens” the light of a quasar, making the image substantially brighter and producing several images of one object.

In a paper published in the December 18/25 edition of NATURE magazine, a Sloan Digital Sky Survey (SDSS) team led by University of Tokyo graduate students Naohisa Inada and Masamune Oguri report that four quasars in close proximity are, in fact, the light from one quasar split into four images by gravitational lensing.

More than 80 gravitationally lensed quasars have been discovered since the first example was found in 1979. A dozen of the cataloged lensed quasars are SDSS discoveries, of which half are the result of the work of Inada and his team.

But what makes this latest finding so dramatic is that the separation between the four images is twice as large as that of any previously known gravitationally lensed quasar. Until the discovery of this quadruple lens quasar, the largest separation known in a gravitationally lensed quasar was 7 arcseconds. The quasar found by the SDSS team lies in the constellation Leo Minor; it consists of four images separated by 14.62 arcseconds.

In order to produce such a large separation, the concentration of matter giving rise to the lensing has to be particularly high. There is a cluster of galaxies in the foreground of this gravitational lens; the dark matter associated with the cluster must be responsible for the unprecedented large separation.

“Additional observations obtained at the Subaru 8.2 meter telescope and Keck telescope confirmed that this system is indeed a gravitational lens,” explains Inada. “Quasars split this much by gravitational lensing are predicted to be very rare, and thus can only be discovered in very large surveys like the SDSS.”

Oguri added: “Discovering one such wide gravitational lens out of over 30,000 SDSS quasars surveyed to date is perfectly consistent with theoretical expectations of models in which the universe is dominated by cold dark matter. This offers additional strong evidence for such models.” (Cold dark matter, unlike hot dark matter, forms tight clumps, the kind that causes this kind of gravitational lens.)

“The gravitational lens we have discovered will provide an ideal laboratory to explore the relation between visible objects and invisible dark matter in the universe,” Oguri explained.

In a second paper to be published in the Astronomical Journal in March 2004, a team led by Gordon Richards of Princeton University used the high resolution of the Hubble Space Telescope to examine four of the most distant known quasars discovered by SDSS for signs of gravitational lensing.

Looking to great distances in astronomy is looking back in time. These quasars are seen at a time when the universe was less than 10percent of its present age. These quasars are tremendously luminous, and are thought to be powered by enormous black holes with masses several billion times that of the Sun. The researchers said it is a real mystery how such massive black holes could have formed so early in the universe. Yet if these objects are gravitationally lensed, SDSS researchers would infer substantially smaller luminosities and therefore black hole masses, making it easier to explain their formation.

“The more distant a quasar, the more likely a galaxy lies between it and the viewer. This is why we expected the most distant quasars to be lensed,” explained SDSS researcher Xiaohui Fan of the University of Arizona. However, contrary to expectations, none of the four shows any sign of multiple images that is the hallmark of lensing.

“Only a small fraction of quasars are gravitationally lensed. However, quasars this bright are very rare in the distant universe. Since lensing causes quasars to appear brighter and therefore easier to detect, we expected that our distant quasars were the ones most likely to be lensed,” suggested team member Zoltan Haiman of Columbia University.

“The fact that these quasars are not lensed says that astronomers have to take seriously the idea that quasars a few billion times the mass of the Sun formed less than a billion years after the Big Bang”, said Richards. “We’re now looking for more examples of high-redshift quasars in the SDSS to give theorists even more supermassive black holes to explain.”

Original Source: SDSS News Release

Rover Cameras Will Be Like Human Vision on Mars

Image credit: NASA/JPL

The mast-mounted cameras on board the Mars Exploration Rovers, Spirit and Opportunity, will provide the best view so far of the surface of the Red Planet. The cameras are the equivalent of 20/20 human vision – with a resolution of one pixel/millimeter at a range of three metres. Their cameras can pan up and down 90-degrees, and look completely around 360-degrees. The first rover, Spirit, will arrive on Mars on January 3, with Opportunity arriving on January 25.

The Cornell University-developed, mast-mounted panoramic camera, called the Pancam, on board the rovers Spirit and Opportunity will provide the clearest, most-detailed Martian landscapes ever seen.

The image resolution – equivalent to 20/20 vision for a person standing on the Martian surface – will be three times higher than that recorded by the cameras on the Mars Pathfinder mission in 1997 or the Viking Landers in the mid-1970s.

From 10 feet away, Pancam has a resolution of 1 millimeter per pixel. “It’s Mars like you’ve never seen it before,” says Steven Squyres, Cornell professor of astronomy and principal investigator for the suite of scientific instruments carried by the rovers.

Spirit is scheduled to land on Mars on Jan. 3 at 11:35 p.m. EST. Opportunity will touch down Jan. 25 at 12:05 a.m. EST.

The Jet Propulsion Laboratory (JPL) in Pasadena, a division of the California Institute of Technology, manages the Mars Exploration Rover project for NASA’s Office of Space Science, Washington, D.C. Cornell, in Ithaca, N.Y., is managing the rovers’ science instruments.

Pancam’s mast can swing the camera 360 degrees across the horizon and 90 degrees up or down. Scientists will know a rover’s orientation each day on the Martian surface by using data gained as the camera searches for and finds the sun in the sky at a known time of day. Scientists will determine a rover’s location on the planet by triangulating the positions of features seen on the distant horizon in different directions.

Rover science team member James Bell, Cornell associate professor of astronomy and the lead scientist for Pancam, says that high resolution is important for conducting science on Mars. “We want to see fine details. Maybe there is layering in the rocks, or the rocks are formed from sediments instead of volcanoes. We need to see the rock grains, whether they are wind-formed or shaped by water,” he says.

Also, Pancam is important for determining a rover’s travel plans. Says Bell: “We need to see details of possible obstacles that may be way off in the distance.”

As each twin-lens CCD (charge-coupled device) camera takes pictures, the electronic images will be sent to the rover’s onboard computer for a number of image processing steps, including compression, before the data are sent to Earth.

Each image, reduced to nothing more than a stream of zeros and ones, will be part of a once- or twice-daily stream of information beamed to Earth, a journey that takes 10 minutes. The data will be retrieved by NASA’s Deep Space Network, delivered to mission controllers at JPL and converted into raw images. From there, the images will be sent to the new Mars image processing facility at Cornell’s Space Sciences Building, where researchers and students will hover over computers to produce scientifically useful pictures.

During the surface activity by the rovers, from January to May 2004, there will be daily extensive planning by the Mars scientific team, led by Squyres. Research specialists Elaina McCartney and Jon Proton will participate in these meetings and decide how to implement the plans for Pancam and each rover’s five other instruments.

Processing pictures from 100 million miles away will be no easy feat. It took three years for Cornell faculty, staff and students to precisely calibrate the Pancam lenses, filters and detectors, and to write the software that tells the special camera what to do.

For instance, researchers Jonathan Joseph and Jascha Sohl-Dickstein wrote and perfected software that will produce images of great clarity. One of Joseph’s software routines patches the images together into larger pictures, called mosaics, and another brings out details within single images. Sohl-Dickstein’s software will allow scientists to generate color pictures and conduct spectral analysis, which is important in understanding the planet’s geology and composition.

Extensive work on the camera also was accomplished by Cornell graduates Miles Johnson, Heather Arneson and Alex Hayes. Hayes, who started working on the Mars mission as a Cornell sophomore, built a mock-up of the panoramic camera that aided the delicate color calibration and calculation of the actual Mars camera’s focal length and field of view. Johnson and Arneson spent eight months at JPL running Pancam under Mars-like conditions and collecting calibration data for the camera’s 16 filters.

For the students and recent graduates on the Pancam team, the research has been both valuable experience and education. “I stood inside a clean room at the Jet Propulsion Laboratory and performed testing on the real rovers,” says Johnson. “It was a weird but an exciting feeling standing next to such a really complex piece of equipment that would soon be on Mars.”

Original Source: Cornell University

Three Dusty Galaxy Images

Image credit: ESO

The European Southern Observatory has released three new images of distant spiral galaxies, which were taken while astronomers were searching for quasars. NGC 613 is a beautiful barred spiral galaxy in the southern constellation of Sculptor; NGC 1792 is a starburst spiral galaxy located in the southern constellation of Columba; and NGC 3627 is also known as Messier 66 and located in the constellation Leo.

Not so long ago, the real nature of the “spiral nebulae”, spiral-shaped objects observed in the sky through telescopes, was still unknown. This long-standing issue was finally settled in 1924 when the famous American astronomer Edwin Hubble provided conclusive evidence that they are located outside our own galaxy and are in fact “island universes” of their own.

Nowadays, we know that the Milky Way is just one of billions of galaxies in the Universe. They come in vastly different shapes – spiral, elliptical, irregular – and many of them are simply beautiful, especially the spiral ones.

Astronomers Mark Neeser from the Universit?ts-Sternwarte M?nchen (Germany) and Peter Barthel from the Kapteyn Institute in Groningen (The Netherlands) were clearly not insensitive to this when they obtained images of three beautiful spiral galaxies with ESO’s Very Large Telescope (VLT). They did this in twilight during the early morning when they had to stop their normal observing programme, searching for very distant and faint quasars.

The resulting colour images (ESO PR Photos 33a-c/03) were produced by combining several CCD images in three different wavebands from the FORS multi-mode instruments.

The three galaxies are known as NGC 613, NGC 1792 and NGC 3627. They are characterized by strong far-infrared, as well as radio emission, indicative of substantial ongoing star-formation activity. Indeed, these images all display prominent dust as well as features related to young stars, clear signs of intensive star-formation.

NGC 613
NGC 613 is a beautiful barred spiral galaxy in the southern constellation Sculptor. This galaxy is inclined by 32 degrees and, contrary to most barred spirals, has many arms that give it a tentacular appearance.

Prominent dust lanes are visible along the large-scale bar. Extensive star-formation occurs in this area, at the ends of the bar, and also in the nuclear regions of the galaxy. The gas at the centre, as well as the radio properties are indicative of the presence of a massive black hole in the centre of NGC 613.

NGC 1792
NGC 1792 is located in the southern constellation Columba (The Dove) – almost on the border with the constellation Caelum (The Graving Tool) – and is a so-called starburst spiral galaxy. Its optical appearance is quite chaotic, due to the patchy distribution of dust throughout the disc of this galaxy. It is very rich in neutral hydrogen gas – fuel for the formation of new stars – and is indeed rapidly forming such stars. The galaxy is characterized by unusually luminous far-infrared radiation; this is due to dust heated by young stars.

M 66 (NGC 3627)
The third galaxy is NGC 3627, also known as Messier 66, i.e. it is the 66th object in the famous catalogue of nebulae by French astronomer Charles Messier (1730 – 1817). It is located in the constellation Leo (The Lion).

NGC 3627 is a beautiful spiral with a well-developed central bulge. It also displays large-scale dust lanes. Many regions of warm hydrogen gas are seen throughout the disc of this galaxy. The latter regions are being ionised by radiation from clusters of newborn stars. Very active star-formation is most likely also occurring in the nuclear regions of NGC 3627.

The galaxy forms, together with its neighbours M 65 and NGC 3628, the so-called “Leo Triplet”; they are located at a distance of about 35 million light-years. M 66 is the largest of the three. Its spiral arms appear distorted and displaced above the main plane of the galaxy. The asymmetric appearance is most likely due to gravitational interaction with its neighbours.

Original Source: ESO News Release

Delta II Launches GPS Satellite

Image credit: Boeing

A Boeing Delta II rocket successfully launched a Global Positioning System satellite for the US Air Force on December 21. The rocket lifted off from Cape Canaveral at 0805 UTC (3:05 EST), and the satellite was deployed 68 minutes later. The satellite, designated GPS IIR-10 was the tenth of 21 IIR class GPS satellites that Boeing will be responsible for launching. The next scheduled Delta launch will also be carrying a GPS satellite; it’s expected to lift off in early 2004.

A Boeing [NYSE: BA] Delta II rocket has successfully deployed a Global Positioning System (GPS) satellite for the U.S. Air Force. This satellite, GPS IIR-10, was the tenth of 21 IIR class GPS satellites Boeing will launch for the Air Force.

Liftoff of the Delta II occurred at 3:05 a.m. EST from Space Launch Complex 17A, Cape Canaveral Air Force Station, Fla. The deployment sequence was completed in 68 minutes at 4:13 a.m. EST.

The GPS satellite, which will orbit nearly 11,000 miles above the Earth, was launched aboard a Delta II 7925-9.5 vehicle.

?Our Delta team has done an outstanding job in supporting the customer, by providing another flawless launch,? said Dan Collins, vice president and program manager, Delta Programs, for Boeing. ?This successful `Delta launch re-affirms our pride in being a part of the GPS program, which is so vital to our nation?s national security.?

Operated by U.S. Air Force Space Command, the GPS constellation provides precise navigation and timing to worldwide military and civilian users 24-hours a day, in all weather conditions. For the warfighter, GPS has enabled the development and use of cost-effective precision guided munitions, and is considered a major component of DoD?s transformational architecture plans.

The next Delta II mission will carry the GPS IIR-11satellite, with the launch scheduled for the first quarter of 2004 from SLC-17B, Cape Canaveral Air Force Station, Fla.

Boeing Launch Services Inc., based in Huntington Beach, Calif., is responsible for the marketing and sales of the Sea Launch and Delta family of launch vehicles to Boeing national security, civil space and commercial customers.
A unit of The Boeing Company, Integrated Defense Systems is one of the world?s largest space and defense businesses. Headquartered in St. Louis, Boeing Integrated Defense Systems is a $25 billion business. It provides systems solutions to its global military, government, and commercial customers. It is a leading provider of intelligence surveillance, and reconnaissance; the world?s largest military aircraft manufacturer; the world?s largest satellite manufacturer and a leading provider of space-based communications; the primary systems integrator for U.S. missile defense; NASA?s largest contractor; and a global leader in launch services.

Original Source: Boeing News Release

Rovers Will Dig Trenches with Their Wheels

Image credit: NASA/JPL

Scientists are always looking for more ways to cram scientific instruments into spacecraft, and they’ve come up with an innovative idea for the Mars Exploration rovers: using the wheels to dig trenches to see what the environment on Mars is like a few centimetres beneath the surface. Researchers from Cornell University perfected a technique where the rover locks all but one of its six wheels, and then uses the final wheel to churn up the dirt – tests in the lab allowed them to get at material which was more than 10 cm deep.

After the twin Mars Exploration Rovers bounce onto the red planet and begin touring the Martian terrain in January, onboard spectrometers and cameras will gather data and images — and the rovers’ wheels will dig holes.

Working together, a Cornell University planetary geologist and a civil engineer have found a way to use the wheels to study the Martian soil by digging the dirt with a spinning wheel. “It’s nice to roll over geology, but every once in a while you have to pull out a shovel, dig a hole and find out what is really underneath your feet,” says Robert Sullivan, senior research associate in space sciences and a planetary geology member of the Mars mission’s science team. He devised the plan with Harry Stewart, Cornell associate professor of civil engineering, and engineers at the Jet Propulsion Laboratory (JPL) in Pasadena.

The researchers perfected a digging method to lock all but one of a rover’s wheels on the Martian surface. The remaining wheel will spin, digging the surface soil down about 5 inches, creating a crater-shaped hole that will enable the remote study of the soil’s stratigraphy and an analysis of whether water once existed. For controllers at JPL, the process will involve complicated maneuvers — a “rover ballet,” according to Sullivan — before and after each hole is dug to coordinate and optimize science investigations of each hole and its tailings pile.

JPL, a division of the California Institute of Technology, manages the Mars Exploration Rover project for NASA’s Office of Space Science, Washington, D.C. Cornell, in Ithaca, N.Y., is managing the science suite of instruments carried by the two rovers.

Each rover has a set of six wheels carved from aluminum blocks, and inside each wheel hub is a motor. To spin a wheel independently, JPL operators will simply switch off the other five wheel motors. Sullivan, Stewart and Cornell undergraduates Lindsey Brock and Craig Weinstein used Cornell’s Takeo Mogami Geotechnical Laboratory to examine various soil strengths and characteristics. They also used Cornell’s George Winter Civil Infrastructure Laboratory to test the interaction of a rover wheel with the soil. Each rover wheel has spokes arranged in a spiral pattern, with strong foam rubber between the spokes; these features will help the rover wheels function as shock absorbers while rolling over rough terrain on Mars.

In November, Sullivan used JPL’s Martian terrain proving ground to collect data on how a rover wheel interacts with different soil types and loose sand. He used yellow, pink and green sand — dyed with food coloring and baked by Brock. Sullivan used a stack of large picture frames to layer the different colored sands to observe how a wheel churned out sloping tailings piles and where the yellow, pink and green sand finally landed. “Locations where the deepest colors were concentrated on the surface suggest where analysis might be concentrated when the maneuver is repeated for real on Mars,” he says.

Stewart notes similarities between these tests and those for the lunar-landing missions in the late-1960s, when engineers needed to know the physical characteristics of the moon’s surface. Back then, geologists relied on visual observations from scouting missions to determine if the lunar lander would sink or kick up dust, or whether the lunar surface was dense or powdery.

“Like the early lunar missions, we’ll be doing the same thing, only this time examining the characteristics of the Martian soil,” Stewart says. “We’ll be exposing fresh material to learn the mineralogy and composition.”

Original Source: Cornell News Release

Chandra Observes Supernova Remnant

Image credit: Chandra

A new image released from the Chandra X-Ray Observatory shows a glowing shell of gas created by the explosion of a massive star. The supernova remnant is called N63A, and thought to be 2,000 to 5,000 years old; it’s located in the Large Magellanic Cloud. A comparison of this image with optical and radio observations show that the shockwave of material is engulfing the entire cloud of material, heating it up to ten million degrees Celsius.

Chandra has imaged the glowing shell created by the destruction of a massive star. X-rays from Chandra (blue), combined with optical (green) and radio (red) data, reveal new details in the supernova remnant known as N63A, located in the nearby galaxy of the Large Magellanic Cloud.

The X-ray glow is from material heated to about ten million degrees Celsius by a shock wave generated by the supernova explosion. The age of the remnant is estimated to be in the range of 2,000 to 5,000 years.

Optical and radio light are brightest in the central region of the remnant, which appears as a triangular-shaped “hole” in the X-ray image. The hole is produced by absorption of X-rays in a dense cloud of cooler gas and dust on the side of the remnant nearest the Earth. A comparison of the X-ray image with the radio and optical images suggests that the shock wave is engulfing this massive cloud, so we see only the edge nearest the Earth. Collisions such as this are thought to trigger the formation of new generations of stars.

The fluffy crescent-shaped X-ray features that appear around the edge of the remnant are thought to be fragments of high-speed matter shot out from the star when it exploded, like shrapnel from a bomb. In the only other supernova remnant (the Vela supernova remnant) where such features have been observed, the crescent shapes are clearly produced by ejecta fragments. An alternative explanation is that they were produced when the shock wave swept over less-massive clouds located several light years away from the site of the explosion.

Original Source: Chandra News Release

Paul Allen is Backing SpaceShipOne

Scaled Composites has confirmed that billionaire Paul Allen is the financial backer for the company’s SpaceShipOne suborbital rocket plane – a rumour that’s been circulating in the space industry for several months. Allen’s announcement coincided with SpaceShipOne’s recent flight test which broke the sound barrier. Allen is the third richest person in the United States with an estimate wealth of $22 billion.

Beagle 2 Separates from Mars Express

Image credit: ESA

The European Space Agency’s Mars Express spacecraft successfully released the British-built Beagle 2 lander this morning, completing a major milestone on its trip to Mars. Mars Express fired a pyrotechnic device which slowly released a spring and separated the two spacecraft. Since Beagle 2 has no propulsion system, controllers have no way of fine-tuning the lander’s flight path. If everything goes as planned, Beagle 2 will enter the planet’s atmosphere on December 25.

This morning, ESA’s Mars Express flawlessly released the Beagle 2 lander that it has been carrying since its launch on 2 June this year. Beagle 2 is now on its journey towards the surface of Mars, where it is expected to land early in the morning of 25 December. Mars Express, Europe’s first mission to Mars, has passed another challenging milestone on its way towards its final destination.

At 9:31 CET, the crucial sequence started to separate the Beagle 2 lander from Mars Express. As data from Mars Express confirm, the pyrotechnic device was fired to slowly release a loaded spring, which gently pushed Beagle 2 away from the mother spacecraft. An image from the on-board visual monitoring camera (VMC) showing the lander drifting away is expected to be available later today.

Since the Beagle 2 lander has no propulsion system of its own, it had to be put on the correct course for its descent before it was released. For this reason, on 16 December, the trajectory of the whole Mars Express spacecraft had to be adjusted to ensure that Beagle 2 would be on course to enter the atmosphere of Mars. This manoeuvre, called ‘retargeting’, was critical: if the entry angle is too steep, the lander could overheat and burn up in the atmosphere; if the angle is too shallow, the lander might skim like a pebble on the surface of a lake and miss its target.

This fine targeting and today’s release were crucial manoeuvres for which ESA’s Ground Control Team at ESOC (European Space Operations Centre) had trained over the past several months. The next major milestone for Mars Express will be the manoeuvre to enter into orbit around Mars. This will happen at 2:52 CET on Christmas morning, when Beagle 2 is expected to land on the surface of Mars.

“Good teamwork by everybody – ESA, industry and the Beagle 2 team – has got one more critical step accomplished. Mars, here comes Europe!” said David Southwood, ESA Director of Science.

Original Source: ESA News Release

The Universe Used to Be More Blue

Image credit: ESO

Although the Universe is currently a beige colour overall, it used to be more blue, according to astronomers with the European Southern Observatory. This was caused by the predominantly hot, young blue stars in the most distant galaxies – astronomers are seeing them when the Universe was only 2.5 billion years old. The astronomers worked out the distance and colour to 300 galaxies which were contained within the Hubble Deep Sky survey, which took a deep look at a region of sky in the southern constellation of Tuscanae.

An international team of astronomers [1] has determined the colour of the Universe when it was very young. While the Universe is now kind of beige, it was much bluer in the distant past, at a time when it was only 2,500 million years old.

This is the outcome of an extensive and thorough analysis of more than 300 galaxies seen within a small southern sky area, the so-called Hubble Deep Field South. The main goal of this advanced study was to understand how the stellar content of the Universe was assembled and has changed over time.

Dutch astronomer Marijn Franx, a team member from the Leiden Observatory (The Netherlands), explains: “The blue colour of the early Universe is caused by the predominantly blue light from young stars in the galaxies. The redder colour of the Universe today is caused by the relatively larger number of older, redder stars.”

The team leader, Gregory Rudnick from the Max-Planck Institut f?r Astrophysics (Garching, Germany) adds: “Since the total amount of light in the Universe in the past was about the same as today and a young blue star emits much more light than an old red star, there must have been significantly fewer stars in the young Universe than there is now. Our new findings imply that the majority of stars in the Universe were formed comparatively late, not so long before our Sun was born, at a moment when the Universe was around 7,000 million years old.”

These new results are based on unique data collected during more than 100 hours of observations with the ISAAC multi-mode instrument at ESO’s Very Large Telescope (VLT), as part of a major research project, the Faint InfraRed Extragalactic Survey (FIRES). The distances to the galaxies were estimated from their brightness in different optical near-infrared wavelength bands.

Observing the early Universe
It is now well known that the Sun was formed some 4.5 billion years ago. But when did most of the other stars in our home Galaxy form? And what about stars in other galaxies? These are some of the key questions in present-day astronomy, but they can only be answered by means of observations with the world’s largest telescopes.

One way to address these issues is to observe the very young Universe directly – by looking back in time. For this, astronomers take advantage of the fact that light emitted by very distant galaxies travels a long time before reaching us. Thus, when astronomers look at such remote objects, they see them as they appeared long ago.

Those remote galaxies are extremely faint, however, and these observations are therefore technically difficult. Another complication is that, due to the expansion of the Universe, light from those galaxies is shifted towards longer wavelengths [2], out of the optical wavelength range and into the infrared region.

In order to study those early galaxies in some detail, astronomers must therefore use the largest ground-based telescopes, collecting their faint light during very long exposures. In addition they must use infrared-sensitive detectors.

Telescopes as giant eyes
The “Hubble Deep Field South (HDF-S)” is a very small portion of the sky in the southern constellation Tucanae (“the Toucan”). It was selected for very detailed studies with the Hubble Space Telescope (HST) and other powerful telescopes. Optical images of this field obtained by the HST represent a total exposure time of 140 hours. Many ground-based telescopes have also obtained images and spectra of objects in this sky area, in particular the ESO telescopes in Chile.

A sky area of 2.5 x 2.5 arcmin2 in the direction of HDF-S was observed in the context of a thorough study (the Faint InfraRed Extragalactic Survey; FIRES, see ESO PR 23/02). It is slightly larger than the field covered by the WFPC2 camera on the HST, but still 100 times smaller than the area subtended by the full moon.

Whenever this field was visible from the ESO Paranal Observatory and the atmospheric conditions were optimal, ESO astronomers pointed the 8.2-m VLT ANTU telescope in this direction, taking near-infrared images with the ISAAC multi-mode instrument. Altogether, the field was observed for more than 100 hours and the resulting images (see ESO PR 23/02), are the deepest ground-based views in the near-infrared Js- and H-bands. The Ks-band image is the deepest ever obtained of any sky field in this spectral band, whether from the ground or from space.

These unique data provide an exceptional view and have now allowed unprecedented studies of the galaxy population in the young Universe. Indeed, because of the exceptional seeing conditions at Paranal, the data obtained with the VLT have an excellent image sharpness (a “seeing” of 0.48 arcsec) and can be combined with the HST optical data with almost no loss of quality.

A bluer colour
The astronomers were able to detect unambiguously about 300 galaxies on these images. For each of them, they measured the distance by determining the redshift [2]. This was done by means of a newly improved method that is based on the comparison of the brightness of each object in all the individual spectral bands with that of a set of nearby galaxies.

In this way, galaxies were found in the field with redshifts as high as z = 3.2, corresponding to distances around 11,500 million light-years. In other words, the astronomers were seeing the light of these very remote galaxies as they were when the Universe was only about 2.2 billion year old.

The astronomers next determined the amount of light emitted by each galaxy in such a way that the effects of the redshift were “removed”. That is, they measured the amount of light at different wavelengths (colours) as it would have been recorded by an observer near that galaxy. This, of course, only refers to the light from stars that are not heavily obscured by dust.

Summing up the light emitted at different wavelengths by all galaxies at a given cosmic epoch, the astronomers could then also determine the average colour of the Universe (the “cosmic colour”) at that epoch. Moreover, they were able to measure how that colour has changed, as the Universe became older.

They conclude that the cosmic colour is getting redder with time. In particular, it was much bluer in the past; now, at the age of nearly 14,000 million years, the Universe has a kind of beige colour.

When did stars form ?
The change of the cosmic colour with time may be interesting in itself, but it is also an essential tool for determining how rapidly stars were assembled in the Universe.

Indeed, while the star-formation in individual galaxies may have complicated histories, sometimes accelerating into true “star-bursts”, the new observations – now based on many galaxies – show that the “average history” of star-formation in the Universe is much simpler. This is evident by the observed, smooth change of the cosmic colour as the Universe became older.

Using the cosmic colour the astronomers were also able to determine how the mean age of relatively unobscured stars in the Universe changed with time. Since the Universe was much bluer in the past than it is now, they concluded that the Universe is not producing as many blue (high mass, short-lived) stars now as it was earlier, while at the same time the red (low mass, long-lived) stars from earlier generations of star formation are still present. Blue, massive stars die more quickly than red, low-mass stars, and therefore as the age of a group of stars increases, the blue short-lived stars die and the average colour of the group becomes redder. So did the Universe as a whole.

This behaviour bears some resemblance with the ageing trend in modern Western countries where less babies are born than in the past and people live longer than in the past, with the total effect that the mean age of the population is rising.

The astronomers determined how many stars had already formed when the Universe was only about 3,000 million years old. Young stars (of blue colour) emit more light than older (redder) stars. However, since there was just about as much light in the young Universe as there is today – although the galaxies are now much redder – this implies that there were fewer stars in the early Universe than today. The present study inidcates that there were ten times fewer stars at that early time than there is now.

Finally, the astronomers found that roughly half of the stars in the observed galaxies have been formed after the time when the Universe was about half as old (7,000 million years after the Big Bang) as it is today (14,000 million years).

Although this result was derived from a study of a very small sky field, and therefore may not be completely representative of the Universe as a whole, the present result has been shown to hold in other sky fields.

Original Source: ESO News Release