Warp Drives Probably Impossible After All

No warp speed ahead

[/caption]

Just when I was getting excited about the possibility of travelling to distant worlds, scientists have uncovered a deep flaw with faster-than-light-speed travel. There appears to be a quantum limit on how fast an object can travel through space-time, regardless of whether we are able to create a bubble in space-time or not…

First off, we have no clue about how to generate enough energy to create a “bubble” in space-time. This idea was first put on a scientific grounding Michael Alcubierre from the University of Mexico in 1994, but before that was only popularized by science fiction universes such as Star Trek. However, to create this bubble we need some form of exotic matter fuel some hypothetical energy generator to output 1045 Joules (according to calculations by Richard K. Obousy and Gerald Cleaver in the paper “Putting the Warp into Warp Drive“). Physicists are not afraid of big numbers, and we are not afraid of words like “hypothetical” and “exotic”, but to put this energy in perspective, we would need to turn all of Jupiter’s mass into energy to even hope to distort space-time around an object.

This is a lot of energy.

If a sufficiently advanced human race could generate this much energy, I would argue that we would be masters of our Universe anyway, who would need warp drive when we could just as well create wormholes, star gates or access parallel universes. Yes, warp drive is science fiction, but it’s interesting to investigate this possibility and open up physical scenarios where warp drive might work. Let’s face it, anything less than light-speed travel is a real downer for our potential to travel to other star systems, so we need to keep our options open, not matter how futuristic.

The space-time bubble. Unfortunately, quantum physics may have the final word (Michael Alcubierre)
The space-time bubble. Unfortunately, quantum physics may have the final word (Richard K Obousy & Gerald Cleaver, 2008)
Although warp speed is highly theoretical, at least it is based on some real physics. It’s a mix of superstring and multi-dimensional theory, but warp speed seems to be possible, assuming a vast supply of energy. If we can “simply” squash the tightly curled extra-dimensions (greater than the “normal” four we live in) in front of a futuristic spacecraft and expand them behind, a bubble of stationary space will be created for the spacecraft to reside in. This way, the spaceship isn’t travelling faster than light inside the bubble, the bubble itself is zipping through the fabric of space-time, facilitating faster-than-light-speed travel. Easy.

Not so fast.

According to new research on the subject, quantum physics has something to say about our dreams of zipping through space-time faster than c. What’s more, Hawking radiation would most likely cook anything inside this theoretical space-time bubble anyway. The Universe does not want us to travel faster than the speed of light.

On one side, an observer located at the center of a superluminal warp-drive bubble would generically experience a thermal flux of Hawking particles,” says Stefano Finazzi and co-authors from the International School for Advanced Studies in Trieste, Italy. “On the other side, such Hawking flux will be generically extremely high if the exotic matter supporting the warp drive has its origin in a quantum field satisfying some form of Quantum Inequalities.”

In short, Hawking radiation (usually associated with the radiation of energy and therefore loss of mass of evaporating black holes) will be generated, irradiating the occupants of the bubble to unimaginably high temperatures. The Hawking radiation will be generated as horizons will form at the front and rear of the bubble. Remember those big numbers physicists aren’t afraid of? Hawking radiation is predicted to roast anything inside the bubble to a possible 1030K (the maximum possible temperature, the Planck temperature, is 1032K).

Even if we could overcome this obstacle, Hawking radiation appears to be symptomatic of an even bigger problem; the space-time bubble would be unstable, on a quantum level.

Most of all, we find that the RSET [renormalized stress-energy tensor] will exponentially grow in time close to, and on, the front wall of the superluminal bubble. Consequently, one is led to conclude that the warp-drive geometries are unstable against semiclassical back-reaction,” Finazzi adds.

However, if you wanted to create a space-time bubble for subluminal (less-than light speed) travel, no horizons form, and therefore no Hawking radiation is generated. In this case, you might not be beating the speed of light, but you do have a fast, and stable way of getting around the Universe. Unfortunately we still need “exotic” matter to create the space-time bubble in the first place…

Sources: “Semiclassical instability of dynamical warp drives,” Stefano Finazzi, Stefano Liberati, Carlos Barceló, 2009, arXiv:0904.0141v1 [gr-qc], “Investigation into Compactified Dimensions: Casimir Energies and Phenomenological Aspects,” Richard K. Obousy, 2009, arXiv:0901.3640v1 [gr-qc]

Via: The Physics arXiv Blog

Astrophysics Satellite Detects Dark Matter Clue?

[/caption]

An international collaboration of astronomers is reporting an unusual spike of atmospheric particles that could be a long-sought signature of dark matter.

The orbiting PAMELA satellite, an astro physics mission operated by Italy, Russia, Germany and Sweden, has detected a  glut of positrons — antimatter counterparts to electrons — in the energy range theorized to be associated with the decay of dark matter. The results appear in this week’s issue of the journal Nature.

Dark matter is the unseen substance that accounts for most of the mass of our universe, and the presence of which can be inferred from gravitational effects on visible matter. When dark matter particles are annihilated after contact with anti-matter, they should yield a variety of subatomic particles, including electrons and positrons.

Antiparticles account for a small fraction of cosmic rays and are also known to be produced in interactions between cosmic-ray nuclei and atoms in the interstellar medium, which is referred to as a ‘secondary source.” 

Previous statistically limited measurements of the ratio of positron and electron fluxes have been interpreted as evidence for a primary source for the positrons, as has an increase in the total electron-positron flux at energies between 300 and 600 GeV. Primary sources could include pulsars, microquasars or dark matter annihilation. 

Lead study author Oscar Adriani, an astrophysics researcher at the University of Florence in Italy, and his colleagues are reporting a positron to electron ratio that systematically increases in a way that could indicate dark matter annihilation.

The new paper reports a measurement of the positron fraction in the energy range 1.5–100GeV.

“We find that the positron fraction increases sharply over much of that range, in a way that appears to be completely inconsistent with secondary sources,” the authors wrote in the Nature paper. “We therefore conclude that a primary source, be it an astrophysical object or dark matter annihilation, is necessary.” Another feasible source for the anitmatter particles, besides dark matter annihilation, could be a pulsar, they note.

PAMELA, which stands for a Payload for Antimatter Matter Exploration and Light Nuclei Astrophysics, was launched in June 2006 and initially slated to last three years. Mission scientists now say it will continue to collect data until at least December 2009, which will help pin down whether the positrons are coming from dark matter anihilation or a single, nearby source.

Source: Nature (there is also an arXiv/astro-ph version here)

New Particle Throws Monkeywrench in Particle Physics

[/caption]

The hits just keep on coming from Department of Energy’s Fermi National Accelerator Laboratory. So far this month, the lab has announced the discovery of a rare single top quark, and then narrowed the gaptwice, actually — for the mass of the elusive Higgs Boson particle, or “God particle,” thought to give all other particles their mass. 

Now, scientists have detected a new, completely untheorized particle that challenges what physicists thought they knew about how quarks combine to form matter. They’re calling it Y(4140), reflecting its measured mass of 4140 Mega-electron volts. 

“It must be trying to tell us something,” said Jacobo Konigsberg of the University of Florida, a spokesman for Fermilab’s collider detector team. “So far, we’re not sure what that is, but rest assured we’ll keep on listening.”

particles
The Standard Model of elementary particles and forces includes six quarks, which bind together to form composite particles. Credit: Fermilab

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Matter as we know it comprises building blocks called quarks. Quarks fit together in various well-established ways to build other particles: mesons, made of a quark-antiquark pair, and baryons, made of three quarks. 

But recently, electron-positron colliders at Stanford’s SLAC National Accelerator Laboratory and the Japanese laboratory KEK have revealed examples of composite quark structures — named X and particles — that are not the usual mesons and baryons. And now, the Collider Detector at Fermilab (CDF) collaboration has found evidence for the Y(4140) particle.

The Y(4140) particle decays into a pair of other particles, the J/psi and the phi, suggesting to physicists that it might be a composition of charm and anticharm quarks. However, the characteristics of this decay do not fit the conventional expectations for such a make-up. Other possible interpretations beyond a simple quark-antiquark structure are hybrid particles that also contain gluons, or even four-quark combinations.

The Fermilab scientists observed Y(4140) particles in the decay of a much more commonly produced particle containing a bottom quark, called the B+ meson. Sifting through trillions of proton-antiproton collisions from Fermilab’s Tevatron, they identified a small sampling of B+ mesons that decayed in an unexpected pattern. Further analysis showed that the B+ mesons were decaying into Y(4140).

The Y(4140) particle is the newest member of a family of particles of similar unusual characteristics observed in the last several years by experimenters at Fermilab’s Tevatron as well as at KEK and the SLAC lab, which operates at Stanford through a partnership with the U.S. Department of Energy.

“We congratulate CDF on the first evidence for a new unexpected Y state that decays to J/psi and phi,” said Japanese physicist Masanori Yamauchi, a KEK spokesperson. “This state may be related to the Y(3940) state discovered by Belle and might be another example of an exotic hadron containing charm quarks. We will try to confirm this state in our own Belle data.”

Theoretical physicists are trying to decode the true nature of these exotic combinations of quarks that fall outside our current understanding of mesons and baryons. Meanwhile, experimentalists happily continue to search for more such particles.

“We’re building upon our knowledge piece by piece,” said Fermilab spokesperson Rob Roser, “and with enough pieces, we’ll understand how this puzzle fits together.”

The Y(4140) observation is the subject of an article submitted by CDF to Physical Review Letters this week. Besides announcing Y(4140), the CDF experiment collaboration is presenting more than 40 new results at the Moriond Conference on Quantum Chromodynamics in Europe this week, including the discovery of electroweak top-quark production and a new limit on the Higgs boson, in concert with experimenters from Fermilab’s DZero collaboration. 

Source: Fermilab

Fermilab Putting the Squeeze on Higgs Boson

The Standard Model describes the interactions of fundamental particles. The W boson, the carrier of the electroweak force, has a mass that is fundamentally relevant for many predictions, from the energy emitted by our sun to the mass of the elusive Higgs boson. Credit: Fermilab

[/caption]

Scientists at the Department of Energy’s Fermi National Accelerator Laboratory have achieved the world’s most precise measurement of the mass of the W boson by a single experiment. Combined with other measurements, a tighter understanding of the W boson mass will also lead researchers closer to the mass of the elusive Higgs boson particle.

The Higgs particle is a theoretical but as yet unseen particle, also called the “God particle,” that is believed to give other particles their mass. The W boson, which is about 85 times heavier than a proton, enables radioactive beta decay and makes the sun shine. 

Today’s announcement marks the second major discovery in a week for the international DZero collaboration at Fermilab. Earlier this week, the group announced the production of a single top quark at Fermilab’s Tevatron collider. 

dzerodetector00-0010_hr
For the W mass precision measurement, the DZero collaboration analyzed about 500,000 decays of W bosons into electrons and neutrinos and determined the particle's mass with a precision of 0.05 percent. Credit: Fermilab

DZero is an international experiment of about 550 physicists from 90 institutions in 18 countries. It is supported by the U.S. Department of Energy, the National Science Foundation and a number of international funding agencies. In the last year, the collaboration has published 46 scientific papers based on measurements made with the DZero particle detector.

The W boson is a carrier of the weak nuclear force and a key element of the Standard Model of elementary particles and forces, which also predicts the Higgs boson. Its  exact mass is crucial for calculations  to estimate the likely mass of the Higgs boson by studying its subtle quantum effects on the W boson and the top quark, an elementary particle that was discovered at Fermilab in 1995.

Scientists working on the DZero experiment now have measured the mass of the W boson with a precision of 0.05 percent. The exact mass of the particle measured by DZero is 80.401 +/- 0.044 GeV/c^2. The collaboration presented its result at the annual conference on Electroweak Interactions and Unified Theories known as Rencontres de Moriond on Sunday.

“This beautiful measurement illustrates the power of the Tevatron as a precision instrument and means that the stress test we have ordered for the Standard Model becomes more stressful and more revealing,” said Fermilab theorist Chris Quigg.

The DZero team determined the W mass by measuring the decay of W bosons to electrons and electron neutrinos. Performing the measurement required calibrating the DZero particle detector with an accuracy around three hundredths of one percent, an arduous task that required several years of effort from a team of scientists including students.

Since its discovery at the European laboratory CERN in 1983, many experiments at Fermilab and CERN have measured the mass of the W boson with steadily increasing precision. Now DZero achieved the best precision by the painstaking analysis of a large data sample delivered by the Tevatron particle collider at Fermilab. The consistency of the DZero result with previous results speaks to the validity of the different calibration and analysis techniques used.

“This is one of the most challenging precision measurements at the Tevatron,” said DZero co-spokesperson Dmitri Denisov, of Fermilab. “It took many years of efforts from our collaboration to build the 5,500-ton detector, collect and reconstruct the data and then perform the complex analysis to improve our knowledge of this fundamental parameter of the Standard Model.“

Source: Fermilab

Fermilab Scientists Discover Rare Single Top Quark

This proton-antiproton collision, recorded by the DZero collaboration, is among the single top quark candidate events. The top quark decayed and produced a bottom quark jet (b jet), a muon and a neutrino. Credit: DZero collaboration.

[/caption]
Scientists at Fermilab have observed particle collisions that produce single top quarks, a 1 in 20 billion find. This discovery confirms important parameters of particle physics, including the total number of quarks. Previously, top quarks had only been observed when produced by the strong nuclear force. That interaction leads to the production of pairs of top quarks. The production of single top quarks involves the weak nuclear force and is harder to identify experimentally. This observation occurred almost 14 years to the day of the top quark discovery in 1995.

Fermilab’s Tevatron, located near Chicago, Illinois is currently the world’s most powerful operating particle accelerator, and the discovery was made by scientists working on together on collaborations. Scientists say finding single top quarks has significance for the ongoing search for the Higgs particle.

The Fermilab accelerator complex. Credit: Fermilab
The Fermilab accelerator complex. Credit: Fermilab

“Observation of the single top quark production is an important milestone for the Tevatron program,” said Dr. Dennis Kovar, Associate Director of the Office of Science for High Energy Physics at the U.S. Department of Energy. “Furthermore, the highly sensitive and successful analysis is an important step in the search for the Higgs.”

Searching for single-top production makes finding a needle in a haystack look easy. Only one in every 20 billion proton-antiproton collisions produces a single top quark. Even worse, the signal of these rare occurrences is easily mimicked by other “background” processes that occur at much higher rates.
Discovering the single top quark production presents challenges similar to the Higgs boson search in the need to extract an extremely small signal from a very large background. Advanced analysis techniques pioneered for the single top discovery are now in use for the Higgs boson search. In addition, the single top and the Higgs signals have backgrounds in common, and the single top is itself a background for the Higgs particle.

To make the single-top discovery, physicists of the CDF and DZero collaborations spent years combing independently through the results of proton-antiproton collisions recorded by their experiments, respectively.

CDF is an international experiment of 635 physicists from 63 institutions in 15 countries. DZero is an international experiment conducted by 600 physicists from 90 institutions in 18 countries.

The CDF detector, about the size of a 3-story house, weighs about 6,000 tons.  Credit: Fermilab
The CDF detector, about the size of a 3-story house, weighs about 6,000 tons. Credit: Fermilab

Each team identified several thousand collision events that looked the way experimenters expect single top events to appear. Sophisticated statistical analysis and detailed background modeling showed that a few hundred collision events produced the real thing. On March 4, the two teams submitted their independent results to Physical Review Letters.

The two collaborations earlier had reported preliminary results on the search for the single top. Since then, experimenters have more than doubled the amount of data analyzed and sharpened selection and analysis techniques, making the discovery possible. For each experiment, the probability that background events have faked the signal is now only one in nearly four million, allowing both collaborations to claim a bona fide discovery that paves the way to more discoveries.

“I am thrilled that CDF and DZero achieved this goal,” said Fermilab Director Pier Oddone. “The two collaborations have been searching for this rare process for the last fifteen years, starting before the discovery of the top quark in 1995. Investigating these subatomic processes in more detail may open a window onto physics phenomena beyond the Standard Model.”

Source: Fermilab

Is There a Mysterious Black Hole Constant?

Space-time warping as a small black hole orbits a larger black hole (Don Davis)

[/caption]If you found yourself in the unfortunate situation of orbiting a black hole, you may be in for a rather dizzying and unpredictable ride. If the black hole is spinning, it will flatten out under centrifugal forces, much like the Earth bulges slightly at the equator, but the black hole’s bulge will be radically greater. As the shape of the black hole changes, so does its gravitational profile.

As you are not orbiting a spherical black hole, you can no longer expect to have a boring, predictable orbit; your orbit will become wild and chaotic, seemingly random. However, it would appear that there is an underlying constant to the mayhem, and what’s more, it seems this constant has also been observed in a more pedestrian system: a three-body Newtonian system. So what’s the link? Physicists aren’t quite sure

When a massive star exhausts its fuel, it may collapse in on itself to create a black hole (after some exciting supernova action). The angular momentum of the original star is expected to be preserved, producing a rapidly spinning black hole. If the black hole “has no hair” (i.e. it has no electrical charge), the gravitational field solely depends on its mass and spin. If there is deformation due to the spin, the gravitational field changes, sending any orbiting body (like a neutron star) on a crazy roller-coaster ride.

In a new paper by Clifford Will of Washington University in St. Louis, the excited physicist describes the scenario. “The orbits go wild — they gyrate and spin, they’re incredibly complex. It’s fantastic,” Will says.

However, physicist Brandon Carter discovered a mathematical constant back in 1968, showing these apparently chaotic orbits are predictable, and that it even applies to orbits around extremely warped space-time. “Black holes have this extra constant that restores the regularity of the orbits,” comments Saul Teukolsky of Cornell University. “It’s a mystery. Every other situation where we have these extra constants, we have symmetry. But there’s no symmetry for an orbiting black hole — that’s why it is regarded as a miracle.”

Quite simply, physicists have no idea why the Carter constant could arise from the General Relativity description of a spinning black hole. Now, to make the problem even more perplexing, Will carried out a classical (Newtonian) 2-body simulation with a third body orbiting. Again, the same constant appeared. It would appear that there is something special about the predictability of an orbit around this black hole configuration.

Teukolsky, who worked on similar problems for his Ph.D. in 1970, remains baffled by these results. However, Will continues to investigate the problem, by including a term for black hole frame dragging. In this situation, the spinning black hole will drag space-time around it, “creases” (or ripples) in space time being pulled with the direction of spin. In this case, the Carter constant disappears, only to return when higher order terms are added to the equations.

This all means one of two things. Either it is simply an artefact in the mathematics, a curiosity that will eventually be rooted out of the equations. However, there is a tantalising possibility that we are seeing a characteristic of exotic rotating black holes, where the configuration of the surrounding fabric of space-time can allow a predictable orbit to come out of the apparent chaos…

Source: Science News

Here’s an article about black body radiation.

Powerful Fusion Laser to Recreate Conditions Inside Exoplanets

A powerful laser could create the conditions inside a giant exoplanet (Sunbeamtech)

[/caption]We’ve all heard that the Large Hadron Collider (LHC) will collide particles together at previously unimaginable energies. In doing so, the LHC will recreate the conditions immediately after the Big Bang, thereby allowing us to catch a glimpse of what particles the Universe would have been filled with at this time. In a way, the LHC will be a particle time machine, allowing us to see the high energy conditions last seen immediately after the Big Bang, 13.7 billion years ago.

So, if we wanted to understand the conditions inside a giant exoplanet, how could we do it? We can’t directly measure it ourselves, we have to create a laboratory experiment that could recreate the conditions in the core of one of these huge exoplanet gas giants. Much like the LHC will recreate the conditions of the Big Bang, a powerful laser intended to kick-start fusion reactions will be used in an effort to help scientists have a very brief look into the cores of these distant worlds…

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory in California is ready for action. The facility will perform fusion experiments, hopefully making a self-sustaining nuclear fusion reaction a reality using an incredibly powerful laser (firing at a hydrogen isotope fuel). Apart from the possibility of finding a way to kick-start a viable fusion energy source (other laboratories have tried, but only sustained fusion for an instant before fizzing out), the results from the laser tests will aid the management of the US nuclear weapon stockpile (since there have been no nuclear warhead tests in 15 years, data from the experiments may help the military deduce whether or not their bombs still work).

Fusion energy and nuclear bombs to one side, there is another use for the laser. It could be used to recreate the crushing pressures inside a massive exoplanet so we can glean a better understanding of what happens to matter at these crushing depths.

The NIF laser can deliver 500 trillion watts in a 20-nanosecond burst, which may not sound very long, but the energy delivered is immense. Raymond Jeanloz, an astronomer at the University of California, Berkeley, will have the exciting task of using the laser, aiming it at a small iron sample (800 micrometres in diameter), allowing him to generate a moment where pressures exceed a billion times atmospheric pressure. That’s 1000 times the pressure of the centre of the Earth.

On firing the laser, the heat will vaporize the iron, blasting a jet of gas so powerful, it will send a shock wave through the metal. The resulting compression is what will be observed and measured, revealing how the metal’s crystalline structure and melting point change at these pressures. The results from these tests will hopefully shed some light on the formation of the hundreds of massive exoplanets discovered in the last two decades.

The chemistry of these planets is completely unexplored,” says Jeanloz. “It’s never been accessible in the laboratory before.”

Now that is one impressive laboratory experiment

Source: New Scientist

Repaired too Late? Tevatron May Beat LHC in Hunt for Higgs Boson

The CDF detector at Fermilab's Tevatron accelerator (Fermilab)

[/caption]The Large Hadron Collider (LHC) is billed as the next great particle accelerator that will give us our best chance yet at discovering the illusive exchange particle (or boson) of the Higgs field. The discovery (or not) of the Higgs boson will answer so many questions about our universe, and our understanding of the quantum world could be revolutionized.

But there’s a problem. The LHC isn’t scheduled for restart until September 2009 (a full year after the last attempt) and particle collisions aren’t expected until October. Even then, high energy collisions won’t be likely until 2010, leaving the field wide open for competing accelerator facilities to redouble their efforts at making this historic discovery before the LHC goes online.

The Tevatron, at Fermi National Accelerator Laboratory (Fermilab) in Illinois, is currently the most powerful accelerator in the world and has refined high energy particle collisions so much, that scientists are estimating there is a 50% chance of a Higgs boson discovery by the end of 2009

If this was a USA vs. Europe competition to discover the Higgs particle, the Tevatron would have a clear advantage. Although it’s old (the first configuration was completed in 1984), and set to be superseded by the LHC in 2010, the Tevatron is a proven particle accelerator with an impressive track record. Accelerator techniques and technology have been refined, making high energy hadron collisions routine. However, Fermilab scientists are keen to emphasise that they aren’t trying to beat the LHC in the search for the Higgs boson.

We’re not racing CERN,” said Fermilab Director, Pier Oddone. He points out that there is a lot of collaborative work between Fermilab and CERN, therefore all scientists, no matter which continent they are on, are all working toward a common goal. In reality, I doubt this is the case. When searching for one of the most coveted prizes in modern quantum physics, it’s more of a case of ‘every lab for itself.’ Scientists in Fermilab have confirmed this, saying they are “working their tails off” analysing data from the Tevatron.

Indirectly, we’re helping them,” says Dmitri Denisov, DZero (one of the Tevatron’s detectors) spokesman, of his European competition. “They’re definitely feeling the heat and working a little harder.”

For the Standard Model to be complete, the Higgs particle must be found. If it does exist, physicists have put upper and lower bounds on its possible mass. Standing at a value between 114 and 184 GeV, this is well within the sensitivity of the Tevatron detectors. It should be a matter of time until the Higgs particle is discovered and physicists have calculated that if the Higgs particle can be created during a Tevatron high-energy proton-antiproton collision. They even give the Tevatron a 50:50 chance of a Higgs particle discovery by the New Year.

Last summer, both key particle experiments (CDF and DZero) focused on detecting Higgs particles with a mass of 170 GeV (at this value a particle would be easier to detect from the background noise). However, no Higgs particles were detected. Now physicists will expand the search above and below this value. Therefore, if the Higgs boson exists, it would be useful if it has a mass as close as possible to 170 GeV. Estimates suggest a 150 GeV Higgs boson could be discovered as early as this summer, well before the LHC has even been repaired. If the mass of the Higgs boson is around the 120 GeV mark, it might take Tevatron scientists until 2010 to verify whether a Higgs boson has been detected.

Source: New Scientist

Stellar Jets are Born Knotted

Herbig Haro object HH47 (a stellar jet), observed with the Hubble Space Telescope

[/caption]

Some of the most beautiful structures observed in the Universe are the intricate jets of supersonic material speeding away from accreting stars, such as young proto-stars and stellar mass black holes. These jets are composed of highly collimated gas, rapidly accelerated and ejected from circumstellar accretion disks. The in-falling gas from the disks, usually feeding the black hole or hungry young star, is somehow redirected and blown into the interstellar medium (ISM).

Much work is being done to understand how accretion disk material is turned into a rapid outflow, forming an often knotted, clumpy cloud of outflowing gas. The general idea was that the stellar jet is ejected in a steady flow (like a fire hose), only for it to interact with the surrounding ISM, breaking up as it does so. However, a unique collaboration between plasma physicists, astronomers and computational scientists may have uncovered the true nature behind these knotted structures. They didn’t become knotted, they were born that way

The predominant theory says that jets are essentially fire hoses that shoot out matter in a steady stream, and the stream breaks up as it collides with gas and dust in space—but that doesn’t appear to be so after all,” said Adam Frank, professor of astrophysics at the University of Rochester, and co-author of the recent publication. According to Frank, the exciting results uncovered by the international collaboration suggest that far from being a steady stream of gas being ejected from the circumstellar accretion disk, the jets are “fired out more like bullets or buckshot.” It is therefore little wonder that the vast stellar jets appear twisted, knotted and highly structured.

A member of the collaboration, Professor Sergey Lebedev and his team at the Imperial College London, made an attempt to replicate the physics of a star in the laboratory, and the experiment matched the known physics of stellar jets very well. The pioneering work by Lebedev is being lauded a possibly the “best” astrophysical experiment that’s ever been carried out.

Using an aluminium disk, Lebedev applied a high-powered pulse of energy to it. Within the first few billionths of a second, the aluminium began to evaporate, generating a small cloud of plasma. This plasma became an accretion disk analogue, a microscopic equivalent of the plasma being dragged into a proto-star. In the centre of the disk, the aluminium had eroded completely, creating a hole. Through this hole, a magnetic field, being applied below the disk, could penetrate through.

It would appear that the dynamics of the magnetic field interacting with the plasma accurately depicts the observed characteristics of extended stellar jets. At first, the magnetic field pushes the plasma aside around the disk’s hole, but its structure evolves by creating a bubble, then twisting and warping, forming a knot in the plasma jet. Then, a very important event occurs; the initial magnetic “bubble” pinches off and is propelled away. Another magnetic bubble forms to continue the process all over again. These dynamic processes cause packets of plasma to be released in bursts and not in the steady, classical “fire hose” manner.

We can see these beautiful jets in space, but we have no way to see what the magnetic fields look like,” says Frank. “I can’t go out and stick probes in a star, but here we can get some idea—and it looks like the field is a weird, tangled mess.”

By shrinking this cosmic phenomenon into a laboratory experiment, the investigators have shed some light on the possible mechanism driving the structure of stellar jets. It appears that magnetic processes, not ISM interactions, shape the knotted structure of stellar jets when they born, not after they have evolved.

Source: EurekAlert

The Journey of Space Exploration: Ex-Astronaut Views on NASA

Why has "one small step for man" turned into "one giant leap backward" for NASA? (NASA)

[/caption]It reads like the annual progress report from my first year in university. He lacks direction, he’s not motivated and he has filled his time with extra-curricular activities, causing a lack of concentration in lectures. However, it shouldn’t read like an 18 year-old’s passage through the first year of freedom; it should read like a successful, optimistic and inspirational prediction about NASA’s future in space.

What am I referring to? It turns out that the Houston university where President John F. Kennedy gave his historic “We go to the Moon” speech back in 1962 has commissioned a report, recommending that NASA should give up its quest for returning to the Moon and focus more on environmental and energy projects. The reactions of several astronauts from the Mercury, Apollo and Shuttle eras have now been published. The conclusions in the Rice University report may have been controversial, but the reactions of the six ex-astronauts went well beyond that. They summed up the concern and frustration they feel for a space agency they once risked their lives for.

At the end of the day, it all comes down to how we interpret the importance of space exploration. Is it an unnecessary expense, or is it part of scientific endeavour where the technological spin-offs are more important than we think?

John F. Kennedy speaking at Rice University in 1962. How times have changed (NASA)
John F. Kennedy speaking at Rice University in 1962. How times have changed (NASA)
The article published in the Houston Chronicle website (Chron.com) talks about the “surprising reactions” by the six former astronauts questioned about Rice University’s James A. Baker III Institute for Public Policy recommendation for NASA. However, I’d argue that much of what they say is not surprising in the slightest. These men and women were active in the US space agency during some of the most profound and exciting times in space flight history, it is little wonder that they may be a little exacerbated by the current spaceflight problems that are besieging NASA. The suggestion that NASA should give up the Moon for more terrestrial pursuits is a tough pill to swallow, especially for these pioneers of spaceflight.

It is widely accepted that NASA is underfunded, mismanaged and falling short of its promises. Many would argue that this is a symptom of an old cumbersome government department that has lost its way. This could be down to institutional failings, lack of investment or loss of vision, but the situation is getting worse for NASA. Regardless, something isn’t right and now we are faced with a five year gap in US manned spaceflight capability, forcing NASA to buy Russian Soyuz flights. The Shuttle replacement, the Constellation Program, has even been written off by many before it has even carried out the first test launch.

So, from their unique perspective, what do these retired astronauts think of the situation? It turns out that some agree with the report, others are strongly opposed to it, whereas all voice concern for the future of NASA.

Kathryn Thornton, before a Shuttle mission (NASA)
Kathryn Thornton, before a Shuttle mission (NASA)
Walt Cunningham flew aboard Apollo 7 in 1968. It was the first manned mission in the Apollo Program. At an age of 76, Cunningham sees no urgency in going back to the Moon but he is also believes the concerns about global warming are “a great big scam.” His feelings about global warming may be misplaced, but he is acutely aware of the funding issue facing NASA, concerned the agency will “keep sliding downhill” if nothing is done.

Four-time Shuttle astronaut Kathryn Thornton, agrees that the agency is underfunded and overstretched and dubious about the Institute’s recommendation that NASA should focus all its attention on environmental issues for four years. “I find it hard to believe we would be finished with the energy and environment issues in four years. If you talk about a re-direction, I think you talk about a permanent re-direction,” Thornton added.

Gene Cernan, commander of the 1972 Apollo 17 mission, believes that space exploration is essential to inspire the young and invigorate the educational system. He is shocked by the Institute’s recommendation to pull back on space exploration. The 74 year old was the last human to walk on the Moon and he believes NASA shouldn’t be focused on ways to save the planet, other agencies and businesses can do that.

It just blows my mind what they would do to an organization like NASA that was designed and built to explore the unknown.” — Gene Cernan

Apollo astronaut Gene Cernan covered with moon dust (NASA)
Apollo astronaut Gene Cernan covered with moondust (NASA)
John Glenn, first US astronaut to orbit the Earth and former senator, is appalled at the suggestion of abandoning projects such as the International Space Station. Although Glenn, now 87, agrees with many of the points argued in the report, he said, “We have a $115 billion investment in the most unique laboratory ever put together, and we are cutting out the ability to do research that may have enormous value to everybody right here on the Earth? This is folly.”

Sally Ride, 57, a physicist and the first American woman to fly into space believes the risky option of extending the life of the Shuttle should be considered to allow US manned access to the space station to continue. The greater risk of being frozen out of the outpost simply is not an option. However, she advocates the report’s suggestion that NASA should also focus on finding solutions to climate change. “It will take us awhile to dig ourselves out,” she said. “But the long-term challenge we have is solving the predicament we have put ourselves in with energy and the environment.”

Franklin Chang Diaz, who shares world’s record for the most spaceflights (seven), believes that NASA has been given a very bad deal. He agrees with many of the report’s recommendations, not because the space agency should turn its back on space exploration, it’s because the agency has been put in an impossible situation.

NASA has moved away from being at the edge of high tech and innovation,” said Chang Diaz. “That’s a predicament NASA has found itself in because it had to carry out a mission to return humans to the moon by a certain time (2020) and within a budget ($17.3 billion for 2008). It’s not possible.”

In Conclusion

This discussion reminds me of a recent debate not about space exploration, but another science and engineering endeavour here on Earth. The Large Hadron Collider (LHC) has its critics who will argue that this $5 billion piece of kit is not worth the effort, where the money spent on accelerating particles could be better spent on finding solutions for climate change, or a cure for cancer.

You did NOT just say that! Brian Cox's expression says it all... (still from the BBC's Newsnight program)
You did NOT just say that! Brian Cox's expression says it all... (still from the BBC's Newsnight program)
In a September 2008 UK televised debate on BBC Newsnight between Sir David King (former Chief Scientific Advisor for the UK government) and particle physicist Professor Brian Cox, King questioned the the importance of the science behind the LHC. By his limited reasoning, the LHC was more “navel-searching”, “curiosity-driven” research with little bearing on the advancement of mankind. In King’s view the money would be better spent on finding solutions to known problems, such as climate change. It is fortunate Brian Cox was there to set the records straight.

Prof. Cox explained that the science behind the LHC is “part of a journey” where the technological spin-offs and the knowledge gained from such a complex experiment cannot be predicted before embarking on scientific endeavour. Indeed, advanced medical technologies are being developed as a result of LHC research; the Internet may be revolutionized by new techniques being derived from work at the LHC; even the cooling system for the LHC accelerator electromagnets can be adapted for use in fusion reactors.

The point is that we may never fully comprehend what technologies, science or knowledge we may gain from huge experiments such as the LHC, and we certainly don’t know what spin-offs we can derive from continued advancement of space travel technology. Space exploration can only enhance our knowledge and scientific understanding.

If NASA starts pulling back on endeavours in space, taking a more introverted view of finding specific solutions to particular problems (such as finding a solution to climate change at the detriment to space exploration, as suggested by the Rice University report), we may never fully realise our potential as a race, and many of the problems here on Earth will never be solved…

Sources: Chron.com, Astroengine.com