[/caption]So how do you take the temperature of one of the most exotic objects in the Universe? A neutron star (~1.35 to 2.1 solar masses, measuring only 24 km across) is the remnant of a supernova after a large star has died. Although they are not massive enough become a black hole, neutron stars still accrete matter, pulling gas from a binary partner, often undergoing prolonged periods of flaring.
Fortunately, we can observe X-ray flares (using instrumentation such as Chandra), but it isn’t the flare itself that can reveal the temperature or structure of a neutron star.
At the AAS conference last week, details about the results from an X-ray observing campaign of MXB 1659-29, a quasi-persistent X-ray transient source (i.e. a neutron star that flares for long periods), revealed some fascinating insights to the physics of neutron stars, showing that as the crust of a neutron star cools, the crustal composition is revealed and the temperature of these exotic supernova remnants can be measured…
During a flare outburst, neutron stars generate X-rays. These X-ray sources can be measured and their evolution tracked. In the case of MXB 1659-29, Ed Cackett (Univ. of Michigan) used data from NASA’s Rossi X-ray Timing Explorer (RXTE) to monitor the cooling of the neutron star crust after an extended period of X-ray flaring. MXB 1659-29 flared for 2.5 years until it “turned off” in September 2001. Since then, the source was periodically observed to measure the exponential decrease in X-ray emissions.
So why is this important? After a long period of X-ray flaring, the crust of a neutron star will heat up. However, it is thought that the core of the neutron star will remain comparatively cool. When the neutron star stops flaring (as the accretion of gas, feeding the flare, shuts off), the heating source for the crust is lost. During this period of “quiescence” (no flaring), the diminishing X-ray flux from the cooling neutron star crust reveals a huge wealth of information about the characteristics of the neutron star.
During quiescence, astronomers will observe X-rays emitted from the surface of the neutron star (as opposed to the flares), so direct measurements can be made of the neutron star. In his presentation, Cackett examined how the X-ray flux from MXB 1659-29 reduced exponentially and then levelled off at a constant flux. This means the crust cooled rapidly after the flaring, eventually reaching thermal equilibrium with the neutron star core. Therefore, by using this method, the neutron star core temperature can be inferred.
Including the data from another neutron star X-ray transient KS 1731-260, the cooling rates observed during the onset of quiescence suggests these objects have well-ordered crustal lattices with very few impurities. The rapid temperature decrease (from flare to quiescence) took approximately 1.5 years to reach thermal equilibrium with the neutron star core. Further work will now be carried out using Chandra data so more information about these rapidly spinning exotic objects can be uncovered.
Suddenly, neutron stars became a little less mysterious to me in the 10 minute talk last Tuesday, I love conferences…
[/caption]
High altitude balloons are an inexpensive means of getting payloads to the brink of space, where all sorts of great science and astronomy can be done. A new prototype of balloon that uses material as thin as plastic food wrap was successfully checked out in an 11-day test flight, and this new design may usher in a new era of high altitude flight. NASA and the National Science Foundation sponsored the test, which was launched from McMurdo Station in Antarctica. The balloon reached a float altitude of more than 111,000 feet and maintained it for the entire 11 days of flight. It’s hoped that the super-pressure balloon ultimately will carry large scientific experiments to the edge of space for 100 days or more.
The flight tested the durability and functionality of the scientific balloon’s novel globe-shaped design and the unique lightweight and thin polyethylene film. It launched on December 28, 2008 and returned on January 8, 2009.
“Our balloon development team is very proud of the tremendous success of the test flight and is focused on continued development of this new capability to fly balloons for months at a time in support of scientific investigations,” said David Pierce, chief of the Balloon Program Office at NASA’s Wallops Flight Facility at Wallops Island, Va. “The test flight has demonstrated that 100 day flights of large, heavy payloads is a realistic goal.”
This seven-million-cubic-foot super-pressure balloon is the largest single-cell, super-pressure, fully-sealed balloon ever flown. When development concludes, NASA will have a 22 million-cubic-foot balloon that can carry a one-ton instrument to an altitude of more than 110,000 feet, which is three to four times higher than passenger planes fly. Ultra-long duration missions using the super pressure balloon cost considerably less than a satellite and the scientific instruments flown can be retrieved and launched again, making them ideal very-high altitude research platforms.
In addition to the super pressure test flight, two additional long-duration balloons were launched from McMurdo during the 2008-2009 campaign. The University of Maryland’s Cosmic Ray Energetics and Mass, or CREAM IV, experiment launched December 19, 2008, and landed January 6, 2009. The CREAM investigation was used to directly measure high energy cosmic-ray particles arriving at Earth after originating from distant supernova explosions elsewhere in the Milky Way galaxy. The payload for this experiment was refurbished from an earlier flight. The team released data and their findings from their first flight in August 2008.
The University of Hawaii Manoa’s Antarctic Impulsive Transient Antenna launched December 21, 2008, and is still aloft. Its radio telescope is searching for indirect evidence of extremely high-energy neutrino particles possibly coming from outside our Milky Way galaxy.
[/caption]Magnetars are the violent, exotic cousins of the well known neutron star. They emit excessive amounts of gamma-rays, X-rays and possess a powerful magnetic field. Neutron stars also have very strong magnetic fields (although weak when compared with magnetars), conserving the magnetic field of the parent star before it exploded as a supernova. However, the huge magnetic field strength predicted from observations of magnetars is a mystery. Where do magnetars get their strong magnetic fields? According to new research, the answer could lie in the even more mysterious quark star…
It is well known that neutron stars have very strong magnetic fields. Neutron stars, born from supernovae, preserve the angular momentum and magnetism of the parent star. Therefore, neutron stars are extremely magnetic, often rapidly spinning bodies, ejecting powerful streams of radiation from their poles (seen from Earth as a pulsar should the collimated radiation sweep through our field of view). Sometimes, neutron stars don’t behave as they should, ejecting copious amounts of X-rays and gamma-rays, exhibiting a very powerful magnetic field. These strange, violent entities are known as magnetars. As they are a fairly recent discovery, scientists are working hard to understand what magnetars are and how they acquired their strong magnetic field.
Denis Leahy, from the University of Calgary, Canada, presented a study on magnetars at a January 6th session at this week’s AAS meeting in Long Beach, revealing the hypothetical “quark star” could explain what we are seeing. Quark stars are thought to be the next stage up from neutron stars; as gravitational forces overwhelm the structure of the neutron degenerate matter, quark matter (or strange matter) is the result. However, the formation of a quark star may have an important side effect. Colour ferromagnetism in color-flavour locking quark matter (the most dense form of quark matter) could be a viable mechanism for generating immensely powerful magnetic flux as observed in magnetars. Therefore, magnetars may be the consequence of very compressed quark matter.
These results were arrived at by computer simulation, how can we observe the effect of a quark star — or the “quark star phase” of a magnetar — in a supernova remnant? According to Leahy, the transition from neutron star to quark star could occur from days to thousands of years after the supernova event, depending on the conditions of the neutron star. And what would we see when this transition occurs? There should be a secondary flash of radiation from the neutron star after the supernova due to liberation of energy as the neutron structure collapses, possibly providing astronomers with an opportunity to “see” a magnetar being “switched on”. Leahy also calculates that 1-in-10 supernovae should produce a magnetar remnant, so we have a pretty good chance at spotting the mechanism in action.
[/caption]
If you’re a PlayStation 3 fan, or if you just received one as a holiday gift, you may be able to do more with the system than just gaming. A group of gravity researchers have configured 16 PlayStation 3’s together to create a type of supercomputer that is helping them estimate properties of the gravitational waves produced by the merger of two black holes. The research team from the University of Alabama in Huntsville and the University of Massachusetts, Dartmouth, calls their configuration the Gravity Grid, and they say the Sony PlayStation 3 has a number of unique features that make it particularly suited for scientific computation. Equally important, the raw computing power per dollar provided by the PS3 is significantly higher than anything else on the market today.
PlayStation 3s have also been used by the Folding@Home project, to harness the PS3’s technology to help study how proteins are formed in the human body and how they sometimes form incorrectly. This helps in research in several diseases such as Parkinson’s, Alzheimer’s, cystic fibrosis, and even Mad-Cow disease.
The PS3 uses a powerful new processor called the Cell Broadband Engine to run its highly realistic games, and can connect to the Internet so gamers can download new programs and take each other on.
The PlayStation 3 cluster used by the gravity research team can solve some astrophysical problems, such as ones involving many calculations but low memory usage, equaling the speed of a rented super-computer.
“If we had rented computing time from a supercomputer center it would have cost us about $5,000 to run our [black hole] simulation one time. For this project we ran our simulation several dozens of times to test different parameters and circumstances,” study author Lior Burko told Inside Science News Service.
One of the unique features of the PS3 is that it is an open platform, where different system software can be run on it. It’s special processor has a main CPU (called the PPU) and six special compute engines (called SPUs) available for raw computation. Moreover, each SPU performs vector operations, which implies that they can compute on multiple data, in a single step.
But the low cost is especially attractive to university researchers. The Gravity Grid team received a partial donation from Sony, and are using “stock” PS3s for the cluster, with no hardware modifications and are networked together using inexpensive equipment.
Gravitational waves are “ripples” in space-time that travel at the speed of light. These were theoretically predicted by Einstein’s general relativity, but have never been directly observed. Other research is being done in this area by the newly constructed NSF LIGO laboratory and various other such observatories in Europe and Asia. The ESA and NASA also have a mission planned in the near future – the LISA mission – that will also be attempting to detect these waves. To learn more about these waves and the recent attempts to observe them, please visit the LISA mission website.
2008 has been a landmark year for space science and physics endeavour. We’ve peered deep into the cosmos and fitted new pieces into some of the most intriguing universal puzzles. We’ve explored other planets with technology we wouldn’t have recognised a decade ago. We’ve assembled some of the most complex experiments to test theories of the very small and the very big. 2008 has built strong foundations for the future of the exploration of the Universe in so many ways…
This week, Time Magazine published the top 10 “Scientific Discoveries” of 2008. Technically, as many readers pointed out, a few of the entries are not “discoveries”, they are “achievements”. Although this might have been the case, space exploration and physics dominated, with the #1 slot going to the LHC and #2 slot going to the Phoenix Mars Lander (#4 and #6 went to the Chinese spacewalk and exoplanets respectively). After reading the superb suggestion put forward by Astrofiend (thanks!), it was the push I needed to want to create a Universe Today version of a “Top 10” for 2008 (I’d love to do a top 20, but I have to find some time for Christmas shopping).
This top ten will focus on the last 12 months of Universe Today articles, so take a journey through the year’s events in space science and physics to find your favourite scientific endeavour of 2008. If you can’t find the article, just leave the name of the specific endeavour and we’ll do the rest. Please leave all nominations in the comments box below…
You have one week to get your nominations in (so your deadline is December 19th), and I’ll compile the list of winners hopefully in time for Christmas. The nominations will be considered not only according to popularity, but also chosen by your unbiased Universe Today team…
[/caption]2008 has been an astounding year of scientific discovery. To celebrate this fact, Time Magazine has listed the “Top 10 Scientific Discoveries” where space exploration and physics dominate. Other disciplines are also listed; including zoology, microbiology, technology and biochemistry, but the number 1 slot goes to the most ambitious physics experiment of our time. Can you guess what it is? Also, of all our endeavours in space, can you pick out three that Time Magazine has singled out as being the most important?
As we approach the end of the year, ready to welcome in 2009, it is good to take stock and celebrate the mind-blowing achievements mankind has accomplished. Read on for the top 10 scientific discoveries of 2008…
The best thing about writing for a leading space news blog is that you gain wonderful overview to all our endeavours in astronomy, space flight, physics, politics (yes, space exploration has everything to do with politics), space commercialization and science in general. 2008 has been such a rich year for space exploration; we’ve landed probes on other worlds, studied other worlds orbiting distant stars, peered deep into the quantum world, learnt profound things about our own planet, developed cutting-edge instrumentation and redefined the human existence in the cosmos. We might not have all the answers (in fact, I think we are only just beginning to scratch the surface of our understanding of the Universe), but we have embarked on an enlightening journey on which we hope to build strong foundations for the next year of scientific discovery.
In an effort to assemble some of the most profound scientific endeavours of this year, Time Magazine has somehow narrowed the focus down to just 10 discoveries. Out of the ten, four are space and physics related, so here they are:
Considering there have never been any direct observations of exoplanets before November 2008–although we have known about the presence of worlds orbiting other stars for many years via indirect methods–this has been a revolutionary year for exoplanet hunters.
Despite early controversy surrounding recorded spaceship transmissions before the rocket had even launched, and then the sustained efforts by conspiracy theorists to convince the world that the whole thing was staged, mission commander Zhai Zhigang did indeed become the first ever Chinese citizen to carry out a spacewalk. Zhai spent 16 minutes outside of the capsule, attached by an umbilical cable, to triumphantly wave the Chinese flag and retrieve a test sample of solid lubricant attached to the outside of the module. His crew mate Liu Boming was also able to do some spacewalking.
Probably the most incredible thing about the first Chinese spacewalk wasn’t necessarily the spacewalk itself, it was the speed at which China managed to achieve this goal in such a short space of time. The first one-man mission into space was in 2003, the second in 2005, and the third was this year. Getting man into space is no easy task, to build an entire manned program in such a short space of time, from the ground-up, is an outstanding achievement.
2. The North Pole – of Mars: The Phoenix Mars Lander
Phoenix studied the surface of the Red Planet for five months. It was intended to only last for three. In that time, this robotic explorer captured the hearts and minds of the world; everybody seemed to be talking about the daily trials and tribulations of this highly successful mission. Perhaps it was because of the constant news updates via the University of Arizona website, or the rapid micro-blogging via Twitter; whatever the reason, Phoenix was a short-lived space celebrity.
To give the highly communicative lander the last word, MarsPhoenix on Twitter has recently announced: “Look who made Time Mag’s Top 10 list for Scientific Discoveries in 2008: http://tinyurl.com/5mwt2l”
In the run-up to the switch-on of the LHC in September, the world’s media focused its attention on the grandest physics experiment ever constructed. The LHC will ultimately probe deep into the world of subatomic particles to help to explain some of the fundamental questions of our Universe. Primarily, the LHC has been designed to hunt for the elusive Higgs boson, but the quest will influence many facets of science. From designing an ultra-fast method of data transmission to unfolding the theoretical microscopic dimensions curled up in space-time, the LHC is a diverse science, with applications we won’t fully appreciate for many years.
Unfortunately, as you may be wondering, the LHC hasn’t actually discovered anything yet, but the high-energy collisions of protons and other, larger subatomic particles, will revolutionize physics. I’d argue that the simple fact the multi-billion euro machine has been built is a discovery of how advanced our technological ability is becoming.
[/caption]
The replacement parts for the damaged components of the Large Hadron Collider (LHC) are arriving, and cautious estimates push the recommissioning date back to July 2009. We now know the repair job will cost several million dollars (£14 million according to a recent report) and scientists have identified the cause of the September 19th quench that kick-started an explosive helium leak, buckling and ripping the heavy supercooled magnets from their mounts. But how can this be avoided in the future? After all, the LHC is the most complex experiment ever constructed, there are a huge number of variables that could spell disaster when the LHC is switched back on again. The “S34 Incident” was triggered by a small electrical fault, what can prevent this from happening in the future?
According to the official report, the LHC requires an additional “early warning system” that will be tailored to detect small electrical shorts, hopefully shutting the system down before any further damage to the LHC blocks the search for the Higgs boson again…
It looks like official reports are being published thick and fast. Yesterday, I reported on two CERN reports that contained further details behind the problems faced by the engineers and physicists working on the repair of the LHC. One report suggested that it was an option to push back the date of LHC commissioning until 2010, whereas the other identified July 2009 as a good date to begin circulating protons once more. Now, a BBC news item has exposed some more facts behind the future of the LHC, indicating an early warning system is being considered to prevent an accident like the S34 Incident from happening again.
The incident, known as a “quench”, was caused by an electrical short between two of the 1200 electromagnets that make up the ring of the particle accelerator. This seemingly small fault was anything but; it initiated the rapid release of a tonne of helium, buckling and breaking the magnets between Sectors 3-4. Describing what happened, LHC project leader Professor Lyn Evans said, “Basically, they have been pulled off their feet and the interconnects have been broken.”
The electrical fault occurred right at the end of the commissioning process, even after the first protons had circulated around the long accelerator ring on September 10th. At the time, the LHC had seven of its eight sectors powered up to full energy, but the quench occurred right at the end of the process. “We are extremely disappointed, especially as we had already commissioned seven of the eight sections of the LHC up to full energy,” Evans said. “This was the last sector to be commissioned and this was really the very last electrical circuit. I must say it felt like a real kick in the teeth.”
If the experiments had continued as planned, scientists would be analysing the ground-breaking particle collision data by now, but it looks like CERN will be taking an even more cautious approach form here on in. “You can think of the LHC as a Formula 1 racing car. It’s a complex tool, a complex machine,” commented Dr Francisco Bertinelli, one of the engineers repairing the magnets. “We will not run it from zero to top speed over one afternoon. We will build up our confidence and lower our risks.”
Generally, although frustrated, scientists are very excited about the future for the LHC. Prof. Tejinder Verdee of Imperial College London reminds us why this is only a minor glitch in the grand scheme of things: “This science has the potential to alter the way we see nature and the way nature operates at a fundamental level so this potential still remains, albeit a few months delayed. The great science is still out there ahead of us, which is greatly motivating.”
The unravelling of the fabric of the Universe has just been delayed and the physics revolution can wait a few more months…
[/caption]On September 19th, CERN announced that the Large Hadron Collider had suffered a major incident, known as a “quench”. An electrical short between two of the superconducting magnets had kick-started a helium coolant leak inside the tunnels housing the accelerator ring. The quench caused the magnets to rapidly heat up, severely damaging them. The violent release of coolant ripped equipment from their concrete anchors, ensuring a huge repair operation would need to be carried out. However, it was a while before engineers were able to access the damage and the news wasn’t good: The LHC would be out of commission until the spring of 2009at the earliest. That was such a sad day.
Late last month, CERN Director-General Robert Aymar gave a presentation to the 84th Plenary Meeting of the European Committee for Future Accelerators, showing the first public images of the quench aftermath, an accident that has become known as the “S34 Incident”.
In addition to these images, there are suggestions that there may be no particle collisions next year. Although the most recent report doesn’t appear to back up these plans, and replacement parts have started to arrive at the facility (above), it looks like the first collisions probably won’t happen until July 2009 at the earliest (that’s four months later than previously estimated)…
It looks like the September 19th quench between Sectors 3-4 of the LHC ring is now being referred to as the “S34 Incident“. And what an incident it was. Fortunately nobody was injured during the quench, but the LHC wasn’t so lucky. For a rundown of the official account of the S34 Incident, I’ll hand over to Robert Aymar’s November 28th presentation (page 15):
Within the first second, an electrical arc developed and punctured the helium enclosure, leading to release of helium into the insulation vacuum of the cryostat. The spring-loaded relief discs on the vacuum enclosure opened when the pressure exceeded atmospheric, thus relieving the helium to the tunnel. They were however unable to contain the pressure rise below the nominal 0.15 MPa absolute in the vacuum enclosures of subsector 23-25, thus resulting in large pressure forces acting on the vacuum barriers separating neighboring subsectors, which most probably damaged them. These forces displaced dipoles in the subsectors affected from their cold internal supports, and knocked the Short Straight Section cryostats housing the quadrupoles and vacuum barriers from their external support jacks at positions Q23, Q27 and Q31, in some locations breaking their anchors in the concrete floor of the tunnel. The displacement of the Short Straight Section cryostats also damaged the “jumper” connections to the cryogenic distribution line, but without rupture of the transverse vacuum barriers equipping these jumper connections, so that the insulation vacuum in the cryogenic line not degrade.
The first image (pictured above) clearly shows the extent of the concrete damage that occurred during the huge pressure forces generated by the leaking helium, ripping the electromagnets off their supports (the red boxes in the photo) and shattering the floor.
In this second image, the extent of the damage is pretty clear. Assuming the accelerator beam-line used to be straight (unfortunately, there is no “before” picture), the violent displacement of a huge magnet (weighing several tonnes) is obvious.
Later in the presentation, Aymar points out that 5 quadrupole and 24 dipole magnets need to be repaired and around 57 magnets have to be removed to be cleaned. This will be a huge task, one that will last many months. According to one eagle-eyed blogger at High Energy PhDs, a previous report presented a few days before the Aymar report signalled that there may be no high energy particle collisions until 2010. Jorg Winnenger outlined two possibilities for the LHC: 1) Partial operations in 2009, allowing only low-energy particle acceleration to await full-scale repairs through the 2009-10 winter shutdown, or 2) Forget 2009 operations and work toward full-scale experiments in 2010. Aymar’s more recent report did not mention these scenarios, simply stating, “the LHC will restart operation in the next spring.”
Judging by the mixed signals, we’ll have to wait patiently until it is clear as to when the LHC is expected to recover. Either way, it will be a long, painstaking and expensive task that needs to be completed as soon as possible. I really hope we don’t have to wait until 2010 until restart.
[/caption]British scientists invent “mini-magnetosphere” to protect astronauts during solar storms.
Space travel during a solar storm just became a little less risky. UK scientists working at Rutherford Appleton Laboratory near Oxford and the universities of York and Strathclyde have tested a “mini-magnetosphere” enveloping a model spacecraft in the lab. It turns out that their prototype offers almost total protection against high energy solar particles. By mimicking the natural protective environment of the Earth, the researchers have scaled the protective magnetic bubble down into an energy efficient, yet powerful deflector shield.
This astounding achievement is a big step toward protecting sensitive electronics and the delicate human body against the radioactive effects of manned missions between the planets. It may sound like science fiction, but future astronauts may well shout the order to “RAISE SHIELDS!” if the Sun flares up during a 36 million mile journey to Mars…
On writing “Scientists Designing ‘Ion Shield’ To Protect Astronauts From Solar Wind” way back in January, I was a little dubious as to whether the preliminary results could be replicated on a full-scale spaceship. At the time, Dr Ruth Bamford (the lead researcher from Rutherford Appleton) had created a mini version of a magnetic shield that acted as a “bubble” in a stream of ions. As ions were charged, they could be deflected by a magnetic field, so the field acts as a barrier to deflect the paths of these ions around the void encapsulated by the magnetic field. All that had to be done was to scale the idea up a notch or two and then place a spaceship in the middle of the protected void. Solved!
Not so fast. The biggest drawback I could see back in January was the large amount of energy that would be required to power the system. After all, to generate a stable, spaceship-sized mini-magnetosphere would need a vast quantity of electricity (and be very bulky), or it would need to be highly efficient (and compact). As this is space travel we’re talking about, the scientists would need to look into the latter. The mini-magnetosphere would need to be a highly efficient device.
Eleven months later and it looks like the British team have found their answer. In results just published in the journal Plasma Physics and Controlled Fusion, they have devised a system no bigger than a large desk that uses the same energy as an electric kettle. Two mini-magnetospheres will be contained within two mini satellites located outside the spaceship. Should there be an increase in solar wind flux, or an approaching cloud of energetic particles from a flare and/or coronal mass ejection (CME), the magnetospheres can be switched on and the solar ions are deflected away from the spacecraft.
“These initial experiments have shown promise and that it may be possible to shield astronauts from deadly space weather,” Dr Bamford said. After all, the effects of radiation poisoning can be devastating.
Prof. Bob Bingham, a theoretical physicist at the University of Strathclyde, gives a graphic account as to why this technology is important:
“Solar storms or winds are one of the greatest dangers of deep space travel. If you got hit by one not only would it take out the electronics of a ship but the astronauts would soon take on the appearance of an overcooked pizza. It would be a bit like being near the Hiroshima blast. Your skin would blister, hair and teeth fall out and before long your internal organs would fail. It is not a very nice way to go. This system creates a Magnetic Field Bubble that would deflect the dangerous radiation away from the spacecraft.” – Prof. Bob Bingham
Bingham added that the team was currently patenting the technology and hopes to have a working full size prototype within five years. So we might have to wait some time until we see some pictures of the system in action…
[/caption]If you thought any quantum discoveries would have to wait until the Large Hadron Collider (LHC) is switched back on in 2009, you’d be wrong. Just because the LHC represents the next stage in particle accelerator evolution does not mean the world’s established and long-running accelerator facilities have already closed shop and left town. It would appear that the Tevatron particle accelerator at Fermilab in Batavia, Illinois, has discovered…
…something.
Scientists at the Tevatron are reluctant to hail new results from the Collider Detector at Fermilab (CDF) as a “new discovery” as they simply do not know what their results suggest. During collisions between protons and anti-protons, the CDF was monitoring the decay of bottom quarks and bottom anti-quarks into muons. However, CDF scientists uncovered something strange. Too many muons were being generated by the collisions, and muons were popping into existence outside the beam pipe…
The Tevatron was opened in 1983 and is currently the most powerful particle accelerator in the world. It is the only collider that can accelerate protons and anti-protons to 1 TeV energies, but it will be surpassed by the LHC when it finally goes into operation sometime early next year. Once the LHC goes online, the sub-atomic flame will be passed to the European accelerator and the Tevatron will be prepared for decommissioning some time in 2010. But before this powerful facility closes down, it will continue probing matter for a little while yet.
In recent proton collision experiments, scientists using the CDF started seeing something they couldn’t explain with our current understanding of modern physics.
The particle collisions occur inside the 1.5 cm-wide “beam pipe” that collimate the relativistic particle beams and focus them to a point for the collision to occur. After the collision, the resulting spray of particles are detected by the surrounding layers of electronics. However the CDF team detected too many muons being generated after the collision. Plus, muons were being generated inexplicably outside the beam pipe with no tracks detected in the innermost layers of CDF detectors.
CDF spokesperson Jacobo Konigsberg, is keen to emphasise that more investigations need to be done before an explanation can be arrived at. “We haven’t ruled out a mundane explanation for this, and I want to make that very clear,” he said.
However, theorists aren’t so reserved and are very excited about what this could mean to the Standard Model of sub-atomic particles. If the detection of these excess muons does prove to be correct, the “unknown” particle has a lifetime of 20 picoseconds and has the ability to travel 1 cm, through the side of the beam pipe, and then decay into muons.
Dan Hooper, another Fermilab scientist, points out that if this really is a previously unknown particle, it would be a huge discovery. “A centimetre is a long way for most kinds of particles to make it before decaying,” says . “It’s too early to say much about this. That being said, if it turns out that a new ‘long-lived’ particle exists, it would be a very big deal.”
Neal Weiner of New York University agrees with Hooper. “If this is right, it is just incredibly exciting,” he says. “It would be an indication of physics perhaps even more interesting than we have been guessing beforehand.”
Particle accelerators have a long history of producing unexpected results, perhaps this could be an indicator of a particle that has previously been overlooked, or more interestingly, not predicted. Naturally, scientists are quick to postulate that dark matter might be behind all this.
Weiner, with colleague Nima Arkani-Hamed, have formulated a model that predicts the existence of dark matter particles in the Universe. In their theory, dark matter particles interact among themselves via force-carrying particles of a mass of approximately 1 GeV. The CDF muons generated outside the beam pipe have been calculated to be produced by an “unknown” decaying parent particle with a mass of approximately 1 GeV.
The comparison is striking, but Weiner is quick to point out that more work is needed before the CDF results can be linked with dark matter. “We are trying to figure that out,” he said. “But I would be excited by the CDF data regardless.”
Perhaps we don’t have to wait for the LHC, some new physics may be uncovered before the brand new CERN accelerator is even repaired…