[/caption]It may seem that the delay is getting longer and longer for the restart of the LHC after the catastrophic quench in September 2008, but progress is being made. Repair costs are expected to hit the $16 million mark as engineers quickly rebuild the damaged electromagnets and track down any further electrical faults that could jeopardize the future operation of the complex particle accelerator.
According to the European Organization for Nuclear Research (CERN), the Large Hadron Collider will resume operations in September. But the best news is: we could be seeing the first particle collisions only a month later…
If, like me, you were restlessly awaiting the grand LHC “switch-on” on September 10th, 2008, only to be disappointed by the transformer breakdown the following day, but then buoyed up by the fact LHC science was still on track, only for your hopes to be completely quenched by the quench that explosively ripped the high-tech magnets from their mounts on September 20th, you’ll probably be weary about getting your hopes up too high. However, allow yourself a little levity, the LHC repairs are going well, potential faults are being identified and fixed, and replacement parts are falling into place. But there is more good news.
Via Twitter, one of my contacts (@dpodolsky) hinted that he’d heard, via word of mouth, that LHC scientists’ optimism was growing for an October 2009 start to particle collisions. However, as of February 2nd, there was no official word from CERN. Today, the CERN Director General issued a statement.
“The schedule we have now is without a doubt the best for the LHC and for the physicists waiting for data,” Rolf Heuer said. “It is cautious, ensuring that all the necessary work is done on the LHC before we start-up, yet it allows physics research to begin this year.”
So, the $5 billion LHC is expected to be restarted in September and the first experiments will hopefully commence by the end of October 2009. It may be a year later than when the first particle collisions were planned, but at least a better idea is forming about when the hunt for the Higgs particle will recommence…
[/caption]In September 2008, the Large Hadron Collider (LHC) suffered a catastrophic quench, triggered by a faulty connection in the electronics connecting two of the supercooled magnets between Sections 3 and 4 of the 27 km-circumference particle accelerator. The “S34-incident” caused tonnes of helium coolant to explosively leak into the LHC tunnel, ripping the heavy electromagnets from their concrete mounts.
Naturally, this was a huge blow for CERN, delaying the first particle collisions by several months. However, the repair work is progressing well, and hopes are high for commencement of LHC science as early as this summer. Now engineers are working hard to avoid a recurrence of the S34 Incident, tracking down similar electrical faults between the accelerator magnets. It seems like they have found many more faults than expected…
According to a recently published progress report, the LHC repairs are progressing as planned, but more electrical faults have been discovered in other sections of the accelerator. An electrical short has been blamed for the quench four months ago, only weeks after the first circulation of protons around the LHC in the beginning of September 2008. It is now of paramount importance to isolate any further potential shorts in the complex experiment. It would appear engineers are doing a good job in tracking them down.
Ribbons of superconducting niobium-titanium wire is used by the LHC to carry thousands of amps of current to the magnets. Connecting the ribbon from electromagnet-to-electromagnet are splices that are soldered in place. Should one of these splices be weakened by poor soldering, an electrical short can occur, making the magnets lose superconductivity, initiating a quench, rapidly heating the sensitive equipment. Various sections are being re-examined and re-soldered. The good news is that this additional work is not compounding the delay any further.
It has been confirmed that there was a lack of solder on the splice joint. Each sector has more than 2500 splices and a single defective splice can now be identified in situ when the sector is cold. Using this method another magnet showing a similar defect has been identified in sector 6-7. This sector will be warmed and the magnet removed. The warm up of this additional sector can be performed in the shadow of the repair to sector 3-4 and will therefore not add any additional delay to the restart schedule. — CERN
Hopefully we’ll see a second circulation of protons this summer, and according to informal rumours from a contact involved in the LHC science, the first particle collisions could start as early as October 2009. I will listen out for any further official confirmation of this information…
[/caption]So how do you take the temperature of one of the most exotic objects in the Universe? A neutron star (~1.35 to 2.1 solar masses, measuring only 24 km across) is the remnant of a supernova after a large star has died. Although they are not massive enough become a black hole, neutron stars still accrete matter, pulling gas from a binary partner, often undergoing prolonged periods of flaring.
Fortunately, we can observe X-ray flares (using instrumentation such as Chandra), but it isn’t the flare itself that can reveal the temperature or structure of a neutron star.
At the AAS conference last week, details about the results from an X-ray observing campaign of MXB 1659-29, a quasi-persistent X-ray transient source (i.e. a neutron star that flares for long periods), revealed some fascinating insights to the physics of neutron stars, showing that as the crust of a neutron star cools, the crustal composition is revealed and the temperature of these exotic supernova remnants can be measured…
During a flare outburst, neutron stars generate X-rays. These X-ray sources can be measured and their evolution tracked. In the case of MXB 1659-29, Ed Cackett (Univ. of Michigan) used data from NASA’s Rossi X-ray Timing Explorer (RXTE) to monitor the cooling of the neutron star crust after an extended period of X-ray flaring. MXB 1659-29 flared for 2.5 years until it “turned off” in September 2001. Since then, the source was periodically observed to measure the exponential decrease in X-ray emissions.
So why is this important? After a long period of X-ray flaring, the crust of a neutron star will heat up. However, it is thought that the core of the neutron star will remain comparatively cool. When the neutron star stops flaring (as the accretion of gas, feeding the flare, shuts off), the heating source for the crust is lost. During this period of “quiescence” (no flaring), the diminishing X-ray flux from the cooling neutron star crust reveals a huge wealth of information about the characteristics of the neutron star.
During quiescence, astronomers will observe X-rays emitted from the surface of the neutron star (as opposed to the flares), so direct measurements can be made of the neutron star. In his presentation, Cackett examined how the X-ray flux from MXB 1659-29 reduced exponentially and then levelled off at a constant flux. This means the crust cooled rapidly after the flaring, eventually reaching thermal equilibrium with the neutron star core. Therefore, by using this method, the neutron star core temperature can be inferred.
Including the data from another neutron star X-ray transient KS 1731-260, the cooling rates observed during the onset of quiescence suggests these objects have well-ordered crustal lattices with very few impurities. The rapid temperature decrease (from flare to quiescence) took approximately 1.5 years to reach thermal equilibrium with the neutron star core. Further work will now be carried out using Chandra data so more information about these rapidly spinning exotic objects can be uncovered.
Suddenly, neutron stars became a little less mysterious to me in the 10 minute talk last Tuesday, I love conferences…
[/caption]
High altitude balloons are an inexpensive means of getting payloads to the brink of space, where all sorts of great science and astronomy can be done. A new prototype of balloon that uses material as thin as plastic food wrap was successfully checked out in an 11-day test flight, and this new design may usher in a new era of high altitude flight. NASA and the National Science Foundation sponsored the test, which was launched from McMurdo Station in Antarctica. The balloon reached a float altitude of more than 111,000 feet and maintained it for the entire 11 days of flight. It’s hoped that the super-pressure balloon ultimately will carry large scientific experiments to the edge of space for 100 days or more.
The flight tested the durability and functionality of the scientific balloon’s novel globe-shaped design and the unique lightweight and thin polyethylene film. It launched on December 28, 2008 and returned on January 8, 2009.
“Our balloon development team is very proud of the tremendous success of the test flight and is focused on continued development of this new capability to fly balloons for months at a time in support of scientific investigations,” said David Pierce, chief of the Balloon Program Office at NASA’s Wallops Flight Facility at Wallops Island, Va. “The test flight has demonstrated that 100 day flights of large, heavy payloads is a realistic goal.”
This seven-million-cubic-foot super-pressure balloon is the largest single-cell, super-pressure, fully-sealed balloon ever flown. When development concludes, NASA will have a 22 million-cubic-foot balloon that can carry a one-ton instrument to an altitude of more than 110,000 feet, which is three to four times higher than passenger planes fly. Ultra-long duration missions using the super pressure balloon cost considerably less than a satellite and the scientific instruments flown can be retrieved and launched again, making them ideal very-high altitude research platforms.
In addition to the super pressure test flight, two additional long-duration balloons were launched from McMurdo during the 2008-2009 campaign. The University of Maryland’s Cosmic Ray Energetics and Mass, or CREAM IV, experiment launched December 19, 2008, and landed January 6, 2009. The CREAM investigation was used to directly measure high energy cosmic-ray particles arriving at Earth after originating from distant supernova explosions elsewhere in the Milky Way galaxy. The payload for this experiment was refurbished from an earlier flight. The team released data and their findings from their first flight in August 2008.
The University of Hawaii Manoa’s Antarctic Impulsive Transient Antenna launched December 21, 2008, and is still aloft. Its radio telescope is searching for indirect evidence of extremely high-energy neutrino particles possibly coming from outside our Milky Way galaxy.
[/caption]Magnetars are the violent, exotic cousins of the well known neutron star. They emit excessive amounts of gamma-rays, X-rays and possess a powerful magnetic field. Neutron stars also have very strong magnetic fields (although weak when compared with magnetars), conserving the magnetic field of the parent star before it exploded as a supernova. However, the huge magnetic field strength predicted from observations of magnetars is a mystery. Where do magnetars get their strong magnetic fields? According to new research, the answer could lie in the even more mysterious quark star…
It is well known that neutron stars have very strong magnetic fields. Neutron stars, born from supernovae, preserve the angular momentum and magnetism of the parent star. Therefore, neutron stars are extremely magnetic, often rapidly spinning bodies, ejecting powerful streams of radiation from their poles (seen from Earth as a pulsar should the collimated radiation sweep through our field of view). Sometimes, neutron stars don’t behave as they should, ejecting copious amounts of X-rays and gamma-rays, exhibiting a very powerful magnetic field. These strange, violent entities are known as magnetars. As they are a fairly recent discovery, scientists are working hard to understand what magnetars are and how they acquired their strong magnetic field.
Denis Leahy, from the University of Calgary, Canada, presented a study on magnetars at a January 6th session at this week’s AAS meeting in Long Beach, revealing the hypothetical “quark star” could explain what we are seeing. Quark stars are thought to be the next stage up from neutron stars; as gravitational forces overwhelm the structure of the neutron degenerate matter, quark matter (or strange matter) is the result. However, the formation of a quark star may have an important side effect. Colour ferromagnetism in color-flavour locking quark matter (the most dense form of quark matter) could be a viable mechanism for generating immensely powerful magnetic flux as observed in magnetars. Therefore, magnetars may be the consequence of very compressed quark matter.
These results were arrived at by computer simulation, how can we observe the effect of a quark star — or the “quark star phase” of a magnetar — in a supernova remnant? According to Leahy, the transition from neutron star to quark star could occur from days to thousands of years after the supernova event, depending on the conditions of the neutron star. And what would we see when this transition occurs? There should be a secondary flash of radiation from the neutron star after the supernova due to liberation of energy as the neutron structure collapses, possibly providing astronomers with an opportunity to “see” a magnetar being “switched on”. Leahy also calculates that 1-in-10 supernovae should produce a magnetar remnant, so we have a pretty good chance at spotting the mechanism in action.
[/caption]
If you’re a PlayStation 3 fan, or if you just received one as a holiday gift, you may be able to do more with the system than just gaming. A group of gravity researchers have configured 16 PlayStation 3’s together to create a type of supercomputer that is helping them estimate properties of the gravitational waves produced by the merger of two black holes. The research team from the University of Alabama in Huntsville and the University of Massachusetts, Dartmouth, calls their configuration the Gravity Grid, and they say the Sony PlayStation 3 has a number of unique features that make it particularly suited for scientific computation. Equally important, the raw computing power per dollar provided by the PS3 is significantly higher than anything else on the market today.
PlayStation 3s have also been used by the Folding@Home project, to harness the PS3’s technology to help study how proteins are formed in the human body and how they sometimes form incorrectly. This helps in research in several diseases such as Parkinson’s, Alzheimer’s, cystic fibrosis, and even Mad-Cow disease.
The PS3 uses a powerful new processor called the Cell Broadband Engine to run its highly realistic games, and can connect to the Internet so gamers can download new programs and take each other on.
The PlayStation 3 cluster used by the gravity research team can solve some astrophysical problems, such as ones involving many calculations but low memory usage, equaling the speed of a rented super-computer.
“If we had rented computing time from a supercomputer center it would have cost us about $5,000 to run our [black hole] simulation one time. For this project we ran our simulation several dozens of times to test different parameters and circumstances,” study author Lior Burko told Inside Science News Service.
One of the unique features of the PS3 is that it is an open platform, where different system software can be run on it. It’s special processor has a main CPU (called the PPU) and six special compute engines (called SPUs) available for raw computation. Moreover, each SPU performs vector operations, which implies that they can compute on multiple data, in a single step.
But the low cost is especially attractive to university researchers. The Gravity Grid team received a partial donation from Sony, and are using “stock” PS3s for the cluster, with no hardware modifications and are networked together using inexpensive equipment.
Gravitational waves are “ripples” in space-time that travel at the speed of light. These were theoretically predicted by Einstein’s general relativity, but have never been directly observed. Other research is being done in this area by the newly constructed NSF LIGO laboratory and various other such observatories in Europe and Asia. The ESA and NASA also have a mission planned in the near future – the LISA mission – that will also be attempting to detect these waves. To learn more about these waves and the recent attempts to observe them, please visit the LISA mission website.
2008 has been a landmark year for space science and physics endeavour. We’ve peered deep into the cosmos and fitted new pieces into some of the most intriguing universal puzzles. We’ve explored other planets with technology we wouldn’t have recognised a decade ago. We’ve assembled some of the most complex experiments to test theories of the very small and the very big. 2008 has built strong foundations for the future of the exploration of the Universe in so many ways…
This week, Time Magazine published the top 10 “Scientific Discoveries” of 2008. Technically, as many readers pointed out, a few of the entries are not “discoveries”, they are “achievements”. Although this might have been the case, space exploration and physics dominated, with the #1 slot going to the LHC and #2 slot going to the Phoenix Mars Lander (#4 and #6 went to the Chinese spacewalk and exoplanets respectively). After reading the superb suggestion put forward by Astrofiend (thanks!), it was the push I needed to want to create a Universe Today version of a “Top 10” for 2008 (I’d love to do a top 20, but I have to find some time for Christmas shopping).
This top ten will focus on the last 12 months of Universe Today articles, so take a journey through the year’s events in space science and physics to find your favourite scientific endeavour of 2008. If you can’t find the article, just leave the name of the specific endeavour and we’ll do the rest. Please leave all nominations in the comments box below…
You have one week to get your nominations in (so your deadline is December 19th), and I’ll compile the list of winners hopefully in time for Christmas. The nominations will be considered not only according to popularity, but also chosen by your unbiased Universe Today team…
[/caption]2008 has been an astounding year of scientific discovery. To celebrate this fact, Time Magazine has listed the “Top 10 Scientific Discoveries” where space exploration and physics dominate. Other disciplines are also listed; including zoology, microbiology, technology and biochemistry, but the number 1 slot goes to the most ambitious physics experiment of our time. Can you guess what it is? Also, of all our endeavours in space, can you pick out three that Time Magazine has singled out as being the most important?
As we approach the end of the year, ready to welcome in 2009, it is good to take stock and celebrate the mind-blowing achievements mankind has accomplished. Read on for the top 10 scientific discoveries of 2008…
The best thing about writing for a leading space news blog is that you gain wonderful overview to all our endeavours in astronomy, space flight, physics, politics (yes, space exploration has everything to do with politics), space commercialization and science in general. 2008 has been such a rich year for space exploration; we’ve landed probes on other worlds, studied other worlds orbiting distant stars, peered deep into the quantum world, learnt profound things about our own planet, developed cutting-edge instrumentation and redefined the human existence in the cosmos. We might not have all the answers (in fact, I think we are only just beginning to scratch the surface of our understanding of the Universe), but we have embarked on an enlightening journey on which we hope to build strong foundations for the next year of scientific discovery.
In an effort to assemble some of the most profound scientific endeavours of this year, Time Magazine has somehow narrowed the focus down to just 10 discoveries. Out of the ten, four are space and physics related, so here they are:
Considering there have never been any direct observations of exoplanets before November 2008–although we have known about the presence of worlds orbiting other stars for many years via indirect methods–this has been a revolutionary year for exoplanet hunters.
Despite early controversy surrounding recorded spaceship transmissions before the rocket had even launched, and then the sustained efforts by conspiracy theorists to convince the world that the whole thing was staged, mission commander Zhai Zhigang did indeed become the first ever Chinese citizen to carry out a spacewalk. Zhai spent 16 minutes outside of the capsule, attached by an umbilical cable, to triumphantly wave the Chinese flag and retrieve a test sample of solid lubricant attached to the outside of the module. His crew mate Liu Boming was also able to do some spacewalking.
Probably the most incredible thing about the first Chinese spacewalk wasn’t necessarily the spacewalk itself, it was the speed at which China managed to achieve this goal in such a short space of time. The first one-man mission into space was in 2003, the second in 2005, and the third was this year. Getting man into space is no easy task, to build an entire manned program in such a short space of time, from the ground-up, is an outstanding achievement.
2. The North Pole – of Mars: The Phoenix Mars Lander
Phoenix studied the surface of the Red Planet for five months. It was intended to only last for three. In that time, this robotic explorer captured the hearts and minds of the world; everybody seemed to be talking about the daily trials and tribulations of this highly successful mission. Perhaps it was because of the constant news updates via the University of Arizona website, or the rapid micro-blogging via Twitter; whatever the reason, Phoenix was a short-lived space celebrity.
To give the highly communicative lander the last word, MarsPhoenix on Twitter has recently announced: “Look who made Time Mag’s Top 10 list for Scientific Discoveries in 2008: http://tinyurl.com/5mwt2l”
In the run-up to the switch-on of the LHC in September, the world’s media focused its attention on the grandest physics experiment ever constructed. The LHC will ultimately probe deep into the world of subatomic particles to help to explain some of the fundamental questions of our Universe. Primarily, the LHC has been designed to hunt for the elusive Higgs boson, but the quest will influence many facets of science. From designing an ultra-fast method of data transmission to unfolding the theoretical microscopic dimensions curled up in space-time, the LHC is a diverse science, with applications we won’t fully appreciate for many years.
Unfortunately, as you may be wondering, the LHC hasn’t actually discovered anything yet, but the high-energy collisions of protons and other, larger subatomic particles, will revolutionize physics. I’d argue that the simple fact the multi-billion euro machine has been built is a discovery of how advanced our technological ability is becoming.
[/caption]
The replacement parts for the damaged components of the Large Hadron Collider (LHC) are arriving, and cautious estimates push the recommissioning date back to July 2009. We now know the repair job will cost several million dollars (£14 million according to a recent report) and scientists have identified the cause of the September 19th quench that kick-started an explosive helium leak, buckling and ripping the heavy supercooled magnets from their mounts. But how can this be avoided in the future? After all, the LHC is the most complex experiment ever constructed, there are a huge number of variables that could spell disaster when the LHC is switched back on again. The “S34 Incident” was triggered by a small electrical fault, what can prevent this from happening in the future?
According to the official report, the LHC requires an additional “early warning system” that will be tailored to detect small electrical shorts, hopefully shutting the system down before any further damage to the LHC blocks the search for the Higgs boson again…
It looks like official reports are being published thick and fast. Yesterday, I reported on two CERN reports that contained further details behind the problems faced by the engineers and physicists working on the repair of the LHC. One report suggested that it was an option to push back the date of LHC commissioning until 2010, whereas the other identified July 2009 as a good date to begin circulating protons once more. Now, a BBC news item has exposed some more facts behind the future of the LHC, indicating an early warning system is being considered to prevent an accident like the S34 Incident from happening again.
The incident, known as a “quench”, was caused by an electrical short between two of the 1200 electromagnets that make up the ring of the particle accelerator. This seemingly small fault was anything but; it initiated the rapid release of a tonne of helium, buckling and breaking the magnets between Sectors 3-4. Describing what happened, LHC project leader Professor Lyn Evans said, “Basically, they have been pulled off their feet and the interconnects have been broken.”
The electrical fault occurred right at the end of the commissioning process, even after the first protons had circulated around the long accelerator ring on September 10th. At the time, the LHC had seven of its eight sectors powered up to full energy, but the quench occurred right at the end of the process. “We are extremely disappointed, especially as we had already commissioned seven of the eight sections of the LHC up to full energy,” Evans said. “This was the last sector to be commissioned and this was really the very last electrical circuit. I must say it felt like a real kick in the teeth.”
If the experiments had continued as planned, scientists would be analysing the ground-breaking particle collision data by now, but it looks like CERN will be taking an even more cautious approach form here on in. “You can think of the LHC as a Formula 1 racing car. It’s a complex tool, a complex machine,” commented Dr Francisco Bertinelli, one of the engineers repairing the magnets. “We will not run it from zero to top speed over one afternoon. We will build up our confidence and lower our risks.”
Generally, although frustrated, scientists are very excited about the future for the LHC. Prof. Tejinder Verdee of Imperial College London reminds us why this is only a minor glitch in the grand scheme of things: “This science has the potential to alter the way we see nature and the way nature operates at a fundamental level so this potential still remains, albeit a few months delayed. The great science is still out there ahead of us, which is greatly motivating.”
The unravelling of the fabric of the Universe has just been delayed and the physics revolution can wait a few more months…
[/caption]On September 19th, CERN announced that the Large Hadron Collider had suffered a major incident, known as a “quench”. An electrical short between two of the superconducting magnets had kick-started a helium coolant leak inside the tunnels housing the accelerator ring. The quench caused the magnets to rapidly heat up, severely damaging them. The violent release of coolant ripped equipment from their concrete anchors, ensuring a huge repair operation would need to be carried out. However, it was a while before engineers were able to access the damage and the news wasn’t good: The LHC would be out of commission until the spring of 2009at the earliest. That was such a sad day.
Late last month, CERN Director-General Robert Aymar gave a presentation to the 84th Plenary Meeting of the European Committee for Future Accelerators, showing the first public images of the quench aftermath, an accident that has become known as the “S34 Incident”.
In addition to these images, there are suggestions that there may be no particle collisions next year. Although the most recent report doesn’t appear to back up these plans, and replacement parts have started to arrive at the facility (above), it looks like the first collisions probably won’t happen until July 2009 at the earliest (that’s four months later than previously estimated)…
It looks like the September 19th quench between Sectors 3-4 of the LHC ring is now being referred to as the “S34 Incident“. And what an incident it was. Fortunately nobody was injured during the quench, but the LHC wasn’t so lucky. For a rundown of the official account of the S34 Incident, I’ll hand over to Robert Aymar’s November 28th presentation (page 15):
Within the first second, an electrical arc developed and punctured the helium enclosure, leading to release of helium into the insulation vacuum of the cryostat. The spring-loaded relief discs on the vacuum enclosure opened when the pressure exceeded atmospheric, thus relieving the helium to the tunnel. They were however unable to contain the pressure rise below the nominal 0.15 MPa absolute in the vacuum enclosures of subsector 23-25, thus resulting in large pressure forces acting on the vacuum barriers separating neighboring subsectors, which most probably damaged them. These forces displaced dipoles in the subsectors affected from their cold internal supports, and knocked the Short Straight Section cryostats housing the quadrupoles and vacuum barriers from their external support jacks at positions Q23, Q27 and Q31, in some locations breaking their anchors in the concrete floor of the tunnel. The displacement of the Short Straight Section cryostats also damaged the “jumper” connections to the cryogenic distribution line, but without rupture of the transverse vacuum barriers equipping these jumper connections, so that the insulation vacuum in the cryogenic line not degrade.
The first image (pictured above) clearly shows the extent of the concrete damage that occurred during the huge pressure forces generated by the leaking helium, ripping the electromagnets off their supports (the red boxes in the photo) and shattering the floor.
In this second image, the extent of the damage is pretty clear. Assuming the accelerator beam-line used to be straight (unfortunately, there is no “before” picture), the violent displacement of a huge magnet (weighing several tonnes) is obvious.
Later in the presentation, Aymar points out that 5 quadrupole and 24 dipole magnets need to be repaired and around 57 magnets have to be removed to be cleaned. This will be a huge task, one that will last many months. According to one eagle-eyed blogger at High Energy PhDs, a previous report presented a few days before the Aymar report signalled that there may be no high energy particle collisions until 2010. Jorg Winnenger outlined two possibilities for the LHC: 1) Partial operations in 2009, allowing only low-energy particle acceleration to await full-scale repairs through the 2009-10 winter shutdown, or 2) Forget 2009 operations and work toward full-scale experiments in 2010. Aymar’s more recent report did not mention these scenarios, simply stating, “the LHC will restart operation in the next spring.”
Judging by the mixed signals, we’ll have to wait patiently until it is clear as to when the LHC is expected to recover. Either way, it will be a long, painstaking and expensive task that needs to be completed as soon as possible. I really hope we don’t have to wait until 2010 until restart.