The Search for Dark Energy Just Got Easier

The Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Credit: Berkeley Lab

Since the early 20th century, scientists and physicists have been burdened with explaining how and why the Universe appears to be expanding at an accelerating rate. For decades, the most widely accepted explanation is that the cosmos is permeated by a mysterious force known as “dark energy”. In addition to being responsible for cosmic acceleration, this energy is also thought to comprise 68.3% of the universe’s non-visible mass.

Much like dark matter, the existence of this invisible force is based on observable phenomena and because it happens to fit with our current models of cosmology, and not direct evidence. Instead, scientists must rely on indirect observations, watching how fast cosmic objects (specifically Type Ia supernovae) recede from us as the universe expands.

This process would be extremely tedious for scientists – like those who work for the Dark Energy Survey (DES) – were it not for the new algorithms developed collaboratively by researchers at Lawrence Berkeley National Laboratory and UC Berkeley.

“Our algorithm can classify a detection of a supernova candidate in about 0.01 seconds, whereas an experienced human scanner can take several seconds,” said Danny Goldstein, a UC Berkeley graduate student who developed the code to automate the process of supernova discovery on DES images.

Currently in its second season, the DES takes nightly pictures of the Southern Sky with DECam – a 570-megapixel camera that is mounted on the Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Every night, the camera generates between 100 Gigabytes (GB) and 1 Terabyte (TB) of imaging data, which is sent to the National Center for Supercomputing Applications (NCSA) and DOE’s Fermilab in Illinois for initial processing and archiving.

A Type Ia supernova occurs when a white dwarf accretes material from a companion star until it exceeds the Chandrasekhar limit and explodes. By studying these exploding stars, astronomers can measure dark energy and the expansion of the universe. CfA scientists have found a way to correct for small variations in the appearance of these supernovae, so that they become even better standard candles. The key is to sort the supernovae based on their color.  Credit: NASA/CXC/M. Weiss
By studying Type Ia supernova, astronomers can measure dark energy and the expansion of the universe. Credit: NASA/CXC/M. Weiss

Object recognition programs developed at the National Energy Research Scientific Computing Center (NERSC) and implemented at NCSA then comb through the images in search of possible detections of Type Ia supernovae. These powerful explosions occur in binary star systems where one star is a white dwarf, which accretes material from a companion star until it reaches a critical mass and explodes in a Type Ia supernova.

“These explosions are remarkable because they can be used as cosmic distance indicators to within 3-10 percent accuracy,” says Goldstein.

Distance is important because the further away an object is located in space, the further back in time it is. By tracking Type Ia supernovae at different distances, researchers can measure cosmic expansion throughout the universe’s history. This allows them to put constraints on how fast the universe is expanding and maybe even provide other clues about the nature of dark energy.

“Scientifically, it’s a really exciting time because several groups around the world are trying to precisely measure Type Ia supernovae in order to constrain and understand the dark energy that is driving the accelerated expansion of the universe,” says Goldstein, who is also a student researcher in Berkeley Lab’s Computational Cosmology Center (C3).

UC Berkeley / Berkeley Lab graduate student Danny Goldstein developed a new code using the machine learning technique Random Forest to vet detections of supernova candidates automatically, in real time, optimizing it for the Dark Energy Survey. Credit: Danny Goldstein, UC Berkeley / Berkeley Lab)
Goldstein’s new code uses machine learning techniques to vet detections of supernova candidates. Credit: Danny Goldstein, UC Berkeley/Berkeley Lab)

The DES begins its search for Type Ia explosions by uncovering changes in the night sky, which is where the image subtraction pipeline developed and implemented by researchers in the DES supernova working group comes in. The pipeline subtracts images that contain known cosmic objects from new images that are exposed nightly at CTIO.

Each night, the pipeline produces between 10,000 and a few hundred thousand detections of supernova candidates that need to be validated.

“Historically, trained astronomers would sit at the computer for hours, look at these dots, and offer opinions about whether they had the characteristics of a supernova, or whether they were caused by spurious effects that masquerade as supernovae in the data. This process seems straightforward until you realize that the number of candidates that need to be classified each night is prohibitively large and only one in a few hundred is a real supernova of any type,” says Goldstein. “This process is extremely tedious and time-intensive. It also puts a lot of pressure on the supernova working group to process and scan data fast, which is hard work.”

To simplify the task of vetting candidates, Goldstein developed a code that uses the machine learning technique “Random Forest” to vet detections of supernova candidates automatically and in real-time to optimize them for the DES. The technique employs an ensemble of decision trees to automatically ask the types of questions that astronomers would typically consider when classifying supernova candidates.

Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild
Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild

At the end of the process, each detection of a candidate is given a score based on the fraction of decision trees that considered it to have the characteristics of a detection of a supernova. The closer the classification score is to one, the stronger the candidate. Goldstein notes that in preliminary tests, the classification pipeline achieved 96 percent overall accuracy.

“When you do subtraction alone you get far too many ‘false-positives’ — instrumental or software artifacts that show up as potential supernova candidates — for humans to sift through,” says Rollin Thomas, of Berkeley Lab’s C3, who was Goldstein’s collaborator.

He notes that with the classifier, researchers can quickly and accurately strain out the artifacts from supernova candidates. “This means that instead of having 20 scientists from the supernova working group continually sift through thousands of candidates every night, you can just appoint one person to look at maybe few hundred strong candidates,” says Thomas. “This significantly speeds up our workflow and allows us to identify supernovae in real-time, which is crucial for conducting follow up observations.”

“Using about 60 cores on a supercomputer we can classify 200,000 detections in about 20 minutes, including time for database interaction and feature extraction.” says Goldstein.

Goldstein and Thomas note that the next step in this work is to add a second-level of machine learning to the pipeline to improve the classification accuracy. This extra layer would take into account how the object was classified in previous observations as it determines the probability that the candidate is “real.” The researchers and their colleagues are currently working on different approaches to achieve this capability.

Further Reading: Berkley Lab

Africa’s First Mission to the Moon Announced

Africa2Moon will be Africa's foist venture into space. Credit: developspacesa.org

Africa is home to 7 out of 10 of the world’s fastest-growing economies. It’s population is also the “youngest” in the world, with 50% of the population being 19 years old or younger. And amongst these young people are scores of innovators and entrepreneurs who are looking to bring homegrown innovation to their continent and share it with the outside world.

Nowhere is this more apparent than with the #Africa2Moon Mission, a crowdfunded campaign that aims to send a lander or orbiter to the Moon in the coming years.

Spearheaded by the Foundation for Space Development – a non-profit organization headquartered in Capetown, South Africa – the goal of this project is to fund the development of a robotic craft that will either land on or establish orbit around the Moon. Once there, it will transmit video images back to Earth, and then distribute them via the internet into classrooms all across Africa.

In so doing, the project’s founders and participants hope to help the current generation of Africans realize their own potential. Or, as it says on their website: “The #Africa2Moon Mission will inspire the youth of Africa to believe that ‘We Can Reach for the Moon’ by really reaching for the moon!”

Through their crowdfunding and a social media campaign (Twitter hashtag #Africa2Moon) they hope to raise a minimum of $150,000 for Phase I, which will consist of developing the mission concept and associated feasibility study. This mission concept will be developed collaboratively by experts assembled from African universities and industries, as well as international space experts, all under the leadership of the Mission Administrator – Professor Martinez.

The ZACube was one of several cubesats launched with the help of the South African Space Council. Credit: SA Space Council
The ZACube-1 was one of several cubesats launched under the direction of the South African Space Council. Here, an artist’s rendering of the cubesat pays homage to Nelson Mandela. Credit: SA Space Council

Martinez is a veteran when it comes to space affairs. In addition to being the convener for the space studies program at the University of Cape Town, he is also the Chairman of the South African Council for Space Affairs (the national regulatory body for space activities in South Africa). He is joined by Jonathan Weltman, the Project Administrator, who is both an aeronautical engineer and the current CEO of the Foundation for Space Development.

Phase I is planned to run from Jan to Nov 2015 and will be the starting point for Phase II of #Africa2Moon, which will be a detailed mission design. At this point, the #Africa2Moon mission planners and engineering team will determine precisely what will be needed to see it through to completion and to reach the Moon.

Beyond inspiring young minds, the program also aims to promote education in the four major fields of Science, Technology, Engineering, and Mathematics (aka STEM). Towards this end, they have pledged to commit 25% of all the funds they raise towards STEM education through a series of #Africa2Moon workshops for educators and students. In addition, numerous public engagement activities will be mounted in partnership with other groups committed to STEM education, science awareness, and outreach.

Africa is so often thought of as a land in turmoil – a place that is perennially plagued by ethnic violence, dictators, disease, drought, and famine. This popular misconception belies very positive facts about the growing economy of world’s second-largest and second-most populous continent.

That being said, all those working on the #Africa2Moon project hope it will enable future generations of Africans to bridge the humanitarian and economic divide and end Africa’s financial dependence on the rest of the world. It is also hoped that the mission will provide a platform for one or more scientific experiments, contribute to humankind’s knowledge of the moon, and form part of Africa’s contribution to global space exploration activities.

The project’s current list of supporters include the SpaceLab at the University of Cape Town, The South African Space Association, Women in Aerospace Africa, The Cape Town Science Centre, Space Commercial Services Group, Space Advisory Company, and the Space Engineering Academy. They have also launched a seed-funding campaign drive through its partnership with the UN Foundation’s #GivingTuesday initiative.

For more information, go to the Foundation’s website, or check out the mission’s Indiegogo or CauseVox page.

Further Reading: Foundation for Space Development

NASA’s “Remastered” View of Europa is the Best Yet

Europa. CThe cracked, icy surface of Europa. The smoothness of the surface has led many scientists to conclude that oceans exist beneath it. Credit: NASA/JPLredit: NASA

Europa, Jupiter’s sixth-closest moon, has long been a source of fascination and wonder for astronomers. Not only is it unique amongst its Jovian peers for having a smooth, ice-covered surface, but it is believed that warm, ocean waters exist beneath that crust – which also makes it a strong candidate for extra-terrestrial life.

And now, combining a mosaic of color images with modern image processing techniques, NASA has produced a new version of what is perhaps the best view of Europa yet. And it is quite simply the closest approximation to what the human eye would see, and the next best thing to seeing it up close.

The high-resolution color image, which shows the largest portion of the moon’s surface, was made from images taken by NASA’s Galileo probe. Using the Solid-State Imaging (SSI) experiment, the craft captured these images during it’s first and fourteenth orbit through the Jupiter system, in 1995 and 1998 respectively.

The view was previously released as a mosaic with lower resolution and strongly enhanced color (as seen on the JPL’s website). To create this new version, the images were assembled into a realistic color view of the surface that approximates how Europa would appear to the human eye.

The puzzling, fascinating surface of Jupiter's icy moon Europa looms large in this newly-reprocessed color view, made from images taken by NASA's Galileo spacecraft in the late 1990s. Image credit: NASA/JPL-Caltech/SETI Institute
This newly-reprocessed color view of Europa was made from images taken by NASA’s Galileo spacecraft in the late 1990s. Image credit: NASA/JPL-Caltech/SETI Institute

As shown above, the new image shows the stunning diversity of Europa’s surface geology. Long, linear cracks and ridges crisscross the surface, interrupted by regions of disrupted terrain where the surface ice crust has been broken up and re-frozen into new patterns.

Images taken through near-infrared, green, and violet filters have been combined to produce this view. The images have been corrected for light scattered outside of the image to provide a color correction that is calibrated by wavelength. Gaps in the images have been filled with simulated color based on the color of nearby surface areas with similar terrain types.

These color variations across the surface are associated with differences in geologic feature type and location. For example, areas that appear blue or white contain relatively pure water ice, while reddish and brownish areas include non-ice components in higher concentrations.

The polar regions, visible at the left and right of this view, are noticeably bluer than the more equatorial latitudes, which look more white. This color variation is thought to be due to differences in ice grain size in the two locations.

Artist's concept of the Galileo space probe passing through the Jupiter system. Credit: NASA
Artist’s concept of the Galileo space probe passing through the Jupiter system.
Credit: NASA

This view of Europa stands out as the color view that shows the largest portion of the moon’s surface at the highest resolution. An earlier, lower-resolution version of the view, published in 2001, featured colors that had been strongly enhanced. Space imaging enthusiasts have produced their own versions of the view using the publicly available data, but NASA has not previously issued its own rendition using near-natural color.

The image also features many long, curving, and linear fractures in the moon’s bright ice shell. Scientists are eager to learn if the reddish-brown fractures, and other markings spattered across the surface, contain clues about the geological history of Europa and the chemistry of the global ocean that is thought to exist beneath the ice.

This is of particular interest to scientists since this supposed ocean is the most promising place in our Solar System, beyond Earth, to look for  present-day environments that are suitable for life. The Galileo mission found strong evidence that a subsurface ocean of salty water is in contact with a rocky seafloor. The cycling of material between the ocean and ice shell could potentially provide sources of chemical energy that could sustain simple life forms.

Future missions to Europa, which could involve anything from landers to space penetrators, may finally answer the question of whether or not life exists beyond our small, blue planet. Picturing this world in all of its icy glory is another small step along that path.

In addition to the newly processed image, JPL has released a new video that explains why this likely ocean world is a high priority for future exploration:

Further Reading: NASA

The Orbit of Earth. How Long is a Year on Earth?

Diagram of the Earths orbit around the Sun. Credit: NASA/H. Zell

Ever since the 16th century when Nicolaus Copernicus demonstrated that the Earth revolved around in the Sun, scientists have worked tirelessly to understand the relationship in mathematical terms. If this bright celestial body – upon which depends the seasons, the diurnal cycle, and all life on Earth – does not revolve around us, then what exactly is the nature of our orbit around it?

For several centuries, astronomers have applied the scientific method to answer this question, and have determined that the Earth’s orbit around the Sun has many fascinating characteristics. And what they have found has helped us to understanding why we measure time the way we do.

Orbital Characteristics:

First of all, the speed of the Earth’s orbit around the Sun is 108,000 km/h, which means that our planet travels 940 million km during a single orbit. The Earth completes one orbit every 365.242199 mean solar days, a fact which goes a long way towards explaining why need an extra calendar day every four years (aka. during a leap year).

The planet’s distance from the Sun varies as it orbits. In fact, the Earth is never the same distance from the Sun from day to day. When the Earth is closest to the Sun, it is said to be at perihelion. This occurs around January 3rd each year, when the Earth is at a distance of about 147,098,074 km.

The average distance of the Earth from the Sun is about 149.6 million km, which is also referred to as one astronomical unit (AU). When it is at its farthest distance from the Sun, Earth is said to be at aphelion – which happens around July 4th where the Earth reaches a distance of about 152,097,701 km.

And those of you in the northern hemisphere will notice that “warm” or “cold” weather does not coincide with how close the Earth is to the Sun. That is determined by axial tilt (see below).

Elliptical Orbit:

Next, there is the nature of the Earth’s orbit. Rather than being a perfect circle, the Earth moves around the Sun in an extended circular or oval pattern. This is what is known as an “elliptical” orbit. This orbital pattern was first described by Johannes Kepler, a German mathematician and astronomer, in his seminal work Astronomia nova (New Astronomy).

An illustration of Kepler's three laws of motion, which show two planets that have elliptical orbits around the Sun. Credit: Wikipedia/Hankwang
An illustration of Kepler’s three laws of motion, which show two planets that have elliptical orbits around the Sun. Credit: Wikipedia/Hankwang

After measuring the orbits of the Earth and Mars, he noticed that at times, the orbits of both planets appeared to be speeding up or slowing down. This coincided directly with the planets’ aphelion and perihelion, meaning that the planets’ distance from the Sun bore a direct relationship to the speed of their orbits. It also meant that both Earth and Mars did not orbit the Sun in perfectly circular patterns.

In describing the nature of elliptical orbits, scientists use a factor known as “eccentricity”, which is expressed in the form of a number between zero and one. If a planet’s eccentricity is close to zero, then the ellipse is nearly a circle. If it is close to one, the ellipse is long and slender.

Earth’s orbit has an eccentricity of less than 0.02, which means that it is very close to being circular. That is why the difference between the Earth’s distance from the Sun at perihelion and aphelion is very little – less than 5 million km.

Seasonal Change:

Third, there is the role Earth’s orbit plays in the seasons, which we referred to above. The four seasons are determined by the fact that the Earth is tilted 23.4° on its vertical axis, which is referred to as “axial tilt.” This quirk in our orbit determines the solstices – the point in the orbit of maximum axial tilt toward or away from the Sun – and the equinoxes, when the direction of the tilt and the direction to the Sun are perpendicular.

Over the course of a year the orientation of the axis remains fixed in space, producing changes in the distribution of solar radiation. These changes in the pattern of radiation reaching earth’s surface cause the succession of the seasons. Credit: NOAA/Thomas G. Andrews
Over the course of a year the orientation of the axis remains fixed in space, producing changes in the distribution of solar radiation. Credit: NOAA/Thomas G. Andrews

In short, when the northern hemisphere is tilted away from the Sun, it experiences winter while the southern hemisphere experiences summer. Six months later, when the northern hemisphere is tilted towards the Sun, the seasonal order is reversed.

In the northern hemisphere, winter solstice occurs around December 21st, summer solstice is near June 21st, spring equinox is around March 20th and autumnal equinox is about September 23rd. The axial tilt in the southern hemisphere is exactly the opposite of the direction in the northern hemisphere. Thus the seasonal effects in the south are reversed.

While it is true that Earth does have a perihelion, or point at which it is closest to the sun, and an aphelion, its farthest point from the Sun, the difference between these distances is too minimal to have any significant impact on the Earth’s seasons and climate.

Lagrange Points:

Another interesting characteristic of the Earth’s orbit around the Sun has to do with Lagrange Points. These are the five positions in Earth’s orbital configuration around the Sun where where the combined gravitational pull of the Earth and the Sun provides precisely the centripetal force required to orbit with them.

Sun Earth Lagrange Points. Credit: Xander89/Wikimedia Commons
Sun-Earth Lagrange Points. Credit: Xander89/Wikimedia Commons

The five Lagrange Points between the Earth are labelled (somewhat unimaginatively) L1 to L5. L1, L2, and L3 sit along a straight line that goes through the Earth and Sun. L1 sits between them, L3 is on the opposite side of the Sun from the Earth, and L2 is on the opposite side of the Earth from L1. These three Lagrange points are unstable,  which means that a satellite placed at any one of them will move off course if disturbed in the slightest.

The L4 and L5 points lie at the tips of the two equilateral triangles where the Sun and Earth constitute the two lower points. These points liem along along Earth’s orbit, with L4 60° behind it and L5 60° ahead.  These two Lagrange Points are stable, hence why they are popular destinations for satellites and space telescopes.

The study of Earth’s orbit around the Sun has taught scientists much about other planets as well. Knowing where a planet sits in relation to its parent star, its orbital period, its axial tilt, and a host of other factors are all central to determining whether or not life may exist on one, and whether or not human beings could one day live there.

We have written many interesting articles about the Earth’s orbit here at Universe Today. Here’s 10 Interesting Facts About Earth, How Far is Earth from the Sun?, What is the Rotation of the Earth?, Why are there Seasons?, and What is Earth’s Axial Tilt?

For more information, check out this article on NASA- Window’s to the Universe article on elliptical orbits or check out NASA’s Earth: Overview.

Astronomy Cast also espidoes that are relevant to the subject. Here’s BQuestions Show: Black black holes, Unbalancing the Earth, and Space Pollution.

Sources:

Subaru Telescope Spots Galaxies From The Early Universe

The expansion of the universe over most of its history has been relatively gradual. The notion that a rapid period "inflation" preceded the Big Bang expansion was first put forth 25 years ago. The new WMAP observations favor specific inflation scenarios over other long held ideas.
A team of astronomers have used the Subaru Telescope to look back more than 13 billion years to find 7 early galaxies that appeared quite suddenly within 700 million years of the Big Bang . Credit: NASA/WMAP Science Team

It’s an amazing thing, staring into deep space with the help of a high-powered telescope. In addition to being able to through the vast reaches of space, one is also able to effectively see through time.

Using the Subaru Telescope’s Suprime-Cam, a team of astronomers has done just that. In short, they looked back 13 billion years and discovered 7 early galaxies that appeared quite suddenly within 700 million years of the Big Bang. In so doing, they discovered clues to one of astronomy’s most burning questions: when and how early galaxies formed in our universe.

The team, led by graduate student Akira Konno and Dr. Masami Ouchi (Associate Professor at the University of Tokyo’s ICRR) was looking for a specific kind of galaxy called a Lyman-alpha emitter (LAE), to understand the role such galaxies may have played in an event called “cosmic reionization”.

The current cosmological model states that the universe was born in the Big Bang some 13.8 billion years ago. In its earliest epochs, it was filled with a hot “soup” of charged protons and electrons. As the newborn universe expanded, its temperature decreased uniformly.

 Credit: NASA, ESA & A. Felid (STScI)).
It is estimated that the first stars and galaxies formed 12.8 billion years ago, during a period of “cosmic reionization”. Credit: NASA/ESA/A. Felid (STScI)

When the universe was 400,000 years old, conditions were cool enough to allow the protons and electrons to bond and form neutral hydrogen atoms. That event is called “recombination” and resulted in a universe filled with a “fog” of these neutral atoms.

Eventually the first stars and galaxies began to form, and their ultraviolet light ionized the hydrogen atoms, and “divided” the neutral hydrogen into protons and electrons again. As this occurred, the “fog” of neutral hydrogen cleared.

Astronomers call this event “cosmic reionization” and think that it ended about 12.8 billion years ago – a billion years after the Big Bang. The timing of this event – when it started and how long it lasted – is one of the big questions in astronomy.

To investigate this cosmic reionization, the Subaru team searched for early LAE galaxies at a distance of 13.1 billion light years. Although Hubble Space Telescope has found more distant LAE galaxies, the discovery of seven such galaxies at 13.1 billion light-years represents a distance milestone for Subaru Telescope.

Color composite images of seven LAEs found in this study as they appeared 13.1 billion years ago. This represents the combination of three filter images from Subaru Telescope. Red objects between two white lines are the LAEs. The LAEs of 13.1 billion years ago have a quite red color due to the effects of cosmic expansion on their component wavelengths of light. Credit: ICRR, University of Tokyo
Color composite images of seven LAEs found in the study. The red objects between two white lines are the LAEs. Credit: ICRR, University of Tokyo

Mr. Konno, the graduate student heading the analysis of the data from the Subaru Telescope pointed out the obstacles that Subaru had to overcome to make the observations.”It is quite difficult to find the most distant galaxies due to the faintness of the galaxies.” he said. “So, we developed a special filter to be able to find a lot of faint LAEs. We loaded the filter onto Suprime-Cam and conducted the most distant LAE survey with the integration time of 106 hours.”

That extremely long integration time was one of the longest ever performed at Subaru Telescope. It allowed for unprecedented sensitivity and enabled the team to search for as many of the most distant LAEs as possible.

According to Konno, the team expected to find several tens of LAEs. Instead they only found seven.

“At first we were very disappointed at this small number,” Konno said. “But we realized that this indicates LAEs appeared suddenly about 13 billion years ago. This is an exciting discovery. We can see that the luminosities suddenly brightened during the 700-800 million years after the Big Bang. What would cause this?”

Figure 2: This shows evolution of the Lyman-alpha luminosities of the galaxies. The yellow circle at 1 billion years after the Big Bang is used for normalization. The yellow circles come from previous studies, and the yellow dashed line shows the expected evolutionary trend of the luminosity. The current finding is shown by a red circle, and we can see that the galaxies appear suddenly when the universe was 700 million years old. This indicates that the neutral hydrogen fog was suddenly cleared, allowing the galaxies to shine out, as indicated by the backdrop shown for scale and illustration. Credit: ICRR, University of Tokyo; Hubble Space Telescope/NASA/ESA
This shows evolution of the Lyman-alpha luminosities of the galaxies. Credit: ICRR, University of Tokyo; Hubble Space Telescope/NASA/ESA

As the table above illustrates, the luminosities of LAEs changed based on this study. The yellow circle at 1 billion years after the Big Bang is used for normalization. The yellow circles come from previous studies, and the yellow dashed line shows the expected evolutionary trend of the luminosity.

The current finding is shown by a red circle, and we can see that the galaxies appear suddenly when the universe was 700 million years old. This indicates that the neutral hydrogen fog was suddenly cleared, allowing the galaxies to shine out, as indicated by the backdrop shown for scale and illustration.

According to the team’s analysis, one reason that LAEs appeared very quickly is cosmic reionization. LAEs in the epoch of cosmic reionization became darker than the actual luminosity due to the presence of the neutral hydrogen fog.

In the team’s analysis of their observations, they suggest the possibility that the neutral fog filling the universe was cleared about 13.0 billion years ago and LAEs suddenly appeared in sight for the first time.

“However, there are other possibilities to explain why LAEs appeared suddenly,” said Dr. Ouchi, who is the principal investigator of this program. “One is that clumps of neutral hydrogen around LAEs disappeared. Another is that LAEs became intrinsically bright. The reason of the intrinsic brightening is that the Lyman-alpha emission is not efficiently produced by the ionized clouds in a LAE due to the significant escape of ionizing photons from the galaxy. In either case, our discovery is an important key to understanding cosmic reionization and the properties of the LAEs in early universe.”

Dr. Masanori Iye, who is a representative of the Thirty Meter Telescope (TMT) project of Japan, commented on the observations and analysis. “To investigate which possibility is correct, we will observe with HSC (Hyper Suprime-Cam) on Subaru Telescope, which has a field of view 7 times wider than Suprime-Cam, and TMT currently being built on the summit of Mauna Kea in Hawaii in the future. By these observations, we will clarify the mystery of how galaxies were born and cosmic reionization occurred.”

Further Reading: Subaru Telescope

Two New Subatomic Particles Found

Particle Collider
Today, CERN announced that the LHCb experiment had revealed the existence of two new baryon subatomic particles. Credit: CERN/LHC/GridPP

With its first runs of colliding protons in 2008-2013, the Large Hadron Collider has now been providing a stream of experimental data that scientists rely on to test predictions arising out of particle and high-energy physics. In fact, today CERN made public the first data produced by LHC experiments. And with each passing day, new information is released that is helping to shed light on some of the deeper mysteries of the universe.

This week, for example, CERN announced the discovery two new subatomic particles that are part of the baryon family. The particles, known as the Xi_b’ and Xi_b*, were discovered thanks to the efforts of the LHCb experiment – an international collaboration involving roughly 750 scientists from around the world.

The existence of these particles was predicted by the quark model, but had never been seen before. What’s more, their discovery could help scientists to further confirm the Standard Model of particle physics, which is considered virtually unassailable now thanks to the discovery of the Higgs Boson.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton.

Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. LHC is as much as 175 meters (574 ft) below ground on the Frence-Swiss border near Geneva, Switzerland. The accelerator ring is 27 km (17 miles) in circumference. (Photo Credit: CERN)
Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. Credit: CERN

However, their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”; and in the Xi_b’ state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b* state they are aligned. This difference makes the Xi_b* a little heavier.

“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’ is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”

“This is a very exciting result,” said Steven Blusk from Syracuse University in New York. “Thanks to LHCb’s excellent hadron identification, which is unique among the LHC experiments, we were able to separate a very clean and strong signal from the background,” “It demonstrates once again the sensitivity and how precise the LHCb detector is.”

Blusk and Charles jointly analyzed the data that led to this discovery. The existence of the two new baryons had been predicted in 2009 by Canadian particle physicists Randy Lewis of York University and Richard Woloshyn of the TRIUMF, Canada’s national particle physics lab in Vancouver.

The bare masses of all 6 flavors of quarks, proton and electron, shown in proportional volume. Credit: Wikipedia/Incnis Mrsi
The bare masses of all 6 flavors of quarks, proton and electron, shown in proportional volume. Credit: Wikipedia/Incnis Mrsi

As well as the masses of these particles, the research team studied their relative production rates, their widths – a measure of how unstable they are – and other details of their decays. The results match up with predictions based on the theory of Quantum Chromodynamics (QCD).

QCD is part of the Standard Model of particle physics, the theory that describes the fundamental particles of matter, how they interact, and the forces between them. Testing QCD at high precision is a key to refining our understanding of quark dynamics, models of which are tremendously difficult to calculate.

“If we want to find new physics beyond the Standard Model, we need first to have a sharp picture,” said LHCb’s physics coordinator Patrick Koppenburg from Nikhef Institute in Amsterdam. “Such high precision studies will help us to differentiate between Standard Model effects and anything new or unexpected in the future.”

The measurements were made with the data taken at the LHC during 2011-2012. The LHC is currently being prepared – after its first long shutdown – to operate at higher energies and with more intense beams. It is scheduled to restart by spring 2015.

The research was published online yesterday on the physics preprint server arXiv and have been submitted to the scientific journal Physical Review Letters.

Further Reading: CERN, LHCb

Elusive Dark Matter Could Be Detected with GPS Satellites

GPS Satellite
According to a new proposal, GPS satellites may be the key to finding dark matter. Credit: NASA

You know the old saying: “if you want to hide something, put it in plain sight?” Well, according to a new proposal by two professors of physics, this logic may be the reason why scientists have struggled for so long to find the mysterious mass that is believed to comprise 27% of the matter in the universe.

In short, these two physicists believe that dark matter can be found the same way the you can find the fastest route to work: by consulting the Global Positioning System.

Andrei Derevianko, of the University of Nevada, Reno, and Maxim Pospelov, of the University of Victoria and the Perimeter Institute for Theoretical Physics in Canada, proposed this method earlier this year at a series of renowned scientific conferences, where it met with general approval.

Their idea calls for the use of GPS satellites and other atomic clock networks and comparing their times to look for discrepancies. Derevianko and Pospelov suggest that dark matter could have a disruptive affect on atomic clocks, and that by looking at existing networks of atomic clocks it might be possible to spot pockets of dark matter by their distinctive signature.

The two are starting to test this theory by analyzing clock data from the 30 GPS satellites, which use atomic clocks for everyday navigation. Correlated networks of atomic clocks, such as the GPS and some ground networks already in existence, can be used as a powerful tool to search for the topological defect dark matter where initially synchronized clocks will become desynchronized.

The HST WFPC2 image of gravitational lensing in the galaxy cluster Abell 2218, indicating the presence of large amount of dark matter (credit Andrew Fruchter at STScI).
The Hubble Space Telescope image of gravitational lensing in the galaxy cluster Abell 2218 indicating the presence of large amount of dark matter. Credit: NASA/Andrew Fruchter/STScI

“Despite solid observational evidence for the existence of dark matter, its nature remains a mystery,” Derevianko, a professor in the College of Science at the University, said. “Some research programs in particle physics assume that dark matter is composed of heavy-particle-like matter. This assumption may not hold true, and significant interest exists for alternatives.”

Their proposal builds on the idea that dark matter could come from cracks in the universe’s quantum fields that could disturb such fundamental properties as the mass of an electron, and have an effect on the way we measure time. This represents a break from the more conventional view that dark matter consists of subatomic particles such as WIMPs and axions.

“Our research pursues the idea that dark matter may be organized as a large gas-like collection of topological defects, or energy cracks,” Derevianko said. “We propose to detect the defects, the dark matter, as they sweep through us with a network of sensitive atomic clocks. The idea is, where the clocks go out of synchronization, we would know that dark matter, the topological defect, has passed by. In fact, we envision using the GPS constellation as the largest human-built dark-matter detector.”

Derevianko is collaborating on analyzing GPS data with Geoff Blewitt, director of the Nevada Geodetic Laboratory, also in the College of Science at the University of Nevada, Reno. The Geodetic Lab developed and maintains the largest GPS data processing center in the world, able to process information from about 12,000 stations around the globe continuously, 24/7.

Artist's rendering of a vacuum tube, one of the main components of an atomic clock. Credit: NASA
Artist’s rendering of a vacuum tube, one of the main components of an atomic clock. Credit: NASA

Blewitt, also a physicist, explained how an array of atomic clocks could possibly detect dark matter.

“We know the dark matter must be there, for example, because it is seen to bend light around galaxies, but we have no evidence as to what it might be made of,” he said. “If the dark matter were not there, the normal matter that we know about would not be sufficient to bend the light as much as it does. That’s just one of the ways scientists know there is a massive amount of dark matter somewhere out there in the galaxy. One possibility is that the dark matter in this gas might not be made out of particles like normal matter, but of macroscopic imperfections in the fabric of space-time.

“The Earth sweeps through this gas as it orbits the galaxy. So to us, the gas would appear to be like a galactic wind of dark matter blowing through the Earth system and its satellites. As the dark matter blows by, it would occasionally cause clocks of the GPS system to go out of sync with a tell-tale pattern over a period of about 3 minutes. If the dark matter causes the clocks to go out of sync by more than a billionth of a second we should easily be able to detect such events.”

“This type of work can be transformative in science and could completely change how we think about our universe,” Jeff Thompson, a physicist and dean of the University’s College of Science, said. “Andrei is a world class physicist and he has already made seminal contributions to physics. It’s a wonder to watch the amazing work that comes from him and his group.”

Derevianko teaches quantum physics and related subjects at the University of Nevada, Reno. He has authored more than 100 refereed publications in theoretical physics. He is a fellow of the American Physical Society, a Simons fellow in theoretical physics and a Fulbright scholar. Among a variety of research topics, he has contributed to the development of several novel classes of atomic clocks and precision tests of fundamental symmetries with atoms and molecules.

Their research appeared earlier this week in the online version of the scientific journal Nature Physics, ahead of the print version.

Further Reading: University of Nevada

“Spotters Guide” for Detecting Black Hole Collisions

A supermassive black hole has been found in an unusual spot: an isolated region of space where only small, dim galaxies reside. Image credit: NASA/JPL-Caltech
A team of astronomers from South Africa have noticed a series of supermassive black holes in distant galaxies that are all spinning in the same direction. Credit: NASA/JPL-Caltech

When it comes to the many mysteries of the Universe, a special category is reserved for black holes. Since they are invisible to the naked eye, they remain visibly undetected, and scientists are forced to rely on “seeing” the effects their intense gravity has on nearby stars and gas clouds in order to study them.

That may be about to change, thanks to a team from Cardiff University. Here, researchers have achieved a breakthrough that could help scientists discover hundreds of black holes throughout the Universe.

Led by Dr. Mark Hannam from the School of Physics and Astronomy, the researchers have built a theoretical model which aims to predict all potential gravitational-wave signals that might be found by scientists working with the Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors.

These detectors, which act like microphones, are designed to search out remnants of black hole collisions. When they are switched on, the Cardiff team hope their research will act as a sort of “spotters guide” and help scientists pick up the faint ripples of collisions – known as gravitational waves – that took place millions of years ago.

X-ray/radio composite image of two supermassive black holes spiral towards each other near the center of a galaxy cluster named Abell 400. Credit: X-ray: NASA/CXC/AIfA/D.Hudson & T.Reiprich et al.; Radio: NRAO/VLA/NRL
X-ray/radio composite image of two supermassive black holes spiraling towards each other near the center of Abell 400 galaxy cluster. Credit: X-ray: NASA/CXC/AIfA/D.Hudson & T.Reiprich et al.; Radio: NRAO/VLA/NRL

Made up of postdoctoral researchers, PhD students, and collaborators from universities in Europe and the United States, the Cardiff team will work with scientists across the world as they attempt to unravel the origins of the Universe.

“The rapid spinning of black holes will cause the orbits to wobble, just like the last wobbles of a spinning top before it falls over,” Hannam said. “These wobbles can make the black holes trace out wild paths around each other, leading to extremely complicated gravitational-wave signals. Our model aims to predict this behavior and help scientists find the signals in the detector data.”

Already, the new model has been programmed into the computer codes that LIGO scientists all over the world are preparing to use to search for black-hole mergers when the detectors switch on.

Dr Hannam added: “Sometimes the orbits of these spinning black holes look completely tangled up, like a ball of string. But if you imagine whirling around with the black holes, then it all looks much clearer, and we can write down equations to describe what is happening. It’s like watching a kid on a high-speed spinning amusement park ride, apparently waving their hands around. From the side lines, it’s impossible to tell what they’re doing. But if you sit next to them, they might be sitting perfectly still, just giving you the thumbs up.”

Researchers crunched Einstein's theory of general relativity on the Columbia supercomputer at the NASA Ames Research Center to create a three-dimensional simulation of merging black holes. Image Credit: Henze, NASA
Researchers crunched Einstein’s theory of general relativity on the Columbia supercomputer at the NASA Ames Research Center to create a three-dimensional simulation of merging black holes. Credit: Henze, NASA

But of course, there’s still work to do: “So far we’ve only included these precession effects while the black holes spiral towards each other,” said Dr. Hannam. “We still need to work our exactly what the spins do when the black holes collide.”

For that they need to perform large computer simulations to solve Einstein’s equations for the moments before and after the collision. They’ll also need to produce many simulations to capture enough combinations of black-hole masses and spin directions to understand the overall behavior of these complicated systems.

In addition, time is somewhat limited for the Cardiff team. Once the detectors are switched on, it will only be a matter of time before the first gravitational wave-detections are made. The calculations that Dr. Hannam and his colleagues are producing will have to ready in time if they hope to make the most of them.

But Dr. Hannam is optimistic. “For years we were stumped on how to untangle the black-hole motion,” he said. “Now that we’ve solved that, we know what to do next.”

Further Reading: News Center – Cardiff U

Amazingly Detailed New Maps of Asteroid Vesta

Artist's concept of the Dawn spacecraft arriving at Vesta. Image credit: NASA/JPL-Caltech

Vesta is one of the largest asteroids in the Solar System. Comprising 9% of the mass in the Asteroid Belt, it is second in size only to the dwarf-planet Ceres. And now, thanks to data obtained by NASA’s Dawn spacecraft, Vesta’s surface has been mapped out in unprecedented detail.
These high-resolution geological maps reveal the variety of Vesta’s surface features and provide a window into the asteroid’s history.

“The geologic mapping campaign at Vesta took about two-and-a-half years to complete, and the resulting maps enabled us to recognize a geologic timescale of Vesta for comparison to other planets,” said David Williams of Arizona State University.

Geological mapping is a technique used to derive the geologic history of a planetary object from detailed analysis of surface morphology, topography, color and brightness information. The team found that Vesta’s geological history is characterized by a sequence of large impact events, primarily by the Veneneia and Rheasilvia impacts in Vesta’s early history and the Marcia impact in its late history.

The geologic mapping of Vesta was made possible by the Dawn spacecraft’s framing camera, which was provided by the Max Planck Institute for Solar System Research of the German Max Planck Society and the German Aerospace Center.  This camera takes panchromatic images and seven bands of color-filtered images, which are used to create topographic models of the surface that aid in the geologic interpretation.

A team of 14 scientists mapped the surface of Vesta using Dawn data. The study was led by three NASA-funded participating scientists: Williams; R. Aileen Yingst of the Planetary Science Institute; and W. Brent Garry of the NASA Goddard Spaceflight Center.

This high-res geological map of Vesta is derived from Dawn spacecraft data. Brown colors represent the oldest, most heavily cratered surface. Credit: NASA/JPL-Caltech/ASU
This high-res geological map of Vesta is derived from Dawn spacecraft data. Credit: NASA/JPL-Caltech/ASU

The brown colored sections of the map represent the oldest, most heavily cratered surface. Purple colors in the north and light blue represent terrains modified by the Veneneia and Rheasilvia impacts, respectively. Light purples and dark blue colors below the equator represent the interior of the Rheasilvia and Veneneia basins. Greens and yellows represent relatively young landslides or other downhill movement and crater impact materials, respectively.

The map indicates the prominence of impact events – such as the Veneneia, Rheasilvia and Marcia impacts, respectively – in shaping the asteroid’s surface. It also indicates that the oldest crust on Vesta pre-dates the earliest Veneneia impact. The relative timescale is supplemented by model-based absolute ages from two different approaches that apply crater statistics to date the surface.

“This mapping was crucial for getting a better understanding of Vesta’s geological history, as well as providing context for the compositional information that we received from other instruments on the spacecraft: the visible and infrared (VIR) mapping spectrometer and the gamma-ray and neutron detector (GRaND),” said Carol Raymond, Dawn’s deputy principal investigator at NASA’s Jet Propulsion Laboratory in Pasadena, California.

The objective of NASA’s Dawn mission is to characterize the two most massive objects in the main asteroid belt between Mars and Jupiter – Vesta and the dwarf planet Ceres.

These Hubble Space Telescope images of Vesta and Ceres show two of the most massive asteroids in the asteroid belt, a region between Mars and Jupiter. Credit: NASA/European Space Agency
These Hubble Space Telescope images of Vesta and Ceres show two of the most massive asteroids in the asteroid belt. Credit: NASA/European Space Agency

Asteroids like Vesta are remnants of the formation of the solar system, giving scientists a peek at its early history. They can also harbor molecules that are the building blocks of life and reveal clues about the origins of life on Earth. Hence why scientists are eager to learn more about its secrets.

The Dawn spacecraft was launched in September of 2007 and orbited Vesta between July 2011 and September 2012. Using ion propulsion in spiraling trajectories to travel from Earth to Vesta, Dawn will orbit Vesta and then continue on to orbit the dwarf planet Ceres by April 2015.

The high resolution maps were included with a series of 11 scientific papers published this week in a special issue of the journal Icarus. The Dawn spacecraft is currently on its way to Ceres, the largest object in the asteroid belt, and will arrive at Ceres in March 2015.

Further Reading: NASA

Warm, Flowing Water on Mars Was Episodic, Study Suggests

Credit: NASA/MRO/Rendering: James Dickson, Brown University

Though the surface of Mars is a dry, dessicated and bitterly cold place today, it is strongly believed that the planet once had rivers, streams, lakes, and flowing water on its surface. Thanks to a combination of spacecraft imagery, remote sensing techniques and surface investigations from landers and rovers, ample evidence has been assembled to support this theory.

However, it is hard to reconcile this view with the latest climate models of Mars which suggest that it should have been a perennially cold and icy place. But according to a new study, the presence of warm, flowing water may have been an episodic occurrence, something that happened for decades or centuries when the planet was warmed sufficiently by volcanic eruptions and greenhouse gases.

The study, which was conducted by scientists from Brown University and Israel’s Weizmann Institute of Science, suggests that warmth and water flow on ancient Mars were probably episodic, related to brief periods of volcanic activity that spewed tons of greenhouse-inducing sulfur dioxide gas into the atmosphere.

The work combines the effect of volcanism with the latest climate models of early Mars and suggests that periods of temperatures warm enough for water to flow likely lasted for only tens or hundreds of years at a time.

The notion that Mars had surface water predates the space age by centuries. Long before Percival Lowell observed what he thought were “canals” on the Martian surface in 1877, the polar ice caps and dark spots on the surface were being observed by astronomers who thought that they were indications of liquid water.

Curiosity found evidence of an ancient, flowing stream on Mars at a few sites, including the "Hottah" rock outcrop pictured here. Credit: NASA/JPL
Curiosity found evidence of an ancient, flowing stream on Mars at a few sites, including the “Hottah” rock outcrop pictured here. Credit: NASA/JPL

But with all that’s been learned about Mars in recent years, the mystery of the planet’s ancient water has only deepened. The latest generation of climate models for early Mars suggests that the atmosphere was too thin to heat the planet enough for water to flow. Billions of years ago, the sun was also much dimmer than it is today, which further complicates this picture of a warmer early Mars.

“These new climate models that predict a cold and ice-covered world have been difficult to reconcile with the abundant evidence that water flowed across the surface to form streams and lakes,” said James W. Head, professor of earth, environmental and planetary sciences at Brown University and co-author of the new paper with Weizmann’s Itay Halevy. “This new analysis provides a mechanism for episodic periods of heating and melting of snow and ice that could have each lasted decades to centuries.”

Halevy and Head explored the idea that heating may have been linked to periodic volcanism. Many of the geological features that suggest water was flowing on the Martian surface have been dated to 3.7 billion years ago, a time when massive volcanoes are thought to have been active.

And whereas on Earth, widespread volcanism has often led to global dimming rather than warming – on account of sulfuric acid particles reflecting the sun’s rays – Head and Halevy think the effects may have been different in Mars’ dusty atmosphere.

To test this theory, they created a model of how sulfuric acid might react with the widespread dust in the Martian atmosphere. The work suggests that those sulfuric acid particles would have glommed onto dust particles and reduced their ability to reflect the sun’s rays. Meanwhile, sulfur dioxide gas would have produced enough greenhouse effect to warm the Martian equatorial region so that water could flow.

Image of the McMurdo Dry Valleys, Antarctica,  acquired by Landsat 7’s Enhanced Thematic Mapper plus (ETM+) instrument. Credit: NASA/EO
Image of the McMurdo Dry Valleys, Antarctica, acquired by Landsat 7’s Enhanced Thematic Mapper plus (ETM+) instrument. Credit: NASA/EO

Head has been doing fieldwork for years in Antarctica and thinks the climate on early Mars may have been very similar to what he has observed in the cold, desert-like.

“The average yearly temperature in the Antarctic Dry Valleys is way below freezing, but peak summer daytime temperatures can exceed the melting point of water, forming transient streams, which then refreeze,” Head said. “In a similar manner, we find that volcanism can bring the temperature on early Mars above the melting point for decades to centuries, causing episodic periods of stream and lake formation.”

As that early active volcanism on Mars ceased, so did the possibility of warmer temperatures and flowing water.

According to Head, this theory might also help in the ongoing search for signs that Mars once hosted life. If it ever did exist, this new research may offer clues as to where the fossilized remnants ended up.

“Life in Antarctica, in the form of algal mats, is very resistant to extremely cold and dry conditions and simply waits for the episodic infusion of water to ‘bloom’ and develop,” he said. “Thus, the ancient and currently dry and barren river and lake floors on Mars may harbor the remnants of similar primitive life, if it ever occurred on Mars.”

The research was published in Nature Geoscience.

Further Reading: Brown University