Ever since Galileo pointed his telescope at Jupiter and saw moons in orbit around that planet, we began to realize we don’t occupy a central, important place in the Universe. In 2013, a study showed that we may be further out in the boondocks than we imagined. Now, a new study confirms it: we live in a void in the filamental structure of the Universe, a void that is bigger than we thought.
In 2013, a study by University of Wisconsin–Madison astronomer Amy Barger and her student Ryan Keenan showed that our Milky Way galaxy is situated in a large void in the cosmic structure. The void contains far fewer galaxies, stars, and planets than we thought. Now, a new study from University of Wisconsin student Ben Hoscheit confirms it, and at the same time eases some of the tension between different measurements of the Hubble Constant.
The void has a name; it’s called the KBC void for Keenan, Barger and the University of Hawaii’s Lennox Cowie. With a radius of about 1 billion light years, the KBC void is seven times larger than the average void, and it is the largest void we know of.
The large-scale structure of the Universe consists of filaments and clusters of normal matter separated by voids, where there is very little matter. It’s been described as “Swiss cheese-like.” The filaments themselves are made up of galaxy clusters and super-clusters, which are themselves made up of stars, gas, dust and planets. Finding out that we live in a void is interesting on its own, but its the implications it has for Hubble’s Constant that are even more interesting.
Hubble’s Constant is the rate at which objects move away from each other due to the expansion of the Universe. Dr. Brian Cox explains it in this short video.
The problem with Hubble’s Constant, is that you get a different result depending on how you measure it. Obviously, this is a problem. “No matter what technique you use, you should get the same value for the expansion rate of the universe today,” explains Ben Hoscheit, the Wisconsin student who presented his analysis of the KBC void on June 6th at a meeting of the American Astronomical Society. “Fortunately, living in a void helps resolve this tension.”
There are a couple ways of measuring the expansion rate of the Universe, known as Hubble’s Constant. One way is to use what are known as “standard candles.” Supernovae are used as standard candles because their luminosity is so well-understood. By measuring their luminosity, we can determine how far away the galaxy they reside in is.
Another way is by measuring the CMB, the Cosmic Microwave Background. The CMB is the left over energy imprint from the Big Bang, and studying it tells us the state of expansion in the Universe.
The two methods can be compared. The standard candle approach measures more local distances, while the CMB approach measures large-scale distances. So how does living in a void help resolve the two?
Measurements from inside a void will be affected by the much larger amount of matter outside the void. The gravitational pull of all that matter will affect the measurements taken with the standard candle method. But that same matter, and its gravitational pull, will have no effect on the CMB method of measurement.
“One always wants to find consistency, or else there is a problem somewhere that needs to be resolved.” – Amy Barger, University of Hawaii, Dept. of Physics and Astronomy
Hoscheit’s new analysis, according to Barger, the author of the 2013 study, shows that Keenan’s first estimations of the KBC void, which is shaped like a sphere with a shell of increasing thickness made up of galaxies, stars and other matter, are not ruled out by other observational constraints.
“It is often really hard to find consistent solutions between many different observations,” says Barger, an observational cosmologist who also holds an affiliate graduate appointment at the University of Hawaii’s Department of Physics and Astronomy. “What Ben has shown is that the density profile that Keenan measured is consistent with cosmological observables. One always wants to find consistency, or else there is a problem somewhere that needs to be resolved.”
We humans have an insatiable hunger to understand the Universe. As Carl Sagan said, “Understanding is Ecstasy.” But to understand the Universe, we need better and better ways to observe it. And that means one thing: big, huge, enormous telescopes.
In this series we’ll look at the world’s upcoming Super Telescopes:
It’s easy to forget the impact that the Hubble Space Telescope has had on our state of knowledge about the Universe. In fact, that might be the best measurement of its success: We take the Hubble, and all we’ve learned from it, for granted now. But other space telescopes are being developed, including the WFIRST, which will be much more powerful than the Hubble. How far will these telescopes extend our understanding of the Universe?
“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has.” – John Grunsfeld, NASA Science Mission Directorate
The WFIRST might be the true successor to the Hubble, even though the James Webb Space Telescope (JWST) is often touted as such. But it may be incorrect to even call WFIRST a telescope; it’s more accurate to call it an astrophysics observatory. That’s because one of its primary science objectives is to study Dark Energy, that rather mysterious force that drives the expansion of the Universe, and Dark Matter, the difficult-to-detect matter that slows that expansion.
WFIRST will have a 2.4 meter mirror, the same size as the Hubble. But, it will have a camera that will expand the power of that mirror. The Wide Field Instrument is a 288-megapixel multi-band near-infrared camera. Once it’s in operation, it will capture images that are every bit as sharp as those from Hubble. But there is one huge difference: The Wide Field Instrument will capture images that cover over 100 times the sky that Hubble does.
Alongside the Wide Field Instrument, WFIRST will have the Coronagraphic Instrument. The Coronagraphic Instrument will advance the study of exoplanets. It’ll use a system of filters and masks to block out the light from other stars, and hone in on planets orbiting those stars. This will allow very detailed study of the atmospheres of exoplanets, one of the main ways of determining habitability.
WFIRST is slated to be launched in 2025, although it’s too soon to have an exact date. But when it launches, the plan is for WFIRST to travel to the Sun-Earth LaGrange Point 2 (L2.) L2 is a gravitationally balanced point in space where WFIRST can do its work without interruption. The mission is set to last about 6 years.
Probing Dark Energy
“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has,” said John Grunsfeld, astronaut and associate administrator of NASA’s Science Mission Directorate at Headquarters in Washington. “This mission uniquely combines the ability to discover and characterize planets beyond our own solar system with the sensitivity and optics to look wide and deep into the universe in a quest to unravel the mysteries of dark energy and dark matter.”
In a nutshell, there are two proposals for what Dark Energy can be. The first is the cosmological constant, where Dark Energy is uniform throughout the cosmos. The second is what’s known as scalar fields, where the density of Dark Energy can vary in time and space.
Since the 1990s, observations have shown us that the expansion of the Universe is accelerating. That acceleration started about 5 billion years ago. We think that Dark Energy is responsible for that accelerated expansion. By providing such large, detailed images of the cosmos, WFIRST will let astronomers map expansion over time and over large areas. WFIRST will also precisely measure the shapes, positions and distances of millions of galaxies to track the distribution and growth of cosmic structures, including galaxy clusters and the Dark Matter accompanying them. The hope is that this will give us a next level of understanding when it comes to Dark Energy.
If that all sounds too complicated, look at it this way: We know the Universe is expanding, and we know that the expansion is accelerating. We want to know why it’s expanding, and how. We’ve given the name ‘Dark Energy’ to the force that’s driving that expansion, and now we want to know more about it.
Probing Exoplanets
Dark Energy and the expansion of the Universe is a huge mystery, and a question that drives cosmologists. (They really want to know how the Universe will end!) But for many of the rest of us, another question is even more compelling: Are we alone in the Universe?
There’ll be no quick answer to that one, but any answer we find begins with studying exoplanets, and that’s something that WFIRST will also excel at.
“WFIRST is designed to address science areas identified as top priorities by the astronomical community,” said Paul Hertz, director of NASA’s Astrophysics Division in Washington. “The Wide-Field Instrument will give the telescope the ability to capture a single image with the depth and quality of Hubble, but covering 100 times the area. The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.”
“The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.” – Paul Hertz, NASA Astrophysics Division
The difficulty in studying exoplanets is that they are all orbiting stars. Stars are so bright they make it impossible to see their planets in any detail. It’s like staring into a lighthouse miles away and trying to study an insect near the lighthouse.
The Coronagraphic Instrument on board WFIRST will excel at blocking out the light of distant stars. It does that with a system of mirrors and masks. This is what makes studying exoplanets possible. Only when the light from the star is dealt with, can the properties of exoplanets be examined.
This will allow detailed measurements of the chemical composition of an exoplanet’s atmosphere. By doing this over thousands of planets, we can begin to understand the formation of planets around different types of stars. There are some limitations to the Coronagraphic Instrument, though.
The Coronagraphic Instrument was kind of a late addition to WFIRST. Some of the other instrumentation on WFIRST isn’t optimized to work with it, so there are some restrictions to its operation. It will only be able to study gas giants, and so-called Super-Earths. These larger planets don’t require as much finesse to study, simply because of their size. Earth-like worlds will likely be beyond the power of the Coronagraphic Instrument.
These limitations are no big deal in the long run. The Coronagraph is actually more of a technology demonstration, and it doesn’t represent the end-game for exoplanet study. Whatever is learned from this instrument will help us in the future. There will be an eventual successor to WFIRST some day, perhaps decades from now, and by that time Coronagraph technology will have advanced a great deal. At that future time, direct snapshots of Earth-like exoplanets may well be possible.
But maybe we won’t have to wait that long.
Starshade To The Rescue?
There is a plan to boost the effectiveness of the Coronagraph on WFIRST that would allow it to image Earth-like planets. It’s called the EXO-S Starshade.
The EXO-S Starshade is a 34m diameter deployable shading system that will block starlight from impairing the function of WFIRST. It would actually be a separate craft, launched separately and sent on its way to rendezvous with WFIRST at L2. It would not be tethered, but would orient itself with WFIRST through a system of cameras and guide lights. In fact, part of the power of the Starshade is that it would be about 40,000 to 50,000 km away from WFIRST.
Dark Energy and Exoplanets are priorities for WFIRST, but there are always other discoveries awaiting better telescopes. It’s not possible to predict everything that we’ll learn from WFIRST. With images as detailed as Hubble’s, but 100 times larger, we’re in for some surprises.
“This mission will survey the universe to find the most interesting objects out there.” – Neil Gehrels, WFIRST Project Scientist
“In addition to its exciting capabilities for dark energy and exoplanets, WFIRST will provide a treasure trove of exquisite data for all astronomers,” said Neil Gehrels, WFIRST project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “This mission will survey the universe to find the most interesting objects out there.”
With all of the Super Telescopes coming on line in the next few years, we can expect some amazing discoveries. In 10 to 20 years time, our knowledge will have advanced considerably. What will we learn about Dark Matter and Dark Energy? What will we know about exoplanet populations?
Right now it seems like we’re just groping towards a better understanding of these things, but with WFIRST and the other Super Telescopes, we’re poised for more purposeful study.
Since the 1960s, astronomers have been aware of the electromagnetic background radiation that pervades the Universe. Known as the Cosmic Microwave Background, this radiation is the oldest light in the Universe and what is left over from the Big Bang. By 2004, astronomers also became aware that a large region within the CMB appeared to be colder than its surroundings.
Known as the “CMB Cold Spot”, scientists have puzzled over this anomaly for years, with explanations ranging from a data artifact to it being caused by a supervoid. According to a new study conducted by a team of scientists from Durham University, the presence of a supervoid has been ruled out. This conclusion once again opens the door to more exotic explanations – like the existence of a parallel Universe!
The Cold Spot is one of several anomalies that astronomers have been studying since the first maps of CMB were created using data from the Wilkinson Microwave Anisotropy Probe (WMAP). These anomalies are regions in the CMB that fall beneath the average background temperature of 2.73 degrees above absolute zero (-270.43 °C; -460.17 °F). In the case of the Cold Spot, the area is just 0.00015° colder than its surroundings.
And yet, this temperature difference is enough that the Cold Spot has become something of a thorn in the hip of standard models of cosmology. Previously, the smart money appeared to be on it being caused by a supervoid – and area of space measuring billions of light years across which contained few galaxies. To test this theory, the Durham team conducted a survey of the galaxies in the region.
This technique, which measures the extent to which visible light coming from an object is shifted towards the red end of the spectrum, has been the standard method for determining the distance to other galaxies for over a century. For the sake of their study, the Durham team relied on data from the Anglo-Australian Telescope to conduct a survey where they measured the redshifts of 7,000 nearby galaxies.
Based on this high-fidelity dataset, the researchers found no evidence that the Cold Spot corresponded to a relative lack of galaxies. In other words, there was no indication that the region is a supervoid. The results of their study will be published in the Monthly Notices of the Royal Astronomical Society (MNRAS) under the title “Evidence Against a Supervoid Causing the CMB Cold Spot“.
As Ruari Mackenzie – a postdoctoral student in the Dept. of Physics at Durham University, a member of the Center for Extragalactic Astronomy, and the lead author on the paper – explained in an RAS press release:
“The voids we have detected cannot explain the Cold Spot under standard cosmology. There is the possibility that some non-standard model could be proposed to link the two in the future but our data place powerful constraints on any attempt to do that.”
Specifically, the Durham team found that the Cold Spot region could be split into smaller voids, each of which were surrounded by clusters of galaxies. This distribution was consistent with a control field the survey chose for the study, both of which exhibited the same “soap bubble” structure. The question therefore arises: if the Cold Spot is not the result of a void or a relative lack of galaxies, what is causing it?
This is where the more exotic explanations come in, which emphasize that the Cold Spot may be due to something that exists outside the standard model of cosmology. As Tom Shanks, a Professor with the Dept.of Physics at Durham and a co-author of the study, explained:
“Perhaps the most exciting of these is that the Cold Spot was caused by a collision between our universe and another bubble Universe. If further, more detailed, analysis of CMB data proves this to be the case then the Cold Spot might be taken as the first evidence for the multiverse – and billions of other Universes may exist like our own.”
Multiverse Theory, which was first proposed by philosopher and psychologist William James, states that there may be multiple or an even infinite number of Universes that exist parallel to our own. Between these Universes exists the entirety of existence and all cosmological phenomena – i.e. space, time, matter, energy, and all of the physical laws that bind them.
Whereas it is often treated as a philosophical concept, the theory arose in part from the study of cosmological forces, like black holes and problems arising from the Big Bang Theory. In addition, variations on multiverse theory have been suggested as potential resolutions to theories that go beyond the Standard Model of particle physics – such as String Theory and M-theory.
Another variation – the Many-Worlds interpretation – has also been offered as a possible resolution for the wavefunction of subatomic particles. Essentially, it states that all possible outcomes in quantum mechanics exist in alternate universes, and there really is no such thing as “wavefunction collapse’. Could it therefore be argued that an alternate or parallel Universe is too close to our own, and thus responsible for the anomalies we see in the CMB?
As explanations go, it certainly is exciting, if perhaps a bit fantastic? And the Durham team is not prepared to rule out that the Cold Spot could be the result fluctuations that can be explained by the standard model of cosmology. Right now, the only thing that can be said definitively is that the Cold Spot cannot be explained by something as straightforward as a supervoid and the absence of galaxies.
And in the meantime, additional surveys and experiments need to be conducted. Otherwise, this mystery may become a real sticking point for cosmology!
Dark matter is mysterious stuff, because we can’t really “see” it. But that hasn’t stopped scientists from researching it, and from theorizing about it. One theory says that there should be filament structures of dark matter connecting galaxies. Scientists from the University of Waterloo have now imaged one of those dark matter filaments for the first time.
Theory predicts that filaments of dark matter connect galaxies together, by reaching from the dark matter halo of one galaxy to the same halo in another galaxy. Other researchers have found dark matter filaments connecting entire galaxy clusters, but this is the first time that filaments have been imaged between individual galaxies.
“This image moves us beyond predictions to something we can see and measure.” – Mike Hudson, University of Waterloo
“For decades, researchers have been predicting the existence of dark-matter filaments between galaxies that act like a web-like superstructure connecting galaxies together,” said Mike Hudson, a professor of astronomy at the University of Waterloo. “This image moves us beyond predictions to something we can see and measure.”
Dark matter makes up about 25% of the Universe. But it doesn’t shine, reflect, or interact with light in any way, so it’s difficult to study. The only way we can really study it is by observing gravity. In this study, the pair of astronomers used the weak gravitational lensing technique.
Weak gravitational lensing relies on the effect that mass has on light. Enough concentrated mass in the foreground—dark matter in this case—will warp light from distant sources in the background.
When dealing with something as large as a super-massive Black Hole, gravitational lensing is quite pronounced. But galaxy-to-galaxy filaments of dark matter are much less dense than a black hole, so their individual effect is minimal. What the astronomers needed was the combined data from multiple galaxy pairs in order to detect the weak gravitational lensing.
Key to this study is the Canada-France-Hawaii Telescope. It performed a multi-year sky survey that laid the groundwork for this study. The researchers combined lensing images of over 23,000 pairs of galaxies 4.5 billion light years away. The resulting composite image revealed the filament bridge between the two galaxies.
“By using this technique, we’re not only able to see that these dark matter filaments in the universe exist, we’re able to see the extent to which these filaments connect galaxies together.” – Seth D. Epps, University of Waterloo
We still don’t know what dark matter is, but the fact that scientists were able to predict these filaments, and then actually find them, shows that we’re making progress understanding it.
We’ve known about the large scale structure of the Universe for some time, and we know that dark matter is a big part of it. Galaxies tend to cluster together, under the influence of dark matter’s gravitational pull. Finding a dark matter bridge between galaxies is an intriguing discovery. It at least takes a little of the mystery out of dark matter.
Over the past decades, scientists have wrestled with a problem involving the Big Bang Theory. The Big Bang Theory suggests that there should be three times as much lithium as we can observe. Why is there such a discrepancy between prediction and observation?
To get into that problem, let’s back up a bit.
The Big Bang Theory (BBT) is well-supported by multiple lines of evidence and theory. It’s widely accepted as the explanation for how the Universe started. Three key pieces of evidence support the BBT:
rough agreement between calculations and observations of the abundance of primordial light nuclei (Do NOT attempt to say this three times in rapid succession!)
But the BBT still has some niggling questions.
The missing lithium problem is centred around the earliest stages of the Universe: from about 10 seconds to 20 minutes after the Big Bang. The Universe was super hot and it was expanding rapidly. This was the beginning of what’s called the Photon Epoch.
At that time, atomic nuclei formed through nucleosynthesis. But the extreme heat that dominated the Universe prevented the nuclei from combining with electrons to form atoms. The Universe was a plasma of nuclei, electrons, and photons.
Only the lightest nuclei were formed during this time, including most of the helium in the Universe, and small amounts of other light nuclides, like deuterium and our friend lithium. For the most part, heavier elements weren’t formed until stars appeared, and took on the role of nucleosynthesis.
The problem is that our understanding of the Big Bang tells us that there should be three times as much lithium as there is. The BBT gets it right when it comes to other primordial nuclei. Our observations of primordial helium and deuterium match the BBT’s predictions. So far, scientists haven’t been able to resolve this inconsistency.
But a new paper from researchers in China may have solved the puzzle.
One assumption in Big Bang nucleosynthesis is that all of the nuclei are in thermodynamic equilibrium, and that their velocities conform to what’s called the classical Maxwell-Boltzmann distribution. But the Maxwell-Boltzmann describes what happens in what is called an ideal gas. Real gases can behave differently, and this is what the researchers propose: that nuclei in the plasma of the early photon period of the Universe behaved slightly differently than thought.
The authors applied what is known as non-extensive statistics to solve the problem. In the graph above, the dotted lines of the author’s model predict a lower abundance of the beryllium isotope. This is key, since beryllium decays into lithium. Also key is that the resulting amount of lithium, and of the other lighter nuclei, now all conform to the amounts predicted by the Maxwell-Boltzmann distribution. It’s a eureka moment for cosmology aficionados.
What this all means is scientists can now accurately predict the abundance in the primordial universe of the three primordial nuclei: helium, deuterium, and lithium. Without any discrepancy, and without any missing lithium.
This is how science grinds away at problems, and if the authors of the paper are correct, then it further validates the Big Bang Theory, and brings us one step closer to understanding how our Universe was formed.
Let’s be honest. Dark matter’s a pain in the butt. Astronomers have gone to great lengths to explain why is must exist and exist in huge quantities, yet it remains hidden. Unknown. Emitting no visible energy yet apparently strong enough to keep galaxies in clusters from busting free like wild horses, it’s everywhere in vast quantities. What is the stuff – axions,WIMPS, gravitinos,Kaluza Klein particles?
It’s estimated that 27% of all the matter in the universe is invisible, while everything from PB&J sandwiches to quasars accounts for just 4.9%. But a new theory of gravity proposed by theoretical physicist Erik Verlinde of the University of Amsterdam found out a way to dispense with the pesky stuff.
Unlike the traditional view of gravity as a fundamental force of nature, Verlinde sees it as an emergent property of space. Emergence is a process where nature builds something large using small, simple pieces such that the final creation exhibits properties that the smaller bits don’t. Take a snowflake. The complex symmetry of a snowflake begins when a water droplet freezes onto a tiny dust particle. As the growing flake falls, water vapor freezes onto this original crystal, naturally arranging itself into a hexagonal (six-sided) structure of great beauty. The sensation of temperature is another emergent phenomenon, arising from the motion of molecules and atoms.
So too with gravity, which according to Verlinde, emerges from entropy. We all know about entropy and messy bedrooms, but it’s a bit more subtle than that. Entropy is a measure of disorder in a system or put another way, the number of different microscopic states a system can be in. One of the coolest descriptions of entropy I’ve heard has to do with the heat our bodies radiate. As that energy dissipates in the air, it creates a more disordered state around us while at the same time decreasing our own personal entropy to ensure our survival. If we didn’t get rid of body heat, we would eventually become disorganized (overheat!) and die.
Emergent or entropic gravity, as the new theory is called, predicts the exact same deviation in the rotation rates of stars in galaxies currently attributed to dark matter. Gravity emerges in Verlinde’s view from changes in fundamental bits of information stored in the structure of space-time, that four-dimensional continuum revealed by Einstein’s general theory of relativity. In a word, gravity is a consequence of entropy and not a fundamental force.
Space-time, comprised of the three familiar dimensions in addition to time, is flexible. Mass warps the 4-D fabric into hills and valleys that direct the motion of smaller objects nearby. The Sun doesn’t so much “pull” on the Earth as envisaged by Isaac Newton but creates a great pucker in space-time that Earth rolls around in.
In a 2010 article, Verlinde showed how Newton’s law of gravity, which describes everything from how apples fall from trees to little galaxies orbiting big galaxies, derives from these underlying microscopic building blocks.
His latest paper, titled Emergent Gravity and the Dark Universe, delves into dark energy’s contribution to the mix. The entropy associated with dark energy, a still-unknown form of energy responsible for the accelerating expansion of the universe, turns the geometry of spacetime into an elastic medium.
“We find that the elastic response of this ‘dark energy’ medium takes the form of an extra ‘dark’ gravitational force that appears to be due to ‘dark matter’,” writes Verlinde. “So the observed dark matter phenomena is a remnant, a memory effect, of the emergence of spacetime together with the ordinary matter in it.”
I’ll be the first one to say how complex Verlinde’s concept is, wrapped in arcane entanglement entropy, tensor fields and the holographic principal, but the basic idea, that gravity is not a fundamental force, makes for a fascinating new way to look at an old face.
Physicists have tried for decades to reconcile gravity with quantum physics with little success. And while Verlinde’s theory should be rightly be taken with a grain of salt, he may offer a way to combine the two disciplines into a single narrative that describes how everything from falling apples to black holes are connected in one coherent theory.
Under Mount Ikeno, Japan, in an old mine that sits one-thousand meters (3,300 feet) beneath the surface, lies the Super-Kamiokande Observatory (SKO). Since 1996, when it began conducting observations, researchers have been using this facility’s Cherenkov detector to look for signs of proton decay and neutrinos in our galaxy. This is no easy task, since neutrinos are very difficult to detect.
But thanks to a new computer system that will be able to monitor neutrinos in real-time, the researchers at the SKO will be able to research these mysteries particles more closely in the near future. In so doing, they hope to understand how stars form and eventually collapse into black holes, and sneak a peak at how matter was created in the early Universe.
Neutrinos, put simply, are one of the fundamental particles that make up the Universe. Compared to other fundamental particles, they have very little mass, no charge, and only interact with other types of particles via the weak nuclear force and gravity. They are created in a number of ways, most notably through radioactive decay, the nuclear reactions that power a star, and in supernovae.
In accordance with the standard Big Bang model, the neutrinos left over from the creation of the Universe are the most abundant particles in existence. At any given moment, trillions of these particles are believed to be moving around us and through us. But because of the way they interact with matter (i.e. only weakly) they are extremely difficult to detect.
For this reason, neutrino observatories are built deep underground to avoid interference from cosmic rays. They also rely on Cherenkov detectors, which are essentially massive water tanks that have thousands of sensors lining their walls. These attempt to detect particles as they are slowed down to the local speed of light (i.e. the speed of light in water), which is made evident by the presence of a glow – known as Cherenkov radiation.
The detector at the SKO is currently the largest in the world. It consists of a cylindrical stainless steel tank that is 41.4 m (136 ft) tall and 39.3 m (129 ft) in diameter, and holds over 45,000 metric tons (50,000 US tons) of ultra-pure water. In the interior, 11,146 photomultiplier tubes are mounted, which detect light in the ultraviolet, visible, and near-infrared ranges of the electromagnetic spectrum with extreme sensitivity.
For years, researchers at the SKO have used the facility to examine solar neutrinos, atmospheric neutrinos and man-made neutrinos. However, those that are created by supernovas are very difficult to detect, since they appear suddenly and difficult to distinguish from other kinds. However, with the newly-added computer system, the Super Komiokande researchers are hoping that will change.
“Supernova explosions are one of the most energetic phenomena in the universe and most of this energy is released in the form of neutrinos. This is why detecting and analyzing neutrinos emitted in these cases, other than those from the Sun or other sources, is very important for understanding the mechanisms in the formation of neutron stars –a type of stellar remnant– and black holes”.
Basically, the new computer system is designed to analyze the events recorded in the depths of the observatory in real-time. If it detects an abnormally large flows of neutrinos, it will quickly alert the experts manning the controls. They will then be able to assess the significance of the signal within minutes and see if it is actually coming from a nearby supernova.
“During supernova explosions an enormous number of neutrinos is generated in an extremely small space of time – a few seconds – and this why we need to be ready,” Labarga added. “This allows us to research the fundamental properties of these fascinating particles, such as their interactions, their hierarchy and the absolute value of their mass, their half-life, and surely other properties that we still cannot even imagine.”
Equally as important is the fact this system will give the SKO the ability to issue early warnings to research centers around the world. Ground-based observatories, where astronomers are keen to watch the creation of cosmic neutrinos by supernova, will then be able to point all of their optical instruments towards the source in advance (since the electromagnetic signal will take longer to arrive).
Through this collaborative effort, astrophysicists may be able to better understand some of the most elusive neutrinos of all. Discerning how these fundamental particles interact with others could bring us one step closer to a Grand Unified Theory – one of the major goals of the Super-Kamiokande Observatory.
To date, only a few neutrino detectors exist in the world. These include the Irvine-Michigan-Brookhaven (IMB) detector in Ohio, the Subdury Neutrino Observatory (SNOLAB) in Ontario, Canada, and the Super Kamiokande Observatory in Japan.
Ever since human beings learned that the Milky Way was not unique or alone in the night sky, astronomers and cosmologists have sought to find out just how many galaxies there are in the Universe. And until recently, our greatest scientific minds believed they had a pretty good idea – between 100 and 200 billion.
However, a new study produced by researchers from the UK has revealed something startling about the Universe. Using Hubble’s Deep Field Images and data from other telescopes, they have concluded that these previous estimates were off by a factor of about 10. The Universe, as it turns out, may have had up to 2 trillion galaxies in it during the course of its history.
Led by Prof. Christopher Conselice of the University of Nottingham, U.K., the team combined images taken by the Hubble Space Telescope with other published data to produced a 3-D map of the Universe. They then incorporated a series of new mathematical models that allowed them to infer the existence of galaxies which are not bright enough to be observed by current instruments.
Using these, they then began reviewing how galaxies have evolved over the past 13 billion years. What they learned was quite fascinating. For one, they observed that the distribution of galaxies throughout the history of the Universe was not even. What’s more, they found that in order for everything in their calculations to add up, there had to be 10 times more galaxies in the early Universe than previously thought.
Most of these galaxies would be similar in mass to the satellite galaxies that have been observed around the Milky Way, and would be too faint to be spotted by today’s instruments. In other words, astronomers have only been able to see about 10% of the early Universe until now, because most of its galaxies were too small and faint to be visible.
As Prof. Conselice explained in a Hubble Science Release, while may help resolve a lingering debate about the structure of the Universe:
“These results are powerful evidence that a significant galaxy evolution has taken place throughout the universe’s history, which dramatically reduced the number of galaxies through mergers between them — thus reducing their total number. This gives us a verification of the so-called top-down formation of structure in the universe.”
To break it down, the “top-down model” of galaxy formation states that galaxies formed from huge gas clouds larger than the resulting galaxies. These clouds began collapsing because their internal gravity was stronger than the pressures in the cloud. Based on the speed at which the gas clouds rotated, they would either form a spiral or an elliptical galaxy.
In contrast, the “bottom-up model” states that galaxies formed during the early Universe due to the merging of smaller clumps that were about the size globular clusters. These galaxies could then have been drawn into clusters and superclusters by their mutual gravity.
In addition to helping to resolve this debate, this study also offers a possible solution to the Olbers’ Paradox (aka. “the dark night sky paradox”). Named after the 18th/19th century German astronomer Heinrich Wilhelm Olbers, this paradox addresses the question of why – given the expanse of the Universe and all the luminous matter in it – is the sky dark at night?
Based on their results, the UK team has surmised that while every point in the night sky contains part of a galaxy, most of them are invisible to the human eye and modern telescopes. This is due to a combination of factors, which includes the effects of cosmic redshift, the fact that the Universe is dynamic (i.e. always expanding) and the absorption of light by cosmic dust and gas.
Needless to say, future missions will be needed to confirm the existence of all these unseen galaxies. And in that respect, Conselice and his colleagues are looking to future missions – ones that are capable of observing stars and galaxies in the non-visible spectrum – to make that happen.
“It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied,” he added. “Who knows what interesting properties we will find when we discover these galaxies with future generations of telescopes? In the near future, the James Webb Space Telescope will be able to study these ultra-faint galaxies.”
Understanding how many galaxies have existed over time is a fundamental aspect of understanding the Universe as a whole. With every passing study that attempts to resolve what we can see with our current cosmological models, we are getting that much closer!
And be sure to enjoy this video about some of Hubble’s most stunning images, courtesy of HubbleESA:
Direction is something we humans are pretty accustomed to. Living in our friendly terrestrial environment, we are used to seeing things in term of up and down, left and right, forwards or backwards. And to us, our frame of reference is fixed and doesn’t change, unless we move or are in the process of moving. But when it comes to cosmology, things get a little more complicated.
For a long time now, cosmologists have held the belief that the universe is homogeneous and isotropic – i.e. fundamentally the same in all directions. In this sense, there is no such thing as “up” or “down” when it comes to space, only points of reference that are entirely relative. And thanks to a new study by researchers from the University College London, that view has been shown to be correct.
For the sake of their study, titled “How isotropic is the Universe?“, the research team used survey data of the Cosmic Microwave Background (CMB) – the thermal radiation left over from the Big Bang. This data was obtained by the ESA’s Planck spacecraft between 2009 and 2013.
The team then analyzed it using a supercomputer to determine if there were any polarization patterns that would indicate if space has a “preferred direction” of expansion. The purpose of this test was to see if one of the basic assumptions that underlies the most widely-accepted cosmological model is in fact correct.
The first of these assumptions is that the Universe was created by the Big Bang, which is based on the discovery that the Universe is in a state of expansion, and the discovery of the Cosmic Microwave Background. The second assumption is that space is homogenous and istropic, meaning that there are no major differences in the distribution of matter over large scales.
This belief, which is also known as the Cosmological Principle, is based partly on the Copernican Principle (which states that Earth has no special place in the Universe) and Einstein’s Theory of Relativity – which demonstrated that the measurement of inertia in any system is relative to the observer.
This theory has always had its limitations, as matter is clearly not evenly distributed at smaller scales (i.e. star systems, galaxies, galaxy clusters, etc.). However, cosmologists have argued around this by saying that fluctuation on the small scale are due to quantum fluctuations that occurred in the early Universe, and that the large-scale structure is one of homogeneity.
By looking for fluctuations in the oldest light in the Universe, scientists have been attempting to determine if this is in fact correct. In the past thirty years, these kinds of measurements have been performed by multiple missions, such as the Cosmic Background Explorer (COBE) mission, the Wilkinson Microwave Anisotropy Probe (WMAP), and the Planck spacecraft.
For the sake of their study, the UCL research team – led by Daniela Saadeh and Stephen Feeney – looked at things a little differently. Instead of searching for imbalances in the microwave background, they looked for signs that space could have a preferred direction of expansion, and how these might imprint themselves on the CMB.
As Daniela Saadeh – a PhD student at UCL and the lead author on the paper – told Universe Today via email:
“We analyzed the temperature and polarization of the cosmic microwave background (CMB), a relic radiation from the Big Bang, using data from the Planck mission. We compared the real CMB against our predictions for what it would look like in an anisotropic universe. After this search, we concluded that there is no evidence for these patterns and that the assumption that the Universe is isotropic on large scales is a good one.”
Basically, their results showed that there is only a 1 in 121 000 chance that the Universe is anisotropic. In other words, the evidence indicates that the Universe has been expanding in all directions uniformly, thus removing any doubts about their being any actual sense of direction on the large-scale.
And in a way, this is a bit disappointing, since a Universe that is not homogenous and the same in all directions would lead to a set of solutions to Einstein’s field equations. By themselves, these equations do not impose any symmetries on space time, but the Standard Model (of which they are part) does accept homogeneity as a sort of given.
These solutions are known as the Bianchi models, which were proposed by Italian mathematician Luigi Bianchi in the late 19th century. These algebraic theories, which can be applied to three-dimensional spacetime, are obtained by being less restrictive, and thus allow for a Universe that is anisotropic.
On the other hand, the study performed by Saadeh, Feeney, and their colleagues has shown that one of the main assumptions that our current cosmological models rest on is indeed correct. In so doing, they have also provided a much-needed sense of closer to a long-term debate.
“In the last ten years there has been considerable discussion around whether there were signs of large-scale anisotropy lurking in the CMB,” said Saadeh. “If the Universe were anisotropic, we would need to revise many of our calculations about its history and content. Planck high-quality data came with a golden opportunity to perform this health check on the standard model of cosmology and the good news is that it is safe.”
So the next time you find yourself looking up at the night sky, remember… that’s a luxury you have only while you’re standing on Earth. Out there, its a whole ‘nother ballgame! So enjoy this thing we call “direction” when and where you can.
And be sure to check out this animation produced by the UCL team, which illustrates the Planck mission’s CMB data:
For almost a century, astronomers and cosmologists have postulated that space is filled with an invisible mass known as “dark matter”. Accounting for 27% of the mass and energy in the observable universe, the existence of this matter was intended to explain all the “missing” baryonic matter in cosmological models. Unfortunately, the concept of dark matter has solved one cosmological problem, only to create another.
If this matter does exist, what is it made of? So far, theories have ranged from saying that it is made up of cold, warm or hot matter, with the most widely-accepted theory being the Lambda Cold Dark Matter (Lambda-CDM) model. However, a new study produced by a team of European astronomer suggests that the Warm Dark Matter (WDM) model may be able to explain the latest observations made of the early Universe.