A team of researchers from the University of Nebraska–Lincoln recently conducted an experiment where they were able to accelerate plasma electrons to close to the speed of light. This “optical rocket”, which pushed electrons at a force a trillion-trillion times greater than that generated by a conventional rocket, could have serious implications for everything from space travel to computing and nanotechnology.
Despite decades of ongoing research, scientists are trying to understand how the four fundamental forces of the Universe fit together. Whereas quantum mechanics can explain how three of these forces things work together on the smallest of scales (electromagnetism, weak and strong nuclear forces), General Relativity explains how things behaves on the largest of scales (i.e. gravity). In this respect, gravity remains the holdout.
To understand how gravity interacts with matter on the tiniest of scales, scientists have developed some truly cutting-edge experiments. One of these is NASA’s Cold Atom Laboratory (CAL), located aboard the ISS, which recently achieved a milestone by creating clouds of atoms known as Bose-Einstein condensates (BECs). This was the first time that BECs have been created in orbit, and offers new opportunities to probe the laws of physics.
Originally predicted by Satyendra Nath Bose and Albert Einstein 71 years ago, BECs are essentially ultracold atoms that reach temperatures just above absolute zero, the point at which atoms should stop moving entirely (in theory). These particles are long-lived and precisely controlled, which makes them the ideal platform for studying quantum phenomena.
This is the purpose of the CAL facility, which is to study ultracold quantum gases in a microgravity environment. The laboratory was installed in the US Science Lab aboard the ISS in late May and is the first of its kind in space. It is designed to advance scientists’ ability to make precision measurements of gravity and study how it interacts with matter at the smallest of scales.
As Robert Thompson, the CAL project scientist and a physicist at NASA’s Jet Propulsion Laboratory, explained in a recent press release:
“Having a BEC experiment operating on the space station is a dream come true. It’s been a long, hard road to get here, but completely worth the struggle, because there’s so much we’re going to be able to do with this facility.”
About two weeks ago, CAL scientists confirmed that the facility had produced BECs from atoms of rubidium – a soft, silvery-white metallic element in the alkali group. According to their report, they had reached temperatures as low as 100 nanoKelvin, one-ten million of one Kelvin above absolute zero (-273 °C; -459 °F). This is roughly 3 K (-270 °C; -454 °F) colder than the average temperature of space.
Because of their unique behavior, BECs are characterized as a fifth state of matter, distinct from gases, liquids, solids and plasma. In BECs, atoms act more like waves than particles on the macroscopic scale, whereas this behavior is usually only observable on the microscopic scale. In addition, the atoms all assume their lowest energy state and take on the same wave identity, making them indistinguishable from one another.
In short, the atom clouds begin to behave like a single “super atom” rather than individual atoms, which makes them easier to study. The first BECs were produced in a lab in 1995 by a science team consisting of Eric Cornell, Carl Wieman and Wolfgang Ketterle, who shared the 2001 Nobel Prize in Physics for their accomplishment. Since that time, hundreds of BEC experiments have been conducted on Earth and some have even been sent into space aboard sounding rockets.
But the CAL facility is unique in that it is the first of its kind on the ISS, where scientists can conduct daily studies over long periods. The facility consists of two standardized containers, which consist of the larger “quad locker” and the smaller “single locker”. The quad locker contains CAL’s physics package, the compartment where CAL will produce clouds of ultra-cold atoms.
This is done by using magnetic fields or focused lasers to create frictionless containers known as “atom traps”. As the atom cloud decompresses inside the atom trap, its temperature naturally drops, getting colder the longer it remains in the trap. On Earth, when these traps are turned off, gravity causes the atoms to begin moving again, which means they can only be studied for fractions of a second.
Aboard the ISS, which is a microgravity environment, BECs can decompress to colder temperatures than with any instrument on Earth and scientists are able to observe individual BECs for five to ten seconds at a time and repeat these measurements for up to six hours per day. And since the facility is controlled remotely from the Earth Orbiting Missions Operation Center at JPL, day-to-day operations require no intervention from astronauts aboard the station.
Robert Shotwell, the chief engineer of JPL’s astronomy and physics directorate, has overseen the project since February 2017. As he indicated in a recent NASA press release:
“CAL is an extremely complicated instrument. Typically, BEC experiments involve enough equipment to fill a room and require near-constant monitoring by scientists, whereas CAL is about the size of a small refrigerator and can be operated remotely from Earth. It was a struggle and required significant effort to overcome all the hurdles necessary to produce the sophisticated facility that’s operating on the space station today.”
Looking ahead, the CAL scientists want to go even further and achieve temperatures that are lower than anything achieved on Earth. In addition to rubidium, the CAL team is also working towards making BECSs using two different isotopes of potassium atoms. At the moment, CAL is still in a commissioning phase, which consists of the operations team conducting a long series of tests see how the CAL facility will operate in microgravity.
However, once it is up and running, five science groups – including groups led by Cornell and Ketterle – will conduct experiments at the facility during its first year. The science phase is expected to begin in early September and will last three years. As Kamal Oudrhiri, JPL’s mission manager for CAL, put it:
“There is a globe-spanning team of scientists ready and excited to use this facility. The diverse range of experiments they plan to perform means there are many techniques for manipulating and cooling the atoms that we need to adapt for microgravity, before we turn the instrument over to the principal investigators to begin science operations.”
Given time, the Cold Atom Lab (CAL) may help scientists to understand how gravity works on the tiniest of scales. Combined with high-energy experiments conducted by CERN and other particle physics laboratories around the world, this could eventually lead to a Theory of Everything (ToE) and a complete understanding of how the Universe works.
And be sure to check out this cool video (no pun!) of the CAL facility as well, courtesy of NASA:
Neutron stars are famous for combining a very high-density with a very small radius. As the remnants of massive stars that have undergone gravitational collapse, the interior of a neutron star is compressed to the point where they have similar pressure conditions to atomic nuclei. Basically, they become so dense that they experience the same amount of internal pressure as the equivalent of 2.6 to 4.1 quadrillion Suns!
In spite of that, neutron stars have nothing on protons, according to a recent study by scientists at the Department of Energy’s Thomas Jefferson National Accelerator Facility. After conducting the first measurement of the mechanical properties of subatomic particles, the scientific team determined that near the center of a proton, the pressure is about 10 times greater than the pressure in the heart of a neutron star.
The study which describes the team’s findings, titled “The pressure distribution inside the proton“, recently appeared in the scientific journal Nature. The study was led by Volker Burkert, a nuclear physicist at the Thomas Jefferson National Accelerator Facility (TJNAF), and co-authored by Latifa Elouadrhiri and Francois-Xavier Girod – also from the TJNAF.
Basically , they found that the pressure conditions at the center of a proton were 100 decillion pascals – about 10 times the pressure at the heart of a neutron star. However, they also found that pressure inside the particle is not uniform, and drops off as the distance from the center increases. As Volker Burkert, the Jefferson Lab Hall B Leader, explained:
“We found an extremely high outward-directed pressure from the center of the proton, and a much lower and more extended inward-directed pressure near the proton’s periphery… Our results also shed light on the distribution of the strong force inside the proton. We are providing a way of visualizing the magnitude and distribution of the strong force inside the proton. This opens up an entirely new direction in nuclear and particle physics that can be explored in the future.”
Protons are composed of three quarks that are bound together by the strong nuclear force, one of the four fundamental forces that government the Universe – the other being electromagnetism, gravity and weak nuclear forces. Whereas electromagnetism and gravity produce the effects that govern matter on the larger scales, weak and strong nuclear forces govern matter at the subatomic level.
Previously, scientists thought that it was impossible to obtain detailed information about subatomic particles. However, the researchers were able to obtain results by pairing two theoretical frameworks with existing data, which consisted of modelling systems that rely on electromagnetism and gravity. The first model concerns generalized parton distributions (GDP) while the second involve gravitational form factors.
Patron modelling refers to modeling subatomic entities (like quarks) inside protons and neutrons, which allows scientist to create 3D images of a proton’s or neutron’s structure (as probed by the electromagnetic force). The second model describes the scattering of subatomic particles by classical gravitational fields, which describes the mechanical structure of protons when probed via the gravitational force.
As noted, scientists previously thought that this was impossible due to the extreme weakness of the gravitational interaction. However, recent theoretical work has indicated that it could be possible to determine the mechanical structure of a proton using electromagnetic probes as a substitute for gravitational probes. According to Latifa Elouadrhiri – a Jefferson Lab staff scientist and co-author on the paper – that is what their team set out to prove.
“This is the beauty of it. You have this map that you think you will never get,” she said. “But here we are, filling it in with this electromagnetic probe.”
For the sake of their study, the team used the DOE’s Continuous Electron Beam Accelerator Facility at the TJNAF to create a beam of electrons. These were then directed into the nuclei of atoms where they interacted electromagnetically with the quarks inside protons via a process called deeply virtual Compton scattering (DVCS). In this process, an electron exchanges a virtual photon with a quark, transferring energy to the quark and proton.
Shortly thereafter, the proton releases this energy by emitting another photon while remaining intact. Through this process, the team was able to produced detailed information of the mechanics going on in inside the protons they probed. As Francois-Xavier Girod, a Jefferson Lab staff scientist and co-author on the paper, explained the process:
“There’s a photon coming in and a photon coming out. And the pair of photons both are spin-1. That gives us the same information as exchanging one graviton particle with spin-2. So now, one can basically do the same thing that we have done in electromagnetic processes — but relative to the gravitational form factors, which represent the mechanical structure of the proton.”
The next step, according to the research team, will be to apply the technique to even more precise data that will soon be released. This will reduce uncertainties in the current analysis and allow the team to reveal other mechanical properties inside protons – like the internal shear forces and the proton’s mechanical radius. These results, and those the team hope to reveal in the future, are sure to be of interest to other physicists.
“We are providing a way of visualizing the magnitude and distribution of the strong force inside the proton,” said Burkert. “This opens up an entirely new direction in nuclear and particle physics that can be explored in the future.”
Perhaps, just perhaps, it will bring us closer to understanding how the four fundamental forces of the Universe interact. While scientists understand how electromagnetism and weak and strong nuclear forces interact with each other (as described by Quantum Mechanics), they are still unsure how these interact with gravity (as described by General Relativity).
If and when the four forces can be unified in a Theory of Everything (ToE), one of the last and greatest hurdles to a complete understanding of the Universe will finally be removed.
Stephen Hawking is rightly seen as one of the most influential scientists of our time. In his time on this planet, the famed physicist, science communicator, author and luminary became a household name, synonymous with the likes of Einstein, Newton and Galileo. What is even more impressive is the fact that he managed to maintain his commitment to science, education and humanitarian efforts despite suffering from a slow, degenerative disease.
Even though Hawking recently passed away, his influence is still being felt. Shortly before his death, Hawking submitted a paper offering his final theory on the origins of the Universe. The paper, which was published earlier this week (on Wednesday, May 2nd), offers a new take on the Big Bang Theory that could revolutionize the way we think of the Universe, how it was created, and how it evolved.
The paper, titled “A smooth exit from eternal inflation?“, was published in the Journal of High Energy Physics. The theory was first announced at a conference at the University of Cambridge in July of last year, where Professor Thomas Hertog (a Belgian physicist at KU Leuven University) shared Hawking’s paper (which Hertog co-authored) on the occasion of his 75th birthday.
According to the current scientific consensus, all of the current and past matter in the Universe came into existence at the same time – roughly 13.8 billion years ago. At this time, all matter was compacted into a very small ball with infinite density and intense heat. Suddenly, this ball started to inflate at an exponential rate, and the Universe as we know it began.
However, it is widely believed that since this inflation started, quantum effects will keep it going forever in some regions of the Universe. This means that globally, the Universe’s inflation is eternal. In this respect, the observable part of our Universe (measuring 13.8 billion light-years in any direction) is just a region in which inflation has ended and stars and galaxies formed.
“The usual theory of eternal inflation predicts that globally our universe is like an infinite fractal, with a mosaic of different pocket universes, separated by an inflating ocean. The local laws of physics and chemistry can differ from one pocket universe to another, which together would form a multiverse. But I have never been a fan of the multiverse. If the scale of different universes in the multiverse is large or infinite the theory can’t be tested. ”
In their new paper, Hawking and Hertog offer a new theory that predicts that the Universe is not an infinite fractal-like multiverse, but is finite and reasonably smooth. In short, they theorize that the eternal inflation, as part of the theory of the Big Bang, is wrong. As Hertog explained:
“The problem with the usual account of eternal inflation is that it assumes an existing background universe that evolves according to Einstein’s theory of general relativity and treats the quantum effects as small fluctuations around this. However, the dynamics of eternal inflation wipes out the separation between classical and quantum physics. As a consequence, Einstein’s theory breaks down in eternal inflation.”
In contrast to this, Hawking and Hertog offer an explanation based on String Theory, a branch of theoretical physics that attempts to unify General Relativity with quantum physics. This theory was proposed to explain how gravity interacts with the three other fundamental forces of the Universe (weak and strong nuclear forces and electromagnetism), thus producing a Theory of Everything (ToE).
To put it simply, this theory describes the fundamental constituents of the Universe as tiny, one-dimensional vibrating strings. Hawking and Hertog’s approach uses the holography concept of string theory, which postulates that the Universe is a large and complex hologram. In this theory, physical reality in certain 3D spaces can be mathematically reduced to 2D projections on a surface.
Together, Hawking and Hertog developed a variation of this concept to project out the dimension of time in eternal inflation. This enabled them to describe eternal inflation without having to rely on General Relativity, thus reducing inflation to a timeless state defined on a spatial surface at the beginning of time. In this respect, the new theory represents a change from Hawking’s earlier work on “no boundary theory”.
Also known as the Hartle and Hawking No Bounary Proposal, this theory viewed the Universe like a quantum particle – assigning it a wave function that described all possible Universes. This theory also predicted that if you go back in time to the beginning of the Universe, it would shrink and close off like a sphere. Lastly, it predicted that the Universe would eventually stop expanding and collapse in on itself.
As Hertog explains, this new theory is a departure from that earlier work:
“When we trace the evolution of our universe backwards in time, at some point we arrive at the threshold of eternal inflation, where our familiar notion of time ceases to have any meaning. Now we’re saying that there is a boundary in our past.”
Using this theory, Hawking and Hertog were able to derive more reliable predictions about the global structure of the Universe. In addition, a Universe predicted to emerge from eternal inflation on the past boundary is also finite and much simpler. Last, but not least, the theory is more predictive and testable than the infinite Multiverse predicted by the old theory of eternal inflation.
“We are not down to a single, unique universe, but our findings imply a significant reduction of the multiverse, to a much smaller range of possible universes,” said Hawking. In theory, a finite and smooth Universe is one we can observe (at least locally) and will be governed by physical laws that we are already familiar with. Compared to an infinite number of Universes governed by different physical laws, it certainly simplifies the math!
Looking ahead, Hertog plans to study the implications of this theory on smaller scales using data obtained by space telescopes about the local Universe. In addition, he hopes to take advantage of recent studies concerning gravitational waves (GWs) and the many events that have been detected. Essentially, Hertog believes that primordial GWs generated at the exit from eternal inflation are the most promising means to test the model.
Due to the expansion of our Universe since the Big Bang, these GWs would have very long wavelengths, ones which are outside the normal range of the Laser Interferometry Gravitational-Wave Observatory‘s (LIGO) or Virgo‘s detectors. However, the Laser Interferometry Space Antenna (LISA) – an ESA-led plan for a space-based gravitational wave observatory – and other future experiments may be capable of measuring them.
Even though he is longer with us, Hawking’s final theory could be his profound contribution to science. If future research should prove him correct, then Hawking will have resolved one of the most daunting problems in modern astrophysics and cosmology. Just one more achievement from a man who spent his life changing how people think about the Universe!
Lighting has always been a source of awe and mystery for us lowly mortals. In ancient times, people associated it with Gods like Zeus and Thor, the fathers of the Greek and Norse pantheons. With the birth of modern science and meteorology, lighting is no longer considered the province of the divine. However, this does not mean that the sense of mystery it carries has diminished one bit.
For example, scientists have found that lightning occurs in the atmospheres of other planets, like the gas giant Jupiter (appropriately!) and the hellish world of Venus. And according to a recent study from Kyoto University, gamma rays caused by lighting interact with air molecules, regularly producing radioisotopes and even positrons – the antimatter version of electrons.
The study, titled “Photonuclear Reactions Triggered by Lightning Discharge“, recently appeared in the scientific journal Nature. The study was led by Teruaki Enoto, a researcher from The Hakubi Center for Advanced Research at Kyoto University, and included members from the University of Tokyo, Hokkaido University, Nagoya University, the RIKEN Nishina Center, the MAXI Team, and the Japan Atomic Energy Agency.
For some time, physicists have been aware that small bursts of high-energy gamma rays can be produced by lightning storms – what are known as “terrestrial gamma-ray flashes”. They are believed to be the result of static electrical fields accelerating electrons, which are then slowed by the atmosphere. This phenomenon was first discovered by space-based observatories, and rays of up to 100,000 electron volts (100 MeV) have been observed.
Given the energy levels involved, the Japanese research team sought to examine how these bursts of gamma rays interact with air molecules. As Teruaki Enoto from Kyoto University, who leads the project, explained in a Kyoto University press release:
“We already knew that thunderclouds and lightning emit gamma rays, and hypothesized that they would react in some way with the nuclei of environmental elements in the atmosphere. In winter, Japan’s western coastal area is ideal for observing powerful lightning and thunderstorms. So, in 2015 we started building a series of small gamma-ray detectors, and placed them in various locations along the coast.”
Unfortunately, the team ran into funding problems along the way. As Enoto explained, they decided to reach out to the general public and established a crowdfunding campaign to fund their work. “We set up a crowdfunding campaign through the ‘academist’ site,” he said, “in which we explained our scientific method and aims for the project. Thanks to everybody’s support, we were able to make far more than our original funding goal.”
Thanks to the success of their campaign, the team built and installed particle detectors across the northwest coast of Honshu. In February of 2017, they installed four more detectors in Kashiwazaki city, which is a few hundred meters away from the neighboring town of Niigata. Immediately after the detectors were installed, a lightning strike took place in Niigata, and the team was able to study it.
What they found was something entirely new and unexpected. After analyzing the data, the team detected three distinct gamma-ray bursts of varying duration. The first was less than a millisecond long, the second was gamma ray-afterglow that took several milliseconds to decay, and the last was a prolonged emission lasting about one minute. As Enoto explained:
“We could tell that the first burst was from the lightning strike. Through our analysis and calculations, we eventually determined the origins of the second and third emissions as well.”
They determined that the second afterglow was caused by the lightning reacting with nitrogen in the atmosphere. Essentially, gamma rays are capable of causing nitrogen molecules to lose a neutron, and it was the reabsorption of these neutrons by other atmospheric particles that produced the gamma-ray afterglow. The final, prolonged emission was the result of unstable nitrogen atoms breaking down.
It was here that things really got interesting. As the unstable nitrogen broke down, it released positrons that then collided with electrons, causing matter-antimatter annihilations that released more gamma rays. As Enoto explained, this demonstrated, for the first time that antimatter is something that can occur in nature due to common mechanisms.
“We have this idea that antimatter is something that only exists in science fiction,” he said. “Who knew that it could be passing right above our heads on a stormy day? And we know all this thanks to our supporters who joined us through ‘academist’. We are truly grateful to all.”
If these results are indeed correct, than antimatter is not the extremely rare substance that we tend to think it is. In addition, the study could present new opportunities for high-energy physics and antimatter research. All of this research could also lead to the development of new or refined techniques for creating it.
Looking ahead, Enoto and his team hopes to conduct more research using the ten detectors they still have operating along the coast of Japan. They also hope to continue involving the public with their research, a process that goes far beyond crowdfunding and includes the efforts of citizen scientists to help process and interpret data.
For more than three decades, the internal structure and evolution of Uranus and Neptune has been a subject of debate among scientists. Given their distance from Earth and the fact that only a few robotic spacecraft have studied them directly, what goes on inside these ice giants is still something of a mystery. In lieu of direct evidence, scientists have relied on models and experiments to replicate the conditions in their interiors.
For instance, it has been theorized that within Uranus and Neptune, the extreme pressure conditions squeeze hydrogen and carbon into diamonds, which then sink down into the interior. Thanks to an experiment conducted by an international team of scientists, this “diamond rain” was recreated under laboratory conditions for the first time, giving us the first glimpse into what things could be like inside ice giants.
For decades, scientists have held that the interiors of planets like Uranus and Neptune consist of solid cores surrounded by a dense concentrations of “ices”. In this case, ice refers to hydrogen molecules connected to lighter elements (i.e. as carbon, oxygen and/or nitrogen) to create compounds like water and ammonia. Under extreme pressure conditions, these compounds become semi-solid, forming “slush”.
And at roughly 10,000 kilometers (6214 mi) beneath the surface of these planets, the compression of hydrocarbons is thought to create diamonds. To recreate these conditions, the international team subjected a sample of polystyrene plastic to two shock waves using an intense optical laser at the Matter in Extreme Conditions (MEC) instrument, which they then paired with x-ray pulses from the SLAC’s Linac Coherent Light Source (LCLS).
“So far, no one has been able to directly observe these sparkling showers in an experimental setting. In our experiment, we exposed a special kind of plastic – polystyrene, which also consists of a mix of carbon and hydrogen – to conditions similar to those inside Neptune or Uranus.”
The plastic in this experiment simulated compounds formed from methane, a molecule that consists of one carbon atom bound to four hydrogen atoms. It is the presence of this compound that gives both Uranus and Neptune their distinct blue coloring. In the intermediate layers of these planets, it also forms hydrocarbon chains that are compressed into diamonds that could be millions of karats in weight.
The optical laser the team employed created two shock waves which accurately simulated the temperature and pressure conditions at the intermediate layers of Uranus and Neptune. The first shock was smaller and slower, and was then overtaken by the stronger second shock. When they overlapped, the pressure peaked and tiny diamonds began to form. At this point, the team probed the reactions with x-ray pulses from the LCLS.
This technique, known as x-ray diffraction, allowed the team to see the small diamonds form in real-time, which was necessary since a reaction of this kind can only last for fractions of a second. As Siegfried Glenzer, a professor of photon science at SLAC and a co-author of the paper, explained:
“For this experiment, we had LCLS, the brightest X-ray source in the world. You need these intense, fast pulses of X-rays to unambiguously see the structure of these diamonds, because they are only formed in the laboratory for such a very short time.”
In the end, the research team found that nearly every carbon atom in the original plastic sample was incorporated into small diamond structures. While they measured just a few nanometers in diameter, the team predicts that on Uranus and Neptune, the diamonds would be much larger. Over time, they speculate that these could sink into the planets’ atmospheres and form a layer of diamond around the core.
In previous studies, attempts to recreate the conditions in Uranus and Neptune’s interior met with limited success. While they showed results that indicated the formation of graphite and diamonds, the teams conducting them could not capture the measurements in real-time. As noted, the extreme temperatures and pressures that exist within gas/ice giants can only be simulated in a laboratory for very short periods of time.
However, thanks to LCLS – which creates X-ray pulses a billion times brighter than previous instruments and fires them at a rate of about 120 pulses per second (each one lasting just quadrillionths of a second) – the science team was able to directly measure the chemical reaction for the first time. In the end, these results are of particular significance to planetary scientists who specialize in the study of how planets form and evolve.
As Kraus explained, it could cause to rethink the relationship between a planet’s mass and its radius, and lead to new models of planet classification:
“With planets, the relationship between mass and radius can tell scientists quite a bit about the chemistry. And the chemistry that happens in the interior can provide additional information about some of the defining features of the planet… We can’t go inside the planets and look at them, so these laboratory experiments complement satellite and telescope observations.”
This experiment also opens new possibilities for matter compression and the creation of synthetic materials. Nanodiamonds currently have many commercial applications – i.e. medicine, electronics, scientific equipment, etc, – and creating them with lasers would be far more cost-effective and safe than current methods (which involve explosives).
Fusion research, which also relies on creating extreme pressure and temperature conditions to generate abundant energy, could also benefit from this experiment. On top of that, the results of this study offer a tantalizing hint at what the cores of massive planets look like. In addition to being composed of silicate rock and metals, ice giants may also have a diamond layer at their core-mantle boundary.
Assuming we can create probes of sufficiently strong super-materials someday, wouldn’t that be worth looking into?
Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of their interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.
However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – an international team of researchers recently made a historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.
Quantum entanglement remains one of the most challenging fields of study for modern physicists. Described by Einstein as “spooky action at a distance”, scientists have long sought to reconcile how this aspect of quantum mechanics can coexist with classical mechanics. Essentially, the fact that two particles can be connected over great distances violates the rules of locality and realism.
Formally, this is a violation of Bell’s Ineqaulity, a theory which has been used for decades to show that locality and realism are valid despite being inconsistent with quantum mechanics. However, in a recent study, a team of researchers from the Ludwig-Maximilian University (LMU) and the Max Planck Institute for Quantum Optics in Munich conducted tests which once again violate Bell’s Inequality and proves the existence of entanglement.
Bell’s Inequality (named after Irish physicist John Bell, who proposed it in 1964) essentially states that properties of objects exist independent of being observed (realism), and no information or physical influence can propagate faster than the speed of light (locality). These rules perfectly described the reality we human beings experience on a daily basis, where things are rooted in a particular space and time and exist independent of an observer.
However, at the quantum level, things do not appear to follow these rules. Not only can particles be connected in non-local ways over large distances (i.e. entanglement), but the properties of these particles cannot be defined until they are measured. And while all experiments have confirmed that the predictions of quantum mechanics are correct, some scientists have continued to argue that there are loopholes that allow for local realism.
To address this, the Munich team conducted an experiment using two laboratories at LMU. While the first lab was located in the basement of the physics department, the second was located in the basement of the economics department – roughly 400 meters away. In both labs, teams captured a single rubidium atom in an topical trap and then began exciting them until they released a single photon.
As Dr. Wenjamin Rosenfeld explained in an Max Planck Institute press release:
“Our two observer stations are independently operated and are equipped with their own laser and control systems. Because of the 400 meters distance between the laboratories, communication from one to the other would take 1328 nanoseconds, which is much more than the duration of the measurement process. So, no information on the measurement in one lab can be used in the other lab. That’s how we close the locality loophole.”
Once the two rubidium atoms were excited to the point of releasing a photon, the spin-states of the rubidium atoms and the polarization states of the photons were effectively entangled. The photons were then coupled into optical fibers and guided to a set-up where they were brought to interference. After conducting a measurement run for eight days, the scientists were able to collected around 10,000 events to check for signs entanglement.
This would have been indicated by the spins of the two trapped rubidium atoms, which would be pointing in the same direction (or in the opposite direction, depending on the kind of entanglement). What the Munich team found was that for the vast majority of the events, the atoms were in the same state (or in the opposite state), and that there were only six deviations consistent with Bell’s Inequality.
These results were also statistically more significant than those obtained by a team of Dutch physicists in 2015. For the sake of that study, the Dutch team conducted experiments using electrons in diamonds at labs that were 1.3 km apart. In the end, their results (and other recent tests of Bell’s Inequality) demonstrated that quantum entanglement is real, effectively closing the local realism loophole.
As Wenjamin Rosenfeld explained, the tests conducted by his team also went beyond these other experiments by addressing another major issue. “We were able to determine the spin-state of the atoms very fast and very efficiently,” he said. “Thereby we closed a second potential loophole: the assumption, that the observed violation is caused by an incomplete sample of detected atom pairs”.
By obtaining proof of the violation of Bell’s Inequality, scientists are not only helping to resolve an enduring incongruity between classical and quantum physics. They are also opening the door to some exciting possibilities. For instance, for years, scientist have anticipated the development of quantum processors, which rely on entanglements to simulate the zeros and ones of binary code.
Computers that rely on quantum mechanics would be exponentially faster than conventional microprocessors, and would ushering in a new age of research and development. The same principles have been proposed for cybersecurity, where quantum encryption would be used to cypher information, making it invulnerable to hackers who rely on conventional computers.
Last, but certainly not least, there is the concept of Quantum Entanglement Communications, a method that would allow us to transmit information faster than the speed of light. Imagine the possibilities for space travel and exploration if we are no longer bound by the limits of relativistic communication!
Einstein wasn’t wrong when he characterized quantum entanglements as “spooky action”. Indeed, much of the implications of this phenomena are still as frightening as they are fascinating to physicists. But the closer we come to understanding it, the closer we will be towards developing an understanding of how all the known physical forces of the Universe fit together – aka. a Theory of Everything!
When it comes to the future of space exploration, some truly interesting concepts are being developed. Hoping to reach farther and reduce associated costs, one of the overarching goals is to find more fuel-efficient and effective means of sending robotic spacecraft, satellites and even crewed missions to their destinations. Towards this end, ideas like nuclear propulsion, ion engines and even antimatter are all being considered.
But this idea has to be the strangest one to date! It’s known as a ferrofluid thruster, a new concept that relies on ionic fluids that become strongly magnetized and release ions when exposed to a magnetic field. According to a new study produced by researchers from the Ion Space Propulsion Laboratory at Michigan Tech, this concept could very well be the future of satellite propulsion.
This study, which was recently published in the journal Physics of Fluids, presents an entirely new method for creating microthrusters – tiny nozzles that are used by small satellites to maneuver in orbit. Thanks to improvements in technology, small satellites – which are typically defined as those that weight less than 500 km (1,100 lbs) – can perform tasks that were once reserved for larger ones.
As such, they are making up an increasingly large share of the satellite market, and many more are expected to be launched in the near future. In fact, it is estimated that between 2015 and 2019, over 500 small satellites will be launched to LEO, with an estimated market value of $7.4 billion. Little wonder then why researchers are looking at various types of microthrusters to ensure that these satellites can maneuver effectively.
While there are no shortage of possibilities, finding the one that balances cost-effectiveness and reliability has been difficult. To address this, an MTU research team began conducting a study that considered ferrofluids as a possible solution. As noted, ferrofluids are ionic liquids that become active when exposed to a magnetic field, forming peaks that emit small amounts of ions.
These peaks then return to a natural state when the magnetic field is removed, a phenomena known as Rosenweig instability. Led by Brandon A. Jackson – a doctoral candidate in mechanical engineering at Michigan Technological University – the MTU research team began to consider how this could be turned into propulsion. Other members included fellow doctoral candidate Kurt Terhune and Professor Lyon B. King.
Prof. King, the Ron & Elaine Starr Professor in Space Systems at Michigan Tech, has been researching the physics of ferrofluids for many years, thanks to support provided by the Air Force Office of Scientific Research (AFOSR). In 2012, he proposed using such ionic fluids to create a microthruster for modern satellites, based on previous studies conducted by researchers at the University of Sydney.
As he explained in a MTU press release, this method offers a simple and effective way to create a reliable microthruster:
“We’re working with a unique material called an ionic liquid ferrofluid. When we put a magnet underneath a small pool of the ferrofluid, it turns into a beautiful hedgehog structure of aligned peaks. When we apply a strong electric field to that array of peaks, each one emits an individual micro-jet of ions.”
With King’s help, who oversees MTU’s Ion Space Propulsion Laboratory, Jackson and Tehrune began conducting an an experimental and computational study on the dynamics of the ferrofluid. From this, they created a computational model that taught them much about the relationships between magnetic, electric and surface tension stresses, and were even surprised by some of what they saw.
“We wanted to learn what led up to emission instability in one single peak of the ferrofluid microthruster,” said Jackson. “We learned that the magnetic field has a large effect in preconditioning the fluid electric stress.”
Ultimately, what they had created was a model for an electrospray ionic liquid ferrofluid thruster. Unlike conventional electrospray thrusters – which generate propulsion with electrical charges that send tiny jets of fluid through microscopic needles – a ferrofluid electrospray thruster would be able to do away with these needles, which are expensive to manufacture and vulnerable to damage.
Instead, the thruster they are proposing would be able to assemble itself out of its own propellant, would rely on no fragile parts, and would essentially be indestructible. It would also present advantages over conventional plasma thrusters, which are apparently unreliable when scaled down for small satellites. With the success of their model, the AFOSR recently decided to award King a second contract to continue studying ferrofluids.
With this funding secured, King is confident that they can put what they learned with this study to good use, and scale it up to examine what happens with multiple peaks. As he explained:
“Often in the lab we’ll have one peak working and 99 others loafing. Brandon’s model will be a vital tool for the team going forward. If we are successful, our thruster will enable small inexpensive satellites with their own propulsion to be mass produced. That could improve remote sensing for better climate modeling, or provide better internet connectivity, which three billion people in the world still do not have.”
Looking ahead, the team wants to conduct experiments on how an actual thruster might perform. The team has also begun working with Professor Juan Fernandez de la Mora of Yale University, one of the world’s leading experts on electrospray propulsion, to help bring their proposal to fruition. Naturally, it will take many years before a prototype is ready, and such a thruster would likely have to be able to execute about 100 peaks to be considered viable.
Nevertheless, the technology holds promise for a market that is expected to grow by leaps and bounds in the coming years and decades. Facilitating everything from worldwide internet access and telecommunications to scientific research, there is likely to be no shortage of smallsats, cubesats, nanosats, etc. taking to space very soon. They will all need to have reliable propulsion if they want to be able to stay clear of each other do their jobs!
Michigan Tech also has patents pending for the technology, which has applications that go beyond propulsion to include spectrometry, pharmaceuticals, and nanofabrication.
The Standard Model of particle physics has been the predominant means of explaining what the basic building blocks of matter are and how they interact for decades. First proposed in the 1970s, the model claims that for every particle created, there is an anti-particle. As such, an enduring mystery posed by this model is why the Universe can exist if it is theoretically made up of equal parts of matter and antimatter.
This seeming disparity, known as the charge-parity (CP) violation, has been the subject of experiments for many years. But so far, no definitive demonstration has been made for this violation, or how so much matter can exist in the Universe without its counterpart. But thanks to new findings released by the international Tokai-to-Kamioka (T2K) collaboration, we may be one step closer to understanding why this disparity exists.
First observed in 1964, CP violation proposes that under certain conditions, the laws of charge-symmetry and parity-symmetry (aka. CP-symmetry) do not apply. These laws state that the physics governing a particle should be the same if it were interchanged with its antiparticle, while its spatial coordinates would be inverted. From this observation, one of the greatest cosmological mysteries emerged.
If the laws governing matter and antimatter are the same, then why is it that the Universe is so matter-dominated? Alternately, if matter and antimatter are fundamentally different, then how does this accord with our notions of symmetry? Answering these questions is not only important as far as our predominant cosmological theories go, they are also intrinsic to understanding how the weak interactions that govern particles work.
Established in June of 2011, the international T2K collaboration is the first experiment in the world dedicated to answering this mystery by studying neutrino and anti-neutrino oscillations. The experiment begins with high-intensity beams of muon neutrinos (or muon anti-neutrinos) being generated at the Japan Proton Accelerator Research Complex (J-PARC), which are then fired towards the Super-Kamiokande detector 295 km away.
This detector is currently one of the world’s largest and most sophisticated, dedicated to the detection and study of solar and atmospheric neutrinos. As neutrinos travel between the two facilities, they change “flavor” – going from muon neutrinos or anti-neutrinos to electron neutrinos or anti-neutrinos. In monitoring these neutrino and anti-neutrino beams, the experiment watches for different rates of oscillation.
This difference in oscillation would show that there is an imbalance between particles and antiparticles, and thus provide the first definitive evidence of CP violation for the first time. It would also indicate that there are physics beyond the Standard Model that scientists have yet to probe. This past April, the first data set produced by T2K was released, which provided some telling results.
As Mark Hartz, a T2K collaborator and the Kavli IPMU Project Assistant Professor, said in a recent press release:
“While the data sets are still too small to make a conclusive statement, we have seen a weak preference for large CP violation and we are excited to continue to collect data and make a more sensitive search for CP violation.”
These results, which were recently published in the Physical Review Letters, include all data runs from between January 2010 to May 2016. In total, this data comprised 7.482 x 1020 protons (in neutrino mode), which yielded 32 electron neutrino and 135 muon neutrino events, and 7.471×1020 protons (in antineutrino mode), which yielded 4 electron anti-neutrino and 66 muon neutrino events.
In other words, the first batch of data has provided some evidence for CP violation, and with a confidence interval of 90%. But this is just the beginning, and the experiment is expected to run for another ten years before wrapping up. “If we are lucky and the CP violation effect is large, we may expect 3 sigma evidence, or about 99.7% confidence level, for CP violation by 2026,” said Hartz.
If the experiment proves successful, physicists may finally be able to answer how it is that the early Universe didn’t annihilate itself. It is also likely help to reveal aspects of the Universe that particle physicists are anxious to get into! For it here that the answers to the deepest secrets of the Universe, like how all of its fundamental forces fit together, are likely to be found.